I've found this utterly hilarious bug when I've been trying to debug my WiX installer's custom actions.
I didn't write the code, but somehow one of my three managed .dlls decides it is invisible to the main .dll, that's referenced and called by WiX, when I turn optimisations off (to debug) for the main .dll.
As I understand, multiple dependencies in WiX are difficult or impossible without using a tool like ILMerge and I'll probably end up splitting everything down into their own separate .dlls and have loads of custom actions instead.
The main question here is: (out of pure curiosity) why are my dependencies loading properly when optimisations are enabled and not without them being enabled?
Related
I am in the process of developing a modular application using C# and MEF. The application is supposed to consist of
a main application, which is the actual executable, providing core functionality, exposing a number of interfaces (extension points) and using MEF to pull in plug-in assemblies that fit into these
a set of plug-ins that provide classes that fit into the interfaces and can be used by the main application
The main application may either run all by itself or with one or more plug-ins imported. This should be a rather standard architecture for a modular MEF-based application.
Initial tests have shown that this appears to generally work. If I deploy the main application as well as one or multiple plug-in assemblies in to a folder, everything works fine.
However, I am struggling with how to organize the Visual Studio solution for this. My initial approach is that the main application, as well as each plug-in are separate projects within a solution. The main application is an exe project, whereas the plug-ins are dll projects. Plug-in projects depend on the main project, since they are implementing interfaces and using classes defined in the main application (I could have created a common lib project that does this, but it does not seem to add any benefit).
This way, I can start and debug the main application (with no plug-ins) fine.
But how can the solution be organized so I can debug the main application with one, multiple or all plug-ins?
The current approach builds each plug-in into its own folder (which is generally fine) and copies the main application into each of these (which is not quite desirable). I could potentially configure an individual plug-in project to start the main application in its output folder, but I have no idea how to do this for more than one plug-in or how to do this if the main application should not be copied into each plug-in output folder.
Any hints or best practices would be highly appreciated. I am using Visual Studio 2015 - if that makes any difference.
Plug-in projects depend on the main project, since they are
implementing interfaces and using classes defined in the main
application (I could have created a common lib project that does this,
but it does not seem to add any benefit).
There are benefits to it, here's a couple of them:
it just makes sense that if you have multiple projects which are all using the same classes/functions, that these reside in a seperate common project. And possibly even another seperate project just for the interfaces. It makes it easier to grasp things, seeing a solution where you have a main application, some dlls with common functionality, some plugins
suppose you once have to add another main app then that can just make use of the common dll as well, instead of having to reference the first main app
you won't end up with your main app being copied x times to other project's output directories because they depend on it
it seems a bit strange (to me at last) that other projects have an exe project as dependency, rather than a dll project. Just like it is really weird to me that a plugin depends on the main application it is in turn loaded in. Might be just me though - but I think it's also one of the reasons you have to ask the rest of your question in the first place.
But how can the solution be organized so I can debug the main
application with one, multiple or all plug-ins?
Typcially you tell your main application at startup which plugins to load. There are different ways to do this: read a file containing names of plugins, scan a known directory for plugins, a combination. All supported or relatively easy to implement using MEF. For instance for one large C# app we have, all plugins are copied into something like bin\Plugins. When the application starts it looks for dlls in bin\Plugins, filters the list based on a textfile containing regexes, then loads plugins from the filtered list. When sending the application to customers they get all, or only some plugins. When developping we use the text file to cut down application load time.
I have a solution which contains multiple C# projects. One of these projects is an executable, the rest are DLLs.
The problem is, some of the projects do not actually run on the same process as the executable start-up project. This is because some of the projects are really extensions to a WCF service that allow the service to play with the executable.
My question is: is it possible in any way, shape, or form to set a breakpoint in said projects? I am aware of the ability to "attach to process", but I'm not sure it is a good solution for me.
I want to:
Be able to see the source as I break
Not have two copies of Visual Studio open if possible
EDIT: the only reason I am not sure of 'attach to process' would work well for me is because I have little experience with that feature - perhaps it is what I should be using? Can I still load .pdb files?
EDIT 2: If one of the commenters would like to submit their comment as an answer, I will accept
Attaching to the WCF service seems to be the exact tool for the job:
It allows you to attach to a running process, even if you've only got the code and PDBs for a plugin/extension for that process. You can set breakpoints in your own code and they'll be hit when the 3rd-party process calls them.
You can attach to any process from an existing VS instance, even if that instance is used to debug a different executable, in this case your main EXE project. You can start debugging your app, then attach to the service before making the service call.
Make sure, though that the DLLs called by the WCF service are the same ones as you have in your VS instance - if they're called from the same location as the VS build output, you'll be fine. If not, make sure they're in sync.
I've got a .net windows service application that calls out to a bunch of other dlls of our own creation. However, all of these dlls were created for x86, and we've since moved on to Any CPU (on x64 environments). Sadly, thanks to .NET's delayed loading functionality, many of these dlls are not loaded unless we exercise some rare and somewhat complicated code paths. The net result? We get incorrect format exceptions days or weeks after deploying code.
I want to know if there's a way to force .NET to fully load all assemblies that it directly references so that I can see such incompatibilities without manually poring through the dozens of projects from which these dependent dlls were created, or worse, doing full regression tests to force all of the assemblies to be loaded.
Addendum: Even if there's an easier way to resolve my specific x86-dlls-in-a-x64-environment issue, I'm still interested in seeing if there's a way to force the environment to load all of its dependencies. You never know when it'll come in handy! :)
EDIT: These are all managed DLLS, and I actually have used reflection to triage the issue in production. However, in a development environment, it suffers from the same problem the other options have: manually traversing all of the DLLs in one way or another.
One way is to statically link the libraries using ILMerge to merge them into the executable itself. otherwise the whole point of DLLs is that they are not evaluated until they are used in code.
see The .NET equivalent of static libraries?
of course you could just run a diagnostic sequence on app load that touches some aspect of all your dlls, so at least you know on load what is there and what won't work.
In a quest to reduce the dependencies in my projects, I now have everything depending on and implementing interfaces, and they are glued together by an IoC container. This means projects need only to have direct references to such interface libraries.
However, if you don't specify the project as having a reference to the implementation (even though you don't need it at compile time) the implementation libraries are not included with the executable or in the setup project.
Is in a way Visual Studio promoting bad practices by requiring explicit references when they are not needed? Is it possible to have the dependencies only to the required interfaces and in this case what is the best method to get the implementation libraries available?
Is in a way Visual Studio promoting bad practices by requiring explicit references when they are not needed?
Not really. Your issue is one of deployment, not building.
Deploying an application and building it are separate things - if you have a good deployment strategy, you will be deploying implementations where they belong.
Is it possible to have the dependencies only to the required interfaces and in this case what is the best method to get the implementation libraries available?
The easiest way is indeed to reference the implementation assemblies. This will definitely make building and running locally as easy as F5, but do you really want that? To be honest, if you and your team have the discipline to only code to interfaces, that's all you need (and there are static analysis tools like nDepend that help with ensuring that remains the case).
One way forward is to create a deployment script that will deploy all dependencies whether local or elsewhere.
Visual studio does not require these references, but your IoC container does.
When adding a reference to the project, its binaries are automatically included in the output folder, which is necessary for your IoC container to glue the code together. There are other ways to get these binaries to the output folder than referencing their projects in Visual Studio - perhaps a post-build step?
No. It is simply the minimum they need to do in order to give developers working code without them having to do anything extra (aside from hitting F5) or for all references to be added by default (which would likely be a mess and slow on older hard discs.
For local development builds, you can simply have a post-build step on the relevant project to copy the DLLs to the main working directory. Just make sure the project is added to the active configuration to be built, other wise you'll go through a whole annoying debug session on the post-build only to realise there was no-build... lol
VS 2010. Post-build. Copy files in to multiple directories/multiple output path
Copy bin files on to Physical file location on Post Build event in VS2010
For full-scale application deployment, you'd likely be looking at Visual Studio setup projects at a bare minimum, but more ideally something like WiX or another deployment tool-set.
In VS, I've only tested code and debugged it, but never actually prepared anything for a finalized program or release. Some of the programs I've downloaded have had dlls that need to be in the folder they're in, and I've had programs that come as just one .exe. Is there a way to compile all the files into one application and not have external dlls? Is this bad programming practice for some reason? How do I compile my VS program into one executable file?
I know this is quite an obvious question, which is why I can't really find an answer, because it would be too obvious to write any kind of tutorial on it.
With a managed language like C# or VB.NET, ILMerge is a utility that you can use.
ILMerge is a utility for merging multiple .NET assemblies into a single .NET assembly. It works on executables and DLLs alike and comes with several options for controlling the processing and format of the output.
If the question is just around getting VisualStudio to build executable programs, it does it every time you run them within it. If you are using all of the default settings, open your project folder and look for a /bin directory. Underneath it, there is a /debug and a /release directory. If you build your program in debug mode, look in the /debug directory, if you build it in release mode, look in the release directory. VS will put everything that your program needs within that directory. You can copy all of those files to another machine that has the .Net runtime installed and it should run.
If the question is more about combining multiple dlls into a single exe, actually, there is a tutorial on it at CodePlex:
As you know, traditional linking of object code is no longer necessary
in .NET. A .NET program will usually consist of multiple parts. A
typical .NET application consists of an executable assembly, a few
assemblies in the program directory, and a few assemblies in the
global assembly cache. When the program is run, the runtime combines
all these parts to a program. Linking at compile time is no longer
necessary.
But sometimes, it is nevertheless useful to combine all parts a
program needs to execute into a single assembly. For example, you
might want to simplify the deployment of your application by combining
the program, all required libraries, and all resources, into a single
.exe file.
http://www.codeproject.com/KB/dotnet/mergingassemblies.aspx
Lastly, if the question is about building an installer for broad distribution, Jan Willem B has a sample for WIX: Using WIX to create an installer: quickstart tutorial
Yes, you can use ILMerge for embedding managed .Net DLLs! Example here.