Basically, I developped a small library with some common fonctionnalities that I use in all my projects. For some political reasons, I cannot choose a generic name for that library (including namespace and assembly name). Usually, it must include the name of the enterprise, something like this for the namespace: Enterprise.ProjectName.XXX.YYY.
For the moment, I'm doing a copy of my library, then I'm renaming the namespaces manually with Visual Studio, and finally I'm recompiling the whole thing.
So my question is the following: Is it possible to create a small program that takes an assembly as input, rename all namespaces from MyLibrary.XXX.YYY to Enterprise.ProjectName.XXX.YYY as well as the assembly name?
What are the steps to follow?
[Edit]
Generating the assembly automatically seems to much work. I will use resharper and/or CTRL+ALT+F like I did so far. Thanks for the answers...
You could use Mono's Cecil project to disassemble the assembly, inspect each type, rename or recreate the type with a new namespace, and generate the resulting assembly.
That being said, it might be simpler to use a tool like Resharper which allows you to rename namespaces correctly within the code base.
Some options:
If you are copying the entire source code for your library into your new project, you can use a refactoring tool like Resharper to "Adjust Namespaces". This is a pretty quick and safe refactoring.
If you just need to avoid shipping the internally named assembly, you may be able to use ILMerge to 'hide' the internal assembly during a post-build step. This is viable if it's just a perception issue for the final assembly names in the binary output directory.
Deal with the issue at the political level by describing your internal library as being no different from any other third-party dependency. Then the naming is no longer a problem. This may solve other problems if you're shipping the source code of this library to multiple clients, as it clarifies that you are not giving full ownership of your 'shared' code to each client. Otherwise they could potentially argue that you are not allowed to use that 'shared' code in projects for other clients, since it is clearly owned by them, having their enterprise name in the namespace.
Related
I managed to decompile a c# file (using dotpeek) and I want to edit a couple of simple things (using visual studio).
The problem is this file has many dll dependencies even though the edits are necessary only on the main exe.
Obviously if you try to build an exe on vs without having the references and dependencies in place the compiler will complain. Are there any solutions to this?
You cannot build without the dependencies; however, there is no need to decompile the dependencies. Just add the DLLs themselves as reference to the project.
This is always fine if the decompiled assembly depends on other DLLs; however, if the other DLLs depend on the decompiled assembly, this will only work if the assemblies are not signed, i.e. if they are not using strong names. The purpose of signing is precisely to disallow such hacks.
No, you can't build without the dependencies because the compiler has to check that types match and have the indicated members etc.
I have a MEF-based application which uses adapters to process files. It uses configuration files to determine which directories to watch and which adapter to use to process each type of file. Plugins take the form of a .dll that implements a common interface.
Each .dll requires its own set of dependent libaries. For instance, plugin1.dll might need to use apilibrary.dll and xmllibrary.dll. It is also possible that at a later date I might want to add plugin2.dll, and plugin2.dll might use xmllibrary.dll as well. These dependent libraries are updated regularly, so I can't count on plugin2.dll using the exact same version of xmllibrary.dll used in plugin1.dll.
I'd like to compile each plugin to one .dll file that invisibly includes within itself all of its dependent libraries, which seems like one way to solve this problem. Alternately, I'd like to figure out how each .dll file can look for its dependent libaries in a subfolder, which I believe would also reduce the possibility of versioning conflicts. Or maybe there's a dead simple solution to this problem that I haven't even considered (which is always very, very likely).
Any thoughts?
You should probably try to get this to work with standard .NET loading rules. However, if you do need to control exactly how assemblies are loaded and which versions are loaded, this blog post shows how: Using Loading contexts effectively
I guess you need to weigh up deployability vs. maintenance. The simple solution is to use a tool called ILMerge. ILMerge takes your project output and can take other assemblies and merge them together. This enables you to wrap up all of the assemblies that your plugin is dependent on, and merge them into a single assembly. Optionally you can do things like re-signing with your public key, etc. Here is a good read: Leveraging ILMerge to simplify deployment and your users experience by Daniel Cazzulino.
But while that is good, what happens if a new version of the referenced assembly is distributed that corrects bugs in that which you have embedded? By the rules of Fusions assembly loader, when it loads the types from your referenced assembly, it will see that they have already been loaded, so there is no reason for it to load the updated version. This would then mean you need to recompile your plugin and merge the newer referenced assembly again.
My question would be, is it really that important to ensure a specific version is used? If a newer version provides an updated implementation (that doesn't break backwards compatibility) then surely this should benefit all plugins that need to reference it?
As for as how assemblies are loaded in reference to each other, have a read of Understanding .Net Assemblies and References, which is an invaluable piece of information.
MEF uses standard .NET assembly loading, and everything's loaded in a single AppDomain. You have very little control over how dependencies are loaded - as they just get loaded automatically by the CLR when the assembly is injected via MEF. Normal CLR assembly loading rules apply when using MEF, so dependencies will be loaded as if they were a dependency of your application - no matter where they're located or referenced.
For the most part, if the plugins and their dependencies are properly written, you most likely will not need to worry about this. As long as the versioning in the dependencies is correct, it will likely just work.
I'm trying to use Code Contracts and I'm running into a problem that is blocking me. With Contract Reference Assembly set to Build, ccrewrite is erroring while trying to access assemblies that are referenced indirectly by assemblies that are referenced directly. These indirect assemblies are not needed to build the solution, so I'm wondering why they're required by Code Contracts? Also, is there a way to work around this problem without having to provide all runtime dependencies as part of the build?
I assume ccrewrite is trying to walk the dependency chain to analyze it for pre/postconditions, etc.. If the assemblies are referenced by assemblies which you in turn reference, then they would be required for your program to run, so ccrewrite is just performing normal analysis before you actually run the program.
That's based on using JML; I've only just started looking at the .NET Code Contracts myself. But I believe both tools operate on roughly the same principles.
The rewriter looks into method bodies of referenced assemblies in order to extract contracts (the C# compiler never does that). As a result, the rewriter often chases more dependencies than C# which is the problem you ran into.
There are two ways to address this.
add extra paths to directories where the desired libraries can be found (in the contract library paths options). This it the preferred method
As a last resort, you can add the option -ignoreMetadataErrors to the runtime contract options. Note that this is dangerous. In the case that the rewriter truly needs some aspect of the referenced code in order to create proper IL, you might end up with incorrect IL. To guard against this, use peverify on the resulting bits.
Hope this helps.
Let's assume I have two assemblies:
MyExecutable.dll version 1.0.0
MyClassLibrary.dll version 1.0.0
Now, MyExecutable.dll currently uses MyClassLibrary.dll's classes and methods (which include some algorithms). Most of those algorithms were made on the run, being that later I'll want to refine them if needed. This means, I won't change the interface of those classes but the code itself will see some changes.
The question at hand is, MyExecutable.dll will be expecting MyClassLibrary.dll 1.0.0 and I'll want it to use version 1.0.1 (or something like that). I don't want to have to recompile MyExecutable.dll(because actually there might be more than just one executable using MyClassLibrary.dll). Is there a solution for this problem? I've heard about the GAC, but if possible I'd like to stay away from it.
Thanks
You are looking for Assembly Binding Redirection - this is a configurable way to tell .NET what version assemblies to use.
The first solution is Assembly Binding redirection, already recommended by Oded.
It is advantageous if you have a smaller .dll and want to make something work with its newer versions.
The second option is creating a separate assembly for the interfaces, and referencing only that from the executable.
This way, you can allow third parties to build stuff against your library without giving them the exact library's assembly. (Eg. they can't decompile it with Reflector, so it is more secure this way.)
As long as the interface assembly doesn't change, you can change other stuff in the library pretty much as you want.
I'm building a tool in managed code (mostly C++/CLI) in two versions, a 'normal user' version and a 'pro' version.
The fact that the core code is identical between the two versions has caused me a little trouble as I want to package the resulting tool as a single assembly (DLL) and I don't want to have to include the .cpp files for the common code in the projects of the two versions of the tools. I'd rather have a project for the common code and a project for each version of the tool and have each version of the tools project depend on the common code and link it in as desired.
In unmanaged C++ I'd do this by placing the common code in a static library and linking both versions of the tool to it. I don't seem to be able to get this to work in C++/CLI. It seems that I'm forced to build the common code into a DLL assembly and that results in more DLL's than I'd like.
So, in summary, I can't work out how to build the common code in one project and link it with each of the final product projects to produce two single DLL assemblies that both include the common code.
I'm probably doing something wrong but I tried to work out how to do this using netmodules and whatever and I just couldn't get it to work. In the end the only way I got it working was to tell the linker to link the build products of the common code assembly rather than the results which works but is a bit of a hack IMHO.
Anyway, does anyone have any suggestions for how I SHOULD be solving this problem?
Edited: I guess I should have mentioned the fact that the assemblies generated are not 100% managed code, they contain a mix of managed and unmanaged code as is, probably, quite common with assemblies produced with C++/CLI...
If you are annoyed at all the DLLs, download ILMerge. I use this to bundle together multiple DLL's into an easy-to-use .EXE for my clients.
If I'm understanding this correctly, you have a solution which contains two projects. One project for the "normal" user and one project for the "pro" user. Visual Studio allows you to add a "link" to another file source from another project. If your "pro" version has the real core code file, and in your "normal" version you add existing -> find the file in the "pro" project, and click the down arrow by the Add button and select "Add as Link". Now you have single file that is literally the same between two projects.
As said, ILmerge is one way. Personally, if you're bundling some exe with a lot of DLLs, I favor Netz.
You could use modules. You can link them into an assembly using the assembly linker, al.exe.
That's the downside of the .Net compilation process, you can't have things like static libraries and the header files that hold them together, everything is held in one big dll file and the only way to share information is to either build a common dll and reference it from other assemblies or to duplicate the code in each dll (possibly by copying/linking .cs files between projects).
Note that the 2nd way will declare different types, even though they have the same name. This will bite you on the ass with stuff like remoting (or anything that requires casting to specific shared interfaces between processes).
Remotesoft Salamander will hook you up. It's basically a native compiler and linker.
When using mono (or cygwin is an option) mkbundle may also be a valid choice.