I was given a task to edit an existing website however the client has no access to the source code anymore so I accessed what I could on the Virtual Machine. However, the only files I could get from the VM are the front-end files. I need to change something in the controllers so I was looking at reverse engineering the project DLL.
Is it possible to just decompile the current dll, create a new project, copy everything, compile a new working dll and just reupload the dll file to the server? From my experience as a programmer, dlls are given a GUID identifier that gets compiled with the solution when it is deployed. I think recreating the project will assign a different GUID for the new DLL. Will this be alright when it is uploaded to the server?
Any help and clarification is appreciated. Thanks.
Yes, the reuploaded DLL should be fine
But, in the case of strongly named assemblies anything that references that DLL won't be too happy. In which case; you'll need to do the same process with those and link against your new project. Which you'll probably want to do anyway so you aren't trapped by their behavior.
IIS itself won't care; it's just looking for an assembly with the entrypoint.
Here's an answer to a similar question: How do I decompile a .dll file?
It's more of a maybe... Try it and see if it works. You will need to know what framework and assemblies are tied into the library and what language it's programmed in.
Yes, rebuilding the DLL from scratch will work unless the "uploading" process has certain particularities we're unaware of (such as assigning GUIDs to DLLs).
In reality, however, depending on the language used automatic decompilation may not be so straightforward, and doing it manually is a complex and difficult task, so I would approach that task with caution.
Related
At the moment of creating a project of type "Library of Classes, usually one can generate a dll when compiling, but how could I generate a dll without losing others that I already have included?
I explain with an example: It turns out that Nuget downloaded an S22.Imap dll with the one I worked with, later I generated the dll in the traditional way that I explained in the beginning, but when I wanted to work with dll in another computer, I got errors that were not I found functions that contained the S22.IMAP dll. So to solve this problem, I had to copy the dll of my project, S22.IMAP in an additional way in a specific path of the other computer.
My question is:
How could you generate a dll that includes the ones included in the project you were working with?
All the referred 3rd party dlls (S22.Imap.dll in your example) will be copied to the output folder together with your own dll file (let's say a.dll) when you build your project. That means you should always copy them together (S22 + a.dll) to the place you want to refer them, on another computer/folder/place.
If you really want to make them only one file (although it is not recommended), you can set the S22 one as some "nested resource". Then you will get only one a.dll file and the S22 one is inside the a.dll. See below page for some reference:
Embedding one dll inside another as an embedded resource and then calling it from my code
AND, ILMerge is some tool that can help you do so.
In general, you don't. A DLL is a dynamic linked library, and you would normally only combine static libraries during a build. Here is an answer on the difference between static and dynamic linking.
Typically you would include all the DLLs you need in the installer package. If you use Visual Studio to create the installer, it can detect the dependencies for you. When you run the installer, all of the necessary DLLs are deployed. Nearly all commercial .NET software follows this pattern.
It is possible to merge an assembly into another assembly using a tool called ILMerge. This would be a very unusual thing to do, and could cause issues with intellectual property and code signing, so it is not recommended.
I am using Visual Studio 2010 and I'm trying to create a .dll. My .dll uses an external library .lib. This Library also contains a collection of other libraries (.lib).
So: My main.lib is a container for the collection of libs - and as a result it is about 300mb big.
Now when I use the lib in my dll, its linking fine and it works correctly on my pc. But when I deploy my compiled program to another computer, then it couldn't load the .lib. It simply can't find it, even when I've put it into the directory of the .dll.
Now my question: Is there a way I can store all functionality of my .lib in a .dll? So that the .dll file will be about 300mb big but I don't need to deploy the .lib anymore?
Update:
Thank you all very much for your answers. To descripe my problem I want to show you this output of my program:
Unhandled Exception: System.Runtime.InteropServices.SEHException: External componennt has thrown an exception.
I've spent many hours of using google to solve this error. I've found out that it's a problem with a missing file (one of my "external components" (.dll)) couldn't find definitions of classes and so on or otherwise a problem with access rights.
I tried my best to fix this and with one try I had success and could use this program. I know that this is because I have put the main.lib into the right folder, so my program could find it. But now I don't know where to put this main.lib. So: my program is broken again and now I want to fix it... I hope this description helps. It's hard to describe it because I don't know exactly what the problem is...
Update 2:
Thanks to your help I solved my problem. At first I misunderstood the principle of how .dll's and .lib's are working. If anyone else has this problem and will be redirected to this post then #D Stanley's answer will help.
Thanks to #David Heffernan I've found out that it's not any missing .lib or something else which is causing this error. It's a problem in my native C++ Code (which is in the .lib). So I fixed this problem (which caused an exception) and now everything is working fine.
Thank you all for your help.
You can not statically link static libraries in other static libraries. What you should be doing instead is statically linking all those individual static libraries in your DLL. Does the linker not warn you about this?
Also, you can't deploy static libraries to another machine, as they can't be linked at run time.
If I understand your situation:
You have several static libraries (.lib)
They are linked together into one big static library (main.lib)
You want to use this library in your dynamic library (.dll)
I'm not certain what's happening locally, but lib files are not "loaded" at run-time - they are linked into either a dynamic library that is loaded at run time (hence the name "dynamic" or into the executable itself. So if your application is working now, then either you're already linking part of it into your dll or it's getting linked into the executable.
So to answer your question, yes, you can link your lib file into your dll - and it will include all of the necessary object code into it. Note that it may not be as big as the source library - that depends on how much of the original code is used by your library.
I also don't see how c# is part of your situation.
From your description it seems that you are linking with *.lib stubs that accompany DLLs for their static loading. You have those DLLs on your computer but not on other computers where you try to use your DLL. So to make everything work, find and copy those DLLs together with your DLL.
I have created several small applications that use my own DLL. The problem is, this DLL is constantly changing. My current solution to this problem is that I have a Setup project in the class library solution that creates and registers the DLL. In all my applications I then have to open the solution and re-reference the newly created/registered DLL. Then I have to re-compile their setup projects, uninstall the old applications, and then re-install the new application.
There has to be a better way and I'm just not sure because I'm fairly new to all this. I have looked into ClickOnce but I don't think that will solve my issue as I cannot publish a class library. I have looked into checking version numbers but I must be doing something wrong because it doesn't work either.
I understand that once a DLL is created and being used in an application it should essentially not be touched. I do not have that option in this situation. It is constantly updated. Done.
So, is there a better way? A point in the direction of a guide or related question/answer/forum would be greatly appreciated.
Edit: The DLL is not constantly changing during runtime but it is constantly evolving to allow more functionality and detail within the other applications. Also, one big thing I guess I should have mentioned is the Public interface is constantly chaning - usually adding new methods.
Make sure the references to your DLL specify SpecificVersion=false. Then just deploy each new version into the GAC and that should do the trick.
Eventually, you can also manually force versions using Binding Redirection.
A solution you can try is to use a single solution for your project and reference the project wherever it needs to go.
Check out NuGet
You could set up an internal Nuget repository (really just a folder that stores nupkg files.) Then when you build a new DLL, you can update the apps as needed in studio. This would ensure it had the latest version. They shouldn't need a redployment unless there are bugs in the DLL that you're fixing.
One solution is as follows:
Physically separate the interface from the implementation. e.g. AssemblyA is the interface, the application (AssemblyB say) knows only the interface at compile time. The implementation (AssemblyC) also knows/references AssemblyA of course. The point being that AssemblyB does not reference AssemblyC. This will require you to use an IoC container (like MS Unity 2.0 but there are many others) in order to resolve and instantiate your concretes at runtime.
Write an update process that finds the new AssemblyC.dll, replaces the local copy and uses reflection along with the IoCContainer to 'load' the new implementation at what ever interval you require, typically app start up.
The above relies on your interface being stable. If it isn't, you may be able to write a (more) stable Facade.
I'm building a tool in managed code (mostly C++/CLI) in two versions, a 'normal user' version and a 'pro' version.
The fact that the core code is identical between the two versions has caused me a little trouble as I want to package the resulting tool as a single assembly (DLL) and I don't want to have to include the .cpp files for the common code in the projects of the two versions of the tools. I'd rather have a project for the common code and a project for each version of the tool and have each version of the tools project depend on the common code and link it in as desired.
In unmanaged C++ I'd do this by placing the common code in a static library and linking both versions of the tool to it. I don't seem to be able to get this to work in C++/CLI. It seems that I'm forced to build the common code into a DLL assembly and that results in more DLL's than I'd like.
So, in summary, I can't work out how to build the common code in one project and link it with each of the final product projects to produce two single DLL assemblies that both include the common code.
I'm probably doing something wrong but I tried to work out how to do this using netmodules and whatever and I just couldn't get it to work. In the end the only way I got it working was to tell the linker to link the build products of the common code assembly rather than the results which works but is a bit of a hack IMHO.
Anyway, does anyone have any suggestions for how I SHOULD be solving this problem?
Edited: I guess I should have mentioned the fact that the assemblies generated are not 100% managed code, they contain a mix of managed and unmanaged code as is, probably, quite common with assemblies produced with C++/CLI...
If you are annoyed at all the DLLs, download ILMerge. I use this to bundle together multiple DLL's into an easy-to-use .EXE for my clients.
If I'm understanding this correctly, you have a solution which contains two projects. One project for the "normal" user and one project for the "pro" user. Visual Studio allows you to add a "link" to another file source from another project. If your "pro" version has the real core code file, and in your "normal" version you add existing -> find the file in the "pro" project, and click the down arrow by the Add button and select "Add as Link". Now you have single file that is literally the same between two projects.
As said, ILmerge is one way. Personally, if you're bundling some exe with a lot of DLLs, I favor Netz.
You could use modules. You can link them into an assembly using the assembly linker, al.exe.
That's the downside of the .Net compilation process, you can't have things like static libraries and the header files that hold them together, everything is held in one big dll file and the only way to share information is to either build a common dll and reference it from other assemblies or to duplicate the code in each dll (possibly by copying/linking .cs files between projects).
Note that the 2nd way will declare different types, even though they have the same name. This will bite you on the ass with stuff like remoting (or anything that requires casting to specific shared interfaces between processes).
Remotesoft Salamander will hook you up. It's basically a native compiler and linker.
When using mono (or cygwin is an option) mkbundle may also be a valid choice.
To preface, I've been working with C# for a few months, but I'm completely unfamiliar with concepts like deployment and assemblies, etc. My questions are many and varied, although I'm furiously Googling and reading about them to no avail (I currently have Pro C# 2008 and the .NET 3.5 Platform in front of me).
We have this process and it's composed of three components: an engine, a filter, and logic for the process. We love this process so much we want it reused in other projects. So now I'm starting to explore the space beyond one solution, one project.
Does this sound correct? One huge Solution:
Process A, exe
Process B, exe
Process C, exe
Filter, dll
Engine, dll
The engine is shared code for all of the processes, so I'm assuming that can be a shared assembly? If a shared assembly is in the same solution as a project that consumes it, how does it get consumed if it's supposed to be in the GAC? I've read something about a post build event. Does that mean the engine.dll has to be reployed on every build?
Also, the principle reason we separated the filter from the process (only one process uses it) is so that we can deploy the filter independently from the process so that the process executable doesn't need to be updated. Regardless of if that's best practice, let's just roll with it. Is this possible? I've read that assemblies link to specific versions of other assemblies, so if I update the DLL only, it's actually considered tampering. How can I update the DLL without changing the EXE? Is that what a publisher policy is for?
By the way, is any of this stuff Google-able or Amazon-able? What should I look for? I see lots of books about C# and .NET, but none about deployment or building or testing or things not related to the language itself.
I agree with Aequitarum's analysis. Just a couple additional points:
The engine is shared code for all of the processes, so I'm assuming that can be a shared assembly?
That seems reasonable.
If a shared assembly is in the same solution as a project that consumes it, how does it get consumed if it's supposed to be in the GAC?
Magic.
OK, its not magic. Let's suppose that in your solution your process project has a reference to the engine project. When you build the solution, you'll produce a project assembly that has a reference to the engine assembly. Visual Studio then copies the various files to the right directories. When you execute the process assembly, the runtime loader knows to look in the current directory for the engine assembly. If it cannot find it there, it looks in the global assembly cache. (This is a highly simplified view of loading policy; the real policy is considerably more complex than that.)
Stuff in the GAC should be truly global code; code that you reasonably expect large numbers of disparate projects to use.
Does that mean the engine.dll has to be reployed on every build?
I'm not sure what you mean by "redeployed". Like I said, if you have a project-to-project reference, the build system will automatically copy the files around to the right places.
the principle reason we separated the filter from the process (only one process uses it) is so that we can deploy the filter independently from the process so that the process executable doesn't need to be updated
I question whether that's actually valuable. Scenario one: no filter assembly, all filter code is in project.exe. You wish to update the filter code; you update project.exe. Scenario two: filter.dll, project.exe. You wish to update the filter code; you update filter.dll. How is scenario two cheaper or easier than scenario one? In both scenarios you're updating a file; why does it matter what the name of the file is?
However, perhaps it really is cheaper and easier for your particular scenario. The key thing to understand about assemblies is assemblies are the smallest unit of independently versionable and redistributable code. If you have two things and it makes sense to version and ship them independently of each other, then they should be in different assemblies; if it does not make sense to do that, then they should be in the same assembly.
I've read that assemblies link to specific versions of other assemblies, so if I update the DLL only, it's actually considered tampering. How can I update the DLL without changing the EXE? Is that what a publisher policy is for?
An assembly may be given a "strong name". When you name your assembly Foo.DLL, and you write Bar.EXE to say "Bar.EXE depends on Foo.DLL", then the runtime will load anything that happens to be named Foo.DLL; file names are not strong. If an evil hacker gets their own version of Foo.DLL onto the client machine, the loader will load it. A strong name lets Bar.EXE say "Bar.exe version 1.2 written by Bar Corporation depends on Foo.DLL version 1.4 written by Foo Corporation", and all the verifications are done against the cryptographically strong keys associated with Foo Corp and Bar Corp.
So yes, an assembly may be configured to bind only against a specific version from a specific company, to prevent tampering. What you can do to update an assembly to use a newer version is create a little XML file that tells the loader "you know how I said I wanted Foo.DLL v1.4? Well, actually if 1.5 is available, its OK to use that too."
What should I look for? I see lots of books about C# and .NET, but none about deployment or building or testing or things not related to the language itself.
Deployment is frequently neglected in books, I agree.
I would start by searching for "ClickOnce" if you're interested in deployment of managed Windows applications.
Projects can reference assemblies or projects.
When you reference another assembly/project, you are allowed to use all the public classes/enums/structs etc in the referenced assembly.
You do not need to have all of them in one solution. You can have three solutions, one for each Process, and all three solutions can load Engine and Filter.
Also, you could have Process B and Process C reference the compiled assemblies (the .dll's) of the Engine and Filter and have similar effect.
As long as you don't set the property in the reference to an assembly to require a specific version, you can freely update DLLs without much concern, providing the only code changes were to the DLL.
Also, the principle reason we
separated the filter from the process
(only one process uses it) is so that
we can deploy the filter independently
from the process so that the process
executable doesn't need to be updated.
Regardless of if that's best practice,
let's just roll with it. Is this
possible?
I actually prefer this method of updating. Less overhead to update only files that changed rather than everything everytime.
As for using the GAC, whole other level of complexity I won't get into.
Tamper proofing your assemblies can be done by signing them, which is required to use the GAC in the first place, but you should still be fine so long as a specific version is not required.
My recommendation is to read a book about the .NET framework. This will really help you understand the CLR and what you're doing.
Applied Microsoft .NET Framework Programming was a book I really enjoyed reading.
You mention the engine is shared code, which is why you put it in a separate project under your solution. There's nothing wrong with doing it this way, and it's not necessary to add this DLL to the GAC. During your development phase, you can just add a reference to your engine project, and you'll be able to call the code from that assembly. When you want to deploy this application, you can either deploy the engine DLL with it, or you can add the engine DLL to the GAC (which is another ball of wax in and of itself). I tend to lean against GAC deployments unless it's truly necessary. One of the best features of .NET is the ability to deploy everything you need to run your application in one folder without having to copy stuff to system folders (i.e. the GAC).
If you want to achieve something like dynamically loading DLL's and calling member methods from your processor without caring about specific version, you can go a couple of routes. The easiest route is to just set the Specific Version property to False when you add the reference. This will give you the liberty of changing the DLL later, and as long as you don't mess with method signatures, it shouldn't be a problem. The second option is the MEF (which uses Reflection and will be part of the framework in .NET 4.0). The idea with the MEF is that you can scan a "plugins" style folder for DLL's that implement specific functionality and then call them dynamically. This gives you some additional flexibility in that you can add new assemblies later without the need to modify your references.
Another thing to note is that there are Setup and Deployment project templates built into Visual Studio that you can use to generate MSI packages for deploying your projects. MSDN has lots of documentation related to this subject that you can check out, here:
http://msdn.microsoft.com/en-us/library/ybshs20f%28VS.80%29.aspx
Do not use the GAC on your build machine, it is a deployment detail. Visual Studio automatically copies the DLL into build directory of your application when you reference the DLL. That ensures that you'll run and debug with the expected version of the DLL.
When you deploy, you've got a choice. You can ship the DLL along with the application that uses it, stored in the EXE installation folder. Nothing special is needed, the CLR can always find the DLL and you don't have to worry about strong names or versions. A bug fix update is deployed simply by copying the new DLL into the EXE folder.
When you have several installed apps with a dependency on the DLL then deploying bug fix updates can start to get awkward. Since you have to copy to the DLL repeatedly, once for each app. And you can get into trouble when you update some apps but not others. Especially so when there's a breaking change in the DLL interface that requires the app to be recompiled. That's DLL Hell knocking, the GAC can solve that.
We found some guidance on this issue at MSDN. We started with two separate solution with no shared code, and then abstracted the commonalities to a shared assemblies. We struggled with ways to isolate changes in the shared code to impact only the projects that were ready for it. We were terrible at Open/Close.
We tried
branching the shared code for each project that used it and including it in the solution
copying the shared assembly from the shared solution when we made changes
coding pre-build events to build the shared code solution and copy the assembly
Everything was a real pain. We ended up using one large solution with all the projects in it. We branch each project as we want to stage features closer to production. This branches the shared code as well. It's simplified things a lot and we get a better idea of what tests fail across all projects, as the common code changes.
As far as deployment, our build scripts are setup to build the code and copy only the files that have changed, including the assemblies, to our environments.
By default, you have a hardcoded version number in your project (1.0.0.0). As long as you don't change it, you can use all Filter builds with the Process assembly (it only knows it should use the 1.0.0.0 version). This is not the best solution, however, because how do you distinguish between various builds yourself?
Another option is use different versions of the Filter by the same Process. You should add an app.config file to the Process project, and include a bindingRedirect element (see the docs). Whenever the Runtime looks for a particular version of the Filter, it's "redirected" to a version indicated in the config. Unfortunately, this means that although you don't have to update the Process assembly, you'll have to update the config file with the new version.
Whenever you encounter versioning problems, you can use Fuslogvw.exe (fusion log viewer) to troubleshoot these.
Have fun!
ulu