Is DLL right for me? [closed] - c#

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have quite a bit of code that I find myself using over and over again in various applications. I recently decided to take the time to get them all organized in one location so I can reference them from any of my solutions. I started with a Class Library project and added a bunch of different functions and some derived controls that I use. Some of the functions are used by the derived controls as well as being called directly from applications. I built the project and started using the DLL in my projects.
I have a few questions regarding this custom DLL (I am using Visual C# 2010 Express):
The first thing I noticed is that it doesn't seem to work quite like the stock Microsoft libraries. For instance, I can have "using System.Windows.Controls;" in a program, but I can only reference the namespace of my custom library with the using directive. It would be nice to organize my classes in a similar fashion if it makes sense to do so. Is it possible to have the classes nested like the Microsoft ones?
edit: For example... If I type at the top of my application code "using System.", I get a bunch of sub-namespaces to choose from. I continue typing "Windows.", and I get more sub-categories. If I add a reference to MyCustom.dll and start typing "using MyCustom.", I don't see any of my classes or subclasses. I'm just asking how to organize things within my class library so they behave like the stock libraries.
Is it a good idea to make one big DLL file with all of my reusable code? I don't have a ton of code at the moment, but I want to leave room for expansion. What is the best practice for this?
How do I deal with my DLL when I make an installer for my applications? I normally use InnoSetup to make my installers.
Here is my main concern... Say I use a function from my custom DLL in Application1 and someone installs it. I later add some features or make a change to that function for Application2 that I am building. If the new features break Application1 or change something in an undesirable way, what happens when a user installs Application2? Is my custom DLL shared between the applications or does each application have its own version of my DLL? Obviously, I would build a new version of Application1 to make it work with the new DLL, but there is no guarantee that the user would update it.
Thanks!

1) You can have your code nested in its own sub-namespaces so-to-speak, by changing the namespace in the class file from: SomeNamespace, to: SomeNamespace.SubNamespace. Incidentally, if you do "Add -> New -> Folder" to your project in the solution explorer of Visual Studio, VS will automatically use your "nested" namespace when you do "Add -> New -> Class", which will be the name of the folder you added. So if your original namespace is: MyDLLProject, and you add a folder to the project named: HelpfulStuff, then when you add a new class file to that folder, you will see the namespace is now "MyDLLProject.HelpfulStuff", and consequently, you will need to reference it in your code that way, either by fully specifying it, or by including a "using MyDLLProject.HelpfulStuff" at the top of any class file that needs to use the code contained therein.
2) Wrapping any and all RE-USEABLE code into a DLL is certainly a good practice, as it is then easily used in other projects, and will only ever have to be changed once, if a change is required, no matter how many projects use it. If however, you can be fairly certain you will never use the code anywhere else, it may not be worth the time and effort.
3) InnoSetup should automatically include any assemblies (.NET DLL's) that are referenced by your project. you can see what DLL's are referenced, and add references from the "References" node in the project tree, in the solution explorer of VS. This is where you will need to add a reference to your own DLL, should you choose to make one.
4) Unless you actually go through the trouble of making your DLL a shared component by having it registered in the GAC (Global Assembly Cache), you will not need to worry about this. Your application will always have its own copy of the DLL's it uses, and will not step on each others toes in the way you are imagining.

DLLs are great for what you are trying to do.
1) I can have "using System.Windows.Controls;" in a program, but I can
only reference the namespace of my custom library with the using
directive.
The reason this is is that MS exposes multiple interfaces in their DLLs.
To do this normally requires that you register more than one component in your DLL
in effect exporting your classes to the client using your code.
2) Is it a good idea to make one big DLL file with all of my reusable
code?
DLLs are generally a deployment layout. So, if you have clients that may only be interested in subsets of your code then break the code up into smaller DLLs, but its not needed unless theres a need.
It is possible to provide other dev teams with specific API components for just their needs, so that you only ship smaller grained components when you need to, the bigger the DLL and the more clients the more change has an impact on them.
3) How do I deal with my DLL when I make an installer for my
applications? I normally use InnoSetup to make my installers.
Modern installers for windows all include the ability to distribute DLLs and to register components inside your DLLs in the windows registry.
4) Here is my main concern...(multiple versions of the same dll across
deployments)
In the old days, when people deployed their DLLs to windows system folders this was a big problem. This is no longer the case in well deployed windows applications. Windows sandboxes DLLs to be local to the installing application automatically so even an app that thinks its installing to system32 or loading from there is truly not. You dont have to worry about conflicts as long as you follow the rules of DLL deployment and ship your DLLs with your app and dont try to force them into system directories (which windows will likely not allow anyway
Hope that helps

I don't understand your problem. Please re-word it.
It varies. Generally re-usable libraries are specific to your solution (such as a shared library containing common enum types, or a database model). If you just have snippets of code that you re-use in different projects it might be an idea to copy the source files into your projects, or reference a source file (VS does let you do this) without making a copy of it. Just be careful if you ever use source-control.
There's nothing to worry about, the DLL is just another dependency of your application. VS will (by default) copy the DLL to your application project's output directory, so you just need to configure your installer just to include all of the files in your bin folder (except maybe XML documentation files).
Unless you go to a lot of trouble to make your DLLs shared in the filesystem (which was a popular thing to do in the 1990s when HDD space was limited, see the "Microsoft Shared" folder for an example) this is not an issue because each distribution of your application will have is own copy of the shared DLL ("sharing" is at compile-time, not run-time). The .NET CLR will throw an error if an application tries to launch with an incompatible DLL file compared to the one it was compiled against (and you can enforce this further with strong-named assemblies).

Related

How to call tool DLLs in C# when the DLL-path is different on the target PC?

I might be a bit stupid, but I want to create a tool in Visual Studio in C# and want to call third party tools via their API-DLLs. The only topics I found here are dealing with one of the two methods that I already know:
Compilation time: add a reference to "C:\FooTool\foo.dll" in my project + "using fooToolNamespace.fooToolClass" in my code (compilation time) --> I can "naturally" use the classes of the DLL and will even get full IntelliSense support if a suiting XML-file is available with the DLL. Also compilation time checks will be done for my usage of the dll.
Dynamic (run time): calling e.g. Assembly.LoadFile(#"C:\FooTool\foo.dll") and then using reflection on it to find functions, fields and so on --> no IntelliSense, no compilation time checks
So I actually have the DLL at hand and thus option 1) would be nice during development. But if my tool is used on a different PC, the third-party DLL might be in a different path there, e.g. "C:\foo\foo.dll" and "C:\bar\foo.dll".
In my understanding using a copy of "foo.dll" will not work, because "foo.dll" might have dependencies, e.g. requiring other files of the FooTool-directory. Thus in my understanding I have to call the DLL which is "installed" to the target PC and not a local copy of it.
So can I somehow change the path where my tool accesses the "foo.dll" at runtime and still use method 1) during development?
Or is there another way of doing things?
Or am I just dumb and there is a simple solution for all this?
Thanks a lot for the help and have a great day
Janis
To be able to use option 1 (a referenced DLL), you need to put the DLL somewhere "where your EXE (or, more precisely, the Assembly Resolver) can find it" on the customer's PC.
So where does the assembly resolver look for your DLL?
In the directory where the EXE resides (for desktop/console applications) or the bin subdirectory (for web applications). Since you mention that your DLL requires other dependencies as well, you'd need to copy them to that location as well.
The Global Assembly Cache (GAC). If your dependency supports this, installing it to the GAC ensures that it can be found by your application.
These two are the "supported" scenarios. There is also the possibility to tweak the assembly resolver to look into other directories as well, but that should be reserved for special cases where the other two options failed. (We had such a case and solved it with a custom AssemblyResolve handler on the application domain.)

How to create a dll that includes all the others?

At the moment of creating a project of type "Library of Classes, usually one can generate a dll when compiling, but how could I generate a dll without losing others that I already have included?
I explain with an example: It turns out that Nuget downloaded an S22.Imap dll with the one I worked with, later I generated the dll in the traditional way that I explained in the beginning, but when I wanted to work with dll in another computer, I got errors that were not I found functions that contained the S22.IMAP dll. So to solve this problem, I had to copy the dll of my project, S22.IMAP in an additional way in a specific path of the other computer.
My question is:
How could you generate a dll that includes the ones included in the project you were working with?
All the referred 3rd party dlls (S22.Imap.dll in your example) will be copied to the output folder together with your own dll file (let's say a.dll) when you build your project. That means you should always copy them together (S22 + a.dll) to the place you want to refer them, on another computer/folder/place.
If you really want to make them only one file (although it is not recommended), you can set the S22 one as some "nested resource". Then you will get only one a.dll file and the S22 one is inside the a.dll. See below page for some reference:
Embedding one dll inside another as an embedded resource and then calling it from my code
AND, ILMerge is some tool that can help you do so.
In general, you don't. A DLL is a dynamic linked library, and you would normally only combine static libraries during a build. Here is an answer on the difference between static and dynamic linking.
Typically you would include all the DLLs you need in the installer package. If you use Visual Studio to create the installer, it can detect the dependencies for you. When you run the installer, all of the necessary DLLs are deployed. Nearly all commercial .NET software follows this pattern.
It is possible to merge an assembly into another assembly using a tool called ILMerge. This would be a very unusual thing to do, and could cause issues with intellectual property and code signing, so it is not recommended.

How to share libraries without using the GAC?

I have a library that is meant to be used by many websites. The way I am doing it now is in the library's properties, I set the "Post-build event command line" to: copy "$(TargetPath)" "$(SolutionDir)\MyWebsite\bin\$(TargetFileName)"
Every time I want a new website to use the shared library, I add a new line like this: copy "$(TargetPath)" "$(SolutionDir)\MyWebsite2\bin\$(TargetFileName)"
Is there an easy or better way to do this besides using the GAC?
In my opinion your problem here is a lack of control about how this library is produced and used by other projects. If I were you (which I'm not :) I'd set about developing the library through a unit test co-project where new functionality can be developed and tested independently. Once that functionality has been implemented and tested to be working within your unit test parameters manually copy the assembly into a "library" folder of the web project that the required the extension of the library in the first place (this folder holds all your compiled assemblies used by that project).
Even better would be to maintain a version system in which you tag the new version of the library so as to keep track of the exact source revision that it's using.
The reason I suggest what may seem like a cumbersome methodology of working is that your current practice makes your existing websites quite brittle as a change made in the library for one site may in fact break one of the other sites... and as the amount of sites you have increases you can't be forever retro testing new versions of the shared library against the existing sites.
It's also for these reasons that I don't recommend using the GAC either.

How do I work with shared assemblies and projects?

To preface, I've been working with C# for a few months, but I'm completely unfamiliar with concepts like deployment and assemblies, etc. My questions are many and varied, although I'm furiously Googling and reading about them to no avail (I currently have Pro C# 2008 and the .NET 3.5 Platform in front of me).
We have this process and it's composed of three components: an engine, a filter, and logic for the process. We love this process so much we want it reused in other projects. So now I'm starting to explore the space beyond one solution, one project.
Does this sound correct? One huge Solution:
Process A, exe
Process B, exe
Process C, exe
Filter, dll
Engine, dll
The engine is shared code for all of the processes, so I'm assuming that can be a shared assembly? If a shared assembly is in the same solution as a project that consumes it, how does it get consumed if it's supposed to be in the GAC? I've read something about a post build event. Does that mean the engine.dll has to be reployed on every build?
Also, the principle reason we separated the filter from the process (only one process uses it) is so that we can deploy the filter independently from the process so that the process executable doesn't need to be updated. Regardless of if that's best practice, let's just roll with it. Is this possible? I've read that assemblies link to specific versions of other assemblies, so if I update the DLL only, it's actually considered tampering. How can I update the DLL without changing the EXE? Is that what a publisher policy is for?
By the way, is any of this stuff Google-able or Amazon-able? What should I look for? I see lots of books about C# and .NET, but none about deployment or building or testing or things not related to the language itself.
I agree with Aequitarum's analysis. Just a couple additional points:
The engine is shared code for all of the processes, so I'm assuming that can be a shared assembly?
That seems reasonable.
If a shared assembly is in the same solution as a project that consumes it, how does it get consumed if it's supposed to be in the GAC?
Magic.
OK, its not magic. Let's suppose that in your solution your process project has a reference to the engine project. When you build the solution, you'll produce a project assembly that has a reference to the engine assembly. Visual Studio then copies the various files to the right directories. When you execute the process assembly, the runtime loader knows to look in the current directory for the engine assembly. If it cannot find it there, it looks in the global assembly cache. (This is a highly simplified view of loading policy; the real policy is considerably more complex than that.)
Stuff in the GAC should be truly global code; code that you reasonably expect large numbers of disparate projects to use.
Does that mean the engine.dll has to be reployed on every build?
I'm not sure what you mean by "redeployed". Like I said, if you have a project-to-project reference, the build system will automatically copy the files around to the right places.
the principle reason we separated the filter from the process (only one process uses it) is so that we can deploy the filter independently from the process so that the process executable doesn't need to be updated
I question whether that's actually valuable. Scenario one: no filter assembly, all filter code is in project.exe. You wish to update the filter code; you update project.exe. Scenario two: filter.dll, project.exe. You wish to update the filter code; you update filter.dll. How is scenario two cheaper or easier than scenario one? In both scenarios you're updating a file; why does it matter what the name of the file is?
However, perhaps it really is cheaper and easier for your particular scenario. The key thing to understand about assemblies is assemblies are the smallest unit of independently versionable and redistributable code. If you have two things and it makes sense to version and ship them independently of each other, then they should be in different assemblies; if it does not make sense to do that, then they should be in the same assembly.
I've read that assemblies link to specific versions of other assemblies, so if I update the DLL only, it's actually considered tampering. How can I update the DLL without changing the EXE? Is that what a publisher policy is for?
An assembly may be given a "strong name". When you name your assembly Foo.DLL, and you write Bar.EXE to say "Bar.EXE depends on Foo.DLL", then the runtime will load anything that happens to be named Foo.DLL; file names are not strong. If an evil hacker gets their own version of Foo.DLL onto the client machine, the loader will load it. A strong name lets Bar.EXE say "Bar.exe version 1.2 written by Bar Corporation depends on Foo.DLL version 1.4 written by Foo Corporation", and all the verifications are done against the cryptographically strong keys associated with Foo Corp and Bar Corp.
So yes, an assembly may be configured to bind only against a specific version from a specific company, to prevent tampering. What you can do to update an assembly to use a newer version is create a little XML file that tells the loader "you know how I said I wanted Foo.DLL v1.4? Well, actually if 1.5 is available, its OK to use that too."
What should I look for? I see lots of books about C# and .NET, but none about deployment or building or testing or things not related to the language itself.
Deployment is frequently neglected in books, I agree.
I would start by searching for "ClickOnce" if you're interested in deployment of managed Windows applications.
Projects can reference assemblies or projects.
When you reference another assembly/project, you are allowed to use all the public classes/enums/structs etc in the referenced assembly.
You do not need to have all of them in one solution. You can have three solutions, one for each Process, and all three solutions can load Engine and Filter.
Also, you could have Process B and Process C reference the compiled assemblies (the .dll's) of the Engine and Filter and have similar effect.
As long as you don't set the property in the reference to an assembly to require a specific version, you can freely update DLLs without much concern, providing the only code changes were to the DLL.
Also, the principle reason we
separated the filter from the process
(only one process uses it) is so that
we can deploy the filter independently
from the process so that the process
executable doesn't need to be updated.
Regardless of if that's best practice,
let's just roll with it. Is this
possible?
I actually prefer this method of updating. Less overhead to update only files that changed rather than everything everytime.
As for using the GAC, whole other level of complexity I won't get into.
Tamper proofing your assemblies can be done by signing them, which is required to use the GAC in the first place, but you should still be fine so long as a specific version is not required.
My recommendation is to read a book about the .NET framework. This will really help you understand the CLR and what you're doing.
Applied Microsoft .NET Framework Programming was a book I really enjoyed reading.
You mention the engine is shared code, which is why you put it in a separate project under your solution. There's nothing wrong with doing it this way, and it's not necessary to add this DLL to the GAC. During your development phase, you can just add a reference to your engine project, and you'll be able to call the code from that assembly. When you want to deploy this application, you can either deploy the engine DLL with it, or you can add the engine DLL to the GAC (which is another ball of wax in and of itself). I tend to lean against GAC deployments unless it's truly necessary. One of the best features of .NET is the ability to deploy everything you need to run your application in one folder without having to copy stuff to system folders (i.e. the GAC).
If you want to achieve something like dynamically loading DLL's and calling member methods from your processor without caring about specific version, you can go a couple of routes. The easiest route is to just set the Specific Version property to False when you add the reference. This will give you the liberty of changing the DLL later, and as long as you don't mess with method signatures, it shouldn't be a problem. The second option is the MEF (which uses Reflection and will be part of the framework in .NET 4.0). The idea with the MEF is that you can scan a "plugins" style folder for DLL's that implement specific functionality and then call them dynamically. This gives you some additional flexibility in that you can add new assemblies later without the need to modify your references.
Another thing to note is that there are Setup and Deployment project templates built into Visual Studio that you can use to generate MSI packages for deploying your projects. MSDN has lots of documentation related to this subject that you can check out, here:
http://msdn.microsoft.com/en-us/library/ybshs20f%28VS.80%29.aspx
Do not use the GAC on your build machine, it is a deployment detail. Visual Studio automatically copies the DLL into build directory of your application when you reference the DLL. That ensures that you'll run and debug with the expected version of the DLL.
When you deploy, you've got a choice. You can ship the DLL along with the application that uses it, stored in the EXE installation folder. Nothing special is needed, the CLR can always find the DLL and you don't have to worry about strong names or versions. A bug fix update is deployed simply by copying the new DLL into the EXE folder.
When you have several installed apps with a dependency on the DLL then deploying bug fix updates can start to get awkward. Since you have to copy to the DLL repeatedly, once for each app. And you can get into trouble when you update some apps but not others. Especially so when there's a breaking change in the DLL interface that requires the app to be recompiled. That's DLL Hell knocking, the GAC can solve that.
We found some guidance on this issue at MSDN. We started with two separate solution with no shared code, and then abstracted the commonalities to a shared assemblies. We struggled with ways to isolate changes in the shared code to impact only the projects that were ready for it. We were terrible at Open/Close.
We tried
branching the shared code for each project that used it and including it in the solution
copying the shared assembly from the shared solution when we made changes
coding pre-build events to build the shared code solution and copy the assembly
Everything was a real pain. We ended up using one large solution with all the projects in it. We branch each project as we want to stage features closer to production. This branches the shared code as well. It's simplified things a lot and we get a better idea of what tests fail across all projects, as the common code changes.
As far as deployment, our build scripts are setup to build the code and copy only the files that have changed, including the assemblies, to our environments.
By default, you have a hardcoded version number in your project (1.0.0.0). As long as you don't change it, you can use all Filter builds with the Process assembly (it only knows it should use the 1.0.0.0 version). This is not the best solution, however, because how do you distinguish between various builds yourself?
Another option is use different versions of the Filter by the same Process. You should add an app.config file to the Process project, and include a bindingRedirect element (see the docs). Whenever the Runtime looks for a particular version of the Filter, it's "redirected" to a version indicated in the config. Unfortunately, this means that although you don't have to update the Process assembly, you'll have to update the config file with the new version.
Whenever you encounter versioning problems, you can use Fuslogvw.exe (fusion log viewer) to troubleshoot these.
Have fun!
ulu

Working with Common/Utility Libraries

At the company I work for we have a "Utility" project that is referenced by pretty much ever application we build. It's got lots of things like NullHelpers, ConfigSettingHelpers, Common ExtensionMethods etc.
The way we work is that when we want to make a new project, we get the latest version of the project from source control add it to the solution and then reference the project from any new projects that get added to the solution.
This has worked ok, however there have been a couple of instances where people have made "breaking changes" to the common project, which works for them, but doesn't work for others.
I've been thinking that rather than adding the common library as a project reference perhaps we should start developing the common library as a standalone dll and publish different versions and target a particular version for a particular project so that changes can be made without any risk to other projects using the common library.
Having said all that I'm interested to see how others reference or use their common libraries.
That's exactly what we're doing. We have a Utility project which has some non project specific useful functions. We increase the version manually (minor), build the project in Release version, sign it and put it to a shared location.
People then use the specific version of the library.
If some useful methods are implemented in some specific projects which could find their way into main Utility project, we put the to a special helper class in the project, and mark them as a possible Utility candidate (simple //TODO). At the end of the project, we review the candidates and if they stick, we move them to the main library.
Breaking changes are a no-no and we mark methods and classes as [Obsolete] if needed.
But, it doesn't really matter because we increase the version on every publish.
Hope this helps.
We use branching in source control; everyone uses the head branch until they make a release. When they branch the release, they'll branch the common utilities project as well.
Additionally, our utilities project has its own unit tests. That way, other teams can know if they would break the build for other teams.
Of course, we still have problems like you mention occasionally. But when one team checks in a change that breaks another team's build, it usually means the contract for that method/object has been broken somewhere. We look at these as opportunities to improve the design of the common utilities project... or at least to write more unit tests :/
I've had the EXACT same issue!
I used to use project references, but it all seems to go bad, when as you say, you have many projects referencing it.
I now compile to a DLL, and set the CopyLocal property for the DLL reference to false after the first build (otherwise I find it can override sub projects and just become a mess).
I guess in theory it should probably be GAC'ed, but if its a problem that is changing a lot (as mine is) this can become problematic..

Categories

Resources