Managed dlls as embedded resources - c#

Is it good practice to bundle all required managed dlls as embedded resources to a .NET library project in order to ship just one dll?
Background:
I have created an API as a .NET dll (in C#), and it's working all fine. But the library has quite some dependencies on other managed libraries (around 15 dlls) so I need to distribute those as well.
When the users of my API have created an application they again have to make sure to distribute all those dlls along with the application. To me it would seem better if they had just one dll to consider.
The main downside I can see to using embedded dlls is that they must be unpacked to a temporary folder before being loaded dynamically, which may or may not have performance and robustness issues.

There are a lot of questions around this. What happens if you're expecting to load a platform-specific dependency (i.e. x86 vs. x64), or that's true about the app consuming your API? Does that mean that you need to include specific x86 vs x64 assemblies in your package as well? It becomes hairy quickly.
You should consider using ClickOnce deployment for these types of scenarios. Then, all of the dependencies would be packaged together.
Realistically, it's a problem for the API consumer to solve, not for the API producer. Your API might be less popular if it has a lot of external dependencies, but you'll have to make decisions there about what's really crucial to your API's success.

Related

How to list used/unused native libraries from C#?

I have a .net (4.7.2) application that calls into a 3rd party native library using [DllImport("Foo.dll"...)]. That native Foo.dll is written in C++ and has a lot of dependencies: 90 assemblies, 360 MB (!) are shipped currently. I know that some dependencies are shipped, but not used anymore. Asking that 3rd party for cleanup had no effect, its getting more and more every few month.
Q: Is there any way to distinguish required, actively loaded native assemblies from dll bloat?
I have experimented with AppDomain.GetAssemblies() on application exit, but it one only returns managed assemblies.
I have experimented with DependencyWalker and its modern brother Dependencies, but it seems most of the truly required dependencies of Foo.dll are loaded on demand - only a small fraction shows up in those apps.
You could use Procmon to track assemblies being dynamically loaded.
But if some assemblies are only loaded if the feature is in the use, then you don't have any other choice than looking at the code (if you can) or go through all the possible code paths and track all loaded dlls.

NuGet or assemblies folder for shared 3rd party DLL's?

I have about 10-15 projects with separate solutions that reference 3rd party DLL's in a microsoft .NET shop. One of the problems that we want to address is consistency of the versions of the DLL's used across projects (E.G. Netwonsoft 8.0.3 in all projects as opposed to separate versions depending when the project was created).
I have seen this done in two separate ways in my previous positions and was wondering if there are any other options to solve this problem.
I have used a corporate NuGet for all third party DLL's referenced within a solution for any project within the company. The DLL's would be updated and then made available to the developers in the projects to pull down and upgrade (if needed) within the solutions on their own.
Another company had an assemblies folder in source that housed all "approved" third party DLL's and all references lived within this directory.
I did see this question but it only offered one of the two solutions above: Where should you store 3rd party assemblies?
Are there other options aside from the ones listed above?
Whenever possible use NuGet. Primary reason being that Git doesn't very much handle large binaries well and using LFS for this doesn't make much sense, since there is a valid alternative. TFVC has fewer issues with large binaries, but I'd keep future migration to Git in mind if you're on TFVC.
Keep in mind that not just NuGet, but likely also npm and other package sources are of interest in this case.
If you want to enforce a certain version being used, create a custom task that you hook into the CI pipeline. That way you can easily give off warnings or setup some kind of policy. The custom task could take the packages.config file, scan the referenced packages and then query the TFS/VSTS package management feed to see if it's using the latest version (or the is using the latest minor version)... (or is using at least x versions back)... or fetches the approved versions from a json file or xml file from somewhere and validates against that...
In your source control, Commit and Push to Master with the desired dependency DLLs when the repository is first populated. Since all users, even on other branches, will then be pulling from the repository, you're ensuring they receive all the DLLs they need. This only becomes a problem if you're referring to DLLs in the GAC, which is resolved either by GACUtil or just making sure everyone is using the same Windows version.

Is packaging JSON serialization dependencies into binary still a good idea for a .NET Core library?

When I've built a shared .NET library in the past (for internal use, not a public library), I've embedded a serialization library (usually Newtonsoft) as a resource and used assembly resolving to load it in a manner similar to Embedding DLLs in a compiled executable. I've done this so the users of the library aren't forced to worry about the version of Newtonsoft that the library was built with.
In a .NET Core world, is this still a good approach? It seems like with the deep inclusion of NuGet, it might not be needed anymore, but I haven't built a large enough number of .NET Core applications to know how bad dependency graph management can get.
I have moved away from doing that. The ability to specify dependencies and which specific versions of them are required makes it more explicit to consumers (ours is only for our team) that there's a dependency. Having dependencies being less explicit has lead to headaches for us, especially with larger applications as keeping your binding redirects all happy can become an issue. Might as well keep it easy for yourself to remember that you might need some other package and which version has been successfully integrated by using the nuget dependency management in the nuspec.

Use .NET DLL on multiple platforms

I am using C# to create a game using MonoGame which I wish to use on multiple platforms (which I know MonoGame can do).
Is there a way to create a .dll in C# and load it from other operating systems (preferably iOS, Android and MacOS) without recompiling the library? I am prepared to write a "loader" application for each platform, but would not like to rewrite the entire project.
I'm hoping there is a way to load functions from a .net dll (i.e. load the game) on these platforms without using paid products such as MonoTouch.
Basically the .NET/Mono Assemblies all get compiled to CIL Code which is an ECMA Standard.
However, you still have to program your game in a way that it do not depend on OS specifics.
Don't refer to certain .NET Assemblies (i.e. No WCF, cause it has only limited WCF support, no WPF at all, no WWF)
Don't refer to OS specific DLLs (i.e. no P/Invokes to kernel32.dll etc.
Don't use OS specific pathes or use preprocessor directives to make OS Specific code (yes, this will require recompile and you probably won't get around this easily)
Threading on Mono has some catches, so you will probably have to make platform specific code using preprocessor directives.
Honestly, I don't see a problem with having to recompile your code, if it's clearly written it's just a matter of adding a new project to your solution file and setting the preprocessor flags. Then all you have to do is compile the solution and have multiple DLLs in your bin folder. No one ever said you have to rewrite the complete project (unless the project is already finished and has any of the dependency mentioned up there), which in this case... it's your own fault for not having to think about it before starting development.
You'll just have to deploy your Apps (on iOS etc.) with the required Mono Runtime. And for this you will probably need something like Xamarin or wire up your own Mono runtime
Reference links:
http://www.mono-project.com/Compatibility
http://www.mono-project.com/MoMA
You can look at the Xamarin-Framework. But you always have to recompile your project.
Xamarin
on these platforms without using paid products such as MonoTouch.
This will be difficult.

How to locate DLLs for inter-component referencing

My SAAS company has two C#.NET products, call them Application Alpha and Application Beta. Both of them reference some libraries that we can call Core.
At the moment, the codebase is monolithic, stored in a single SVN repository with a single .NET solution, and we're trying to make it more modular/componentized. We've split off into separate repositories for the two applications and the core library, but we're now running into the following problem:
Alpha and Beta must reference Core, but we're trying to avoid having a direct code reference because then we're practically back to square one: you would need to check out and co-locate all repositories. So how should we go about referencing assemblies between these components?
Each component could have a directory containing DLLs from the other components that need to be referenced, stored in SVN, but this would mean extra effort any time Core is updated to push out the new DLLs to Alpha and Beta.
Or we could store the DLLs in a central SVN'd location (and/or in the GAC), but that would mean extra effort any time Core is updated for everyone else to pull the new DLLs.
Is there a third option that we're overlooking?
I have something similar in which I have 5 applications utilizing a series of web controls I built. The controls are compiled into a series of DLLs for modularization and the applications that utilize them live on separate servers.
What I do is utilize VS2008's build utility to execute a batch file that copys the compiled (updated) DLLs to the production servers when a Release Build executes.
You do this by going to the project that builds into the DLL (or DLLs) and right click on that project and goto Properties. Then you goto the BUILD EVENTS tab. There you see Pre-Compile command line and Post-Compile command line textboxes.
Therefore your release builds can be fully automated and you never have to worry about DLL hell-like differences between versions of your production DLLs.
Hope this helps,
JP
you could have your rebuild script for Alpha and betat create artifacts (namely build core) and place the result of the core build at a specific location referencing that location.
You could use SVN:externals. It was designed for this type of scenario.
If you want to avoid that, these are probably your better options. I'd avoid putting the files in the GAC, though, unless your core project is very stable, and not changing very often. Having the DLLs local provides much more flexibility.
Each component could have a directory containing DLLs from the other components that need to be referenced, stored in SVN, but this would mean extra effort any time Core is updated to push out the new DLLs to Alpha and Beta.
This could be handled fairly easily with a good build system. This approach has some disadvantages (ie: exectuable depdenencies in the build system), but has some advantages, including allowing each dependent project to have different versions as needed, etc.

Categories

Resources