Download latest NuGet during run-time - c#

During runtime, I want to download the latest DLL from the NuGet library using .NET Core and load it into the container.
I am looking at a scenario when a set of dependencies has been changed and new package versions are released to nuget.org, after which I will have to download the latest packages and run some tests. Each time the application will verify if it has the latest package version and, if necessary, download and dynamically load the new assembly without the application stopping.

I'll start off saying that this doesn't sound like a good idea. You didn't explain the scenario about why you want to do this, so we have no way of suggesting better approaches. If it's a web application that you don't want to have downtime for, a better approach might be to have a load balancer and when you release a new version of your web app, to configure the load balancer to send all new requests to the new contains/version while the existing connections on the old version drain.
If your app is a queue listener, you don't need hot-loading at all. Just have an app which creates a pull request on your source code to update nuget package version numbers when they're available, have the CI automatically build the change and if tests pass you can automate approving/merging the PR then your CD can automatically deploy and decommission the previous version. This would be less risky than automatically loading new versions since you risk the new version having bugs or not being binary compatible and now your app is going to crash.
Anyway, if you really do have a good reason to hot-reload assemblies, on .NET Core you'll need to use AssemblyLoadContext. However, it's such an unusual case to need this, that I can't find any realistic examples on how to use it. All the "examples" I've seen use AssemblyLoadContext.Default which I assume is equivilent to using a single context and therefore won't let you load different versions of the same assembly. So, if you want to go down this path, you're probably going to have to figure it out yourself. Lots of trial and error, debugging and possibly reading the coreclr source code. As some commenters to your question mentioned, you'll also need to handle the case that an assembly was compiled against one version of a dependency, but you're now going to load a different version. In the .NET Framework is uses something called Assembly Binding Redirect. The way most people use this is with an app.config or web.config file, but that won't work for you since it will change at runtime. I'm sure there's an API to deal with it programatically, which you'll also need to figure out. I'm not sure how/if binding redirects are different in .NET Core compared to the .NET Framework, so that's yet another thing you'll need to figure out, but I'm sure a couple of good searches on google will give you an answer to that.
Once you get assembly hot reloading working, you can either read the NuGet server API and implement your own client, or you can use the NuGet client SDK. But the NuGet Client team's primary focus are the tools (Visual Studio integration, dotnet cli, nuget.exe), they just publish the packages for convenience (plus as a way to share the libraries with dotnet cli and mono), so there are no docs on using the SDK. Also note that while the NuGet client team tries not to break the packages binary API, it's a secondary concern when needing to implement features and bug fixes for the tools. The NuGet client SDK track Visual Studio version number, it doesn't use semantic versioning, so when you try to update to newer NuGet client packages, you might find you need to change your code when you otherwise might not expect. It's pretty rare, but I generally recommend the "ports and adapters" pattern, and this is an excellent example of when it's particularly useful. At least with the NuGet client packages, there are blog posts from people who figured it out and shared how they did whatever they needed. Otherwise since NuGet itself is opensource, you can look though NuGet's code to see how it uses the packages internally, and use that as a guide to help you figure out what you want to do. If you go down this path you'll need to do most of what the nuget tools do:
Query a NuGet feed to find what versions of a package are available, then choose which version you want to use and check if that version is already downloaded.
Download and extract the package.
Asset selection. Particularly when a package has libraries for multiple TFMs, you need to know which one to use based on your project's TFM
The NuGet client at this stage would either write the project.assets.json file, used by the .NET SDK, or edit the packages.config and csproj files, depending if the project uses PackageReference or packages.config. In your case this is where you integrate with assembly hot-reload.
Note that in the general case, the process above, or depending on how you implement it a single step, could be recursive because a package can have dependencies. So, the dependencies also need to be retrieved, but a package can have different dependencies depending on which TFM is selected, so you need to figure out do you want to download all dependencies, even ones that will never be used after asset selection, or do you want to do asset selection first to minimise the packages that you download. Also when multiple packages have a dependency on different versions of a single package, you need to make a decision about which version of the package you want to use.
So, that's the high level algorithm you'll need to implement. As I mentioned, most of this doesn't have API docs with examples on exactly how to implement it, and very few, if any, examples exist on the internet. Like I wrote in the first paragraph, I don't think this is a good idea. It's probably easier to automate your CI/CD pipelines, and if necessary automate changes to a load balancer's configuration, rather than making such a complex app. Sometimes orchestrating a bunch of simple apps is easier than making a complex app.

Related

Is it possible to develop using local projects instead of having to publish every change to nuget?

I'm coding on a project that has several Azure-based applications, as well as several Windows services, etc. Needless to say, it's just a bunch of individual applications that are deployed out to Azure, or elsewhere, and expected are all expected to work together.
We use Nuget for our underlying library project versioning. Every feature or change results in a bump to the Nuget version, a package published to our private Nuget server, and a subsequent update to every other application that needs the update. This is currently a tedious manual task, but is not even my most immediate source of frustration.
The thing that I struggle with the most, currently, is while doing development on a feature that requires changes across the entire set of applications, from bottom to top, and having to constantly push out Nuget packages and update Nuget packages just to even develop and debug.
Prior to using Nuget, we may have just added all of these projects as direct dependencies on disk, which removes versioning but instantly lets me develop against my local changes.
Now with Nuget, I can't develop against local changes without pushing out a new package.
Is there a workflow that I'm missing that would allow me to still use Nuget but also be able to make changes and work locally without having to push and pull Nuget packages all the time?
Can I somehow develop against local projects, but also somehow have the project dependencies know to use the Nuget packages?
I ran into this issue when setting up a shared NuGet repo for my company. You can set up local a NuGet feed and 'publish' just by dropping files to a folder. This is extremely useful for local testing before you're ready to publish to the shared repo.
Also, NuGet uses semantic versioning. I find it useful to have pre-release versions by using a tag like MyLibrary.1.0.0-prerelease-12345 so you can still have incremental builds, but most other apps will not be notified of the changes until you create a major release such as MyLibrary.1.0.1. This could require you to make some changes to your DevOps process, but it allows multiple developers to test your package before 'officially' releasing it.
If your issue is that you want to be able to easily update multiple applications locally and test those changes. I have occasionally found it useful to create a single solution file encompassing all my projects so I can quickly open, update, and build everything in one Visual Studio instance. However, this solution is not particularly scalable, so you might be better off writing PowerShell scripts for automation.
Update Another solution that you might find useful is NuLink. I have never tried it so I can't actually endorse it, but it purports to provide similar functionality to npm link (and actually uses symlinks just like npm does).
Given the projects are all in the same repo, just use project references instead of package references.
When you pack a project, NuGet will convert project references into NuGet dependencies, and the dependency version will be the same as what the other project is if/when it is packed.
Check this answer, where you could:
build the dependency's code locally to produce DLLs.
replace the DLLs in your machine's nuget cache folder corresponding your dependency with the local DLLs produced in the previous step
That's a quick way to see changes locally without publish-consume cycles

Is there a way to have multiple applications reference a global project that has other project references

After searching quite a bit, I seem to be unable to find an answer to my problem. Usually, I find that means it is a non existent or incorrect approach, but I think it worth it to have an answer floating around on the internet nonetheless.
Essentially, we have 4 applications referencing 5 different "source" projects. So the scenario is, when we add a 5th application (for example), we will need to create project references to the other 5 different projects, as the application requires their output.
Its not a difficult task because the amount of projects is small, but it got us thinking. What if you could create a single project, maybe called Libs or something, reference all 5 projects in that project, and then the applications must only reference Libs. The idea seems cool, but I don't know if it will work because when you create a project reference, it points to Libs single output libs.dll.
So to actually ask a question, is this possible, and if so, how can it be done? Currently, having Libs reference the other "source" projects, and then having the applications reference the Lib project does not work, as it says there are missing assemblies.
And just to go over how this was created. The 5 source projects reside in a couple different solutions, so the only tedious part of this approach is "add existing project" at the initial start of the application's solution.
The way we manage this kind of thing in my organisation is to make a NuGet package for each of these shared "source" projects (e.g. in our case we have an error logging library, an XML utils library, a bespoke HTTP client, and others). These are published to our private NuGet feed URL (hosted on Azure DevOps, but you can just use a standard Windows fileshare if necessary) for our developers to use in their applications.
This has some clear advantages over your approach:
1) Dependencies - this seems most relevant to your question. If the project you built the NuGet package from itself depends on any other NuGet packages (either publicly available ones, or others from our private feed) then when someone installs that package in their project it will automatically install all the other packages it depends on.
So in your case you could create a shell "libs" package which doesn't deliver any content itself, but has dependencies on all your other packages, causing them to be installed automatically. In our case we have several cases of dependency (e.g. a "base" error logging package which is relied on by error handling modules which are tailored to different app types, e.g. MVC, Web API, Windows Services), and it works very well.
2) Updates and maintenance. In your scenario if you make a breaking change to one of your "source" projects, then, because you have a direct project reference declared in Visual Studio, any project which references the source one will have to make related changes to cope with the updates to the source project, before you can re-compile it and do whatever feature changes you're trying to achieve. This could be a pain, and be an untimely problem, especially in the case of major updates. However if instead you install a NuGet package containing that functionality, the developer of the application can choose if and when to install an updated version of the package.
There are other minor benefits as well which I won't go into, but there are some very good reasons why almost all major programming languages now provide similar "package" and "feed" functionality as a way of managing dependency on external projects and libraries. Your approach is, I feel, outdated and naive, resulting in the issue you've described and the potential for other irritations as well.

NuGet or assemblies folder for shared 3rd party DLL's?

I have about 10-15 projects with separate solutions that reference 3rd party DLL's in a microsoft .NET shop. One of the problems that we want to address is consistency of the versions of the DLL's used across projects (E.G. Netwonsoft 8.0.3 in all projects as opposed to separate versions depending when the project was created).
I have seen this done in two separate ways in my previous positions and was wondering if there are any other options to solve this problem.
I have used a corporate NuGet for all third party DLL's referenced within a solution for any project within the company. The DLL's would be updated and then made available to the developers in the projects to pull down and upgrade (if needed) within the solutions on their own.
Another company had an assemblies folder in source that housed all "approved" third party DLL's and all references lived within this directory.
I did see this question but it only offered one of the two solutions above: Where should you store 3rd party assemblies?
Are there other options aside from the ones listed above?
Whenever possible use NuGet. Primary reason being that Git doesn't very much handle large binaries well and using LFS for this doesn't make much sense, since there is a valid alternative. TFVC has fewer issues with large binaries, but I'd keep future migration to Git in mind if you're on TFVC.
Keep in mind that not just NuGet, but likely also npm and other package sources are of interest in this case.
If you want to enforce a certain version being used, create a custom task that you hook into the CI pipeline. That way you can easily give off warnings or setup some kind of policy. The custom task could take the packages.config file, scan the referenced packages and then query the TFS/VSTS package management feed to see if it's using the latest version (or the is using the latest minor version)... (or is using at least x versions back)... or fetches the approved versions from a json file or xml file from somewhere and validates against that...
In your source control, Commit and Push to Master with the desired dependency DLLs when the repository is first populated. Since all users, even on other branches, will then be pulling from the repository, you're ensuring they receive all the DLLs they need. This only becomes a problem if you're referring to DLLs in the GAC, which is resolved either by GACUtil or just making sure everyone is using the same Windows version.

How do you go about using 'lib' folders in .Net projects for building in dependencies?

I need to get my external dll dependencies (automap, others ...) into a situation where I can build them on my build server. I would also like to get then them into subversion, so the build server can pick them up.
So I'm new to the whole 'lib' folder thing. I've searched Google, but it seems it's kind of assumed, there are no basics of what to do here. The books I own don't go into it. It's been a long time since I had a mentor at work, or even someone I could ask questions of ... and I'd really love to understand the fundamentals of what I should be doing here.
I write in .Net, use Jenkins as my CI server (new to that) and msbuild (new to that too). I'm hearing svn:externals (don't compute), NuGet ....
Please help!
Suppose my solution is called MySolution and is stored in C:\MySolution, then I have have three directories for binaries, all managed by source control.
vendor the source code of third party frameworks. If needed they are built and signed (with my key) and treated as if the code was my own. This is sometimes necessary to "fix" defects in the framework or debug their source to understand why it fails.
src\packages modules managed by nuget (I wished to combine this with my "lib" folder, but that isn't yet supported)
lib compiled libraries for which I don't have the source and that are not managed by nuget.
(I have omitted folders like "src", "sample", "setup", "documentation" and "scripts" to keep the answer specific to the OP).
The recent months I started to create my own nuget packages for "packages" in the lib folder so I can migrate all of them to "packages". Its published to a private nuget server. It also simplify managing the binaries across solutions.
I use to use externs, but they pose a branching nightmare after a while because you have to branch and pin the external dependencies to. With nuget this is no longer needed.
I would definitely avoid putting binaries in source control. You'd be far better off creating your own nuget repository containing your preferred versions of packages and either using nuget restore or some other way of "rehydrating" your dependencies for building. I use a simple batch file called nuget-update.bat which just looks at all packages.config files and gets any dependencies it finds.
It seems that you posted a sequence of questions on the same topic. I recommend NuGet, as it becomes critical and promoted hard by Microsoft. However, many old libraries are not available there, and you may still need to keep a lib folder. My open source project #SNMP is a good example,
I tried to use as many NuGet packages as possible, and even stepped up to maintain some of the dependencies, such as DockPanel Suite.

Looking for an msbuild and xbuild task to fetch referenced libraries (without nuget.exe)

I have an issue with creating an easy solution for my build system, based on mono.
Current situation is that I keep my referenced libraries inside the git repository, which is not good, for many obvious reasons.
What I want to achieve is something like what NuGet provides - automatically download dlls from the Web, put them in some directory and forget about them.
I want to do this at build time, so it would not require any additional actions with downloading libraries etc. The best option would be an msbuild (xbuild on mono) task, but I want it to be system independent, so the popular one, executing NuGet.exe, is out of question (consider parallel mono installations, etc.).
I've tried Pepita project, but it's... wrong. No, really, it is, it has too many design mistakes to be easy to use or repair. To make a proper configuration would require a serious rewrite of the whole project.
What I would love is a library, that would employ NuGet.Core library and be available as a task. If such a lib is not there, I could use any solution, that would download a nuget package and unpack it to a directory specified in .csproj.
Even better, it would be nice if such a library could resolve dependencies without specifying them explicitly in packages.config (or similar) file, e.g. if I want to include Castle.Windsor I don't want to include Castle.Core in my config file.
I know about the OpenWrap project (with NuGet Gallery), it looks promising, but I can't find the solution where I would just put a constant set of libraries in my repo once, modify csproj files, some configs and have it done.
I can tell you that OpenWrap at the core has everything built-in to do what you want. Everything you can do with the openwrap-shell is also available to be called from msbuild. So, it seems to me that you would just need to add a before build hook to call out to openwrap to perform an "update-wrap". Several months back I actually looked into doing something similar. AFAIR I actually wrote an msbuild script to call openwrap tasks, but didn't really hook them into the normal build process.
I don't know exactly what you mean with "put a constant set of libraries in your repo once"? For OpenWrap, all you need to do is maintain the "openwrap descriptor" for your project. That file contains all direct dependencies of your project (with or without restrictions on version numbers). (Indirect dependencies are pulled in automatically) Are you wondering about how you get started when you have a bunch of binary dlls to start with? I can tell you what I did. Basically, I do not use any NuGet packages, I created OpenWrap packages for everything. I also created OpenWrap packages for all our binary dependencies (some of which are open-source). This is really super simple: you fill in correct dependencies in the OpenWrap descriptor and specify that the package must only contain the given dlls. We had a bunch of binary dependencies, but once you start packaging them, it's definitely not that much work.
If you want to see an example, you can check this one:
http://code.google.com/p/ppwcode/source/browse/dotnet/External/Apache.Log4Net/trunk/Apache.Log4Net.wrapdesc
That is all you have to do to package your binary dependencies. This is a package I created and we currently use it in the company where I work. I know Log4Net is probably available as a NuGet package, and I could probably use that. The advantage of creating those binary packages myself, is that I have full control over the packages, over the version numbering of the packages, over how a big project is split over several smaller packages and so on.
As an OpenWrap repo, you can use a folder on your local filesystem, or a folder on a network share. What we use, is actually a webdav repository that we mount locally on a drive (using Windows 7). This works fine for us and also allows us to specify who has read and write access to the repository.
You mention mono.... well, that might be a problem: the currently released version of OpenWrap (2.0.2) does not run on mono AFAIK. But the good news is that Sebastien Lambla has been working hard to get OpenWrap to run on mono+xbuild for the new version that is going to be released very soon: 2.0.3. No alpha/beta builds available yet, but you can build from git. (In that case you would need to build both openwrap-shell and openwrap). Sebastien Lambla, who created OpenWrap, normally keeps an eye on questions on StackOverflow and will probably be able to give you a more complete answer on the mono status.
Btw, where I work, we are using OpenWrap already for over a year. Back then we compared both NuGet and OpenWrap, and at that moment OpenWrap was way way way ahead of NuGet. Basically, to me, NuGet was not a tool for dependency management, but a tool to assist you in Visual Studio to pull in binary dependencies from a remote server (meaning: copy dll from remote server to local folder and add reference to local dll in project file). In the mean time, NuGet has been playing catch-up with OpenWrap and has added functionality that already existed in OpenWrap. There are in my opinion only 2 things that NuGet has over OpenWrap and that is integration in Visual Studio (aka overview of remotely available packages and click-click-click adding of packages) and the fact that it is maintained by Microsoft people (AFAIK). Both things are just political: it's easier to convince people with a pretty interface and microsoft support. Personally, however, I think that OpenWrap is technically superior and I think it's really a pity that it doesn't get the attention that it deserves.

Categories

Resources