I am relatively new to the new .NET project format, and was looking to solicit some opinions on how best to manage a library's dependencies.
Let's say I am writing a library (intended to be shared as a NuGet package) which contains some ASP.NET core functionality.
In my library's csproj file, I can add a reference to Microsoft.AspNet.Core and happily use everything I want from it or any of its transitive dependencies.
An alternative approach would be to explicitly add references for each part of AspNet.Core that I use, as I start using types from those packages. For example, to add a package reference to Microsoft.AspNetCore.Http.Abstractions the first time I use HttpContext.
To me, the latter approach seems like it would be preferable. It provides a more clear description of the actual dependencies of my package, and is not susceptible to breaking in the same way as referencing Microsoft.AspNet.Core may be when updating to a version which may remove Microsoft.AspNetCore.Http.Abstractions from its transitive dependency graph.
If the general consensus is indeed the latter approach described above, is there any way to enforce this behaviour? For example, by failing a build if a project references types in a transitive dependency?
Really interested to hear everyone's thoughts on this!
I'm an experienced C# developer and maintainer of several NuGet packages. Though those packages don't actually have any dependencies.
I agree that the latter approach of only referencing the exact packages used is preferable. This has several advantages:
Helps prevents conflicts between assembly versions by reducing the number of packages your package pulls in
Minimizes the total added size to projects using your package
May enable advanced scenarios for users of your package
I recently had to file a bug due to the third situation. The xUnit library provides their assertions in two different packages xunit.assert and xunit.assert.source. The second enables you to extend the assertion class with custom assertions. I was using the xunit.categories package and it references xunit which references several packages including xunit.assert. So when I tried to switch to using xunit.assert.source I got a conflict because they contain the same classes. However, xunit.categories doesn't even use the assertion portion of xUnit. It should only reference xunit.core.
I think I prefer explicitly referencing packages. However, I can understand the answer may be subjective.
In any case, there is a tool that helps you find transitive package references:
https://github.com/spectresystems/snitch
Related
I want to build my app once using the latest and greatest version of the main nuget package.
But due to an unresolved error I also need to build it with a lower version of the same nuget package for the hosted use case.
From this question I would think it is not possible in one project? Do I need to split it into several projects?
I am using Microsoft.NET.Sdk.
The actual code does not change when switching packages. (The APIs I use from the nuget stay the same.)
Edit:
At the moment I am just changing one number in the nuget reference and build again two have both version, Can this be automated with some build infrastructue?
Edit2: Twitter suggest its possible.
It's not possible or it's not worth the effort. Use a lower version of the same NuGet or fix the bug. Maybe it would be possible by dynamically loading the DLL like Assembly.LoadFile("mylibv1.dll"), Assembly.LoadFile("mylibv2.dll") but it sounds more complicated than just using a lower version or fixing the bug. Otherwise how will the application know which FSharp.Core.dll it needs to load and use?
After searching quite a bit, I seem to be unable to find an answer to my problem. Usually, I find that means it is a non existent or incorrect approach, but I think it worth it to have an answer floating around on the internet nonetheless.
Essentially, we have 4 applications referencing 5 different "source" projects. So the scenario is, when we add a 5th application (for example), we will need to create project references to the other 5 different projects, as the application requires their output.
Its not a difficult task because the amount of projects is small, but it got us thinking. What if you could create a single project, maybe called Libs or something, reference all 5 projects in that project, and then the applications must only reference Libs. The idea seems cool, but I don't know if it will work because when you create a project reference, it points to Libs single output libs.dll.
So to actually ask a question, is this possible, and if so, how can it be done? Currently, having Libs reference the other "source" projects, and then having the applications reference the Lib project does not work, as it says there are missing assemblies.
And just to go over how this was created. The 5 source projects reside in a couple different solutions, so the only tedious part of this approach is "add existing project" at the initial start of the application's solution.
The way we manage this kind of thing in my organisation is to make a NuGet package for each of these shared "source" projects (e.g. in our case we have an error logging library, an XML utils library, a bespoke HTTP client, and others). These are published to our private NuGet feed URL (hosted on Azure DevOps, but you can just use a standard Windows fileshare if necessary) for our developers to use in their applications.
This has some clear advantages over your approach:
1) Dependencies - this seems most relevant to your question. If the project you built the NuGet package from itself depends on any other NuGet packages (either publicly available ones, or others from our private feed) then when someone installs that package in their project it will automatically install all the other packages it depends on.
So in your case you could create a shell "libs" package which doesn't deliver any content itself, but has dependencies on all your other packages, causing them to be installed automatically. In our case we have several cases of dependency (e.g. a "base" error logging package which is relied on by error handling modules which are tailored to different app types, e.g. MVC, Web API, Windows Services), and it works very well.
2) Updates and maintenance. In your scenario if you make a breaking change to one of your "source" projects, then, because you have a direct project reference declared in Visual Studio, any project which references the source one will have to make related changes to cope with the updates to the source project, before you can re-compile it and do whatever feature changes you're trying to achieve. This could be a pain, and be an untimely problem, especially in the case of major updates. However if instead you install a NuGet package containing that functionality, the developer of the application can choose if and when to install an updated version of the package.
There are other minor benefits as well which I won't go into, but there are some very good reasons why almost all major programming languages now provide similar "package" and "feed" functionality as a way of managing dependency on external projects and libraries. Your approach is, I feel, outdated and naive, resulting in the issue you've described and the potential for other irritations as well.
I have about 10-15 projects with separate solutions that reference 3rd party DLL's in a microsoft .NET shop. One of the problems that we want to address is consistency of the versions of the DLL's used across projects (E.G. Netwonsoft 8.0.3 in all projects as opposed to separate versions depending when the project was created).
I have seen this done in two separate ways in my previous positions and was wondering if there are any other options to solve this problem.
I have used a corporate NuGet for all third party DLL's referenced within a solution for any project within the company. The DLL's would be updated and then made available to the developers in the projects to pull down and upgrade (if needed) within the solutions on their own.
Another company had an assemblies folder in source that housed all "approved" third party DLL's and all references lived within this directory.
I did see this question but it only offered one of the two solutions above: Where should you store 3rd party assemblies?
Are there other options aside from the ones listed above?
Whenever possible use NuGet. Primary reason being that Git doesn't very much handle large binaries well and using LFS for this doesn't make much sense, since there is a valid alternative. TFVC has fewer issues with large binaries, but I'd keep future migration to Git in mind if you're on TFVC.
Keep in mind that not just NuGet, but likely also npm and other package sources are of interest in this case.
If you want to enforce a certain version being used, create a custom task that you hook into the CI pipeline. That way you can easily give off warnings or setup some kind of policy. The custom task could take the packages.config file, scan the referenced packages and then query the TFS/VSTS package management feed to see if it's using the latest version (or the is using the latest minor version)... (or is using at least x versions back)... or fetches the approved versions from a json file or xml file from somewhere and validates against that...
In your source control, Commit and Push to Master with the desired dependency DLLs when the repository is first populated. Since all users, even on other branches, will then be pulling from the repository, you're ensuring they receive all the DLLs they need. This only becomes a problem if you're referring to DLLs in the GAC, which is resolved either by GACUtil or just making sure everyone is using the same Windows version.
I'm looking for some thoughts on which approach would be considered best practice.
Scenario:
.NET SVN repository consisting out of multiple projects, some sharing the same NuGet package. Multiple solution files also exist with a subset, some overlapping, of projects in each.
Example:
a.sln
--> proj1 - log4net, nunit(v1.0.0)
--> proj2 - log4net, json
--> proj3 - log4net, nunit(v1.0.0)
b.sln
--> proj2 - log4net, json
--> proj4 - nunit(v1.0.0)
The problem arrises when a developer opens a.sln and updates the nunit(v1.0.0) package for all projects within that solution to say, v2.0.0. This leaves proj4's nunit still on v1.0.0, and assuming all binaries are copied to one output folder, the solution that is built first will determine which version of nunit becomes available.
Approach #1:
Compose one project, used by both a.sln and b.sln, that's sole purpose is contain all the NuGet references for all the projects, and output all files to a folder, say Externals. Modify all projects to manually reference the dll's inside this folder.
Approach #2:
Be deligent when updating packages and repeat the process for each solution.
Approach #3:
Create one solution that contains all the projects and avoid multiple solutions.
I have a strong preference but would like to get community feedback and be open minded.
Approach #1 - the problem with that I see is that it forces updates onto other projects, leading to potential compilation issues. Also, you may have a project that would prefer the older NuGet package
Approach #2 - as per #1 and in addition, unless developers are across the other solutions, they might not think to update them. We developers can be finicky when working on a bug with blinders on
Approach #3 - though somewhat easier from a maintenance perspective, it does not guarantee that some projects are all using the same NuGet packet. e.g. two different versions of nunit. In Visual Studio when installing and/or updating a NuGet package, the developer can override which projects will receive the package. Depending on the size of your projects, some developers might not like the idea of creating a monolithic solution (even if projects can be right-click Unload Project) due to general performance impact
In my opinion Approach #3 may be easier due to the convenience of mass installing the same package to all your projects in one step. Approach #1 comes close too but I fear you will find that you're going to have to deploy the NuGet packages into the other projects regardless due to dependencies - "Type xxx is defined in an assembly that is not referenced. Please add it to references".
Incompatible Versions
This leaves proj4's nunit still on v1.0.0, and assuming all binaries are copied to one output folder, the solution that is built first will determine which version of nunit becomes available.
Agreed, this strikes me as the ultimate problem and something we have run into. Even if your team is disciplined with using say log4net version X everywhere, good chances you will run into a situation where using some third party library also requires log4net but version Y. So the issue of incompatible versions rises again.
AppDomains
If you find that your solution must be deployed with multiple versions of 3rd party assemblies, you may want to look into child .NET AppDomains. The idea is that you deploy your assemblies not into one big folder (where older files can be clobbered by newer ones or vice versa), but a root folder and child folder for each assembly bound to a particular third-party .dll.
The reason you need child AppDomains is that an assembly may not be loaded more than once per AppDomain even if the other assemblies are located in different folders.
e.g.
<my app folder>
|--folder a
|--log4net v1.dll
|--nunit 3.dll
|--folder b
|--log4net v2.dll
|--nunit 4.dll
After all, nUnit uses child AppDomains when running your tests.
I was working on a solution with multiple projects and I updated Newtonsoft.Json in a one of the projects (a class library). This caused some errors in another project (an ASP.Net WebApi project) because it also needed the reference to Newtonsoft.Json to be updated.
The error was thrown at run-time not compile-time and I had to manually install Newtonsoft.Json on each of the projects within the solution.
Install-Package Newtonsoft.Json
This lead me to wonder if it is possible to update all third party libraries at solution level?
I know you can create a folder to store the libraries and add that to the source control but I would like to avoid that.
I want Nuget to download the libraries after fetching the solution from the repository. I was wondering if it is possible to:
Create a new project in the solution called something like www.solution.com.thirdparty
Add reference to all third party software using nuget
Expose the all third party libraries via www.solution.com.thirdparty
I don't know how can I do the third step and I would also know if there is a better way to do this?
Use packages at solution level, as described in this post.
Alternatively you can have a look at Paket, it has a nice transitive dependency resolution algorithm that may work for you.