What is the expected behavior when different, independent .csproj projects are built/published to the same folder using dotnet publish?
Imagine that Alpha.csproj might require NuGet package foo at 1.0.0, and that project Bravo.csproj might require version 2.0.0 of the same NuGet dependency. My concern is that two separate dotnet publish invocations on these distinct projects, pointed at the same target folder, will cause the foo dependency to be overwritten...thus breaking the deployment.
I know that NuGet often stores its binaries in subfolders, often distinguished by version numbers, but it is also common that it places the dependencies directly into the main publish-folder. So, at face value there is room for unexpected conflicts.
So far the way I've solved this anticipated problem is to place both projects into the same solution, and then publish the solution. I figure that a single publish command is smart enough to resolve the differences (storing different dependency versions of foo into different subfolders). However, what if these projects are in different solutions?
I say "anticipated" problem because I've not actually spent time trying it. I have tight deadlines and can't afford to deal with potentially obscure bugs emerging at runtime; I don't want to discover the answer to my question "the hard way."
Understanding this dynamic is particularly important for me, because I'm building .Net Core 3.1 plugins that I'd like to publish/deploy to an existing framework folder. The application is designed to scan for new plugins and load them.
The expected behavior is that the last project published will overwrite existing files in the publish directory and thus can lead to dependency version conflicts.
Single-file deployments have a better chance of not creating a conflict when publishing to the same directory at the expense of a larger overall deployment size.
Related
I have one base nuget library called Foundation.dll.
I have another 5 nuget libraries which are using a different version of Foundation.dll.
Everything is in one project.
My question is when I build a project, VS .Net is obviously going to put only one Foundation.dll in the bin/debug folder. So how VS/.Net decides which nuget package's foundation.dll should be put in the bin/debug folder. Is it randomly?
If I do reference Foundation.dll directly in the project then it is putting my direct reference version into the bin/debug folder but for some other developers in the team, it is putting an older version.
it is very scary that the same exact branch code in 2 different machines works differently. I added one argument in one of Foundation.dll's methods & for one developer it is working but for another developer, the same exact code gives a compilation error.
What is the ultimate solution to this problem? What change should I make in my project?
Thank you.
This is a difficult topic and yes, there are a lot of factors that determine which version is being put to the bin folder. Normally, the compiler chooses the latest version from all the dependencies automatically. But particularly if you have several "final" assemblies in your solution (e.g exe's, unit test libraries) the compiler sometimes gets it wrong. Usually, the code works anyway, but I agree, this is scary.
The actual outcome may depend on the build order, build environment (whether building from the command line or within VS, etc.). Me and my team has had a hard time figuring out the best way around this problem.
The safest approach we found is to reference the latest version of your package directly in the project. This does not need to be the latest version available, but the latest version used anywhere within your solution. Of course, this only works if the versions are backwards compatible. If some library requires an older version of the dependency and you can't rebuild that library, you are in for really big trouble.
I've met the same issue. One project in the solution was depending on .net standard version of a library, while others were expecting a classic framework version.
As PMF wrote, which library will finally occur in the output depends on a build order.
I've solved it by copying the .net standard version into the output folder on post-build events.
The output should not depend on random things, there's some (complex) rules for these situations:
https://learn.microsoft.com/en-us/nuget/concepts/dependency-resolution
It should choose the direct dependency over the indirect ones.
Do you still get different results when doing "rebuild all"?
After searching quite a bit, I seem to be unable to find an answer to my problem. Usually, I find that means it is a non existent or incorrect approach, but I think it worth it to have an answer floating around on the internet nonetheless.
Essentially, we have 4 applications referencing 5 different "source" projects. So the scenario is, when we add a 5th application (for example), we will need to create project references to the other 5 different projects, as the application requires their output.
Its not a difficult task because the amount of projects is small, but it got us thinking. What if you could create a single project, maybe called Libs or something, reference all 5 projects in that project, and then the applications must only reference Libs. The idea seems cool, but I don't know if it will work because when you create a project reference, it points to Libs single output libs.dll.
So to actually ask a question, is this possible, and if so, how can it be done? Currently, having Libs reference the other "source" projects, and then having the applications reference the Lib project does not work, as it says there are missing assemblies.
And just to go over how this was created. The 5 source projects reside in a couple different solutions, so the only tedious part of this approach is "add existing project" at the initial start of the application's solution.
The way we manage this kind of thing in my organisation is to make a NuGet package for each of these shared "source" projects (e.g. in our case we have an error logging library, an XML utils library, a bespoke HTTP client, and others). These are published to our private NuGet feed URL (hosted on Azure DevOps, but you can just use a standard Windows fileshare if necessary) for our developers to use in their applications.
This has some clear advantages over your approach:
1) Dependencies - this seems most relevant to your question. If the project you built the NuGet package from itself depends on any other NuGet packages (either publicly available ones, or others from our private feed) then when someone installs that package in their project it will automatically install all the other packages it depends on.
So in your case you could create a shell "libs" package which doesn't deliver any content itself, but has dependencies on all your other packages, causing them to be installed automatically. In our case we have several cases of dependency (e.g. a "base" error logging package which is relied on by error handling modules which are tailored to different app types, e.g. MVC, Web API, Windows Services), and it works very well.
2) Updates and maintenance. In your scenario if you make a breaking change to one of your "source" projects, then, because you have a direct project reference declared in Visual Studio, any project which references the source one will have to make related changes to cope with the updates to the source project, before you can re-compile it and do whatever feature changes you're trying to achieve. This could be a pain, and be an untimely problem, especially in the case of major updates. However if instead you install a NuGet package containing that functionality, the developer of the application can choose if and when to install an updated version of the package.
There are other minor benefits as well which I won't go into, but there are some very good reasons why almost all major programming languages now provide similar "package" and "feed" functionality as a way of managing dependency on external projects and libraries. Your approach is, I feel, outdated and naive, resulting in the issue you've described and the potential for other irritations as well.
I have about 10-15 projects with separate solutions that reference 3rd party DLL's in a microsoft .NET shop. One of the problems that we want to address is consistency of the versions of the DLL's used across projects (E.G. Netwonsoft 8.0.3 in all projects as opposed to separate versions depending when the project was created).
I have seen this done in two separate ways in my previous positions and was wondering if there are any other options to solve this problem.
I have used a corporate NuGet for all third party DLL's referenced within a solution for any project within the company. The DLL's would be updated and then made available to the developers in the projects to pull down and upgrade (if needed) within the solutions on their own.
Another company had an assemblies folder in source that housed all "approved" third party DLL's and all references lived within this directory.
I did see this question but it only offered one of the two solutions above: Where should you store 3rd party assemblies?
Are there other options aside from the ones listed above?
Whenever possible use NuGet. Primary reason being that Git doesn't very much handle large binaries well and using LFS for this doesn't make much sense, since there is a valid alternative. TFVC has fewer issues with large binaries, but I'd keep future migration to Git in mind if you're on TFVC.
Keep in mind that not just NuGet, but likely also npm and other package sources are of interest in this case.
If you want to enforce a certain version being used, create a custom task that you hook into the CI pipeline. That way you can easily give off warnings or setup some kind of policy. The custom task could take the packages.config file, scan the referenced packages and then query the TFS/VSTS package management feed to see if it's using the latest version (or the is using the latest minor version)... (or is using at least x versions back)... or fetches the approved versions from a json file or xml file from somewhere and validates against that...
In your source control, Commit and Push to Master with the desired dependency DLLs when the repository is first populated. Since all users, even on other branches, will then be pulling from the repository, you're ensuring they receive all the DLLs they need. This only becomes a problem if you're referring to DLLs in the GAC, which is resolved either by GACUtil or just making sure everyone is using the same Windows version.
I'm looking for some thoughts on which approach would be considered best practice.
Scenario:
.NET SVN repository consisting out of multiple projects, some sharing the same NuGet package. Multiple solution files also exist with a subset, some overlapping, of projects in each.
Example:
a.sln
--> proj1 - log4net, nunit(v1.0.0)
--> proj2 - log4net, json
--> proj3 - log4net, nunit(v1.0.0)
b.sln
--> proj2 - log4net, json
--> proj4 - nunit(v1.0.0)
The problem arrises when a developer opens a.sln and updates the nunit(v1.0.0) package for all projects within that solution to say, v2.0.0. This leaves proj4's nunit still on v1.0.0, and assuming all binaries are copied to one output folder, the solution that is built first will determine which version of nunit becomes available.
Approach #1:
Compose one project, used by both a.sln and b.sln, that's sole purpose is contain all the NuGet references for all the projects, and output all files to a folder, say Externals. Modify all projects to manually reference the dll's inside this folder.
Approach #2:
Be deligent when updating packages and repeat the process for each solution.
Approach #3:
Create one solution that contains all the projects and avoid multiple solutions.
I have a strong preference but would like to get community feedback and be open minded.
Approach #1 - the problem with that I see is that it forces updates onto other projects, leading to potential compilation issues. Also, you may have a project that would prefer the older NuGet package
Approach #2 - as per #1 and in addition, unless developers are across the other solutions, they might not think to update them. We developers can be finicky when working on a bug with blinders on
Approach #3 - though somewhat easier from a maintenance perspective, it does not guarantee that some projects are all using the same NuGet packet. e.g. two different versions of nunit. In Visual Studio when installing and/or updating a NuGet package, the developer can override which projects will receive the package. Depending on the size of your projects, some developers might not like the idea of creating a monolithic solution (even if projects can be right-click Unload Project) due to general performance impact
In my opinion Approach #3 may be easier due to the convenience of mass installing the same package to all your projects in one step. Approach #1 comes close too but I fear you will find that you're going to have to deploy the NuGet packages into the other projects regardless due to dependencies - "Type xxx is defined in an assembly that is not referenced. Please add it to references".
Incompatible Versions
This leaves proj4's nunit still on v1.0.0, and assuming all binaries are copied to one output folder, the solution that is built first will determine which version of nunit becomes available.
Agreed, this strikes me as the ultimate problem and something we have run into. Even if your team is disciplined with using say log4net version X everywhere, good chances you will run into a situation where using some third party library also requires log4net but version Y. So the issue of incompatible versions rises again.
AppDomains
If you find that your solution must be deployed with multiple versions of 3rd party assemblies, you may want to look into child .NET AppDomains. The idea is that you deploy your assemblies not into one big folder (where older files can be clobbered by newer ones or vice versa), but a root folder and child folder for each assembly bound to a particular third-party .dll.
The reason you need child AppDomains is that an assembly may not be loaded more than once per AppDomain even if the other assemblies are located in different folders.
e.g.
<my app folder>
|--folder a
|--log4net v1.dll
|--nunit 3.dll
|--folder b
|--log4net v2.dll
|--nunit 4.dll
After all, nUnit uses child AppDomains when running your tests.
POSSIBLY POINTLESS BACKSTORY
In an effort to get out of NuGet hell (formerly dependency hell) my team has decided to switch to considerably larger solutions focused on our major divisions in the company. In the past we would have a core DLL that contained the most common code for a particular system and multiple solutions that pulled that DLL in through NuGet. This resulted in a lot of packages, and even more solutions, which ultimately meant a highly fractured collection of code for each division.
Our new approach is to move towards using a large comprehensive core library that gets exposed through a Web API. We also plan to serve up the core DLL through a NuGet package for situations where Web API performance is not suitable.
A basic solution will have 3 projects: Core, API, and Wrapper. The Wrapper project provides methods for accessing the API through simple methods rather than re-writing all the Web API calls in our apps that will use it. More complex solutions will have additional projects such as Windows services and console apps to run as scheduled tasks.
The 3 basic projects will all share the same version number because they're tightly coupled. The problem arises when including other projects that should not use the same version number in their builds.
THE REAL QUESTION
I have a solution with 4 projects: A, B, C, and D, A-C are always updated together and share the same version number. D is a service project that has its own logic in it that can change independently of the other projects, and therefore should have its own version numbering. However, to avoid confusion, D should be deployed with the correctly versioned DLLs from A-C. (Meaning that if D is on 2.0.0, and A-C are 1.0.0, the DLLs from A-C in the install directory should show 1.0.0 in their details.)
With that in mind, is there a way in TeamCity to build and control the version numbering so that the projects that need to be unique can be, and still reference the correct dependencies?
Notes:
I know that the easiest solution is to simply move the special projects to their own solutions and reference the core DLL, but that's part of what we're trying to get away from.
We want to be able to have proper build numbers on our finished files, as well as have tags in Git from successful builds.
As far as I can tell, the AssemblyInfo Patcher feature in TeamCity can only overwrite the whole version. If it could simply overwrite the build number then I could control version numbers right in my source code.
Well you are right AssemblyInfo Patcher won't help in this case, as it updates all assemblyinfo files or global assembly info file. There is no straight forward way I am afraid but I think you could try something like below:
Rather than building the solution, build individual projects using .csproj files, i.e. break compilation into 4 steps one for each project (A-D).
Use AssemblyInfo patcher for A-C projects so they can simply use Teamcity version %build.number%.
Add a pre-build event to D csproj to update its assemblyinfo information. You'll need to define a variable in teamcity for first 3 bits of its version say DVersion(1.0.0), the last part of it can come from %build.counter% variable. Pass this version (%DVersion%.%build.counter%) as a parameter to D csproj file build step.
Finally update teamcity build number by writing logs to something like ##teamcity[buildNumber '%build.number%-%DVersion%']