Split large Software Project on Azure DevOps Repos - c#

We are currently in the process of migrating from IBM ClearCase as centralized Version Control to Git / Azure DevOps. It is a large and long grown software project in C# with more than 100 C# projects distributed over about a dozen solutions with some dependencies inbetween, at the moment via Project References.
So far all those solutions were managed with ClearCase in a common folder, however for Git it seems to be best practise to use one solution per repo and handle dependencies cross repo with NuGet Packages.
I wanted to ask for experiences in migrating and splitting such projects, did you come across situations where it was better to include multiple C# solutions in a repo?
How do you manage multiple repositories that belong to one software with one release cycle? We plan on using release branches and I think we have to write a script that branches all repositories belonging to the project or is there a more convenient way, maybe provided by Azure DevOps?

I have written before on ClearCase migration to Git.
In all instances, the scenario was the same:
don't import the full history, only major labels or UCM baselines
split VObs per project, each project being one Git repository
revisit what was versioned in Vobs: some large files/binaries might need to be .gitignore'd in the new Git repository.
You can still reference all your Git repository (C# projects) in one parent Git repository, through Git submodules.
You can also go the monorepo route: after all, that is what Microsoft is doing with its "The largest Git repo on the planet".
But in that case, you might need to use Scalar, and sparse checkouts.
See:
"Bring your monorepo down to size with sparse-checkout"
"Make your monorepo feel small with Git’s sparse index"

Related

Dotnet Publish to Same Folder

What is the expected behavior when different, independent .csproj projects are built/published to the same folder using dotnet publish?
Imagine that Alpha.csproj might require NuGet package foo at 1.0.0, and that project Bravo.csproj might require version 2.0.0 of the same NuGet dependency. My concern is that two separate dotnet publish invocations on these distinct projects, pointed at the same target folder, will cause the foo dependency to be overwritten...thus breaking the deployment.
I know that NuGet often stores its binaries in subfolders, often distinguished by version numbers, but it is also common that it places the dependencies directly into the main publish-folder. So, at face value there is room for unexpected conflicts.
So far the way I've solved this anticipated problem is to place both projects into the same solution, and then publish the solution. I figure that a single publish command is smart enough to resolve the differences (storing different dependency versions of foo into different subfolders). However, what if these projects are in different solutions?
I say "anticipated" problem because I've not actually spent time trying it. I have tight deadlines and can't afford to deal with potentially obscure bugs emerging at runtime; I don't want to discover the answer to my question "the hard way."
Understanding this dynamic is particularly important for me, because I'm building .Net Core 3.1 plugins that I'd like to publish/deploy to an existing framework folder. The application is designed to scan for new plugins and load them.
The expected behavior is that the last project published will overwrite existing files in the publish directory and thus can lead to dependency version conflicts.
Single-file deployments have a better chance of not creating a conflict when publishing to the same directory at the expense of a larger overall deployment size.

Is there a way to have multiple applications reference a global project that has other project references

After searching quite a bit, I seem to be unable to find an answer to my problem. Usually, I find that means it is a non existent or incorrect approach, but I think it worth it to have an answer floating around on the internet nonetheless.
Essentially, we have 4 applications referencing 5 different "source" projects. So the scenario is, when we add a 5th application (for example), we will need to create project references to the other 5 different projects, as the application requires their output.
Its not a difficult task because the amount of projects is small, but it got us thinking. What if you could create a single project, maybe called Libs or something, reference all 5 projects in that project, and then the applications must only reference Libs. The idea seems cool, but I don't know if it will work because when you create a project reference, it points to Libs single output libs.dll.
So to actually ask a question, is this possible, and if so, how can it be done? Currently, having Libs reference the other "source" projects, and then having the applications reference the Lib project does not work, as it says there are missing assemblies.
And just to go over how this was created. The 5 source projects reside in a couple different solutions, so the only tedious part of this approach is "add existing project" at the initial start of the application's solution.
The way we manage this kind of thing in my organisation is to make a NuGet package for each of these shared "source" projects (e.g. in our case we have an error logging library, an XML utils library, a bespoke HTTP client, and others). These are published to our private NuGet feed URL (hosted on Azure DevOps, but you can just use a standard Windows fileshare if necessary) for our developers to use in their applications.
This has some clear advantages over your approach:
1) Dependencies - this seems most relevant to your question. If the project you built the NuGet package from itself depends on any other NuGet packages (either publicly available ones, or others from our private feed) then when someone installs that package in their project it will automatically install all the other packages it depends on.
So in your case you could create a shell "libs" package which doesn't deliver any content itself, but has dependencies on all your other packages, causing them to be installed automatically. In our case we have several cases of dependency (e.g. a "base" error logging package which is relied on by error handling modules which are tailored to different app types, e.g. MVC, Web API, Windows Services), and it works very well.
2) Updates and maintenance. In your scenario if you make a breaking change to one of your "source" projects, then, because you have a direct project reference declared in Visual Studio, any project which references the source one will have to make related changes to cope with the updates to the source project, before you can re-compile it and do whatever feature changes you're trying to achieve. This could be a pain, and be an untimely problem, especially in the case of major updates. However if instead you install a NuGet package containing that functionality, the developer of the application can choose if and when to install an updated version of the package.
There are other minor benefits as well which I won't go into, but there are some very good reasons why almost all major programming languages now provide similar "package" and "feed" functionality as a way of managing dependency on external projects and libraries. Your approach is, I feel, outdated and naive, resulting in the issue you've described and the potential for other irritations as well.

NuGet or assemblies folder for shared 3rd party DLL's?

I have about 10-15 projects with separate solutions that reference 3rd party DLL's in a microsoft .NET shop. One of the problems that we want to address is consistency of the versions of the DLL's used across projects (E.G. Netwonsoft 8.0.3 in all projects as opposed to separate versions depending when the project was created).
I have seen this done in two separate ways in my previous positions and was wondering if there are any other options to solve this problem.
I have used a corporate NuGet for all third party DLL's referenced within a solution for any project within the company. The DLL's would be updated and then made available to the developers in the projects to pull down and upgrade (if needed) within the solutions on their own.
Another company had an assemblies folder in source that housed all "approved" third party DLL's and all references lived within this directory.
I did see this question but it only offered one of the two solutions above: Where should you store 3rd party assemblies?
Are there other options aside from the ones listed above?
Whenever possible use NuGet. Primary reason being that Git doesn't very much handle large binaries well and using LFS for this doesn't make much sense, since there is a valid alternative. TFVC has fewer issues with large binaries, but I'd keep future migration to Git in mind if you're on TFVC.
Keep in mind that not just NuGet, but likely also npm and other package sources are of interest in this case.
If you want to enforce a certain version being used, create a custom task that you hook into the CI pipeline. That way you can easily give off warnings or setup some kind of policy. The custom task could take the packages.config file, scan the referenced packages and then query the TFS/VSTS package management feed to see if it's using the latest version (or the is using the latest minor version)... (or is using at least x versions back)... or fetches the approved versions from a json file or xml file from somewhere and validates against that...
In your source control, Commit and Push to Master with the desired dependency DLLs when the repository is first populated. Since all users, even on other branches, will then be pulling from the repository, you're ensuring they receive all the DLLs they need. This only becomes a problem if you're referring to DLLs in the GAC, which is resolved either by GACUtil or just making sure everyone is using the same Windows version.

Suggested setup for OS project with several NuGet packages

I decided to share a small project I have been working on for a while. It's basically a development framework for distributed applications.
Now I'm setting up the GitHub repository and I wanted to use AppVeyor for continuous integration and I'm struggling with the setup.
As for now, my framework is composed of few packages
Interfaces: defines the basic interfaces, extracted as a package so that you don't need to bring the whole framework in your business logic library
Core: contains the basic components of the framework
CastleWindsor: contains support for the Castle IoC container
RabbitMQ: contains the implementation of the engine based on RabbitMQ
Now, it's pretty easy to setup AppVeyor to build and push into Nuget a single project/package.
But what I'm looking is:
build everything on push (includes testing)
create packages and publish them only on a specific action.
I'm wondering if I should create multiple repositories (one for each package) on GitHub and have multiple projects on AppVeyor as well. Or maybe a single repository with multiple projects.
Thanks for your insights.

How do you go about using 'lib' folders in .Net projects for building in dependencies?

I need to get my external dll dependencies (automap, others ...) into a situation where I can build them on my build server. I would also like to get then them into subversion, so the build server can pick them up.
So I'm new to the whole 'lib' folder thing. I've searched Google, but it seems it's kind of assumed, there are no basics of what to do here. The books I own don't go into it. It's been a long time since I had a mentor at work, or even someone I could ask questions of ... and I'd really love to understand the fundamentals of what I should be doing here.
I write in .Net, use Jenkins as my CI server (new to that) and msbuild (new to that too). I'm hearing svn:externals (don't compute), NuGet ....
Please help!
Suppose my solution is called MySolution and is stored in C:\MySolution, then I have have three directories for binaries, all managed by source control.
vendor the source code of third party frameworks. If needed they are built and signed (with my key) and treated as if the code was my own. This is sometimes necessary to "fix" defects in the framework or debug their source to understand why it fails.
src\packages modules managed by nuget (I wished to combine this with my "lib" folder, but that isn't yet supported)
lib compiled libraries for which I don't have the source and that are not managed by nuget.
(I have omitted folders like "src", "sample", "setup", "documentation" and "scripts" to keep the answer specific to the OP).
The recent months I started to create my own nuget packages for "packages" in the lib folder so I can migrate all of them to "packages". Its published to a private nuget server. It also simplify managing the binaries across solutions.
I use to use externs, but they pose a branching nightmare after a while because you have to branch and pin the external dependencies to. With nuget this is no longer needed.
I would definitely avoid putting binaries in source control. You'd be far better off creating your own nuget repository containing your preferred versions of packages and either using nuget restore or some other way of "rehydrating" your dependencies for building. I use a simple batch file called nuget-update.bat which just looks at all packages.config files and gets any dependencies it finds.
It seems that you posted a sequence of questions on the same topic. I recommend NuGet, as it becomes critical and promoted hard by Microsoft. However, many old libraries are not available there, and you may still need to keep a lib folder. My open source project #SNMP is a good example,
I tried to use as many NuGet packages as possible, and even stepped up to maintain some of the dependencies, such as DockPanel Suite.

Categories

Resources