Is C# compile/build an incremental process? - c#

Our solution contains lots of C# projects. There are complicated dependency relationship between them, eg. project A/B/C, A dependents on B, B dependents on C. If I change one file in project C, then rebuild the solution, project A,B,C will be rebuild together.
In C++, build contains two process, compile and link. If I change one file in project C, then I build the solution, the relevant file in A and B will be compiled(other's files won't be compiled, their .obj will be reused in link process), then do link.
In java, just the changed file in project C will be recompiled, others file will be kept then package to .jar. It reuse previous work output(not changed file's .class).
In a word, C# doesn't reuse any previous work output. It doesn't have any middle file just like Java's .class and C++'s .obj. So in this point, I feel C# doesn't do incremental build process. Some little change will cause a big build process. I don't understand why C# doesn't use previous work output to accelerate the build process.
I am not sure whether my understanding of C# compile/build process is right. Could you please help to explain more? Thanks a lot.

The C# compiler does incremental compilations, I'm not sure where you got the idea that it doesn't. Maybe, due to the complexity of your solution, you are not understanding the dependencies correctly and projects that you assumed would not need to be recompiled are in fact necessary.
The best way to check the compiler's behavior is to create a simple dummy solution and play around with it:
Setup:
Create an empty Visual Studio C# solution.
Add any two projects, A and B.
Make project B a reference in project A.
Implement a class FooInB in B and use it in another class in A BarInA.
Now lets play around a bit with this setup:
Compile the solution. You will see that both projects compile.
Compile the solution again. You will see that none of the projects compile, both are up to date.
Change the implementation in BarInA and compile again. You
will see that only one project compiles, A. There is no need
to compile B again as there are no changes.
Change the implementation in FooInB and compile one last time. You will see that both projects compile. This behaviour is correct, A depends on B so any change in B will necessarily need that A recompile again to make sure it is pointing to the latest version of B. In a theoretical world where the C# compiler could detect if the changes in B have no consequences in A and could therefore "optimize" away building A again, would be a nightmare scenario where each project could be referencing different and outdated assembly versions.
That said, I'd like to point out that, AFAIK, the C# compiler will only perform incremental compilations at project level. I am not aware of any incremental compilation optimizations at class level inside any given assembly. Someone with much more insight in the inner workings of the compiler might be able to clarify this behavior.

You are kind of right.
If project A depends on project B. A change in dependent project B, does make a recompile of project A necessary.
If project B depends on project C. A change in project B, does not make project C recompile.
In Java, one class/file is compiled to a single .class file. Therefor the whole project does not to be recompiled when a single file changes.
Before distribution these .class files are combined into a .jar. If a single .class file changes, the whole .jar also needs to be reassembled.
In .Net you immediately compile a project to an assembly (.dll or .exe), so a single file change, needs a recompile of the whole assembly.
The same behavior is seen on Java environments where the .jar (or dependencies like .apk) is used to run/debug the application. The whole .jar is also needed during development. And the whole project is recompiled when a single file changes.
You can always place projects, outside of your solution. And only add a reference the .dll they produce. This will minimize compile times.

Ok.. you are actually on the right track.. if there are 3 projects in the solution, you can give reference of one project to the other. It means that project A dependent on project B dependent on project C..
When you build a project, all the built files and dlls (from the post build events ) end up in the bin folder of that project.
So when you build project A, project C will build first (because A->B->C). Project B uses the built components of project C and creates its own components. and project A uses components of B and C and creates its own components.
Because of this, if you only build project A, if the referencing is correct, you will see all build files of B and C in the bin folder on project A.

I have a solution with many projects and even more cross-project references.
For simplicity lets say I have project A B and C.
A is my main desktop exe which reffers B and C
B contains windows forms and reffers C
C contains buisness and database logic.
My main problem is: B takes about fourty five seconds to compile. Now let's say I only change a small line of code in C, the compiler will compile C and after that it will compile B, than A
Since I mostly change code in C which compiles very fast I always have to wait for B to compile.
I discovered .NET Daemon from redgate. This is a tool which changes the build process and will only rebuild B if the public API from C changes. This can have a massive impact on your build process. I would say, it saves about 80% of the time spent building over a day based on my personal experience..
However, I just looked and they aren't selling it anymore: https://www.red-gate.com/products/dotnet-development/dotnet-demon/
Visual Studio 2015 will introduce Microsoft's new Roslyn compiler,
with improvements which we believe make .NET Demon redundant.
I am still on Visual Studio 2013 but I think initial observation of the build process and the other answers about the NET build may not be true anymore for Visual Studio 2015 with Roslyn. Maybe it behaves more like the dotnet-deamon build.
On a side note: My newer projects use IOC containers, everything is wired up in the main project without much cross project references. This will also improve the build process since almost everything can be build in parallel.

All versions of C# (MSBuild really) support incremental compilation of non-dependent projects.
C# in VS2003 supported incremental compilation within projects. There is a uservoice suggestion to bring the feature forward into the latest C# version.

I've tested with Jenkins MSBuild plugin, the incremental build worked as expected, and only changed project will be recompiled.
What I want is to do a incremental deploy, deploy only these changed/recompiled dlls. Now how can I efficiently find the recompiled dlls.

Related

Impact of adding dll reference vs project reference

Is there any build time impact ? We have around 30 projects in our .Net solution and they shared projects are added by project reference. I am thinking to change to dll reference instead to see if there is any build performance gain.
Did anyone have experience on similar lines ? Please share your thoughts.
Yes, there is potentially a huge impact depending on how you have your CI set up.
The solution to this is to group logical sections of your application (data access, presentation, whatever else) into separate solutions and turn them into NuGet packages. I've had a lot of success combining TFS build, Release Management, and NuGet to automate the continuous delivery of NuGet packages from "prerelease" to "stable".
You can have it package up PDB files as well for debugging purposes, and using NuGet also helps with sharing code between different disparate projects. If Project A is using version 1.2.3 of Package X, but you've updated Package X to version 2.0.0 for Project B, Project A can happily keep consuming version 1.2.3.
One thing to keep in mind when doing a split like this:
const variables are replaced at compile time across all assemblies with the literal value. If you change a const value in Assembly A, and Assembly B references the const value, the value will not change in Assembly B if you don't recompile it. You can avoid that by using readonly fields instead of const.
I don't know why DLL references would save you time, the only added expense of project references is solving the dependency tree, which is something you definitely want it to do.
Otherwise if you aren't careful you will end up with a project that breaks every time you rebuild and requires a couple normal builds to start functioning again.

How to customly resolve Assembly version conflict

CIn our .NET application we have project A that reference an infrastructural project, let's call it Infra. Now, project A also references project B, which resides in a different solution, referencing another version of the Infra. The top level application, project C, references project A.
While this compiles just fine, it has another side effect - it does not copy the Infra.dll to the bin folder of project C.
I increased the verbosity of the build output, to see what exactly is the problem, and saw this:
There was a conflict between "Infra, Version=1.1.14.9..." and "Infra,
Version=1.0.0.0 ...". "Infra, Version=1.1.14.9" was chosen because it
had a higher version.
That actually makes a lot of sense, having 2 conflicting references to the same version should try to get the higher version. The problem is that B project's reference to Infra is not an accessible path and therefore will not be copied.
I could surly solve this issue by adding a reference form project C directly to infra, but I would prefer not to. I was wondering whether I could tell the compiler explicitly which version to use.
Any ideas?

Unity not finding dll references

We are doing a project that uses interfaces and Unity to resolve concrete implementations of classes.
My questions is the following: I need to get my dll's all into the same folder otherwise unity will not be able to resolve the interface etc. So according to me I have a couple of options:
1. Add the projects with the implementations as references and let VS copy the files to the output folder (for some reason this just feels like a hack)
2. Change the build location of all my projects to build to the same folder
3. Create a post build event to copy all the files needed to whereever they need to go
I have implemented to second option but this could lead to files in your build folder that should not be there. I am not a big fan of post build events, so I would like to ask from other people using Unity what they found to be the best solution for them.
Thanks in advance
The first approach sounds like the right one to me. Your project does depend on the implementation libraries; it doesn't express that dependency directly in code, but it requires them, so it seems reasonable to add a reference to them.
This is basically the same situation as where you've got three projects, where project A depends on project B, which depends on project C - you need to explicitly add project C as a reference within project A. Visual Studio doesn't work out transitive dependencies for you (at least it didn't the last time I checked).

Dealing with dependencies in a large project... Is it possible to reliably alter build order without using project dependencies in C#/VS2008?

I've got a legacy project in VS2008 that we're about to start refactoring for better componentization. The references between the 150 projects in the solution are messy, so as a starting point, I'm trying to at least get to a point I can have a few projects use binary references to other projects while others use project references. (For build time reasons)
To Illustrate, given projects A, B, and C, I'd like to see...
A references C.dll
B references C.csproj
Now the problem is I need to make sure that C.csproj builds before A.csproj. I know I can control build order using project dependencies, but this appears to cause exactly the behavior I'm trying to avoid... building A always causes C to build. I'm sure I can monkey with the proj or sln files directly to get things to build in the order I want, but I'm also sure that will get overwritten in short order by VS's automatic magic.
Is there some way to control this order reliably, or am I missing something obvious here?
Thanks...
Separate related components (.csproj) into individual solutions. This enforces binary references across package boundaries. It also forces you and other developers to group components by layer.
Then use your build process to build solutions in correct order starting with the least dependent packages.
In my estimation, from an SCM standpoint Solution == UML Package == Merge Module (all solutions create a merge module)
You could make custom msbuild files instead of relying on the .csproj and .sln files, such that, depending on the target chosen, it will only build certain assemblies. It would require learning msbuild if you don't know it already though.

What scenarios are possible where the VS C# compiler would not compile a reference of a reference?

I'm probably asking this question wrong (and that may be why Google isn't helping), but here goes:
In Visual Studio I am compiling a C# project (let's call it Project A, the startup project) which has a reference to Project B. Project B has a reference to a Project C, so when A gets built, the dlls for B gets placed in the bin directory of A, as does the dll for C (because B requires C, and A requires B). However, I have apparently made some change recently so that the dll for Project C does not go into the bin directory of Project A when rebuilding the solution. I have no idea what I've done to make this happen.
I have not modified the setup of the solution itself, and I have only added additional references to the project files. Code wise, I have commented out most of the actual code in Project B that references classes in Project C, but did not remove the reference from the project itself (I don't think this matters). I was told that perhaps the C# compiler was optimizing somehow so that it was not building Project C, but really I'm out of ideas. I would think someone has run into something similar before
Any thoughts? Thanks!
Have you changed your build configuration? In Visual Studio 2008, the default Solution Configurations are Debug and Release while the default Solution Platform is Any CPU. My experience suggests the Solution Configuration/Platform pair has a unique build configuration. In other words, Debug/Any CPU and Release/Any CPU are two independent build configurations, each with their own settings. If you've selected a different configuration, the settings for the original configuration do not automatically apply; you'll need to set the dependencies for all of your configurations, as well as any new projects you add to your solution, in order to seemlessly switch between them.
You can right click on the solution in the solution explorer, and check your project dependencies.
My guess is that, somehow, you've flagged that B doesn't rely on C.
Alternatively: In the solution properties, make sure that the current Configuration is set to build Project C.

Categories

Resources