Is there any build time impact ? We have around 30 projects in our .Net solution and they shared projects are added by project reference. I am thinking to change to dll reference instead to see if there is any build performance gain.
Did anyone have experience on similar lines ? Please share your thoughts.
Yes, there is potentially a huge impact depending on how you have your CI set up.
The solution to this is to group logical sections of your application (data access, presentation, whatever else) into separate solutions and turn them into NuGet packages. I've had a lot of success combining TFS build, Release Management, and NuGet to automate the continuous delivery of NuGet packages from "prerelease" to "stable".
You can have it package up PDB files as well for debugging purposes, and using NuGet also helps with sharing code between different disparate projects. If Project A is using version 1.2.3 of Package X, but you've updated Package X to version 2.0.0 for Project B, Project A can happily keep consuming version 1.2.3.
One thing to keep in mind when doing a split like this:
const variables are replaced at compile time across all assemblies with the literal value. If you change a const value in Assembly A, and Assembly B references the const value, the value will not change in Assembly B if you don't recompile it. You can avoid that by using readonly fields instead of const.
I don't know why DLL references would save you time, the only added expense of project references is solving the dependency tree, which is something you definitely want it to do.
Otherwise if you aren't careful you will end up with a project that breaks every time you rebuild and requires a couple normal builds to start functioning again.
Related
So, I have a BIG project with lots of stuff in it, but when for example I fix the code, VS2015 compiles the whole project again which takes lots and lots of time. Can I somehow only compile the file I edited?
EDIT: I have one solution, in that I have one project, in that project I have lots of files.
The Build process is smart, it will skip projects that haven't changed and for which the dependencies also haven't been changed. If you change something inside a library that is used by more or less the entire solution, then there just is no alternative but to rebuild all that was (potentially) touched.
You could try tuning the Dependencies yourself, right-click on the Solution and select 'Project dependencies...'. But you can't remove Dependencies that are needed or inferred.
Get an SSD as build drive. A fast one.
All the other tips given here are a given - build is smart, so it will only recompile what needs to be recompiled. This is hardly a help, though, if you have a base library that triggers dozens of projects to update.
Compilation is IO limited, so a SSD helps
Otherwise it may be time to destroy that large solution and generate inernal NUGET packages of base libraries. This can decouple recompilations of base libraries and actual applications. This is particularly useful if you do maintenance on the base libraries.
Our solution contains lots of C# projects. There are complicated dependency relationship between them, eg. project A/B/C, A dependents on B, B dependents on C. If I change one file in project C, then rebuild the solution, project A,B,C will be rebuild together.
In C++, build contains two process, compile and link. If I change one file in project C, then I build the solution, the relevant file in A and B will be compiled(other's files won't be compiled, their .obj will be reused in link process), then do link.
In java, just the changed file in project C will be recompiled, others file will be kept then package to .jar. It reuse previous work output(not changed file's .class).
In a word, C# doesn't reuse any previous work output. It doesn't have any middle file just like Java's .class and C++'s .obj. So in this point, I feel C# doesn't do incremental build process. Some little change will cause a big build process. I don't understand why C# doesn't use previous work output to accelerate the build process.
I am not sure whether my understanding of C# compile/build process is right. Could you please help to explain more? Thanks a lot.
The C# compiler does incremental compilations, I'm not sure where you got the idea that it doesn't. Maybe, due to the complexity of your solution, you are not understanding the dependencies correctly and projects that you assumed would not need to be recompiled are in fact necessary.
The best way to check the compiler's behavior is to create a simple dummy solution and play around with it:
Setup:
Create an empty Visual Studio C# solution.
Add any two projects, A and B.
Make project B a reference in project A.
Implement a class FooInB in B and use it in another class in A BarInA.
Now lets play around a bit with this setup:
Compile the solution. You will see that both projects compile.
Compile the solution again. You will see that none of the projects compile, both are up to date.
Change the implementation in BarInA and compile again. You
will see that only one project compiles, A. There is no need
to compile B again as there are no changes.
Change the implementation in FooInB and compile one last time. You will see that both projects compile. This behaviour is correct, A depends on B so any change in B will necessarily need that A recompile again to make sure it is pointing to the latest version of B. In a theoretical world where the C# compiler could detect if the changes in B have no consequences in A and could therefore "optimize" away building A again, would be a nightmare scenario where each project could be referencing different and outdated assembly versions.
That said, I'd like to point out that, AFAIK, the C# compiler will only perform incremental compilations at project level. I am not aware of any incremental compilation optimizations at class level inside any given assembly. Someone with much more insight in the inner workings of the compiler might be able to clarify this behavior.
You are kind of right.
If project A depends on project B. A change in dependent project B, does make a recompile of project A necessary.
If project B depends on project C. A change in project B, does not make project C recompile.
In Java, one class/file is compiled to a single .class file. Therefor the whole project does not to be recompiled when a single file changes.
Before distribution these .class files are combined into a .jar. If a single .class file changes, the whole .jar also needs to be reassembled.
In .Net you immediately compile a project to an assembly (.dll or .exe), so a single file change, needs a recompile of the whole assembly.
The same behavior is seen on Java environments where the .jar (or dependencies like .apk) is used to run/debug the application. The whole .jar is also needed during development. And the whole project is recompiled when a single file changes.
You can always place projects, outside of your solution. And only add a reference the .dll they produce. This will minimize compile times.
Ok.. you are actually on the right track.. if there are 3 projects in the solution, you can give reference of one project to the other. It means that project A dependent on project B dependent on project C..
When you build a project, all the built files and dlls (from the post build events ) end up in the bin folder of that project.
So when you build project A, project C will build first (because A->B->C). Project B uses the built components of project C and creates its own components. and project A uses components of B and C and creates its own components.
Because of this, if you only build project A, if the referencing is correct, you will see all build files of B and C in the bin folder on project A.
I have a solution with many projects and even more cross-project references.
For simplicity lets say I have project A B and C.
A is my main desktop exe which reffers B and C
B contains windows forms and reffers C
C contains buisness and database logic.
My main problem is: B takes about fourty five seconds to compile. Now let's say I only change a small line of code in C, the compiler will compile C and after that it will compile B, than A
Since I mostly change code in C which compiles very fast I always have to wait for B to compile.
I discovered .NET Daemon from redgate. This is a tool which changes the build process and will only rebuild B if the public API from C changes. This can have a massive impact on your build process. I would say, it saves about 80% of the time spent building over a day based on my personal experience..
However, I just looked and they aren't selling it anymore: https://www.red-gate.com/products/dotnet-development/dotnet-demon/
Visual Studio 2015 will introduce Microsoft's new Roslyn compiler,
with improvements which we believe make .NET Demon redundant.
I am still on Visual Studio 2013 but I think initial observation of the build process and the other answers about the NET build may not be true anymore for Visual Studio 2015 with Roslyn. Maybe it behaves more like the dotnet-deamon build.
On a side note: My newer projects use IOC containers, everything is wired up in the main project without much cross project references. This will also improve the build process since almost everything can be build in parallel.
All versions of C# (MSBuild really) support incremental compilation of non-dependent projects.
C# in VS2003 supported incremental compilation within projects. There is a uservoice suggestion to bring the feature forward into the latest C# version.
I've tested with Jenkins MSBuild plugin, the incremental build worked as expected, and only changed project will be recompiled.
What I want is to do a incremental deploy, deploy only these changed/recompiled dlls. Now how can I efficiently find the recompiled dlls.
So the problem is quite simple: My project references assembly X but not Z. But assembly X does reference assembly Z. Assembly Z updates somewhat frequently, so whenever I build my project, I'd like to get the latest version of Z as well.
So far I've come up with 3 options:
reference the assembly Z. This has the advantage of getting the new version, always. But it does pollute the references with something that isn't strictly required in there.
Add a post-build event that copies the required DLL(s) from where they are updated. I think this is quite ok, until I need multiple different DLLs, which would make the script quite long and tedious to maintain.
Add the assembly Z as a resource and set copy to output to true. This one I would probably prefer, except that when I add the DLL to the project, visual studio actually copies the (then) current version in to the project, and there is no link to the original source. Thus, when the assembly is updated, this is not reflected in any way in my project. Unless I combine this approach with option number 2, but then I might as well just use option 2 alone.
So, am I missing something, or are these my only options?
I would go with option 1. I think it's entirely reasonable for your project to reference everything it depends on, even if those dependencies can sometimes be indirect.
It also seems like the simplest option to me - and one which fits in with Visual Studio's view of code dependencies that your app requires... so anything that Visual Studio does with those dependencies should just naturally flow, rather than you having to think about this at every stage.
EDIT: As an alternative option, have you considered using NuGet? That way you'd only express a dependency on X within your project's NuGet dependencies, but it would "know" that it depended on Z. I believe it should all just work... You should be able to do this even if these are internal projects, as you can set your own NuGet source rather than the public repository.
In a Visual Studio C# solution, are there any disadvantages for all projects to share the same output path? I'd like to do this because we use dependency injection and the files don't get copied automatically (since they are not referenced). Will this cause me any problems?
(This is related to: C# - Copy dlls to the exe output directory when using dependency injection with no references?)
We are doing this on our current project. We have about 30 projects output to the same bin folder and we have not had any problems.
There's a potential problem if you have two different projects depending on different versions of the same assembly. If I have project A depending on X.dll version 1, and project B depending on X.dll version 2, you're not going to be able to place both versions of X.dll into the same output folder (without a rename). Admittedly, the chances of these aren't high, but they're not zero.
The point of having multiple projects is to produce multiple assemblies. Assemblies are a deployment mechanism in .Net. If you always plan to bundle all the output dlls into one package, then I do not see disadvantages in this approach.
However, if you are planning to deploy Assemblies A,B,C separately from Assemblies D,E,F for one reason or another, keeping the output directories separate will ensure that the correct assemnbly and only its dependencies are in the output folder. It will be easier to then just write a script that correctly bundles those Assemblies into the correct packaging that you want.
Generally not a disadvantage, I find it quite beneficial to have a single output location for my target assemblies. When VS sorts the projects for compilation by determining dependencies, it will overwrite an existing instances of compiled assemblies that have been built earlier in the build order.
Saves me having to hunt round for the compiled assemblies when everything is built.
Some instances where you may have issues is if you have two projects with references to external assemblies that are of different versions, you may end up with an assembly bundled without the incorrect version of a dependency...
You may run into issues w/TeamBuild or whatever automated build process that you are using if you are planning on deploying your assemblies in different ways (core package vs. add ons, etc.)
How you customize your build scripts will be different (not necessarily disadvantageous) depending on how much customization you've done to your csproj files.
One big disadvantage with this approach is often you will run into access violation issues if your startup project holds references to any dlls that other projects reference during the build, even if copy local = false. I run into this issues everyday now. I would suggest having a project with all the needed references and copy local = true for those references. This will also help with making an installer.
I've got a legacy project in VS2008 that we're about to start refactoring for better componentization. The references between the 150 projects in the solution are messy, so as a starting point, I'm trying to at least get to a point I can have a few projects use binary references to other projects while others use project references. (For build time reasons)
To Illustrate, given projects A, B, and C, I'd like to see...
A references C.dll
B references C.csproj
Now the problem is I need to make sure that C.csproj builds before A.csproj. I know I can control build order using project dependencies, but this appears to cause exactly the behavior I'm trying to avoid... building A always causes C to build. I'm sure I can monkey with the proj or sln files directly to get things to build in the order I want, but I'm also sure that will get overwritten in short order by VS's automatic magic.
Is there some way to control this order reliably, or am I missing something obvious here?
Thanks...
Separate related components (.csproj) into individual solutions. This enforces binary references across package boundaries. It also forces you and other developers to group components by layer.
Then use your build process to build solutions in correct order starting with the least dependent packages.
In my estimation, from an SCM standpoint Solution == UML Package == Merge Module (all solutions create a merge module)
You could make custom msbuild files instead of relying on the .csproj and .sln files, such that, depending on the target chosen, it will only build certain assemblies. It would require learning msbuild if you don't know it already though.