How to embed EDMX in a Code First assembly? - c#

We're using Code First in EF6.1 - our model is over 300+ tables now and startup time is ridiculous. We've tried pregenerating views already but it didn't help much, it's the model compilation in the Code First pipeline that takes most of the time.
We're going to try using the Database / Model First approach for initializing the context by using an entity connection string with metadata links to CSDL, SSDL and MDL files, instead of a direct SQL connection. This would be our ideal process:
After the project containing the Code First model is compiled, a post-build task runs that generates an EDMX file from our DbContext, splits it into component CSDL, SSDL and MDL files, and embeds these files in the assembly as resources
When we create a context via our own factory, we wrap the original SQL connection string in an EntityConnectionStringBuilder, with the Metadata property pointing to the embedded resources, and use the builders connection string to initialize the DbContext
Initial testing shows an ~80% improvement in startup time - the tricky part here is doing the post-build resource embedding in step 1!
Can anyone provide any clues as to how step 1 could be done in MSBuild? Is there an alternative strategy that would work as well? Basically we want a zero-maintenance solution so that developers don't have to manually do anything other than build their code, with no special deployment considerations either.
EDIT:
We ended up using a new, separate class library project that references the project containing the Code First model. This project contains a T4 template that writes the EDMX from the DbContext into memory, then saves the component parts into project files that are already marked as embedded resources, so we get source control as well.
The build order guarantees that the resources will always be up to date, and the entity connection string references this resources assembly at runtime. The MSBuild integration was done by using the T4 MSBuild integration targets so that the template always runs during a build of the project.

You should certainly be able to do this with MSBuild. You will have to pick up a little bit of build script, but shouldn't be too bad.
How are you doing it now? Do you have a console application that you run to generate the edmx? It sounds like you have already done the hard part -- integrating with MSBuild should be easy. I will assume that you do, and go from there.
btw: One thing to know up front is that you .csproj files are MSBuild scripts, so any custom MSBuild scripting can go into those csproj files.
In order of increasing complexity, you could:
Add an "After build" event to your project that executes your console app. This option does not require any MSBuild script -- you just set up an after build event in the project options. It will always run, though. (I don't think you can make a post-build event dependent on configuration), so it could slow down your compile times.
You could use the Exec task in MSBuild to execute your console application. This will require a little editing of your csproj file, but you can make it conditional if you need to. Here's a link to the Exec task: http://msdn.microsoft.com/en-us/library/x8zx72cd.aspx If you put it in a target named "AfterBuild", it will automatically execute after your build has completed.
You could write your own build task -- this is a c# class that will be loaded and executed during build. This is the most sophisticated way to do it, but it also gives you the most control: http://blogs.msdn.com/b/msbuild/archive/2006/01/21/515834.aspx
One of the nice things about the last option (custom build tasks), is that you can write error messages back into the build process. That should help with getting helpful information if the task fails, and if you use a build server, those messages should be picked up by the server in the same manner as any other build message.

Related

Post build events in Script#

I'm trying to create post build events to copy the final .js and .debug.js files for my script# projects into the proper directories. I can't use the regular output folder, since I have more than one project that references another project, and that always results in a build error (Unable to copy referenced script because it is being used by another process).
The problem is that the C# compiler appears to run the post build events BEFORE it writes the actual .js files, so they don't exist when the post build event happens.
Is there any other solution to make this work?
You can set up a DeploymentPath property in your csproj and the generated scripts will be copied there.
All of the logic is in here: https://github.com/nikhilk/scriptsharp/blob/cc/src/Core/Build/Tasks/ScriptCompilerTask.cs ... so another option is to customize the build task to exactly your requirements.
The latest work if you check out the github repo, also has the script# part of the build process done during the build step of an msbuild project, so that should free up the post-build step for you to do what you'd like with the generated scripts. See https://github.com/nikhilk/scriptsharp/blob/cc/src/Core/Build/ScriptSharp.targets. Again, its just msbuild stuff, so you could potentially customize the .targets file to your liking as well if it doesn't fit your needs.
I got around this by adding the "copy" command as a pre build step on the projects that were using the script# project output, then adding a dependency so that the script# project would be built first.

Bringing C# application under assembly version and using it to create patches and manage them

We have a C# desktop application which we run for clients on various servers on a software as a service model. We are still on dot net framework 2.
The software has a architecture in which we have an independent application to catch external data thrown by some server. Then an application to make calculations based on it. Also one more application on which the client sees the output. The link between the 3 applications is another application which communicates with the DB.
The 4 solutions are on a SVN for sourcecontrol. But the release management is still manual and the patches are made manually by checking the log and including the dlls, pdbs, xml. etc for the projects for which the code has changed.
There is no assembly versioning implemented and the patch or release management is just done in the dark.
I want to know what is the industry practice for generating automatic patches from the code. Also I want a patch for each revision in the SVN. Also is assembly versioning helpful in this?
I have read much about continuous integration but it fails because we do not have unit tests and other fancy code to moniter the correctness of code.
The only thing at this time I would be interested is to implement a way to make patches which can be applied and removed easily. Also I want to know a way to determine the way we can monitor which release is at which level(or what patches have been applied) by some automated way rather than maintaining a log manually.
We use a build script which creates a SvnVersion.cs file containing the last commited revision. This file is placed in the root of the solution, and then added to all projects in the solution (but added as a link, not copied).
The template for the file (SvnVersion.Template.cs) looks like this:
using System.Reflection;
[assembly: AssemblyVersion("1.0.0.$WCREV$")]
[assembly: AssemblyFileVersion("1.0.0.$WCREV$")]
And we simply use TortoiseSVN to fill these placeholders in a batch script:
type "%TRUNKPATH%SvnVersion.Template.cs" > "%TRUNKPATH%\SvnVersion.tmp"
SubWcRev "%TRUNKPATH%\" "%TRUNKPATH%SvnVersion.tmp" "%TRUNKPATH%SvnVersion.cs" -f
IF ERRORLEVEL 1 GOTO ERROR
DEL "%TRUNKPATH%SvnVersion.tmp"
If you don't use TortoiseSVN, there are other ways to get this info in the file.
You will also need to remove this same information from your AssemblyInfo.cs files or you'll get a compile error. Also, to speed up Debug builds, this is only executed in Release builds (and in Debug builds only if the file doesn't initially exists, like after a fresh checkout).

Whats a good approach for white labeling dll

Whats a good approach for white labeling dll and exe with visual studio?
In essence we want to be able to have the name of the dll and exe change based on the client that we are packaging the solution for, e.g.:
Instead of myCompany.exe and myCompany.db.dll, I would like yourComany.exe and yourComany.db.dll or acme.exe and acme.db.dll, etc
Edit:
Currently we are using a straight visual studio build process with a wix project to create an msi.
If the only justification for rebuilding it is to change the name, can you just use something generic in the first place? Imagine having to patch 50 identical DLLs, and build/deploying each one separately because they all must be named different things. Even if it's only for a few clients, I would hate to have to maintain that. Versioning could be a hassle too.
If you must do it, I would probably go with a build task (which can perform fairly advanced operations). You mention that you are "packaged the solution"; the viability of a build task would depend on how it is being packaged.
In response to your comment about naming the EXEs with client-specific names... My obvious suggestion there would be to have those applications contain as little code as possible.
The simplest build integration I can think of would be to create a post-build task which ran upon successful compilation in release mode. The task could then read a config file which defined the unique names, and copy the successfully built EXEs to an output directory.
Some of the operations can be accomplished just from the task config file: http://msdn.microsoft.com/en-us/library/ms171466.
Alternatively, you might want to create a little application to do all the work for you, and just pass config switches to it.
For example, here is a little post-build command that I execute to minify my JavaScript/CSS upon successful build of a web application. The concept is similar:
build
execute an app (like msbuild.exe, or your custom build app)
pass data to the executable (like paths, switches, etc.)
executable writes the files out
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe
"$(ProjectDir)Properties\build\minify.xml"
/p:SourceLocation="$(ProjectDir)client"
/p:CssOutputFile="$(ProjectDir)client\final\final-full.css"
/p:JavaScriptOutputDirectory="$(ProjectDir)client\final"
You could use ILMerge in whatever post-build process you want on all your outputted assemblies (dll and exe), to create one-off customer-branded builds.
ilmerge /out:CustomerName.exe internalName.dll internalName.exe
I don't know that there is a good way to do this without actually building the project as XYZ company. You could try something like this which will give you the desired result BUT it will change the physical name of the assembly as well which may cause dependency problems.

Visual Studio Long wait before Starting to build

We have a moderately sized solution, with about 20 projects. In one of them I have my business entities. On compiling any project, visual studio waits and hangs about one and a half minutes on this BusinessEntities project.
I tried our solution in SharpDevelop and it compiles our complete solution, in 18 seconds. Similar timing with MSBuild.
My guess is that VS is trying to find out if the project needs a compile, but this process is about 15 times slower than actually performing the compile!!
I can't switch to the great sharpdevelop, it lacks some small, but essential requirements for our debugging scenarios.
Can I prevent VS from checking this project, And have it compile the projects without such a check, just like sharpdevelop?
I already know about unchecking projects in configuration management to prevent building some projects, but my developers will forget they need to compile this project after updating to latest sources and they face problems that seem strange to them.
Edit: Interesting results of an investigation: The delay happens to one of the projects only. In configuration manager I unchecked all projects, then compiled each of them individually. All projects compile in a few seconds!! The point is this: if that special project is built directly, compiles in a few seconds, if it is being built (or skipped, because it is up-to-date) as a result of building another project that depends on it, VS hangs for about a minute and half, and then decides to compile it (or skip it). My conclusion: Visual studio is checking to know if any files are changed, but for some reasons, for this special project it is extremely inefficient!!
I'd go to Tools -> Options -> Projects and Solutions -> Build and Run and then change the "MSBuild project build [output|build log] verbosity" to Diagnostic. At that level it will include timings which should help you track down the issue.
We had the same problem with an ASP.NET MVC web project running in Visual Studio 2013. We build the project and nothing happens for about a minute or so and then the output window shows that we are compiling.
Here's what fixed it... open the .csproj file in a text editor and set MvcBuildViews to false:
<MvcBuildViews>false</MvcBuildViews>
I had to use sysinternals process monitor to figure this out but it's clearly the cause for my situation. The site compiles in less than 5 seconds now and previously took over a minute. During that minute the Asp.net compilation process was putting files and directories into the Temporary Asp.net Files folder.
Warning: If you set this, you'll no longer precompile your views so you will lose the ability to see syntax errors in your views at build time.
There is the possibility that you are suffering from VS inspecting other freshly built assemblies for the benefit of the currently compiling project.
When an assembly is built, VS will inspect the references of the target assembly, which if they are feshly built or new versions, may include actually loading them in a .Net domain, which bears all the burdens of loading an assembly as though you were going to run it. The build can get progressively slower as it rebuilds more and more projects. When one assembly becomes newer the others do a lot more work. This is one possible explanation for why building by itself, versus already built, versus building clean, all have seemingly relevantly differing results. Its really tht the others changed and not about the one being compiled.
VS will 'mark down' the last 'internal' build number of the referenced assembly and look to see if the referenced assembly actually changed as it rolls through its build process. If its not differnt, a ton of work gets skipped. And yes, there are internal assembly build numbers that you dont control. This is probalby not in any way due to the actual c# compiler or its work or anything post-compile, but pre-compile steps necessary for the most general cases.
There are several reference oriented settings you can play with, and depending on your dev, test, or deployments needs, the functional differences may be irrelevant, however may profoundly impact how VS behaves and how long it takes during build.
Go to the references of one of the projects in Solution Explorer:
1) click on a reference
2) open the properties pane if its not (not the Property Pages or the Property Manager)
3) look at 'Copy Local', 'Embed Interop Types', 'Reference Output Assembly'; those may be very applicable and probably something good to know about regardless. I strongly suggest looking up what they do on MSDN. 'Reference Output Assembly' may or may not show in the list.
4) unload the project, and edit the .proj file in VS as text. look for the assembly reference in the XML and look for 'Private'. This means whether the assembly referenced is to be treated as though its going to be a private assembly from the referencing assemblies perspective, vs a shared one. Which is sort of a wordy way of saying, will that assembly be deployed as a unit with the other assemblies together. This is very important toward unburdening things. Background: http://msdn.microsoft.com/en-us/magazine/cc164080.aspx
So the basic idea here is that you want to configure all of these to be the least expensive, both during build and after deployment. If you are building them together, then for example you probably really don't need 'Copy Local'. Id hate to say more about how you should configure them without knowing more about your needs, but its a very fine thing to go read a few good paragraphs about each. This gets very tricky however, because you also influence whether VS will use the the stale old one when resolving before the referenced one is rebuilt. As a further example explaiing that its good to go read about these, Copy Local can use the local copy, even though its stale, so having this set can be double bad. Just remember the goal at the moment is to lower the burden of VS loading newly built assemblies jsut to compile the others.
Lastly, for now, I can easily say that hanging for only 1.5 mins is getting off very lucky. There are people with much much worse build times due to things like this ;)
Some troubleshooting idea's that have not been mentioned:
Clean solution?
Delete Obj and Bin folders plus the .suo file? FYI, neither Clean nor Rebuild will delete non-build files, eg files copied during a pre-build command.
Turn off VS scanning outside files. Options > tools > environment > document > detect when file is changed outside the environment?
Rollback SVN history to confirm when it started to occur? What changed? If the project file on day 1 takes the same time, recreate the project, add all the files and build.
Otherwise could you please run Process Monitor and let us know what Visual Studio is doing in the prep-build stage?
Sounds silly, but remove all breakpoints first. It sped up my pre-build checks massively - still don't know why though.
Based on the (limited) information provided one possibility is that there could be a pre-build action specified in the project file that is slow to compile.
Try disabling platform verification task as described here.
If your individual projects are compiling correctly then all you can do is change order of compilation by setting dependent projects explicitly in configuration.
Try to visualize your project dependency hierarchy and set dependent projects. For example, if your business entities project is referenced in each project, then in configuration of each project, this project must be selected as dependent.
When an explicit build order is not set, visual studio is analyzing projects to create an order of building project. Setting explicit dependent projects wiki make visual studio skip this step and use the order provided by you.
With such an extreme delay on a single project and no other avenue seeming to provide a reason I would attempt to build that specific project while running procmon from sysinternals and filter out all the success messages. You could probably also narrow it down to just the file system actions as well. From your description I might guess that the files are being locked by an external source like the event collection or workflow management process services.
Other things to consider would be whether or not this is a totally clean build machine or if it has been used to perhaps test the builds as well? If so, is there a chance that someone mapped an IIS application path to the project directly or registered it as a service location?
If you run procmon and see no obvious locks or conflicts I would create a totally new solution and project and copy the files over to see if that project also has the same delay. If it does have the same delay I would create a sample project of the same type but generic data (essentially empty) and see if that too is slow. If the new project with the same files builds fine you can then diff the directories to see what the variance is that causes the problem (perhaps a config or project setting).
For me, thoroughly disabling code analyzers helped per instructions here:
https://learn.microsoft.com/en-us/visualstudio/code-quality/disable-code-analysis?view=vs-2019#net-framework-projects.
I thought my code analyzers were already off, but adding the extra xml helped.
Thanks Kaleb's for the suggestion to set "MSBuild project build [output|build log] verbosity" to Diagnostic. The first message took more than 10 seconds to display:
Property reassignment: $(Features)=";flow-analysis;flow-analysis" (previous value: ";flow-analysis") at C:\myProjectDirectory\packages\Microsoft.NetFramework.Analyzers.2.9.3\build\Microsoft.NetFramework.Analyzers.props (32,5)
Which led me to the code analyzers.
Just in case someone else trips into this issue:
In my case the delay was being caused by an invalid path entry in "additional include directories" that referred to a non accessible UNC location.
Once this was corrected, the delay disappeared.

Custom build step for C# project

In C++ projects there is the possibility to set a custom build step for files. Is there a similar functionality in C# projects? I couldn't really find anything.
One idea would be to create a second project (makefile or c++) and move the files there.
MsBuild should work for you although it might take some time to figure out how it works. It appears that you can setup a step that runs prior to building each .cs file by separating each .cs file into its own build group.
In MSBuild script for compiling each .cs file into an EXE, Dino Chiesa comments:
By using the %(CSFile.identity)
scalar, we run this task once for each
file. The converse would be
#(CSFile.identity). That would run
the compile once, for all files,
compiling them all together into a
single assembly.
Also, these links might help:
Custom build step for C# files
Master Complex Builds with MSBuild
No custom build step for individual files with C# projects. You could probably hack something together with MSBuild...
Look at the BeforeBuild and AfterBuild targets in your csproj file.
I think you are on the right track with your comment about multiple projects. Combine this with the fact that you can include multiple projects within a single Solution and you may have your answer. I use this functionality to build several components at a time and it works quite well.

Categories

Resources