We are developing a small framework within the company and there is a small weird issue with pdb files.
While developing framework, we also commit pdb&dll outputs and related projects are referenced directly to these dlls.
But when i build and commit these dlls, my companions cannot navigate to sources of framework. When someone else builds, i cannot navigate to source.
Only thing i can do is to use resharper's navigation via "navigate to -> decompiled sources".
There is something wrong i think. They are the same files so that i should be able to navigate to their files directly.
Btw, we do not version framework. All dlls use same 1.0 version.
Anyone having an idea?
I found the answer. Using DUMPBIN I examined all pdb files and there were full paths of last build, which is different in my computer.
For ex: my collegue build framework project in d:\projects path however, working directory in my computer was c:\projects so that pdb files somehow not found (which is veird. The paths should be relative imo)
When one of us changed framework project path and if we both use the same path; no matter who built that project last, it just worked. I can navigate source codes directly in Visual Studio.
It can be that you need to disable Optimize Code on release build to make it work. Try that..
Related
I have a solution with round about 50 Projects.
The default behavior of Visual Studio is to build each project into the project subfolders bin/debug and bin/release.
In my company there are guidelines that all projects should end up in a common output directory. The advantage should be that the files are not copied so often.
This is done defining the <OutputDirectory> in the .csproj file.
Checking the buildoutput showed me many deletes and copies of the same file. So I would expect that there are no advantages in copy count.
So are there any other benefits by sharing the output directory? Are there any problems that I not noticed yet?
What is the best practice handling this?
Thanks for your help :D
You can do this simply by using the MSBuild switch --output and specifying a folder for all the outputs when building a project/solution. All the dependent projects etc will be built and copied into one output folder.
I don't think it's particularly relevant "to reduce the number of files that are copied" - hard disks copy files all day, every day. Outputting into one folder can have a benefit when packaging for release; our Jenkins build server bundles all files into one folder before it puts them in a package (zip) file for Octopus deploy
You need to be careful htough, because it's possible for different projects to be dependent on different versions of the same DLL, and by mashing all the files together in one folder, DLLs of the same name (but different version) overwrite each other. You can then end up with a situation where your app doesn't load because it's trying for find XYZ DLL version 1.0 and some other project has a reference to version 1.1, and because that project built later, it's the 1.1 version of the DLL that ends up in the folder. You can then experience runtime errors indicating "The located assembly's manifest definition does not match the assembly reference" - it's saying "I was looking for 1.0, I found a DLL with the right name, but the version of the one I found was 1.1"
This can usually be solved with binding redirects, but it isn't 100% guaranteed that future versions of DLLs are backwards compatible with earlier versions. If you're arranging all your DLLs to copy into one folder youre creating a bit of a lottery for yourself as to whether things will break in production, when DLL versions go out of sync with what you expect
If your company operates a build and deploy process that uses locally copied versions of DLLs then it might be that they insist you do copy all DLLs to one folder to prove that the app will work in production - this is a much better reason (i.e. arrange a way for you to break the dev server so you can see and fix problems caused by DLL versions before putting to production and hitting the same) than "because we don't want the hard disk to get tired copying extra files"
If you insist to do that. You can
Right Click Your Project
Choose "Properties" => "Build Events" => "Post-build event command line"
Put xcopy "$(ProjectDir)\bing\debug{yourprojectdll}" "{path to shared folder}" /Y /I /R
I made a WPF program which uses SQLite. And by using Visual Studio 2012, it generates both Debug and Release version exe file. When I go to Debug or Release directory and run my exe file, e.g. MultiStart.exe, it can run normally.
But if I copy the MultiStart.exe to my Desktop and try to run it, it failed.
By several tests, I found that I also need to copy files MultiStart.exe.config and System.Data.SQLite.dll to my Desktop. And then it can run now.
But why? Do we have better solution so that I can make it run without addition files?
Thanks!
Why my WPF program cannot run without Visual Studio?
The question title is not really accurate since it's not really related Visual Studio. MultiStart.exe is dependent on configuration (MultiStart.exe.config) as well as other assemblies (System.Data.SQLite.dll). Without these dependencies the application cannot run (because that is how .NET works).
WPF doesn't necessarily need a config file to run so the question is what is in your config file that the application needs. It might be possible to move this configuration information into the code (e.g. connection string) and remove the app.config but then the values will be hard coded in the application.
In terms of dependent assemblies, instead of deploying them it is possible to embed them as resources and then use the AppDomain.AssemblyResolve Event to read the assembly from a resource (see Embedding assemblies inside another assembly for an example).
Another approach instead of embedding assemblies as resources is to merge them into one assembly. ILMerge is a popular choice for merging assemblies but I read that it can have issues with WPF assemblies (not sure if that applies to you). See Merging dlls into a single .exe with wpf for some other ideas for merging assemblies with WPF.
Note that setting PATH variables does not work because .NET does not use the PATH for resolving assemblies -- see How the Runtime Locates Assemblies for the details.
Another, option instead of copying the MultiStart.exe to the desktop is to use a shortcut on the desktop that links to the appropriate directory. Perhaps that is a simpler solution
You can also use ILMerge to merge all dependencies into single .exe file to simplify distributiuon of your application.
More detaiils on ILMerge can be found here: ILMerge on CodeProject
Example of usage: ilmerge /target:winexe /out:YourDestinationApp.exe
YourCurrentProgram.exe System.Data.SQLite.dll
Better solution that i used to do with my windows form apps is, Copy the entire folder, which contains supporting files. place it where you want. then create a shortcut of your .exe on your desktop. That always worked for me.
Because you are missing some dependency. You can open your config file and set the dependency...but I wouldn't recommend you to change config file manually.
You can also copy the dependent dll in system32 folder. ..but its only a trick because exe first search dlls in current folder than system 32 folder.
Because you're missing things from your PATH. Visual Studio is probably set to copy DLLs to the target directory on build.
You're almost certainly pulling in external libraries. Some of these are part of .NET, while others are packaged in libraries in specific folders. When you start your exe, it looks in your PATH and the current folder for everything (which includes all DLLs Visual Studio copied).
When you moved the exe to the desktop, suddenly it had no idea where those DLLs are. You haven't specifically added them to your PATH, and they are no longer in the current folder. This is why copying those DLLs to your desktop magically made them work.
Unless you stop use SQLite, there is not a way for you to not need that DLL (there are lots of ways to package/reference it though).
I've got Monoserve and Nginx running perfectly in Ubuntu however I still have to publish the website locally on a Windows box using MSBuild and then copy the files over.
Preferably I'd like to have a Linux CI server that does this instead using XBuild however I can only get it to build the project into .dlls, how do I publish and deploy it with js, css, views, etc?
Typically the "build dlls" part is the hardest part. If you've got that solved, you're 80% there. The other half is publishing content. In it's most elementary aspect, you're copying a portion of the files from the source dir to the website folder. MSDeploy is Microsoft's answer to it, and it's waaaaaay too complex. I built an NAnt task that does this, though that also doesn't apply to your specific scenario. However, the general methodology can:
Crawl the sln file looking for web projects. What makes a web project? Technically guids in the csproj file or project type ids in the sln file. I cheated and defined it as "the target folder includes a web.config file". If you've only got one website project in your solution, you can skip this step, and just hard-code the .csproj file.
Crawl the csproj file looking for <Content Include="some\file.ext" /> nodes. XPath could do this, Linq to XML could do it too. This gives you all the .aspx, .cshtml, .js, .css, .png, .config, etc, etc, while carefully leaving behind all the .cs files. You'll need to prefix the path to the .csproj file to get the true origin file location, and you want to ensure you preserve the folder structure in the destination location. But this is trivial in comparison to harvesting the file list.
Now that you've got the file list, loop through it copying from the source folder to the destination folder. (You probably want to either empty the destination folder first or afterwards prune extra files from previous deployments. I find the former easier.) The only thing the csproj file crawl didn't give you was the bin folder content, but that's cake: copy all the contents of the bin folder. :D (There's a healthy debate about whether to copy .pdb files, but I say yes.)
Form a script to do the above 3 steps, then call it either from an XBuild task or call both XBuild and this script from the CI process. Poof. You've got a deploy target. Happy coding!
Preferably I'd like to have a Linux CI server that does this instead using XBuild
The good news is that you CAN do this with a workaround I found in this article. Here is an excerpt and the workaround link from the above article:
The build server does not have Microsoft MVC (any version) installed.
However, this is very easy to work around-Microsoft MVC is available
on NuGet at http://nuget.org/packages/Microsoft.AspNet.Mvc if you need
to install an older version, click on the older version at the bottom
of the page and you’ll reach instructions on how to install that
version of the framework.
Hope this make it easy for you!
#DHarun 's idea works great!
I just wrote a small script based on #Dharun 's idea, hope it may help others.
https://github.com/z-ji/MonoWebPublisher
Is the .pdb file enough to debug and step into the code of a dll? Or do you actually have to reference the corresponding project source code?
I tried referencing a dll with the .pdb file in the same directory, and was unable to step into the code of the dll. So I was wondering what the rules around dlls and .pdb files where.
Thanks in advance.
The .pdb file will allow you to debug, but it will not provide any sources. Check out this blog post for an excellent description of PDB files and their purpose.
http://www.wintellect.com/CS/blogs/jrobbins/archive/2009/05/11/pdb-files-what-every-developer-must-know.aspx
The PDB file is how visual studio knows how the executing code in the assembly corresponds to the lines in the source code. The answer to your question is yes, Visual studio needs the source code that the corresponding pdb was built from.
The pdb does not contain the source code packaged inside of it (well it can, but it is a bit of a hack and not many people do it), however the symbol server should automatically download it if it has the source available. However the pdb must match the exact version as the dll you are working with for it to download the source.
I have a small suspicion that you are trying to do the .NET framework source stepping and it is not stepping in to it. Microsoft has not updated the symbol servers with the current versions of the pdb files so source stepping is broken if you are running a up to date version of .net (at least until they release the new versions of the source files).
I'm still learning the basics of how VS2010 sees the world. Apparently, you can optionally "include" a file in a project. I'm a bit confused by this: If a file is version-controlled, AND the file is within the project directory, shouldn't it implicitly be "included" in the project? If not, what's the use case where a version-controlled file in the project directory should NOT be included in the project?
=== Addition ===
Based on the answers I've gotten so far, maybe I should rephrased my question: What does it mean for a file to be "included" in a project?
A project needs to know about files in order for compilation and distribution to occur. Just because you have a file that's under source-control, doesn't mean that it will be compiled if the project is unaware of it.
Also, you may want to include files as part of a distribution package. We do this quite often for our web projects that we distribute using web app gallery.
Conversely, you could have documentation or sql scripts that you version control, but do not want them to be part of the project.
EDIT: In answer to your update, what it means for a file to be included in a project is that the file is actually added to the .csproj or .vbproj file and will be used during compilation and/or distribution. VS does differentiate if the file is Content or if it needs to Compile it. This can be seen by clicking on the file in Solution Explorer and looking at the Build Action property.
No, you don't want random files that happen to be in the project directory included in source control.
We do sometimes put documentation (pdfs) or drawings/schematics in the project folder and under version control but you don't need them inside the visual studio project (especially when they are not being distributed because they are for internal use only).
Excluding the file from your project can be useful if the file is related to the project but not necessarily needed in the solution.
Example
If I need some test XML for an application that i'm writing; that is designed to normally be pulling this from a WCF service, it can be useful to keep that file in the directory for a development environment where I use IO to get the XML for testing, but I don't necessarily want it in my solution which is source controlled.
When you exclude a file from a project is no longer compiled or embedded, then when you want to include it again you can do so without having lost your settings.
If you e.g. copy a file (containing a helpful class which want to have in your project) into a folder of your project, then you will see ... nothing. You have to check the option "Show all files" of the solution explorer and the copied file can be seen, but it is still "greyed out". No you can choose the menuitem Include in project and that file will be integrated in your project and a pending change (add) for your source control is added too. Visual Studio doesn't include all files it can find in the project folder automatically to the project - and that is a good feature.
One of my colleagues explained to me a scenario in which a version-controlled file should NOT be part of the project. Here's the idea:
A developer writes some new code.
The code is experimental, and not intended to be part of the normal build.
The easiest way to exclude the file from the build is to NOT include it in the project, but still version-control it.
This way, the file can be shared with other developers, but not break the build.