How to control output of a nuget package dependencies during build - c#

I would like to support backward compatibility in my application.
Simply saying - one app needs to work using different versions of a dll depending on a flag which the app get's during runtime.
I've simplified everything and created a test solution with 2 projects in it.
Each project has it's own version of the same nuget package.
I picked System.Drawing.Common cause it has no dependencies.
ClassLibrary1 contains System.Drawing.Common of version 4.5.0.
ClassLibrary2 contains System.Drawing.Common of version 6.0.0.
Both projects have same output path:
<OutputPath>..\DEBUG\</OutputPath>
When I build my solution I get just one System.Drawing.Common.dll in my output folder:
Cause both dlls have one name and only version is different.
The desired behavior on the pictures below:
Distribute the nuget package dependencies into different folders according to versions.
Add suffix to the nuget package dependencies according to versions.
The idea is in controlling output of the nuget package dependencies.
Do you have any idea how I can achieve that ?
P.S. all other logic - resolving dependencies according versions etc is out of scope of this question.

It's possible.
First you need to add GeneratePathProperty to PackageReference element in csproj file
<ItemGroup>
<PackageReference Include="System.Drawing.Common">
<Version>4.5.0</Version>
<GeneratePathProperty>true</GeneratePathProperty>
</PackageReference>
</ItemGroup>
It allows us using $(PkgSystem_Drawing_Common) variable which contains a path to the nuget package.
Then we need to create a msbuild targets file
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="CopyNugetDll" BeforeTargets="BeforeCompile" Outputs="System.Drawing.Common.dll">
<XmlPeek XmlInputPath="$(ProjectPath)" Query="Project/ItemGroup/PackageReference[#Include='System.Drawing.Common']/Version/text()">
<Output TaskParameter="Result" PropertyName="NugetPackageVersion" />
</XmlPeek>
<ItemGroup>
<NugetrDll Include="$(PkgSystem_Drawing_Common)\lib\net461\System.Drawing.Common.dll" />
</ItemGroup>
<Message Text="Copying #(NugetrDll) to $(OutDir)" Importance="high" />
<Exec Command="copy $(PkgSystem_Drawing_Common)\lib\net461\System.Drawing.Common.dll $(OutDir)\System.Drawing.Common.$(NugetPackageVersion).dll" />
</Target>
</Project>
Here using xpath we select version from project.assets.json file and save it in NugetPackageVersion variable. Exec copy is used to copy the dll to a specific location with a specific prefix which contains a value from NugetPackageVersion variable.
Lastly you need to include msbuild targets file to a project
<Import Project="CopyDll.targets" />

This just isn't how package resolution works in .NET, you get one version of each package which is decided at restore time.
There may be some funky options if you have a very niche problem, but it sounds like maybe you're trying to solve a common problem in an uncommon way which is generally a bad idea.
Typically for the problem of backwards compatibility the onus is on the publisher of the library rather than the consumer of the library to make sure it all works by not making breaking API changes.

Related

Nuget package creation and file removal on uninstall

I am creating a package for use in dotnet core web applications.
As of recent, it would appear that Powershell scripts are no longer used to populate and remove files on Nuget package installation/ uninstallation. It would appear that it is all managed using MsBuild. I have successful migrations on installation. The files are populated on a Build target like so:
<Target Name="CopyMyAssets" BeforeTargets="Build">
<ItemGroup>
<MyAssets Include="$(MyAssetsPath)" />
</ItemGroup>
<Copy SourceFiles="#(MyAssets)" DestinationFiles="#(MyAssets->'$(MSBuildProjectDirectory)\App_Plugins\MyPlugin\%(RecursiveDir)%(Filename)%(Extension)')" SkipUnchangedFiles="true" />
</Target>
And have a removal method with a clean target:
<Target Name="ClearMyAssets" BeforeTargets="Clean">
<ItemGroup>
<MyPluginDir Include="$(MSBuildProjectDirectory)\App_Plugins\MyPlugin\" />
</ItemGroup>
<RemoveDir Directories="#(MyPluginDir)" />
</Target>
My question is, on package uninstall through the Nuget package manager, how do I ensure that these files are removed during the uninstall process? The user may not remember to clean before uninstalling, which will leave these orphaned files, and with no package to tell the project what to do, they will no longer be removed on subsequent cleans. Do I need to somehow invoke a project clean on uninstallation, or is there a way of scripting file removal? I have read the MsBuild documentation and I see no uninstall target or replacement for the old uninstall.ps1 scripts that used to be used.
Edit: Additional context. This is a package designed for Umbraco CMS 9 and up. I generated the project using Umbraco templates, but it doesn't differ hugely from the usual template. You use a .targets file to determine file transfers on build.

Dynamic Link Library - include NuGet package libraries in final DLL in Visual Studio [duplicate]

I want to merge one .NET DLL assembly and one C# Class Library project referenced by a VB.NET Console Application project into one command-line console executable.
I can do this with ILMerge from the command-line, but I want to integrate this merging of reference assemblies and projects into the Visual Studio project. From my reading, I understand that I can do this through a MSBuild Task or a Target and just add it to a C#/VB.NET Project file, but I can find no specific example since MSBuild is large topic. Moreover, I find some references that add the ILMerge command to the Post-build event.
How do I integrate ILMerge into a Visual Studio (C#/VB.NET) project, which are just MSBuild projects, to merge all referenced assemblies (copy-local=true) into one assembly?
How does this tie into a possible ILMerge.Targets file?
Is it better to use the Post-build event?
The "MSBuild ILMerge task" (or MSBuild.ILMerge.Task) NuGet package makes this process quite simple. It defaults to merging any "copy local" references into your main assembly.
Note: Although the packages have similar names, this one is different from ILMerge.MSBuild.Tasks that Davide Icardi mentioned in his answer. The one I'm suggesting here was first published in August 2014.
Here an alternative solution:
1) Install ILMerge.MSBuild.Tasks package from nuget
PM> Install-Package ILMerge.MSBuild.Tasks
2) Edit the *.csproj file of the project that you want to merge by adding the code below:
<!-- Code to merge the assemblies into one:setup.exe -->
<UsingTask TaskName="ILMerge.MSBuild.Tasks.ILMerge" AssemblyFile="$(SolutionDir)\packages\ILMerge.MSBuild.Tasks.1.0.0.3\tools\ILMerge.MSBuild.Tasks.dll" />
<Target Name="AfterBuild">
<ItemGroup>
<MergeAsm Include="$(OutputPath)$(TargetFileName)" />
<MergeAsm Include="$(OutputPath)LIB1_To_MERGE.dll" />
<MergeAsm Include="$(OutputPath)LIB2_To_MERGE.dll" />
</ItemGroup>
<PropertyGroup>
<MergedAssembly>$(ProjectDir)$(OutDir)MERGED_ASSEMBLY_NAME.exe</MergedAssembly>
</PropertyGroup>
<Message Text="ILMerge #(MergeAsm) -> $(MergedAssembly)" Importance="high" />
<ILMerge InputAssemblies="#(MergeAsm)" OutputFile="$(MergedAssembly)" TargetKind="SameAsPrimaryAssembly" />
</Target>
3) Build your project as usual.
Some more information that might be useful to some people implementing Scott Hanselman's solution.
When I first set this up it would complain about not being able to resolve references to System.Core, etc.
It is something to do with .NET 4 support. Including a /lib argument pointing to the .NET 4 Framework directory fixes it (in fact just include the $(MSBuildBinPath)).
/lib:$(MSBuildBinPath)
I then found that IlMerge would hang while merging. It was using a bit of CPU and a lot of RAM but wasn't outputting anything. I found the fix on stackoverflow of course.
/targetplatform:v4
I also found that some of the MSBuild properties used in Scott's blog article relied on executing MsBuild from the project's directory, so I tweaked them a bit.
I then moved the targets & ilmerge.exe to the tools folder of our source tree which required another small tweak to the paths...
I finally ended up with the following Exec element to replace the one in Scott's original article:
<Exec Command=""$(MSBuildThisFileDirectory)Ilmerge.exe" /lib:$(MSBuildBinPath) /targetplatform:v4 /out:#(MainAssembly) "$(MSBuildProjectDirectory)\#(IntermediateAssembly)" #(IlmergeAssemblies->'"%(FullPath)"', ' ')" />
UPDATE
I also found Logic Labs answer about keeping the CopyLocal behaviour and just excluding ilMerged assemblies from CopyLocal essential if you are using Nuget packages. Otherwise you need to specify a /lib argument for each package directory of referenced assemblies that aren't being merged.
The article Mixing Languages in a Single Assembly in Visual Studio seamlessly with ILMerge and MSBuild at http://www.hanselman.com/blog/MixingLanguagesInASingleAssemblyInVisualStudioSeamlesslyWithILMergeAndMSBuild.aspx demonstrates how to use ILMerge and MSBuild within a Visual Studio Project.
One issue I found with the article at: http://www.hanselman.com/blog/MixingLanguagesInASingleAssemblyInVisualStudioSeamlesslyWithILMergeAndMSBuild.aspx.
If you have any references that you do not wish to ILMerge then the code in the article fails because it overrides the default CopyLocal behaviour to do nothing.
To fix this - Instead of:
<Target Name="_CopyFilesMarkedCopyLocal"/>
Add this entry to the targets file instead (.NET 3.5 only) (to filter out the non-ilmerge copylocal files, and treat them as normal)
<Target Name="AfterResolveReferences">
<Message Text="Filtering out ilmerge assemblies from ReferenceCopyLocalPaths" Importance="High" />
<ItemGroup>
<ReferenceCopyLocalPaths Remove="#(ReferenceCopyLocalPaths)" Condition="'%(ReferenceCopyLocalPaths.IlMerge)'=='true'" />
</ItemGroup>
</Target>
This is a great article that will show you how to merge your referenced assemblies into the output assembly. It shows exactly how to merge assemblies using msbuild.
My 2 cents - I picked up #Jason's response and made it work for my solution where I wanted to generate the *.exe in the bin/Debug folder with all *.dlls inside the same folder.
<Exec Command=""$(SolutionDir)packages\ILMerge.2.13.0307\Ilmerge.exe" /wildcards /out:"$(SolutionDir)..\$(TargetFileName)" "$(TargetPath)" $(OutDir)*.dll" />
Note: This solution is obviously hardcoded into the ILMerge nuget package version. Please let me know if you have some suggestions to improve.
Edit the *.csproj file of the project that you want to merge by adding the code below:
<Target Name="AfterBuild" Condition=" '$(ConfigurationName)' == 'Release' " BeforeTargets="PostBuildEvent">
<CreateItem Include="#(ReferenceCopyLocalPaths)" Condition="'%(Extension)'=='.dll'">
<Output ItemName="AssembliesToMerge" TaskParameter="Include" />
</CreateItem>
<Exec Command=""$(SolutionDir)packages\ILMerge.3.0.29\tools\net452\ILMerge.exe" /internalize:"$(MSBuildProjectPath)ilmerge.exclude" /ndebug /out:#(MainAssembly) "#(IntermediateAssembly)" #(AssembliesToMerge->'"%(FullPath)"', ' ')" />
<Delete Files="#(ReferenceCopyLocalPaths->'$(OutDir)%(DestinationSubDirectory)%(Filename)%(Extension)')" />
</Target>
Notes:
Replace $(SolutionDir)packages\ILMerge.3.0.29\tools\net452\ILMerge.exe with whatever path you have the ILMerge.exe in.
You can remove the Condition in the target to also merge on Debug but then the Debugger might not work
If you are not excluding anything you can remove: /internalize:"$(MSBuildProjectPath)ilmerge.exclude"
Check out this article by Jomo. He has a quick process to hack ILMerge into the msbuild system
http://blogs.msdn.com/jomo_fisher/archive/2006/03/05/544144.aspx

Specifying files to add to a nuget package in .csproj file

I am creating a nuget package from some code, but also need to deploy some tools with the package.
In a .nuspec file, I can do this with the <files> element, and this all works well.
However when using a .nuspec file, the packageReferences from the csproj file aren't included, and I am seeing some problems when including them manually (with the <dependencies> element).
The package created also always seems to restore as a .net framework package, even though it is targetting .net, as in this question.
I am hoping that all these problems would go away if I moved to using the .csproj format for specifying the nuget package details, but having read the docs I can't find out how to do it.
Does anyone know how it is done?
If not, can anyone shed any light on created a .net framework / .net core nuget package from a .nuspec file, that restores to the correct target version and respects package dependencies?
It's not easy to find/discover, but NuGet's MSBuild tasks docs page has a section called "including content in a package", which tells you about the PackagePath metadata on MSBuild items, that NuGet uses to copy files into the package.
So, in your csproj, you could have something like this:
<ItemGroup>
<None Include="..\MyTool\Tool.exe" PackagePath="tools" Pack="true" />
</ItemGroup>
and then your package will contain tools\Tool.exe. The Pack="true" attribute is required for None elements.
You can use MSBuild's globbing to copy entire directories, if that's easier. Include="..\MyTool\*". My MSBuild skills are not so advanced, so I don't know how to glob ..\MyTool\**\*, which means all files in all subdirectories, while maintaining the correct directory layout in the PackagePath="???" metadata. So the best I can suggest is one glob per directory.

Constrain PackageReference upgrade version when update-package run

Under .NET's older packages.config system for NuGet, I could constrain the possible versions of a package that are considered when packages are updated by using the allowedVersions attribute on the Package element
<package id="Newtonsoft.Json" version="10.0.3" allowedVersions="[10.0.3]" />
When update-package is run within Visual studio for a project including the above, no update will occur for Newtonsoft.Json because I've pinned to 10.0.3 using the allowedVersions attribute.
How can I achieve this under PackageReference? Applying semver syntax to the Version attribute only affects the version restored - it doesn't constrain updates. So if I specify the below PackageReference and run update-package, I will for example be upgraded to 11.0.1 if 11.0.1 is in my NuGet repository.
<PackageReference Include="Newtonsoft.Json" Version="[10.0.3]" />
Background
We rely on command line tooling to update packages because we have both fast moving internal packages (updated multiple times a day) and more stable low moving packages (eg: ASP.NET). On large codebases updating each dependency by hand in .csproj files is simply not scalable for us (and error prone). Under packages.config we can 'pin' the third party packages which we don't want upgraded and also update to the latest fast moving dependencies.
From this answer:
At the moment, this is not possible. See this GitHub issue for tracking.
The cli commands for adding references however support updating single packages in a project by re-running dotnet add package The.Package.Id.
From GitHub Issue 4358:
There is no PackageReference replacement for update yet, the command to modify references is only in dotnet.
You might want to weigh in on the open feature request GitHub issue 4103 about this (4358 was closed as a duplicate). Microsoft hasn't put a high priority on this feature (it was originally opened in October, 2016).
Possible Workarounds
Option 1
It is possible to "update" a dependency by removing and adding the reference. According to this post, specifying the version explicitly with the command will install the exact version, not the latest version. I have also confirmed you can add version constraints with the command:
dotnet remove NewCsproj.csproj package Newtonsoft.Json
dotnet add NewCsproj.csproj package Newtonsoft.Json -v [10.0.3]
What you could do with these commands:
Keep version numbers of packages around in a text file (perhaps just keep it named packages.config).
Use a script to create your own "update" command that reads the text file and processes each dependency in a loop using the above 2 commands. The script could be setup to be passed a .sln file to process each of the projects within it.
Option 2
Use MSBuild to "import" dependencies from a common MSBuild file, where you can update the versions in one place.
You can define your own <IncludeDependencies> element to include specific dependencies to each project.
SomeProject.csproj
<Project Sdk="Microsoft.NET.Sdk">
<IncludeDependencies>Newtonsoft.Json;FastMoving</IncludeDependencies>
<Import Project="..\..\..\Dependencies.proj" />
...
</Project>
Dependencies.proj
<Project>
<ItemGroup>
<PackageReference Condition="$(IncludeDependencies.Contains('Newtonsoft.Json'))" Include="Newtonsoft.Json" Version="[10.0.3]" />
<PackageReference Condition="$(IncludeDependencies.Contains('FastMoving'))" Include="FastMoving" Version="3.332.0" />
</ItemGroup>
</Project>
This has now been implemented as of https://github.com/NuGet/NuGet.Client/pull/2201. If you are using any version of NuGet 5, PackageReference semver constraints should now work as expected.
Pinning - Yet another workaround
This doesn't prevent the update but triggers a build error if one did and update. It won't help much with the use case of automated updates but it may help others that do manual updates and need some way of pinning.
<ItemGroup>
<PackageReference Include="MongoDB.Driver" Version="2.13.*" GeneratePathProperty="true" />
</ItemGroup>
<Target Name="CheckPkgVersions" AfterTargets="AfterBuild">
<Error Condition="!$(PkgMongoDB_Driver.Contains('2.13.'))" Text="MongoDB.Driver must remain at version 2.13.* to be compatible with MongoDB 3.4.21" />
</Target>

Change the referenced dll in a SSIS script task at build / deploy or runtime

Recently I have added all of our SSIS projects into a continuous integration pipeline. The projects are built using MSBuild in TeamCity, packaged and pushed to a nuget feed. We deploy them using Octopus and some hand cranked PowerShell built on the back os SQL server management objects (SMO). It all works, with the exception of one project. The project in question contains some script tasks which reference an external assembly. That assembly is built in the same pipeline and its assembly version numbers are updated by part of the process. The problem lies in the fact that the SSIS project now references a strong named dll in the GAC which does not exist because the version numbers have changed.
Does anyone know of a way to either updated the reference at build time on the CI server or override the version number at the point of deployment?
I know this post is quite old but it has been viewed a lot so here's a solution I have found.
You need to include a 'targets' file for the build.
Here's an example Targets file:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<E10BOs>C:\Integrations\E10 Uplifts\Epicor 10.2.600.3</E10BOs>
</PropertyGroup>
<Target Name="BeforeResolveReferences">
<CreateProperty Value="$(E10BOs);$(AssemblySearchPaths)">
<Output TaskParameter="Value" PropertyName="AssemblySearchPaths" />
</CreateProperty>
</Target>
</Project>
The property group E10BOS (and there can be more than 1) then defines the path to the dll's of the version you want to build against.
It needs to be saved as myTargetsFile.targets
Then in a regular VS project you could add this line to the project file (outside of VS in notepad)
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Import Project="$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), myTargetsFile.targets))\myTargetsFile.targets" />
</Project>
This uses GetDirectoryNameOfFileAbove to search up the folder tree until it finds you targets file and then imports it. Very useful if you have a lot of different projects all requiring a version change and you don't have to figure out any relative paths!
In SSIS however this doesn't seem to work.
What does is hard-wiring the path to the targets file in the package .dtsx file, again edit it in notepad and add the following line (you will probably see the csharp entry near the end of the project tag as before:
<Import Project="C:\Integrations\E10 Uplifts\Epicor 10.2.600.3\myTargetsFile.targets" />
This will pass through the information of the project references into the scripts.
Then with all of you projects using a targets file changing the version is done by changing the path to the version folder you want them to use.
Hope that helps?

Categories

Resources