Keep only one copy of each ClickOnce dependency? - c#

I'm publishing a ClickOnce application that has a few dependencies in the form of DLLs. When I publish the initial version 0.0.1 all the dependencies are copied into the deployment folder as I would expect.
If I make an update and publish version 0.0.2 all of the same dependencies, which are unchanged and have the same file hash, are copied into a second subfolder within the deployment folder.
The folder structure looks like this:
deployment/
MyApplication.application
setup.exe
Application Files/
MyApplication_1_0_0_1/
Dll_A.dll - Hash: 0x111111111111111
Dll_B.dll - Hash: 0x222222222222222
MyApplication_1_0_0_2/
Dll_A.dll - Hash: 0x111111111111111
Dll_B.dll - Hash: 0x222222222222222
Is there any way to only have one copy of Dll_A.dll/Dll_B.dll when they are the same file and only have a new copy if they change?
I'm referring to the size of the directory on the developer's end. Not the size of the download and cache that clients have as I understand ClickOnce already takes care to minimize that.

My issue came from a misunderstanding of how the Application Files directory works.
Yes there is a new folder with the same DLLs made each time, but for upgrades to work you do not need to keep all of the folders in it. You can erase all of them except for the most recent release and have no ill effects.
Even if someone is between versions/missed upgrades they will simply pull the latest version when their program updates. They don't need the intermediate versions they missed inbetween.

I don't believe ClickOnce allows for that granular of control.
Edit:
Found this, it may help with your question on the cleanup.
ClickOnce deployment is leaving multiple versions (yes, more than two)

Related

NET 5 DLL is present but does not load

This question is about why an assembly cannot find a library that resides in the same folder. As far as I can tell, the bitness of the caller (x64) agrees with the unfindable NuGet package (AnyCPU).
Even when I recompile everything to AnyCPU, the error message remains:
Could not load file or assembly 'ChoETL.Core, Version=1.2.1.28, etc.' The system cannot find the file specified.
My published NET 5 exe (x64) file calls a published NET 5 library that calls the Nuget package ChoETL.Core in the error message (exe->lib->ChoETL.Core).
Here is the command line that I used to build the program exe and intermediate library .csproj files. I do not use Visual Studio and do not use the solution .sln file. All files build without errors or warnings.
msbuild -t:restore;publish /p:Platform=x64 /p:PlatformTarget=x64 /p:Configuration=Debug /p:TargetFramework=net5.0-windows7.0 /p:RuntimeIdentifier=win-x64 /p:PublishDir=c:\dev\holding\core.plt xxx.csproj
Here is the reference in the .csproj file of the library that references the ChoETL.Core package. Keep in mind that the msbuild command line properties override the AnyCPU values in the csproj file below.
<PropertyGroup>
<UseWindowsForms>true</UseWindowsForms>
<Platforms>AnyCPU</Platforms>
<PlatformTarget>AnyCPU</PlatformTarget>
<TargetFramework>net5.0-windows7.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="ChoETL" Version="1.2.1.28" />
</ItemGroup>
I thought maybe the CorFlags values might be the issue, so I checked them. There were as follows (32bitreq/prefs were zero):
Program.exe: PE32+, flags 0x9, ILONLY 1, Signed 1
Library.dll: PE32+, flags 0x9, ILONLY 1, Signed 1
ChoETL.Core: PE32, flags 0x1, ILONLY 1, Signed 0
PE32+ means x64, and PE32 means AnyCPU (if the assembly is loaded into a 64 process, it runs as a 64-bit assembly). So, they should be compatible (and besides, I cannot compile the NuGet package).
I looked at this SO question, but it did not help.
Could not load file or assembly, system could not find the file specified
Does anyone have any idea why the library cannot find the ChoETL.Core.dll library when all three assemblies are side by side in the same folder?
I've spent days on the problem without progress or success. Thank you.
After yet another day of debugging, I have things working, but I don't know exactly why. I started with a simple console program, then exe + lib, then exe + lib (that called ChoETL), and finally made a test case with part of my real system. By the time I got to experiments with part of the real system, things were working okay. But fragile, I imagine.
Here are some theories that might help the next person.
One key issue (in the real system that was failing) was that the ChoETL package was not being published by the msbuild ... -t:clean;restore;publish operation. I think this was a major part of the problem. Yet, as I built up my test experiments using the same msbuild command lines, ChoETL was always being published correctly beside the executables and libraries that consumed the package.
At that time, I recognized that my experiments always completely deleted the bin and obj folders before starting a new build, whereas my real system build files did not take that drastic action (thinking that msbuild clean/restore would be sufficient for a clean build). But, not so. Msbuild is not that smart. I modified my real-system build files to completely delete all obj/bin folders in the 70-project build sequence at the start of the build, and ChoETL was correctly published beside the assemblies that consumed it.
And at last, the original problem was solved (but not really understood). The system could find ChoETL and load it. All I know is that completely deleting the obj and bin folders at the start enabled a successful build. I do not have a clear idea of what file(s) or data elements in the obj/bin folders might have prevented a successful publish operation. Maybe it was the project.assets.json file, or maybe it was one of the other json files.
I do know that msbuild does not easily regenerate the json files when csproj files and msbuild command line properties are changed. I have developed the habit of completely deleting the obj/bin folders in the months since starting to work with NET 5 because of such problems.
I thought msbuild would do dependency and timestamp analysis like make does, and regenerate the json files if they were "out of date" (whatever that might mean in this context; I don't know). But it does not. Thus I have started to develop the habit of completely deleting the obj/bin folders before a build. Obviously, this experience (a development delay of weeks) shows that I still have much further to go with developing that into a rigid habit. I hope this answer might help someone else.

Wix ToolSet Patch Creation Using "Patch Creation Properties"

First of all, I'd like to iterate that I am a complete noob regarding installers and patches, and have been living by articles from the interwebz.
Quick background: We've created an installer that "installs" a web app - creates IIS services, databases, etc. For the succeeding releases, we plan on using patches for the minor upgrades. We use .NET C# for this app.
I've been trying to create a patch for the project using the "Patch Creation Properties" tutorial from the wix site: http://wixtoolset.org/documentation/manual/v3/patching/patch_building.html
I managed to create a patch, and it does work, but I've noticed that it only changes html, js, and css files, along with the web config. Any changes that I made on .cs files were not reflected. I'm assuming that DLL files are not being replaced by the patch.
Below is my config for the patch.wxs:
<PatchCreation
Id="{real guid heere}"
CleanWorkingFolder="yes"
OutputPath="C:\Outputpath\patch.pcp"
WholeFilesOnly="yes" >
<PatchInformation
Description="Project 3.0.10 Patch"
Manufacturer="project"/>
<PatchMetadata
AllowRemoval="yes"
Description=" Patch v3.0.10"
ManufacturerName="ManufacturerName"
TargetProductName="TargetProductName"
MoreInfoURL="www.google.com"
Classification="Update"
DisplayName="ManufacturerNamePatch"
MinorUpdateTargetRTM="1"/>
<Family DiskId="5000"
MediaSrcProp="Patch"
Name="patchtest">
<UpgradeImage SourceFile="C:\output\test\new\admin\Setup.msi"
Id="UpgradeImage">
<TargetImage SourceFile="C:\output\test\old\admin\Setup.msi"
Order="2"
Id="TargetImage"
IgnoreMissingFiles="no" />
</UpgradeImage>
</Family>
<PatchSequence PatchFamily="SCMPatchFamily"
Supersede="yes" />
</PatchCreation>
I did a patch install with log, and noticed this:
MSI (s) (18:F0) [17:37:18:316]: File: C:\Location\Website.Web.dll; Won't Overwrite; Won't patch; Existing file is of an equal version
I've been scouring the net for answers and haven't found a fix for this.
Thank you very much.
The clue is in the message! Service packs, patches, hotfixes all depend on updating binaries based on the file version. Not only does this speed things up by not installing files that don't need changing, it ensures that you don't overwrite a higher version with an old version. Creation dates (as in the idea that a file should be replaced because mine is newer) do not apply. File versiions are also useful for identifying whether a client has an up-to-date version. So increment the file version, and if it's managed code you don't need to change AssemblyVersion, just add an AssemblyFileVersion for the files that have actually been changed, incremented above the existing installed versions.
https://msdn.microsoft.com/en-us/library/aa367835(v=vs.85).aspx

How to force deployment project to update files during installation of newer version?

I have a Deployment Project for my VS2008 C# application. When installing a new version of my application I want to make sure that files are updated with the files contained in the installer.
I want my installer to automatically remove any old version of the installed application. To do that I follow this procedure (also described here):
Set RemovePreviousVersions to True
Set DetectNewerInstalledVersion to
True
Increment the Version of the
installer
Choose Yes to change the ProductCode
For the assemblies I make sure the AssemblyVersion is set to a higher version:
[assembly: AssemblyVersion("1.0.*")]
Everything is working as intended except for my configuration files (xml files). Since I cannot find a way to "version" these files I cannot make sure that the files are updated if they have been modified on the target machine.
Is there any way to do this and how?
UPDATE: I have tried the approach/solution/workaround found here
Include the file directly in a
project with the following
properties: "Build Action -> Content
file" and "Copy to Output Directory
-> Copy always"
Then add it to the deployment
project via Project
Output->Database->Content Files
Unfortunately it did not make any difference. The behavior is exactly the same.
Add the following property to the Property table of the MSI:
Property REINSTALLMODE with Value amus
Note: This will cause all the files in the MSI (versioned and nonversioned) to overwrite the files that are on the system.
If you're willing to use Orca (there may be another way to do this method, but it's the only one I know of) you may be able to set up RemoveFile directives.
See here for a typically light-weight MSDN document -- could be a starting point.
http://msdn.microsoft.com/en-us/library/aa371201.aspx
Alternatively you could always create a very simple bootstrapper executable that simply calls "msiexec /i REINSTALLMODE=oums" (or whichever command-line switches are needed). Call it setup.exe if you like...
Better option long-term may be to switch to InstallShield or similar -- VS2010 includes a light version of IS and I believe they're moving away from vdproj projects.
Have you tried the approach/solution/workaround found here?
Include the file directly in a
project with the following
properties: "Build Action -> Content
file" and "Copy to Output Directory
-> Copy always"
Then add it to the deployment
project via Project
Output->Database->Content Files
I may be incorrect here, and therefore I am exposing myself to down votes, but here goes anyway!
I believe it is on purpose that config files do not automatically get overwritten; the principle there being that you would not normally want your application's configuration overwritten when you install the new version of the program... at least not without numerous warnings and/or chances to merge configuration first.
Having your application configuration get overwritten by an updated version of a program could make for a very upset end user (in this case, web site admin).
Further, I know that sometimes, the developer may be the person doing the deployment. In such a case, this behavior might not seem so logical when deploying a single site to a single server; but when roles are split and/or you have multiple servers with different configurations, this can be a life saver.
You need to include the new version of your files in your custom installer and manually install these file during Custom Install routine is called
This must be applied to any file that does not have version that can be tracked by the installer.

.NET Dependency Management and Tagging/Branching

My company is having trouble figuring out the best way to manage our builds, releases, and branches... Our basic setup is we have 4 applications we maintain 2 WPF applications and 2 ASP.NET applications, all 4 of these applications share common libraries, so currently they are all in one folder /trunk/{app1, app2, app3, app4}.
This makes it very hard to branch/tag a single application because you are branching all 4 at the same time, so we would like to separate it out into something like {app1,app2,app3,app4}/{trunk,tags,branches} but then we run into the issue of where to put the shared libraries?
We can't put the shared libraries as SVN externals because then when you branch/tag the branch is still referencing the trunk shared libs instead of having them branched as well.
Any tips? Ideas?
We are currently using svn and cruisecontrol.net.
EDIT: The shared libraries are changing often as of right now, which is why we can't use them as svn externals to trunk, because we might be changing them in the branch. So we can't use them as binary references.
Its also very hard to test and debug when the libraries are statically built instead of including the source.
I guess it all depends on how stable the shared libraries are. My preference would be for the shared libraries to be treated as their own project, built in CruiseControl like the others. Then the four main applications would have binary references to the shared libraries.
The primary advantage with this approach is the stability of the applications now that the shared libraries are static. A change to the libraries wouldn't affect the applications until they explicitly updated the binaries to the newer version. Branching brings the binary references with it. You won't have the situation where a seemingly innocuous change breaks the other three applications.
Can you clarify why you don't like branching all four applications at the same time?
This makes it very hard to branch/tag a single application because you are branching all 4 at the same time
I usually put all my projects directly under trunk as you are currently doing. Then when I create a release branch or a feature branch, I just ignore the other projects that get carried along. Remember, the copies are cheap, so they're not taking up space on your server.
To be specific, here's how I would lay out the source tree you've described:
trunk
WPF1
WPF2
ASP.NET 1
ASP.NET 2
lib1
lib2
branches
WPF1 v 1.0
WPF1
WPF2
ASP.NET 1
ASP.NET 2
lib1
lib2
WPF1 v 1.1
WPF1
WPF2
ASP.NET 1
ASP.NET 2
lib1
lib2
lib1 payment plan
WPF1
WPF2
ASP.NET 1
ASP.NET 2
lib1
lib2
We are kicking off an open source project to try and deal with this issue. If anyone is interested in commenting on it or contributing to it, it's at:
http://refix.codeplex.com
I agree with #Brian Frantz. There's no reason to not treat the shared libraries as their own project that is built daily and your projects take binary dependency on the daily builds.
But even if you want to keep them as a source dependency and build them with the app, why wouldn't the SVN externals approach work for you? When you branch particular app, there's no need to branch the shared library as well, unless you need a separate copy of it for that branch. But that means, it not a shared library anymore, right?
I've tried solving this problem several ways over the years, and I can honestly say there is no best solution.
My team is currently in a huge development phase and everyone basically needs to be working off of the latest and greatest of the shared libs at any given time. This being the case we have a folder on everyone's C: drive called SharedLibs\Latest that is automatically synced up with the latest development release of each of our shared libraries. Every project that should be drinking from the firehose has absolute file references to this folder. As people push out new versions of the shared libs, the individual projects end up picking them up transparently.
In addition to the latest folder, we have a SharedLibs\Releases folder which has a hierarchy of folders named for each version of each shared lib. As projects mature and get towards release candidate phase, the shared lib references are pointed to these stable folders.
The biggest downside to this is that this structure needs to be in place for any project to build. If someone wants to build an app 10 years from now, they will need this structure. It is important to note that these folders need to exist on the build/CI server as well.
Previous to doing this, each solution had a lib folder that was under source control containing the binaries. Each project owner was tasked with propagating new shared dlls. Since most people owned several projects, things often fell through the cracks for the projects that were still in the non-stable phase. Additionally TFS didn't seem to track changes to binary files that well. If TFS was better at tracking dlls we probably would have used a shared libs solution / project instead of the file system approach we are taking now.
Apache NPanday + Apache Maven Release
... might solve your problems
It gives you dependeny management (transitive resolving), strong versioning support, and automatic tagging/branching on 14+ version control systems, including SVN.
Give me a hint, if I should elaborate more.
I think there is no way you can avoid versioning and distributing your shared libs as separate artifacts, but Maven helps you alot doing that!
And you can allways do tricks to get it all opened in one Solution :-)
A sample workflow:
Dev 1 build A locally using Maven
Checks in sources
Build server builds A and deploys so-called SNAPSHOT-Versions to Repository Manager (e.g. Nexus)
Dev 2 two loads B, NPanday will automatically resolve the A-libs from the Repository Manager (No need to get the source and build)
Dev 1 wants to release A: Maven Release creates a branch or a tag with your source, finalizes the Version (removing SNAPSHOT) and deploys the artifacts to a Repository Manager.
Dev 2 can now upgrade B to use the final release of A (change entry in xml, or use VS-addin to do so)
Now Dev 2 can release B, again with automatic creation of tag or branch and deployment of built artifacts.
If you want to provide zipped packages as output from your build, Maven Assembly Plugin will help you do that.
You can use Apache/ IVY in standalone mode.
http://ant.apache.org/ivy/history/latest-milestone/standalone.html
I need to emphasize "stand alone" mode. If you google for examples....you will find alot of (not standalone) ones.
Basically, IVY works on this premise.
You publish binaries (or any kind of file, but I'll say binaries from this point forward).....as little binary-packages.
Below is PSEUDO code, do not rely on my memory.
java.exe ivy.jar -publish MyBinaryPackageOne.xml --revision 1.2.3.4 (<< where the .xml refers to N number of files that make up the one package.))
"Package" simply means a group of files. You can include .dll and .xml and .pdb files in a package (what I do with a DotNet build of assemblies). Or whatever. IVY is file-type agnostic. If you want to put WordDocs up there you could, but sharepoint is better for documents.
As you make bug fixes to your code, you increment the revision.
java.exe ivy.jar -publish MyBinaryPackageOne.xml --revision 1.2.3.5
then later you can retrieve from IVY what you want.
java.exe ivy.jar -retrieve PackagesINeed.xml
PackagesINeed.xml would contain information about the packages you want.
something like
"I want version '1.2+ of the MyBinaryPackageOne"
(defined in xml)
As you build your framework binaries...you PUBLISH to IVY.
Then, as you develop and build your code...you RETRIEVE from IVY.
In a NUTSHELL, IVY is a repository for FILES (not source code).
Ivy then becomes the definitive source of your binaries.
None of the "Hey, Developer-Joe has the binaries we need" kind of bull-mess.
.......
Advantages:
1. You do NOT keep your binaries in source control. (and thus do not BLOAT your source control).
2. You have ONE definitive source for binaries.
3. Through xml configuration, you say which versions you need for a library.
(In the example above, if version 2 (2.0.0.0) of MyBinaryPackageOne is published to IVY (let's assume with breaking changes from 1.2.x.y)...then you are OK, because you defined in your retrieve (xml configuration file) .. .that you only want "1.2+". Thus your project will ignore anything 2+...unless you change the configuration package.
Advanced:
If you have a build machine (CruiseControl.NET for example)....you can write logic to publish your (newly built) binaries to IVY after each build.
(Which is what I do).
I use the SVN revision as the last number in the build number.
If my SVN revision was "3333", then I would run something like this:
java.exe ivy.jar -publish MyBinaryPackageOne.xml --revision 1.2.3.3333
Thus whenever retrieve the package for revision "1.2.3+" .... I'll get the latest build.
In this case, I would get version 1.2.3.3333 of the package.
It's sad that IVY was started in 2005 (well, that's the good news)...but that NUGET didn't come out til 2010? (2011?)
Microsoft was 5-6 years behind on this one, IMHO.
I would never go back to putting binaries in source control.
IVY is very good. It is time proven. It solves the problem of DEPENDENCY management.
Does it take a little bit of time to get comfortable with it?
Yep.
But it is worth it in the end.
My 2 cents.
.................
But idea #2 is
Learn how to use NUGET with a local (as in..local to your company) repository.
That is the about the same thing as IVY.
But having looked at NUGET, I still like IVY.

How to publish a beta version of a ClickOnce application?

I want to publish a beta version of my application every time it builds, so users can access the "beta" version and test features out before a general release.
I tried doing this by overriding the ProductName while running it to [product]-beta. The problem is the Publish process still creates a [product].application and it seems that the ClickOnce magic doesn't know the difference between a [product].application on one URL and a [product].application on another.
Any idea of how I would get around this?
I ran into a very similar problem and here is the solution I came up with.
I put all of my GUI forms into a DLL including the main startup form. I then created 2 EXE projects which reference my GUI dll. One has the name Product and the other ProductBeta.
The code in the EXE is virtually the same between both of them. Namely Application.Run(new MainForm()).
I then set them to publish to sub-directories on the same share.
It's annoying and has a bit of overhead but the results work very well.
As you've discovered, modifying the product name isn't sufficient. You need to modify the assembly name.
Details from http://weblogs.asp.net/sweinstein/archive/2008/08/24/top-5-secrets-of-net-desktop-deployment-wizards.aspx
The most important thing is having
support for multiple environments -
this isn't built in, and if you
attempt to deploy two different
ClickOnce builds with the same
deployment name to different sites,
the latest build will take precedence
and effectively overwrite the existing
deployment on the desktop.
The fix for this is relatively
straightforward - you need to provide
different deployment name for each
build. Like so -
<MSBuild
Projects="ClickOnce.csproj"
Targets="Publish"
Properties="
MinimumRequiredVersion=$(MinimumRequiredVersion);
ApplicationVersion=$(ApplicationVersion);
ApplicationRevision=$(ApplicationRevision);
CodeBranch=$(CodeBranch);
DeployEnv=$(DeployEnv)
AssemblyName=ClickOnce.$(DeployEnv);
PublishUrl=$(PublishUrl);
ProductName=ClickOnce $(CodeBranch) $(DeployEnv)" />
The one limitation of this approach is
that project references will no longer
work. Use file based assembly refs,
and it'll be fine.

Categories

Resources