For our new project we decided to use .NET 5 for compatibility and we chose to split the project into different libraries to maintain it easily.
The problems come out now that we are trying to put them together.
We would like to have a hierarchy of libraries where the middle level reference the low level ones and then in the main project reference the middle ones and automatically gains the references to the low levels. But this is not working as expected.
We started using direct reference to .dll differentiating by Debug and Release version for debugging as follows:
<ItemGroup Condition=" '$(Configuration)' == 'Debug' ">
<Reference Include="Model.Core">
<HintPath>..\..\Model.Core\Model.Core\bin\Debug\net5.0\Model.Core.dll</HintPath>
</Reference>
<Reference Include="Services.Logging">
<HintPath>..\..\..\Services\Services.Logging\Services.Logging\bin\Debug\net5.0\Services.Logging.dll</HintPath>
</Reference>
</ItemGroup>
<ItemGroup Condition=" '$(Configuration)' == 'Release' ">
<Reference Include="Model.Core">
<HintPath>..\..\Model.Core\Model.Core\bin\Release\net5.0\Model.Core.dll</HintPath>
</Reference>
<Reference Include="Services.Logging">
<HintPath>..\..\..\Services\Services.Logging\Services.Logging\bin\Release\net5.0\Services.Logging.dll</HintPath>
</Reference>
</ItemGroup>
The problem with this approach is we don't get indirect references resolved (and them are not included inside the middle library) so when running a program we got a FileNotFoundException about the low level library.
Low-level libraries: Logger (wrapper for Serilog NuGet package), Model.Core
Middle-level library: Model.FileSystem
Running program
This is a call example:
Running program -> Model.FileSystem.ReadConfiguration() -> Logger.Log()
At this point we get the FileNotFoundException.
I know we can solve the problem by manually coping the necessary dlls directly inside the bin folder of running program but we would like to have it done automatically.
An alternative way can be to pack all libraries inside NuGet packages and then refer to them. This way references are automatically resolved but it seems over-complicated for our needs. I'm surely missing something but now this seems the procedure and it's quite painful while debugging:
Create the package with Debug configuration
Move the package to the local or private repository
Compile the project referencing the package
Debug
Once the debug ends, repeat 1, 2 and 3 with Release configuration version.
Imho this way there are too many manually parts needed each time we change something and could lead to problems mixing Debug and Release packages when forgetting about changing the package (can happen while compiling many many times).
I liked the direct reference where you can target Release and Debug versions of the same library according to main configuration but it doesn't work.
Sorry for being so verbose but it's quite complicated to explain this problem and I'm not sure I was clear. I asked this question even here How to reference private library in debug/release mode.
Thanks in advance to anyone tries to help me.
I have chosen to include in each solution the project needed for references. This way there are no missing libraries while running.
Not really what I wanted but it's the easiest way.
Related
I have a library project that extends some functionality on EntityFrameworkCore. I'm looking to support both 2.* and 3.*. My project is setup like so:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>netstandard2.0;netcoreapp3.0</TargetFrameworks>
[...]
</PropertyGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netcoreapp3.0' ">
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="3.0.0" />
</ItemGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netstandard2.0' ">
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="2.2.6" />
</ItemGroup>
[...]
</PropertyGroup>
</Project>
In the code I'm using the function EntityTypeExtensions.FindProperty(...). The signature of this function changes between 2.2.6 and 3.0.0.
The project's code (incorrectly?) uses the signature for 2.2.6. This compiles properly (which shouldn't be the case?) in both target frameworks.
I have a unit test project that multi-targets and has conditional references, much like the original project:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>netcoreapp3.0;netcoreapp2.0</TargetFrameworks>
[...]
</PropertyGroup>
[...]
<ItemGroup Condition=" '$(TargetFramework)' == 'netcoreapp2.0' ">
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="2.2.6" />
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="2.2.6" />
</ItemGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netcoreapp3.0' ">
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="3.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="3.0.0" />
</ItemGroup>
[...]
</Project>
All unit tests (incorrectly?) pass in both target frameworks.
Note that even though it builds and tests pass, when the library is used in a netcore3 project (which references efcore 3.0.0 directly) it throws the following exception. Which seems completely reasonable, I just don't understand why it allowed me to get to this point.
System.MissingMethodException: Method not found: 'Microsoft.EntityFrameworkCore.Metadata.IProperty Microsoft.EntityFrameworkCore.EntityTypeExtensions.FindProperty(Microsoft.EntityFrameworkCore.Metadata.IEntityType, System.Reflection.PropertyInfo)'.
Questions:
Is there a way around this so it gets picked up as an error/warning/something, at least, during the build?
Is the solution to this to use preprocessor directives around the call to .FindProperty(...) and based on the framework make the correct method call? Isn't there a way to do this based on the version of efcore instead of the dependency?
Is there a way to unit test this properly with the different packages? Right now as it is, I expected the unit tests to fail in one of the versions since the method does not exist.
Source repository and specifically the call to FindProperty can be found here.
Sample netcore3 project that results in a MissingMethodException when calling the library can be found here.
Stack trace of the exception can be found here.
I have good news and bad news. The good news is that the problem is with your package, and everything works just how you appear to believe it should work. The bad news is I don't know how your package got incorrectly authored.
Steps to verify:
Download Panner.Order version 1.1.0 from nuget.org (you've published 1.1.1 since asking this questions, which has the same, but different, problem). If you have NuGet Package Explorer installed, open the nupkg with that, expand the lib/ folder and double click each of the .dll files. Alternatively you can extract the nupkg as a zip file then use ILSpy or ILDasm or whatever else you want to inspect the assemblies. Notice that both the netstanard2.0 and netcoreapp3.0 assemblies have the same assembly references. In particular the Microsoft.EntityFrameworkCore.dll reference is for version 2.2.6.0, even though we'd expect the netcoreapp3.0 version to use version 3.0.0.0. Therefore I conclude that your netstandard2.0 assembly was copied incorrectly into the netcoreapp3.0 folder of your package. Your 1.1.1 package has the opposite problem. Both the netstandard2.0 and netcoreapp3.0 folders contain the netcoreapp3.0 assembly, so your package doesn't work with projects that try to use the netstandard2.0 assembly.
However, I have no idea why this happens. When I clone your repo and run dotnet pack and check the generated nupkg, I can see that the netstandard2.0 and netcoreapp3.0 assemblies have different references, so I'm confident that the package I generated locally should work. You need to investigate why the packages you publish are not being generated correctly.
To quickly answer your questions:
Is there a way around this so it gets picked up as an error/warning/something, at least, during the build?
It will, as the problem was not with the project, but with the package. If you multi-target your project and call an API that does not exist in at least one of the TFMs, you will get a compile error.
Is the solution to this to use preprocessor directives around the call to .FindProperty(...) and based on the framework make the correct method call? Isn't there a way to do this based on the version of efcore instead of the dependency?
When you call APIs that are different in different TFMs, yes, you can use #if to change your code per project TFM, as described in ASP.NET Core's docs when migrating to 3.0.
I'm going to ignore the "based on the version of efcore" because I'm a detail oriented person, and I don't want to write one thousand words for something that ultimately doesn't matter. The key is that in this scenario, you don't need to. You used conditions on your package references to bring in a different version of efcore per project TFM, so each time your project gets compiled, it's using a different version of efcore, but only one version per compile target. Therefore you don't need runtime selection of different versions of efcore.
Is there a way to unit test this properly with the different packages? Right now as it is, I expected the unit tests to fail in one of the versions since the method does not exist.
You multi-target your test project, but I see you've done that already. Since you're using a project reference, the test won't detect package authoring problems like what's happening.
If you really want to test the package, rather than your code, you could use a nuget.config file to add a local folder as a package source, then your multi-targeting test project references the package, not the project. You'd probably want to also use the nuget.config file to set the globalPackagesFolder to something that's in .gitignore because NuGet considers packages to be immutable and if a debug version of your package gets into your user profile global packages folder, every project you use on that machine (that uses your user profile global packages folder) will use that debug version, making it more difficult for you to make updates. For customers who want to test packages, rather than projects, I highly recommend using SemVer2's pre-release labels and create a unique package version for every single build to reduce the risk of testing a different version than you intend.
Using package reference rather than project reference is a pain, because it's no longer as simple as writing code and then running the test. You'll need to change code, compile the project that gets generated into a package, copy the package into the package source folder if you haven't automated that, update the package version in your test project, then compile and run the test project. I think you're better off keeping the project reference. Fix the package authoring problem and then trust the tooling works.
Not to directly answer all questions above one by one, just to describe the cause of the original issue and some suggestions.
In the code I'm using the function
EntityTypeExtensions.FindProperty(...). The signature of this function
changes between 2.2.6 and 3.0.0.
According to your description, I assume you may use code like EntityTypeExtensions.FindProperty(entityType, propertyInfo); in your original project.
For Microsoft.EntityFrameworkCore 2.2:
FindProperty (this Microsoft.EntityFrameworkCore.Metadata.IEntityType entityType, System.Reflection.PropertyInfo propertyInfo); second parameter=>PropertyInfo
For Microsoft.EntityFrameworkCore 3.0:
FindProperty (this Microsoft.EntityFrameworkCore.Metadata.IEntityType entityType, System.Reflection.MemberInfo memberInfo); second parameter=>MemberInfo
However, please check PropertyInfo Class, you'll find:
Inheritance: Object->MemberInfo->PropertyInfo
And I think that's the reason why the project's code uses the signature for 2.2.6 but it compiles properly in both target frameworks. And it's the cause of other strange behaviors you met after that...
So for this issue, you could use the signature for 3.0.0(MemberInfo) in code instead of 2.2.6(PropertyInfo) to do the test. I think the build will fall as you expected. And as Heretic suggests in comment, for multi-target project, use #if is a good choice.
Hope all above makes some help and if I misunderstand anything, please feel free to correct me :)
I created a C# application (MyAppV1) that requires a third party API. My application needs to work with multiple versions of this API, but only a single version at one time. I have setup my solution to change the reference and using statement for different build configurations and I create multiple executable files that each target a different API version.
Presently I have this situation:
MyAppV1_ThirdPartyV1.exe uses ThirdPartyV1.dll
MyAppV1_ThirdPartyV2.exe uses ThirdPartyV2.dll
MyAppV1_ThirdPartyV2_5.exe uses ThirdPartyV2.dll (they didn't change
the library name for the minor version of their software)
MyAppV1_ThirdPartyV3.exe uses ThirdPartyV3.dll
I would like to be able to maintain a list of the versions, perhaps in an App.config and load the appropriate dll library at runtime. I'm having trouble knowing where to begin with this. Is this an appropriate strategy? I'm not sure how best to handle this situation. Multiple versions of my application the only differ with the referenced library seems very clunky to me.
Much of the information I find is related to supporting multiple frameworks, handling the requirement of two versions of the same library downstream at the same time, or needing to load both at the same time. I can't find information on how to handle my particular situation.
This is possible on project level. You can build different configurations in solution and when you add references as below, it will take the desired DLLs
<Choose>
<When Condition="'$(Configuration)|$(Platform)'=='YourSpecialConfiguration1|x64'"><!-- attention here -->
<ItemGroup>
<Reference Include="your.dllv1.name">
<HintPath>yourDllPath_v1\your.dllv1.dll</HintPath><!-- attention here -->
<Private>true</Private>
</Reference>
<!-- more references here -->
</ItemGroup>
</When>
<When Condition="'$(Configuration)|$(Platform)'=='YourSpecialConfiguration2|x64'"><!-- attention here -->
<ItemGroup>
<Reference Include="your.dllv2.name">
<HintPath>yourDllPath_v2\your.dllv2.dll</HintPath><!-- attention here -->
<Private>true</Private>
</Reference>
<!-- more references here -->
</ItemGroup>
</When>
<Otherwise>
<ItemGroup>
<Reference Include="your.dllname">
<HintPath>yourRegularPath\your.dllname.dll</HintPath><!-- attention here -->
<Private>true</Private>
</Reference>
<!-- AND more references here -->
</ItemGroup>
</Otherwise>
</Choose>
What you see above - option 1.
Option 2 - Different projects for each version. Downside - if you add a file or reference , you need to add to each project
Option 3 - Add all references but declare different namespace aliases (in reference property window) for each. Then in code do conditional compilation like
ISomething myVar;
#if V1
myVar = new namespace1.ClassX();
#elif V2
myVar = new namespace2.ClassX();
#else
. . . .
#endif
And lastly:
"I would like to be able to maintain a list of the versions, perhaps in an App.config and load the appropriate dll library at runtime."
- you probably don't need non of these. You just need to produce your packages with different versions. Loading at runtime will require more coding work while still supplying all DLLs because you don't know what you going to load next time.
I have a solution containing several projects. Let's say PackageA and PackageB, where PackageB depends on PackageA with a ProjectReference.
Each project is set to also output a NuGet package on build. This process itself works perfectly but I am unable to specify a package version-range for individual builds.
E.g. I'd like to restrict PackageB to only refer to PackageA version 1.0.* (patch steps).
<Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0">
<PropertyGroup
<TargetFrameworks>netstandard2.0;netcoreapp2.0;net46</TargetFrameworks>
<RootNamespace>PackageB</RootNamespace>
<Company>MyCompany</Company>
<Authors>John Doe</Authors>
<Description>This package depends on a specific version of PackageA.</Description>
<Version>1.1.0</Version>
<Copyright>Copyright © 2018 John Doe</Copyright>
<GeneratePackageOnBuild>true</GeneratePackageOnBuild>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\PackageA\PackageA.csproj" />
</ItemGroup>
</Project>
MSBuild seems to ignore any Version="1.0.*" or AllowVersion="1.0.*" arguments within the ProjectReference tag.
Is there a possibility to specify a version range without breaking the ProjectReference or using PackageReference?
Short Answer
No, there is no way to limit a project reference by a version attribute of that project.
Longer Answer
If you want your dependent package to vary independently from its dependency and limit the range of changes it will depend upon, you are very much in need of using a package reference rather than a project reference (yes, even if those projects are in the same solution).
Project Reference Now
When you reference a project, you're making a declaration to your IDE that you want to include the referenced projects design time state in your dependent projects design time state so that you can use it and see changes to it in your IDE before it's built. When your dependent project is built, its dependency is built too. So, A project reference is always a latest-version reference. You cannot reference a previous version of a project, but you can reference the versioned result of a project that was built previously.
Packing a Project Reference
In line with project references being built when the dependent project is built, when you pack a project with a dependency upon another project using a project reference, dotnet pack and nuget pack assume that you're going to also be packing each of those projects as packages as well, and writes the project reference as a package dependency at the same version of the dependent project package. So, if you pack projB # v1.2.3 the package will have a dependency reference to projA # v1.2.3. If you don't pack projA # v1.2.3 or you don't publish that package (because maybe there weren't any changes to it), consumers of projB # v1.2.3 will fail the install because nuget won't find projA # v1.2.3. If you're going to insist on using project references for packages, those referenced projects should also be packages that are versioned with their host (whether they change or not).
A minor exception to the above reference rule
The exception to project references listed as package dependencies of the same version as the host is a project reference that has its assets marked as private. In those situations you either need to create a build target that will include those assets in the package or have some other convention in place to deliver the dependency to the runtime. Using the private assets route does not allow you to do what you're asking, but it is an exception to the rule of project reference becoming a LISTED dependency of your package.
Existing NuGet targets don't support this directly. A couple of issues on GitHub (1, 2) requesting this functionality have been open for years. However, with a bit of MSBuild item trickery, I was able to 'extend' ProjectReference with two attributes, PackageVersion and ExactVersion:
<!-- MyProject.csproj -->
<Project Sdk="Microsoft.NET.Sdk" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
...
<ItemGroup>
<ProjectReference Include="..\MyProject1\MyProject1.csproj" PackageVersion="[1.1.0, 2.0.0)" />
<ProjectReference Include="..\MyProject2\MyProject2.csproj" ExactVersion="true" />
<ProjectReference Include="..\MyProject3\MyProject3.csproj" />
</ItemGroup>
...
<Target Name="UseExplicitPackageVersions" BeforeTargets="GenerateNuspec">
<ItemGroup>
<_ProjectReferenceWithExplicitPackageVersion Include="#(ProjectReference->'%(FullPath)')"
Condition="'%(ProjectReference.PackageVersion)' != ''" />
<_ProjectReferenceWithExactPackageVersion Include="#(ProjectReference->'%(FullPath)')"
Condition="'%(ProjectReference.ExactVersion)' == 'true'" />
<_ProjectReferenceWithReassignedVersion Include="#(_ProjectReferencesWithVersions)"
Condition="'%(Identity)' != '' And '#(_ProjectReferencesWithVersions)' == '#(_ProjectReferenceWithExplicitPackageVersion)'">
<ProjectVersion>#(_ProjectReferenceWithExplicitPackageVersion->'%(PackageVersion)')</ProjectVersion>
</_ProjectReferenceWithReassignedVersion>
<_ProjectReferenceWithReassignedVersion Include="#(_ProjectReferencesWithVersions)"
Condition="'%(Identity)' != '' And '#(_ProjectReferencesWithVersions)' == '#(_ProjectReferenceWithExactPackageVersion)'">
<ProjectVersion>[#(_ProjectReferencesWithVersions->'%(ProjectVersion)')]</ProjectVersion>
</_ProjectReferenceWithReassignedVersion>
<_ProjectReferencesWithVersions Remove="#(_ProjectReferenceWithReassignedVersion)" />
<_ProjectReferencesWithVersions Include="#(_ProjectReferenceWithReassignedVersion)" />
</ItemGroup>
</Target>
...
</Project>
Given package versions specified in other projects like this
<!-- ..\MyProject1\MyProject1.csproj -->
<!-- ..\MyProject2\MyProject2.csproj -->
<!-- ..\MyProject3\MyProject3.csproj -->
<Project Sdk="Microsoft.NET.Sdk" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Version>1.1.3</Version>
</PropertyGroup>
...
</Project>
the generated MyProject.nuspec file will contain the following dependencies:
<?xml version="1.0" encoding="utf-8" ?>
<package>
<metadata>
<dependencies>
<group targetFramework="...">
<dependency id="MyProject1" version="[1.1.0, 2.0.0)" />
<dependency id="MyProject2" version="[1.1.3]" />
<dependency id="MyProject3" version="1.1.3" />
</group>
</dependencies>
</metadata>
</package>
This useful target can be put into Directory.Build.targets to cover all projects in your solution.
As far as i know it's not possible with ProjectReference, however there are some open issues in this topic on Github, so it might happen that they will implement it sometime.
But for now this functionality is only enabled on PackageReference. Docs.
Well let's think that through shall we? The project may have a version number embedded in in it somewhere, but it's likely to be the latest or previous version, which might not even build, and there's no guarantee that a subsequent build step won't update that value. The point at which a build system produces a versioned artifact is near the end of the build, usually the last step, which is normally the packaging or publishing step.
If your project must limit version ranges for any of its dependencies, it should take dependencies on other packages, not the projects that build them. This provides a natural asynchronous set of workflows to feed into a single product.
If you want the convenience of having dependencies built to their latest, then you must keep all the projects in sync with each other wrt compatibility. Project dependencies really only make sense for developer builds, not CI builds.
One thing you should never do, is produce two different packages with the same version numbers. Visual Studio projects are broken by design in the area of versioning, as they default to using a static version string that must be set prior to the build. If you happen to forget to bump that number, you will violate this semantic versioning rule.
Even if the Nuget/VS devs give you what you are asking for, it's not a good solution. What if the the currently checked out project is for a version outside of the specified range? Assuming the devs can figure out what code to check-out of revision control, is that really what you want to happen on your dev box? Any solution they come up with will be complex and prone to errors. Even if you've got the version checked-out, Nuget can't know you didn't make a breaking change to it.
It's better to run independent pipelines of code, review, build, package, test and publish, using only published packages as dependencies.
Are you basing your question on how NodeJS versioning works (^ and ~)? In .NET that's not possible, and not necessary.
NodeJS needs this because, you know, it's javascript. Since javascript doesn't have strict type-checking, you need some way of verifying whether packages are compatible with each other. Some properties, methods might or might not exist on certain objects. So the only way the build system (node) can verify this is through the package version selectors.
As I said, in .NET we don't need this, because it's a strict programming language. If a field, property or method doesn't exist on a class, the project simply won't build.
I have a Visual Studio C# solution which consists of some projects. One of the projects needs to reference another project which is not part of the solution.
At the beginning I was referencing dlls:
<ItemGroup>
<Reference Include="ExternalProj1, Version=1.0.0.0, Culture=neutral, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\Proj1\ExternalProj1.dll</HintPath>
</Reference>
</ItemGroup>
However I must reference projects so that those will generate their dlls. In fact If I reference dlls and they have not been created, I need to build those projects separately.
However when referencing projects:
<ItemGroup>
<ProjectReference Include="..\..\Proj1\ExternalProj1">
<Project>{3341b552-a569-4313-aabc-34452fff60ac}</Project>
<Name>ExternalProj1</Name>
</ProjectReference>
</ItemGroup>
However when building the compiler cannot find those assemblies. The strange thing is that building process is reported as completed successfully but the error window reports one warning:
The referenced component ExternalProj could not be found.
So, what am I doing wrong? Thank you
I see you are using ProjectReference, which is what I'm familiar with in plain (non-NET) C++ projects. The Include attribute needs to name the file, not just the base name; e.g.
<ProjectReference Include="..\..\Proj1\ExternalProj1.vcxproj">
That is, ProjectReference is not Reference. See Common MSBuild Project Items
Also, the metadata that determines whether to link the LIB automatically is determined via the supplied props files if it is not specified for that item. Is a managed project even producing a LIB? So this should (with the filename correct) cause the nominated project to be built also as a dependent, doing something with its products is another issue altogether.
Try building from MSBuild.exe command line, not the IDE, to see the pure behavior before the IDE messes things up or adds more issues to figure out. And, feed it the specific proj file you are wanting, not the "solution" file. The .sln file is a strange beast and not only is it possible to have project references not present in the sln, there is no inherent concept of a sln file at all. Other than a list of projects to show in the IDE, it is a magic file converted into a master proj on the fly that lets you name various targets individually without having to know which proj file (or the path to it) which is handy enough, but mainly there for compatibility with VSBuild according to The Books. So avoid it, at least to simplify things to get the behavior you want during the exploration stage. Then add any complications back in if you still want them :) .
I'm working with a large (270+ project) VS.Net solution. Yes, I know this is pushing the friendship with VS but it's inherited and blah blah. Anyway, to speed up the solution load and compile time I've removed all projects that I'm not currently working on... which in turn has removed those project references from the projects I want to retain. So now I'm going through a mind numbing process of adding binary references to the retained projects so that the referenced Types can be found.
Here's how I'm working at present;
Attempt to compile, get thousands of
errors, 'type or namespace missing'
Copy the first line of the error
list to the clipboard
Using a perl script hooked up to a
hotkey (AHK) I extract the type name from
the error message and store it in the windows clipboard
I paste the type name into source
insight symbol browser and note the
assembly containing the Type
I go back to VS and add that
assembly as a binary reference to
the relevant project
So now, after about 30 mins I'm thinking there's just got to be a quicker way...
These solutions come to my mind:
You can try to use Dependency Walker or similar program to analyze dependecies.
Parse MSBuild files (*.csproject) to get list of dependencies
EDIT:
Just found 2 cool tools Dependency Visualizer & Dependency Finder on codeplex I think they can help you greatly.
EDIT:
#edg, I totally misread your question, since you lose references from csproj files you have to use static analysis tool like NDepend or try to analyze dependencies in run time.
No, there currently isn't a built-in quicker way.
I would suggest not modifying the existing solution and create a new solution with new projects that duplicate (e.g. rename and edit) the projects you want to work on. If you find that the solution with the hundreds of projects is an issue for you then you'll likely just need to work on a subset. Start with a couple of new projects, add the binary (not project) reference and go from there.
One thing you can try is opening up the old .csproj file in notepad and replacing the ProjectReference tags with Reference tags. If you can write a parser, feel free to share. :)
Entry in .csproj file if it is a project reference
<ItemGroup>
<ProjectReference Include="..\WindowsApplication2\WindowsApplication2.csproj">
<Project>{7CE93073-D1E3-49B0-949E-89C73F3EC282}</Project>
<Name>WindowsApplication2</Name>
</ProjectReference>
</ItemGroup>
Entry in .csproj file if it is an assembly reference
<ItemGroup>
<Reference Include="WindowsApplication2, Version=1.0.0.0, Culture=neutral, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<ExecutableExtension>.dll</ExecutableExtension>
<HintPath>..\WindowsApplication2\bin\Release\WindowsApplication2.dll</HintPath>
</Reference> </ItemGroup>
Instead of removing the project files from the solution, you could unload the projects you aren't working on (right-click the project and select Unload Project). As long as the unloaded project has been built once, any other project with a reference to it will be able to find the assembly in the project's output directory and build with it.