I am trying to update a large solution to .NET Framework 4.7.2 with VS2019. One of the problems with this solution is that it is a large plugin type architecture, where (for many reasons) I am not able to recompile and release the plugins to production with the updated shared set of libraries that the solution provides.
Normally this is OK, but we have to be very careful to ensure full backwards binary compatibility. When we recently upgraded to .NET 4.7.2, we started getting conflicting usage indicators between System.Linq and MoreLinq, generally on the .ToHashSet() extension that we commonly use. The problem is outlined somewhat on MoreLinq's github
I think the only way to correct this is to isolate MoreLINQs usage into a single DLL that I control, and once all plugins reference that DLL upgrade to 4.7.2, fixing the .ToHashSet() call in the common location at that time.
Does anyone know of a better/more efficient way to do this, without re-releasing all the plugins at once? Some kind of global redirect that I am not aware of?
Evidently this just works. Installing the solutions common DLLs # 4.7.2 alongside older plugins # 4.5.2 worked fine, even though those plugins had compilation errors when compiling in Visual Studio against 4.7.2.
I'd be curious to know what internals are making this succeed. Does the system load multiple versions of .NET framework DLLs for the running process?
Related
The dll's are:
-CrystalDecisions.Enterprise.Framework
-CrystalDecisions.Enterprise.InfoStore
-Microsoft.VisualStudio.TeamSystem.Data.UnitTesting
I'm trying to update an old project to the new c# updates.
It is the .Net version that must be compatible with any loaded dlls, and newer visual studio versions do support older .Net versions. If you are updating from .Net framework to .Net core/5/6 you might just have to test it.
.Net core/5/6 is not backwards compatible with .Net framework, since some functionality have been removed. I think you can still try to load a dll written for .Net framework, but if any removed functionality is called you will get a runtime failure. So you need to be diligent with testing.
I would expect libraries that does things like UI and communication to work less well, and libraries that mostly do things like computation, or wraps native functionality, to work better.
I currently have a utilities library built as .NET Framework 4.6.1 and referenced by various .NET Framework applications.
I now want to create a new .NET Core application and therefore I want to convert the utilities library to .NET Standard 2.0 so that it can be used by applications of both types.
If I simply open the source code for the library, change the target to .NET Standard 2.0 and rebuild it (assuming that it does only use APIs available in .NET Standard), can I just drop the new assembly in to replace the existing one and should the existing applications still work? Or would the applications need to be rebuilt against the new version?
And the more general related question is, what are the differences in the metadata produced for a .NET Standard DLL compared to a .NET Framework one, and how/if do they affect the assembly resolver?
(to pre-empt the comment "why not just try it and see", I want to know if this is a supported scenario, not just whether technically it might work for me)
.Net Standard is a compatible cross-section between (but not limited to) .Net Framework and .Net Core
Think of it like this
Or, one of the more standard propaganda pictures
There are lot of things in .Net Framework that just doesn't make a lot of sense in .Net Core. Windows specific things for instance.
However, what you can do is use The .NET Portability Analyzer to work out any glaring compatibility problems
Want to make your libraries multi-platform? Want to see how much work
is required to make your application compatible with other .NET
implementations and profiles, including .NET Core, .NET Standard, UWP,
and Xamarin for iOS, Android, and Mac? The .NET Portability Analyzer
is a tool that provides you with a detailed report on how flexible
your program is across .NET implementations by analyzing assemblies.
The Portability Analyzer is offered as a Visual Studio Extension and
as a console app.
And here is another picture (of the tool), because it makes my post look detailed
To answer your question
If I simply open the source code for the library, change the target to
.NET Standard 2.0 and rebuild it, can I just drop the new assembly in
to replace the existing one and should the existing applications still
work? Or would the applications need to be rebuilt against the new
version?
Needs to be rebuilt as far as i know, try it see what happens and let us know.
There are so many things that could make this not work.
And the more general related question is, what are the differences in
the metadata produced for a .NET Standard DLL compared to a .NET
Framework one, and how/if do they affect the assembly resolver?
Not entirely sure what you mean, however it will resolve just the same when rebuilt.
(to pre-empt the comment "why not just try it and see", I want to know
if this is a supported scenario, not just whether technically it might
work for me)
Just replacing the assembly is not supported as far as i know or could research. However maybe someone else has more information here.
I'm working on a project, to which I would like to add Prometheus metrics exporting. There's a pretty great library for it over at NuGet that I've used. The new version (2.0.x, prerelease), however, is supposedly build for .NET Standard. That's fine, but my current project is built for the .NET Framework on Windows. When I add this NuGet package, however, I get over 100 assemblies added to the output of my project, including many that I don't think are related.
I added an issue for this to the project, but the project maintainer came to the conclusion that this is normal. I don't agree, however, and I'd like to know if there is a specific suggestion I can make to improve the situation.
What should prometheus-net change to avoid adding all these assemblies to the output of my project?
This is a flaw/bug/unfortunate side effect of the current .NET Standard 2.0 build situation. When a .NET Framework before 4.7.1 is targeted, the system cannot be sure that all the dependencies exist, so it copies all of these extra assemblies to the output.
4.7.1 has everything that .NET Standard 2.0 needs, so they will not be included if that framework is targeted (though, things might break if an earlier framework is used at runtime).
There is information (and a workaround that seems to work for me) at https://github.com/dotnet/standard/issues/415#issuecomment-314288712
I taught myself coding, and I'm not sure if this is possible.
Neither do i know if what I ask here goes by some name (e.g.: "What you ask is called xxxxxx"). I couldn't find anything on this topic (but did find some upgrade articles, which is not exactly what I want, so please excuse me if this sounds like a NOOB question to hardcore coders; I'm a beginner).
I had a small project that relied on .NET 2.0 because of inclusion of some external libraries. The software worked well, but now it needs added functionality; things that would be much more easy to program under .NET 4.0 or 4.5.
However, that included external library isn't at that .NET level, so now I wonder: can a project have multiple .NET versions ?
I'm not sure but was thinking also perhaps I only write my new function as a dll that depends on .NET 4.5, in which I write my public functions in a different project, then later include that final dll in my prj that depends on .NET 2.0... not sure if this would be the way to go though.
Yes, you can include .NET 2 assemblies into a .NET 4 or .NET 4.5 project (or any version if you will). Including a .NET 4 assembly in a .NET 2 project won't work.
If you want to use new features in a base project, you need to upgrade all projects that rely on it. Make sure that the entire tree supports .NET 4 then (for example Office add-ins, or other related software you might use).
An exception is the use of mixed-mode assemblies (assemblies that use both .NET and unmanaged code), which are more tight to the CLR and they might cause problems due to the activity policy (MSDN).
You can only use one .NET version, and that's the version that your primary application requires (or is configured) to use. However, .NET is largely backwards compatible with older versions and can often run older assemblies unchanged in newer versions of the framework.
So even though your app may be running .NET 4 or 4.5, you can use assemblies (or libraries) that were written for .NET 2 (although some restrictions may apply, as others have mentioned). Those assemblies just run under the 4.5 CLR (assuming there are no backwards compatibility issues with them), which is rare but does happen).
But the key to remember, your running application must be at the highest version of any contained assemblies. Meaning, if you have 4.5 assemblies, you can't run your app as a 2.0 app. It must be the other way around.
It sounds like you are looking for Side-by-side execution, especially In-Process Side-by-Side execution:
In-Process Side-by-Side execution
yes, you can include. that's what CLS Do
If you want to use new features in a base project, you need to upgrade all projects that based on it.
I've been reading and trying to use NUnit, and so far the books/articles I am reading say that in the bin folder of NUnit, there should be nunit.framework.dll, which I need to reference in my project.
But the strange thing is that there is no nunit.framework.dll, but there are two folders: net-1.1 and net-2.0.
I use neither, I am working in VS 2010 with .NET 4.0.
Is NUnit deprecated?
Why would they only have two folders for old versions of .NET?
To start off, NUnit is in no way deprecated and has become de facto primary unit testing framework for .NET.
However, as a testing framework, it doesn't involve compiling expressions, doesn't use LINQ or dynamic language constructs, and in its current implementation relies on feature found even in .NET 1.0, such as attributes.
However, with the release of .NET 2.0, a new CLR was shipped. Old .NET 1.1 assemblies had to be “remapped” in the application configuration file in order to be picked up by the new CLR, and people would get issues with running their .NET 2.0 applications with a library compiled for .NET 1.1.
When .NET 3.0 and .NET 3.5 came out, they didn't carry a new CLR along. Both of them still use .NET 2.0 CLR and only add some libraries (WPF, WCF) and language features (such as LINQ) with them. Because these releases don't bring changes to CLR, there is absolutely no sense in distributing separate library versions for them, as the binaries would stay exactly the same, referencing version of mscorlib found in .NET 2.0.
I'm not sure about .NET 4.0 and if it requires entry in App.config to properly load .NET 2.0-compiled NUnit library so I am open to comments on this.
There's a quote from NUnit blog explaining separate 1.1 and 2.0 packaging:
So, if it’s possible to run the
original, built-with-net-1.1 version
of NUnit under any version of the CLR,
why do we need a separate .Net 2.0
version? Strictly speaking, we don’t.
But it’s convenient for a few reasons:
Folks seem to have an inordinate amount of trouble getting NUnit to run
under the proper framework version.
Having a special version will make
their lives a bit easier, not to
mention mine, since I have to answer
all the questions.
Eventually, we will need a separate version. It’s inevitable that people
will begin to use 2.0-only features in
their tests. Imagine an Assert on a
nullable value. Or a generic
TestFixture class. We’ll be able to
deal with some of those things from an
NUnit built with .Net 1.1, but many of
them will either require .Net 2.0 or
be much simpler to implement with .Net
2.0.
For now, the .Net 2.0 builds are
identical in features to the .Net 1.1
builds. We have reflected this in not
changing the version numbering for
now. Those using one or the other
framework version exclusively can
download a copy of NUnit built to use
that version without missing out on
any features. Those using both
versions have the choice of installing
both versions side by side – just be
careful which one you reference – or
using the command-line /framework
option to select the correct version
on the fly.
(highlighting is mine)
However, it was posted a long time ago (November 2006) so probably by now the versions do differ.
The net-2.0 version was built using .NET 2.0, but it will run successfully under .NET 4.0 (it's got entries in its .exe.config file to enable this). And it will successfully run tests that were built with .NET 4.0.
I agree that it would be less confusing if they provided binaries that were more obviously current, but there's no technical need for them. Use the net-2.0 folder; it should work for you.
Goto http://nuget.org/ and install NuGet. Then follow the instructions on how to add a NuGet package to your project. Then search for NUnit and it will automatically import the correct dll for your project.
By the way, with a .NET 4.0, you can use any dll above 2.0 so you can use NUnit .NET 2.0 assembly with your project. It will automatically be loaded and run with CLR v4 along with your own assemblies so no performance penalty there.