Where can i view the source code in System.IdentityModel.Services namespace? - c#

This if my first foray into .net source code. I'm trying to understand the event handlers in WSFederationAuthenticationModule so I can override them successfully.
I downloaded the .net source (4.5.1) from referencesource.microsoft.com. When I opened the solution I found the System.IdentityModel project but there is no project for System.IdentityModel.Services, nor does it exist as a subclass of System.IdentityModel.
I've searched the entire solution for WSFederationAuthenticationModule, but it doesn't exist. My expectation was to see a one-for-one mapping of what I see in Intellisense with the projects and classes in the ndp.sln.
Can someone shed some light on why the source doesn't map to the namespaces and classes in the compiled framework and/or tell me where I can get the source code for System.IdentityModel.Services?

The sources available at http://referencesource.microsoft.com/ do not represent the complete set of assemblies available in .NET.
As you can probably notice, the set of assemblies that we have is not
complete. We don't intend to keep it that way, so we plan to expand
the set of assemblies over time.
A new look for .NET Reference Source, Immo Landwerth
On the same post quoted before you can find how to contact Microsoft about requesting new assemblies to be made available, given that they are probably added in terms of priority and how many people are actually wanting to browse those assemblies.
Having said that I already went through the process you're about to embark and the dotPeek decompiled source was enough for my needs, so you may want to try that while the actual source code is not available.

WSFederationAuthenticationModule class is located in the System.IdentityModel.Services assembly, in System.IdentityModel.Services.dll, which itself is located in your GAC. You can see the source code using a decompiler like dotPeek or ILSpy.

Related

Dynamically change namespace and assembly name

Basically, I developped a small library with some common fonctionnalities that I use in all my projects. For some political reasons, I cannot choose a generic name for that library (including namespace and assembly name). Usually, it must include the name of the enterprise, something like this for the namespace: Enterprise.ProjectName.XXX.YYY.
For the moment, I'm doing a copy of my library, then I'm renaming the namespaces manually with Visual Studio, and finally I'm recompiling the whole thing.
So my question is the following: Is it possible to create a small program that takes an assembly as input, rename all namespaces from MyLibrary.XXX.YYY to Enterprise.ProjectName.XXX.YYY as well as the assembly name?
What are the steps to follow?
[Edit]
Generating the assembly automatically seems to much work. I will use resharper and/or CTRL+ALT+F like I did so far. Thanks for the answers...
You could use Mono's Cecil project to disassemble the assembly, inspect each type, rename or recreate the type with a new namespace, and generate the resulting assembly.
That being said, it might be simpler to use a tool like Resharper which allows you to rename namespaces correctly within the code base.
Some options:
If you are copying the entire source code for your library into your new project, you can use a refactoring tool like Resharper to "Adjust Namespaces". This is a pretty quick and safe refactoring.
If you just need to avoid shipping the internally named assembly, you may be able to use ILMerge to 'hide' the internal assembly during a post-build step. This is viable if it's just a perception issue for the final assembly names in the binary output directory.
Deal with the issue at the political level by describing your internal library as being no different from any other third-party dependency. Then the naming is no longer a problem. This may solve other problems if you're shipping the source code of this library to multiple clients, as it clarifies that you are not giving full ownership of your 'shared' code to each client. Otherwise they could potentially argue that you are not allowed to use that 'shared' code in projects for other clients, since it is clearly owned by them, having their enterprise name in the namespace.

How to pack referenced libraries into a new libarary

When creating a new library MyAPI.dll, I am referencing many other (non-standard) libraries such as RestSharp.dll, Newtonsoft.dll,MyUtilities.dll, etc. My library works fine in my development environment because I've downloaded all of those other libraries and they're sitting in my project's bin folder, but as soon as I try to publish that library and use it in a new location, it fails because the referenced libraries cannot be found.
How to I set up my MyAPI.csproj project so that these dlls/libraries get packaged into my published .dll file, and future users of MyAPI.dll don't have to worry about downloading and referencing those dependencies?
Thought this would be simple, but my google-fu is weak today. Setting those external references to CopyLocal = False removes them from the /bin/ directory, giving the illusion that they are getting packaged into MyAPU.dll, but really Visual Studio is just adding them to my Global Assembly Cache (GAC), which doesn't help future users of the API.
There are two options (as far as i know):
ILMerge
Embeded Resource and Assembly.Resolve (see Jeffrey Richter)
First you can use ILMerge, which is comamndline program that can merge multiple .NET assemblies together, creating one output file. It cant merge WPF projects. Can be added to postbuild events to make the merge automatic.
Second is adding library as embeded resource to your project, and then registering to Assembly.Resolve event and loading assembly when its needed from resources. Article from Jeffrey Richter about this method: Jeffrey Richter.
The second method has major drawback, it doesnt work with merging multiple libraries into one (it can only be used for adding libraries to executable), at least in c# without another tool. To add library to library you have to use another tool, which is mentioned in Jeffrey's article comments at second page: (Module initializer injection).The problem with embeding library into other library is that you cant (at least in c#) register to Assembly.Resolve event before the embeded library is needed, so you need to inject the registering to module initializer using the Module initializer injection. It can also be set as build event, which is written on the apge with the tool. It may sounds complicated, but once you set it up its easy.
There is a free nuget package "Costura.Fody" it packs dependency assemblies as resources into your assembly. The solution works with WPF and other managed assemblies.
If the dependency assemblies are not in the executing folder, the packed assemblies are taken automaitcally. It also configures your msbuild targets automatically for packing the dependencies during build. You do not have to add code in your assemblies.
It also lets you configure, which assemblies to pack or not in a xml file.
It uses a combination of two methos:
Jeffrey Richter's suggestion of using embedded resources as a method of merging assemblies.
Einar Egilsson's suggestion using cecil to create module initializers.
You can find documentation here: https://github.com/Fody/Costura/blob/master/README.md
It's not free (well there's a trial) but a friend of mine told me about a program called .NET Reactor, which has the ability to package an exe with dependent DLL's into a single executable, well worth a look.
I would say the next most straight-forward alternative would be ClickOnce, a good tutorial is here.

Plugin Situation: What to do with dependent libraries?

I have a MEF-based application which uses adapters to process files. It uses configuration files to determine which directories to watch and which adapter to use to process each type of file. Plugins take the form of a .dll that implements a common interface.
Each .dll requires its own set of dependent libaries. For instance, plugin1.dll might need to use apilibrary.dll and xmllibrary.dll. It is also possible that at a later date I might want to add plugin2.dll, and plugin2.dll might use xmllibrary.dll as well. These dependent libraries are updated regularly, so I can't count on plugin2.dll using the exact same version of xmllibrary.dll used in plugin1.dll.
I'd like to compile each plugin to one .dll file that invisibly includes within itself all of its dependent libraries, which seems like one way to solve this problem. Alternately, I'd like to figure out how each .dll file can look for its dependent libaries in a subfolder, which I believe would also reduce the possibility of versioning conflicts. Or maybe there's a dead simple solution to this problem that I haven't even considered (which is always very, very likely).
Any thoughts?
You should probably try to get this to work with standard .NET loading rules. However, if you do need to control exactly how assemblies are loaded and which versions are loaded, this blog post shows how: Using Loading contexts effectively
I guess you need to weigh up deployability vs. maintenance. The simple solution is to use a tool called ILMerge. ILMerge takes your project output and can take other assemblies and merge them together. This enables you to wrap up all of the assemblies that your plugin is dependent on, and merge them into a single assembly. Optionally you can do things like re-signing with your public key, etc. Here is a good read: Leveraging ILMerge to simplify deployment and your users experience by Daniel Cazzulino.
But while that is good, what happens if a new version of the referenced assembly is distributed that corrects bugs in that which you have embedded? By the rules of Fusions assembly loader, when it loads the types from your referenced assembly, it will see that they have already been loaded, so there is no reason for it to load the updated version. This would then mean you need to recompile your plugin and merge the newer referenced assembly again.
My question would be, is it really that important to ensure a specific version is used? If a newer version provides an updated implementation (that doesn't break backwards compatibility) then surely this should benefit all plugins that need to reference it?
As for as how assemblies are loaded in reference to each other, have a read of Understanding .Net Assemblies and References, which is an invaluable piece of information.
MEF uses standard .NET assembly loading, and everything's loaded in a single AppDomain. You have very little control over how dependencies are loaded - as they just get loaded automatically by the CLR when the assembly is injected via MEF. Normal CLR assembly loading rules apply when using MEF, so dependencies will be loaded as if they were a dependency of your application - no matter where they're located or referenced.
For the most part, if the plugins and their dependencies are properly written, you most likely will not need to worry about this. As long as the versioning in the dependencies is correct, it will likely just work.

How do I create and use a .NET metadata-only 'Reference Assembly'?

Since version 3.0, .NET installs a bunch of different 'reference assemblies' under C:\Program Files\Reference Assemblies\Microsoft...., to support different profiles (say .NET 3.5 client profile, Silverlight profile). Each of these is a proper .NET assembly that contains only metadata - no IL code - and each assembly is marked with the ReferenceAssemblyAttribute. The metadata is restricted to those types and member available under the applicable profile - that's how intellisense shows a restricted set of types and members. The reference assemblies are not used at runtime.
I learnt a bit about it from this blog post.
I'd like to create and use such a reference assembly for my library.
How do I create a metadata-only assembly - is there some compiler flag or ildasm post-processor?
Are there attributes that control which types are exported to different 'profiles'?
How does the reference assembly resolution at runtime - if I had the reference assembly present in my application directory instead of the 'real' assembly, and not in the GAC at all, would probing continue and my AssemblyResolve event fire so that I can supply the actual assembly at runtime?
Any ideas or pointers to where I could learn more about this would be greatly appreciated.
Update: Looking around a bit, I see the .NET 3.0 'reference assemblies' do seem to have some code, and the Reference Assembly attribute was only added in .NET 4.0. So the behaviour might have changed a bit with the new runtime.
Why? For my Excel-DNA ( http://exceldna.codeplex.com ) add-in library, I create single-file .xll add-in by packing the referenced assemblies into the .xll file as resources. The packed assemblies include the user's add-in code, as well as the Excel-DNA managed library (which might be referenced by the user's assembly).
It sounds rather complicated, but works wonderfully well most of the time - the add-in is a single small file, so no installation of distribution issues. I run into (not unexpected) problems because of different versions - if there is an old version of the Excel-DNA managed library as a file, the runtime will load that instead of the packed one (I never get a chance to interfere with the loading).
I hope to make a reference assembly for my Excel-DNA managed part that users can point to when compiling their add-ins. But if they mistakenly have a version of this assembly at runtime, the runtime should fail to load it, and give me a chance to load the real assembly from resources.
To create a reference assembly, you would add this line to your AssemblyInfo.cs file:
[assembly: ReferenceAssembly]
To load others, you can reference them as usual from your VisualStudio project references, or dynamically at runtime using:
Assembly.ReflectionOnlyLoad()
or
Assembly.ReflectionOnlyLoadFrom()
If you have added a reference to a metadata/reference assembly using VisualStudio, then intellisense and building your project will work just fine, however if you try to execute your application against one, you will get an error:
System.BadImageFormatException: Cannot load a reference assembly for execution.
So the expectation is that at runtime you would substitute in a real assembly that has the same metadata signature.
If you have loaded an assembly dynamically with Assembly.ReflectionOnlyLoad() then you can only do all the reflection operations against it (read the types, methods, properties, attributes, etc, but can not dynamically invoke any of them).
I am curious as to what your use case is for creating a metadata-only assembly. I've never had to do that before, and would love to know if you have found some interesting use for them...
If you are still interested in this possibility, I've made a fork of the il-repack project based on Mono.Cecil which accepts a "/meta" command line argument to generate a metadata only assembly for the public and protected types.
https://github.com/KarimLUCCIN/il-repack/tree/xna
(I tried it on the full XNA Framework and its working afaik ...)
Yes, this is new for .NET 4.0. I'm fairly sure this was done to avoid the nasty versioning problems in the .NET 2.0 service packs. Best example is the WaitHandle.WaitOne(int) overload, added and documented in SP2. A popular overload because it avoids having to guess at the proper value for *exitContext" in the WaitOne(int, bool) overload. Problem is, the program bombs when it is run on a version of 2.0 that's older than SP2. Not a happy diagnostic either. Isolating the reference assemblies ensures that this can't happen again.
I think those reference assemblies were created by starting from a copy of the compiled assemblies (like it was done in previous versions) and running them through a tool that strips the IL from the assembly. That tool is however not available to us, nothing in the bin/netfx 4.0 tools Windows 7.1 SDK subdirectory that could do this. Not exactly a tool that gets used often so it is probably not production quality :)
You might have luck with the Cecil Library (from Mono); I think the implementation allows ILMerge functionality, it might just as well write metadata only assemblies.
I have scanned the code base (documentation is sparse), but haven't found any obvious clues yet...
YYMV

The .NET equivalent of static libraries?

I'm building a tool in managed code (mostly C++/CLI) in two versions, a 'normal user' version and a 'pro' version.
The fact that the core code is identical between the two versions has caused me a little trouble as I want to package the resulting tool as a single assembly (DLL) and I don't want to have to include the .cpp files for the common code in the projects of the two versions of the tools. I'd rather have a project for the common code and a project for each version of the tool and have each version of the tools project depend on the common code and link it in as desired.
In unmanaged C++ I'd do this by placing the common code in a static library and linking both versions of the tool to it. I don't seem to be able to get this to work in C++/CLI. It seems that I'm forced to build the common code into a DLL assembly and that results in more DLL's than I'd like.
So, in summary, I can't work out how to build the common code in one project and link it with each of the final product projects to produce two single DLL assemblies that both include the common code.
I'm probably doing something wrong but I tried to work out how to do this using netmodules and whatever and I just couldn't get it to work. In the end the only way I got it working was to tell the linker to link the build products of the common code assembly rather than the results which works but is a bit of a hack IMHO.
Anyway, does anyone have any suggestions for how I SHOULD be solving this problem?
Edited: I guess I should have mentioned the fact that the assemblies generated are not 100% managed code, they contain a mix of managed and unmanaged code as is, probably, quite common with assemblies produced with C++/CLI...
If you are annoyed at all the DLLs, download ILMerge. I use this to bundle together multiple DLL's into an easy-to-use .EXE for my clients.
If I'm understanding this correctly, you have a solution which contains two projects. One project for the "normal" user and one project for the "pro" user. Visual Studio allows you to add a "link" to another file source from another project. If your "pro" version has the real core code file, and in your "normal" version you add existing -> find the file in the "pro" project, and click the down arrow by the Add button and select "Add as Link". Now you have single file that is literally the same between two projects.
As said, ILmerge is one way. Personally, if you're bundling some exe with a lot of DLLs, I favor Netz.
You could use modules. You can link them into an assembly using the assembly linker, al.exe.
That's the downside of the .Net compilation process, you can't have things like static libraries and the header files that hold them together, everything is held in one big dll file and the only way to share information is to either build a common dll and reference it from other assemblies or to duplicate the code in each dll (possibly by copying/linking .cs files between projects).
Note that the 2nd way will declare different types, even though they have the same name. This will bite you on the ass with stuff like remoting (or anything that requires casting to specific shared interfaces between processes).
Remotesoft Salamander will hook you up. It's basically a native compiler and linker.
When using mono (or cygwin is an option) mkbundle may also be a valid choice.

Categories

Resources