Microsoft add-In with .net framework - c#

I am extending an application using MAF in .net framework. I have implemented a pipeline and the required folder structure and it is working fine when I have the add-in implemented in one dll.
If I have a complex add-in where one dll implements the contract and a supporting dll performs the internal logic.
When I build this add-in project then the main dll and the supporting dlls are copied to the add-in folder and at that time the framework is not able to find the token from that folder.

You need to put the supporting assemblies into the GAC because the pipeline domains aren't able to resolve external dependencies from within the pipeline folders. Some of the interfaces in System.AddIn.Contract seem like they might be targeted for a scenario like you describe (IServiceProvider and IProfferServiceContract), but there are zero examples from Microsoft on how to use them.
It's a real shame that Microsoft has been so completely silent on MAF for the last two years. The lack of complex real-world examples is a big hindrance given the complexity of using it. The silence is deafening...

Related

How to call a function in a C# DLL from another C# DLL in the same folder

We are 2 teams (2 different vendors) working on developing 2 different C# DLLs.
The 1st DLL on which I am working, has to call a public function in the 2nd DLL developed by the other team. The function will NOT be a COMVisible function.
We have agreed upon some method signatures. The 2nd DLL will be placed in the same folder as the 1st DLL.
Whenever the other team releases a new version of the 2nd DLL, I want to just replace the file in the folder without the need to recompile my DLL, or register the other DLL using regasm, etc.
How should I write my code to invoke the function in the 2nd DLL? Preferably, if there is a solution without using Reflection, I will prefer that.
Further Background:
Both the DLLs will be shipped as part of an application. I do not want to have a static reference to the 2nd DLL in the 1st DLL, as finally I want to generate 2 DLLs to be shipped. The objective is that if later on, the other team wants to change the DLL, they can simply replace their DLL in the user installation without needing me to recompile my DLL.
Thanks.
MEF might be what You are looking for :
The Managed Extensibility Framework or MEF is a library for creating lightweight, extensible applications. It allows application developers to discover and use extensions with no configuration required. It also lets extension developers easily encapsulate code and avoid fragile hard dependencies. MEF not only allows extensions to be reused within applications, but across applications as well.
Managed Extensibility Framework (MEF)
Eventually you will be able to load Components from folder
// Load parts from the available DLLs in the specified path using the directory catalog
var directoryCatalog = new DirectoryCatalog(directoryPath, "*.dll");
// ...
Take a look at this Tutorial :
Its basic purpose is to plug-in components to an already running application
An Introduction to Managed Extensibility Framework (MEF) - Part I

How do I find all assemblies containing type/member matching a pattern?

I have a folder (possibly, with nested sub-folders) containing thousands of files, some of them are DLLs, and some of those DLLs are .NET assemblies. I need to find all assemblies containing types/members matching a certain pattern (e.g. "*Collection", or "Create*"). What is the best (fastest) way to do this?
It is OK to suggest open-source libraries as long as their usage does not require to open my source code.
Maybe this api is useful to you: http://cciast.codeplex.com/
Microsoft Research Common Compiler Infrastructure (CCI) is a set of
libraries and an application programming interface (API) that supports
some of the functionality that is common to compilers and related
programming tools. CCI is used primarily by applications that create,
modify or analyze .NET portable executable (PE) and debug (PDB) files.
Or you can load all with Assembly.LoadFrom(path) and call to Assembly.GetExportedTypes()

Managed dlls as embedded resources

Is it good practice to bundle all required managed dlls as embedded resources to a .NET library project in order to ship just one dll?
Background:
I have created an API as a .NET dll (in C#), and it's working all fine. But the library has quite some dependencies on other managed libraries (around 15 dlls) so I need to distribute those as well.
When the users of my API have created an application they again have to make sure to distribute all those dlls along with the application. To me it would seem better if they had just one dll to consider.
The main downside I can see to using embedded dlls is that they must be unpacked to a temporary folder before being loaded dynamically, which may or may not have performance and robustness issues.
There are a lot of questions around this. What happens if you're expecting to load a platform-specific dependency (i.e. x86 vs. x64), or that's true about the app consuming your API? Does that mean that you need to include specific x86 vs x64 assemblies in your package as well? It becomes hairy quickly.
You should consider using ClickOnce deployment for these types of scenarios. Then, all of the dependencies would be packaged together.
Realistically, it's a problem for the API consumer to solve, not for the API producer. Your API might be less popular if it has a lot of external dependencies, but you'll have to make decisions there about what's really crucial to your API's success.

Understanding .NET assembly signing

This question is about the public key that every .NET assembly has AFAIK (looking at DLL in notepad [at bottom is public key])
I understand what signing is and why it is there in .NET, but I am working on a plug and play (plugin) project. I understand how to use System.Type and such to make a plugin system.
But my question is about the how the plugins will work with some "API's" I write (notice quotes), like access to the app's internal routines itself (like an operating system).
Now for the question: If on an app update, the updater replaces the "API" DLL with a newer version, would that break the plugins because of the signiature? (If so, I am perfectly fine with writing a little code file that uses System.Type that they put in their plugin to access the "API")
If you are looking to create a plugin architecture, I suggest you take a look at MEF (Managed Extensibility Framework) from MS.
The Managed Extensibility Framework (MEF) is a composition layer for .NET that improves the flexibility, maintainability and testability of large applications. MEF can be used for third-party plugin extensibility, or it can bring the benefits of a loosely-coupled plugin-like architecture to regular applications.
MEF makes it very easy to create an application that uses plugins.

How to locate DLLs for inter-component referencing

My SAAS company has two C#.NET products, call them Application Alpha and Application Beta. Both of them reference some libraries that we can call Core.
At the moment, the codebase is monolithic, stored in a single SVN repository with a single .NET solution, and we're trying to make it more modular/componentized. We've split off into separate repositories for the two applications and the core library, but we're now running into the following problem:
Alpha and Beta must reference Core, but we're trying to avoid having a direct code reference because then we're practically back to square one: you would need to check out and co-locate all repositories. So how should we go about referencing assemblies between these components?
Each component could have a directory containing DLLs from the other components that need to be referenced, stored in SVN, but this would mean extra effort any time Core is updated to push out the new DLLs to Alpha and Beta.
Or we could store the DLLs in a central SVN'd location (and/or in the GAC), but that would mean extra effort any time Core is updated for everyone else to pull the new DLLs.
Is there a third option that we're overlooking?
I have something similar in which I have 5 applications utilizing a series of web controls I built. The controls are compiled into a series of DLLs for modularization and the applications that utilize them live on separate servers.
What I do is utilize VS2008's build utility to execute a batch file that copys the compiled (updated) DLLs to the production servers when a Release Build executes.
You do this by going to the project that builds into the DLL (or DLLs) and right click on that project and goto Properties. Then you goto the BUILD EVENTS tab. There you see Pre-Compile command line and Post-Compile command line textboxes.
Therefore your release builds can be fully automated and you never have to worry about DLL hell-like differences between versions of your production DLLs.
Hope this helps,
JP
you could have your rebuild script for Alpha and betat create artifacts (namely build core) and place the result of the core build at a specific location referencing that location.
You could use SVN:externals. It was designed for this type of scenario.
If you want to avoid that, these are probably your better options. I'd avoid putting the files in the GAC, though, unless your core project is very stable, and not changing very often. Having the DLLs local provides much more flexibility.
Each component could have a directory containing DLLs from the other components that need to be referenced, stored in SVN, but this would mean extra effort any time Core is updated to push out the new DLLs to Alpha and Beta.
This could be handled fairly easily with a good build system. This approach has some disadvantages (ie: exectuable depdenencies in the build system), but has some advantages, including allowing each dependent project to have different versions as needed, etc.

Categories

Resources