I'm more or less new to .NET development as well as to techniques like WCF and COM(+). I think I'm little bit informed about these topics, even though I just skim over some of those.
Problem:
I want to develop a multi-client capable Webservice which should have the ability to make a "session"-based usage of an existing COM-Object. (The COM-Object already exists and couldn't be changed)
The COM-Object (Dll) itself loads two (unmanaged) Dlls. And now here comes the tricky part:
If I create ONE instance of the COM-Object in a sample C# client (console app), everything works fine at that point, because the Instance runs in process. (Those unmanaged Dlls just loaded once each process). So if I'm going to create another instance of the COM-Object in the same process, the app crashes. This is an expecting behavior because those Dlls are not thread save.
A first conclusion was that I have to create just ONE COM-Instance each (isolated) process!
But what about create multiple instances of the COM-Object in as Webapp which bases on user-sessions?
What I have tried already:
1. I created a new Appdomain for each COM-instance.
-> Dont work, because unmanaged Code don't know about Appdomains.
I made a COM+ library from the COM-Object and I created a WCF service with the utility (ComSvcConfig) and hosted it finally in IIS (WAS).
-> Don't work, because it runs in the same process (worker-processes).
I created a first WCF-Service of the COM-Object and created the appropriate service-operations for each COM-Function.
I created a second WCF-Service that just generates a new instance of the first Selfhosting WCF-Service. The second WCF-Service starts and returns an unique URL of the first service in order that each client have its own and Selfhosted service in a separated process.
-> Works! But it seems very complicated and it isn't probably a good programming style IMO.
Conclusion:
May I haven't considered several aspects to make a suitable solution (due to lack of knowledge). So may there are other (better) ways to solve the problem?
Do you have some advices and tips to solve the problem more conveniently?
Thanks in advance!
Related
I essentially want to make an api for an application but I only want one instance of that dll to be running at one time.
So multiple applications also need to be able to use the DLL at the same time. As you would expect from a normal api.
However I want it to be the same instance of the dll that the different applications use. This is because of communication with hardware that I don't want to be able to overlap.
DLLs are usually loaded once per process, so if your application is guaranteed to only be running in single-instance mode, there's nothing else you have to do. Your single application instance will have only one loaded DLL.
Now, if you want to "share" a "single instance" of a DLL across applications, you will inevitably have to resort to a client-server architecture. Your DLL will have to be wrapped in a Windows Service, which would expose an HTTP (or WCF) API.
You can't do that as you intend to do. The best way to do this would be having a single process (a DLL is not a process) which receives and processes messages, and have your multiple clients use an API (this would be your DLL) that just sends messages to this process.
The intercommunication of those two processes (your single process and the clients sending or receiving the messages via your API) could be done in many ways, choose the one that suits you better (basically, any kind of client/server architecture, even if the clients and the server are running on the same hardware)
This is an XY-Problem type of question. Your actual requirement is serializing interactions with the underlying hardware, so they do not overlap. Perhaps this is what you should explicitly and specifically be asking about.
Your proposed solution is to have a DLL that is kind of an OS-wide singleton or something like that. This is actually what you are asking about; although it is still not the right approach, in my opinion. The OS is in charge of managing the lifetime of the DLL modules in each process. There are many aspects to this, but for one: most DLL instances are already being shared between every process (mostly code sections, resources and such - data, of course, is not shared by default).
To solve your actual problem, you would have to resort to multi-process synchronization techniques. In Windows, this works mostly through named kernel objects like mutexes, semaphores, events and such. Another approach would be to use IPC, as other folks have already mentioned in their respective answers, which then again would require in itself some kind of synchronization.
Maybe all this is already handled by that hardware's device driver. What would be the real scenarios in which overlapped interactions with the underlying hardware would have a negative impact on the applications that use your DLL?
To ensure you have loaded one DLL per machine, you would need to run a controlling assembly in separate AppDomain, then try creating named pipe for remoting (with IpcChannel) and claim hardware resources. IpcChannel will fail to create second time in the same environment. If you need high performance communication with your hardware, use remoting only for claiming and releasing resource by another assembly used by applications.
Mutex is one of solution for exclusive control of multiple processes.
***But Mutex will sometimes occur dead lock. Be careful if you use.
We have a service running that connects with hundreds of devices over TCP. Every time we want to do an update of this service we need to restart it and this causes a connection loss for all devices.
To prevent this we want to divide our application into a connection part and a business logic/datalayer part. This will give us the option to update the business logic/datalayer without restarting the connection part. This could be done with WCF services, but the system should response as fast a possible and introducing another connection to something will cause an extra delay.
Would it be possible to update a dll file without restarting the application and give the application an instruction so it will load the new dll and discharge the old one? Off course as long as the interface between the layers don't break.
According to MSDN:
"There is no way to unload an individual assembly without unloading all of the application domains that contain it. Even if the assembly goes out of scope, the actual assembly file will remain loaded until all application domains that contain it are unloaded."
Reference: http://msdn.microsoft.com/en-us/library/ms173101(v=vs.90).aspx
My approach would probably involve some sort of local communication between communication layer and business logic, each on a different context (AppDomain) - via named pipes or memory mapped files, for example.
Here is a good example of loading / unloading assembly dynamically.
http://www.c-sharpcorner.com/uploadfile/girish.nehte/how-to-unload-an-assembly-loaded-dynamically-using-reflection/
Be careful about speed since the MethodInfo.Invoke is slow you might want to look into using DynamicMethod. Also creating / destroying app domains is slow.
http://www.wintellect.com/blogs/krome/getting-to-know-dynamicmethod
Also you can use what is called a "plugin" framework. Codeplex has one called the MEF "Managed Extensibility Framework"
http://mef.codeplex.com/
I wish to understand some unexpected behaviour in the running of COM DLLs, where is appears static C++ data is being shared across multiple processes. The environment is a little complex and my understanding of the various COM threading models rather weak, which I am hoping someone can help with.
The environment
An IIS Server on a 64 bit OS running multiple C# web services, with each service in its own 32-bit application pool, and therefore process
The pools have the "Enable 32-Bit Applications"=True setting
each 32 bit C# service calls a different in-process 32 bit COM DLL (so service A calls COM DLL 1, service C calls COM DLL 2. The COM DLLs are written in C++ using Qt 4.8 ActiveQt
The COM DLLs depend on a number of 32 bit C++ DLLs, which are shared i.e. both COM DLLs 1 and 2 depend on Utilities.dll
As far as I can tell, there is no ThreadingModel set for the COM DLLs, so I am expecting the system will fall back on the main STA.
I am aware this is frowned upon, but I do not have enough knowledge to change it currently.
Utilities.dll contains some static C++ data
The COM DLLs were registered using "regsvr32" and do not appear to be listed in "Component Services", though my knowledge of the latter is minimal.
The observed issue is that the static data in Utilities.dll appears to end up being shared between the different IIS processes, with undesirable consequences. I had expected that as the COM were in the main STA they would be accessed as if it were not thread safe, that each process would get its own copy of the DLL static data, but this appears not to be the case.
Can someone explain how the static data ends up shared between processes?
How can I avoid this? (apart from refactoring the code to remove all static data, which is not really viable currently)
If you are seeing data shared between COM objects, it means they are hosted in the same process. Yes, it's possible to share data between processes, but not accidentally. Since your app pools are different processes, it must be that these COM objects are hosted out of process, and it's just stubs that are loaded into the app pool.
If you have control of the Utilities.dll (and it sounds like you do) I would try adding some debugging information to find out what process id is hosting the COM objects. I expect you'll find that it doesn't match the app pool id, and you'll be able to use that id to find out what's going on.
Ideally it shouldn't matter where well-designed COM objects live, that's supposed to be something of an implementation detail. Is it possible to do away with the shared data structures?
Not really sure how to ask this question because I really don't know what I'm talking about. I have two DLLs (.NET), each is an AddIn that runs in two different application processes i.e. application one loads DLL one and application two loads DLL two. I wanted these DLLs to be able to communicate while loaded. In each DLL, I know the exact class that will be instantiated by the host process and I want these two living objects in each process to be able to communicate (call methods on each other). This seems like it would be possible. Has anyone done something like this before?
Although some might say a dprecated technology .Net Remoting is suited to this kind of inter-process object instance communications on the same host.
try to specify your requirements better please... there is .NET remoting to access and consume instances of objects running in another process/machine but should be used only when required.
in general WCF can be used to communicate between applications and processes but again it depends if you only want to call methods or also and absolutely have object level IPC.
the system I'm working with consists of:
A front-end application written in most likely VB or else VC++ (don't know, don't and can't have the sources for it)
An unmanaged VC++ .dll
A C# .dll
The application calls the first dll, the first dll calls different methods from the second one.
In order to make the first dll able to see and call the C# code I followed this guide:
http://support.microsoft.com/kb/828736
The only difference is that i am not compiling with /clr:OldSyntax, if I do then changing the other dependant compiling options makes the first dll load incorrectly from the application.
Everything compiles smoothly; the whole setup even worked fine initially, however after completely developing my code across the two dlls I now get an error in the application. The error is:
Run-time error '-2147417848 (80010108)':
Automation Error
The object invoked has disconnected from its clients.
And occurs when the following line is executed in the first dll:
MyManagedInterfacePtr ptrName(__uuidof(MyManagedClass));
I tried reproducing a fully working setup but without success.
Any ideas on how the heck I managed to do it in the first place?
Or alternatively on other approaches for making the two dlls work together?
Thanks in advance!
It is a low-level COM error, associated with RPC. That gets normally used in out-of-process servers, but that doesn't sound like your setup. It would also be used if you make calls on a COM interface from another thread. One possible cause is that the thread that created the COM object was allowed to exit, calling CoUninitialize and tearing down the COM object. A subsequent call made from another thread would generate this error. Getting reference counting wrong (calling Release too often) could cause this too.
Tackle this by carefully tracing which threads create a COM object and how long they survive.