I'm mixed on how I should design my program. I'll try to be clear, but it's hard to explain!
Basically, I inject a managed C#.NET Dll into a certain process. Right now, this Dll can load any other Dll dynamically via reflection (if the other Dll implement the IRunnable interface).
What I would like to have is the following:
A Master GUI that injects the Dll into a process. You can load from this GUI an extension Dll (via reflection) into the process (all this remotely, the GUI MUST NOT be within that said process). I want the GUI to communicate via WCF or a named pipe. Also, once the said Dll has been load via reflection, it must create a GUI in the master user interface.
I'm really confused on how to approach this problem. Any feedback would be appreciated.
Thanks!
EDIT 1:
Sorry for the late reply I was out of town. I do know how to inject a DLL and starting the CLR into the remote process, that's not a problem. I'll try to reformulate the problem in better terms:
A : Host process (injector)
B : Target process (the one who's gonna get injected)
C : Dll to be injected
D : Dll loaded via reflection from C. Belongs to process B.
Basically, the problem is I would like that once D gets loaded, it creates a GUI window into A. Then I would like that GUI window to be able to communicate with D. How is that possible? Probably I'm looking at the problem the wrong way, but I'm out of idea.
Do you already know how to inject a DLL into another process? This is not completely trivial, and may involve using hooks or modifying the process' memory (of course, both require the appropriate privileges).
Assuming that you do, I would use .NET Remoting or WCF to create a two-way communication channel between the two processes. If you have any more specific questions, feel free to ask.
You won't be able to inject a dll from one process into another. That sort of thing just won't work (even in native (unmanaged) applications). One application can "instruct" (or ask) another "cooperating" application to load a specific dll, provided that dll is available locally on the machine. So application "A" could send the dll to application "B" and ask "B" to load the dll as well.
The way application "A" communicates with application "B" is not so important. If you use named pipes both application will have to be on the same network. If you want this to work across networks (internet) then any tcp technology can be used. Http might be the better choice so you can go past firewalls and such. So Http REST could be one option whether you use ASP.NET or WCF is your choice.
Related
I've created a .Net app, hosted on a secure sharepoint. I want to be able to run this app via VBA (or other apps).
Through asking and research yesterday, I found this is easy to do via Shell filePath, vbNormalFocus - and I can even pass arguments to the app, but what I really want is to control the app as an object, similar to controlling a COM object. After researching and much confusion today I found I can't use COM because this isn't a class library - it's an .exe app (although I've learned SO much today!)
Imagine this type of deal:
Dim x as Object
Set x = Shell "filePath", vbNormalFocus
x.NotSureHowThisWorks
I want to start the app in my Excel environment, and make many calls to and from the app, to give it instructions. The calls could be strings or whatever but the point is I need the running app coupled to an object variable in the VBA code so I can repeat calls and receive responses.
Is this possible?
Since the app is yours, the most simplest thing you can do is to break your app into 2 projects; one DLL and one EXE project. Have the EXE project reference the DLL for all the functionalities, and make the DLL COM-visible then just reference that DLL from your COM application.
If it's meant to be a shared process, then it might be simpler to just run it as a service rather than an .exe.
Ok so the scenario is as followed.
Application1 has the ability to load and make calls to an unmanaged C++ DLL.
I want to write another user interface in C# to make my life easier.
I then want the DLL to be able to send information to the C# executable, and the C# executable to be able to send information to the DLL.
The information being passed back and forth is not complex. It will simply be a string.
Any ideas on how this can be done.
This should answer your question. Basically the easiest options are Named Pipes for communication on the same machine, and Sockets for different machines.
Update
After better consideration, the answer depends on 'Who is in control?' in your scenario.
1. If C# executable is responsible for calling your unmanaged DLL and sending/retrieving information, then you should go with Platform Invoke.
2. If you you want your unmanaged DLL to decide when to send data to the application, then first of all you should transform your DLL into full fledged application and after that go with the interprocess communication.
I have two programs that call a dll. I want to have them both make sure to call the same instance of the dll so it can be used to pass information back and forth.
How can I correctly pinvoke the same instance so that both programs are talking to the same dll and information can be passed back and forth using the dll as an intermediately with reverse pinvokes and callsbacks.
Is pinvoke not the way to do this? Is there a better way?
It is called "shared section in DLL" and it would let your somehow share data between all processes that load that DLL.
You will not be able to share callbacks as code is running in different processes. You need some sort of IPC (inter-process communication) mechanism to do that.
Overall I would recommend against doing it as it is unusual approach to sharing data between applications. You will unlikely to find help and samples how to do that and would need to read the book (Windows Internals, useful read anyway) to do it properly on your own.
Use Interprocess Communication with WCF
DLLs are used for shared code, not shared data.
I have some VBA code that needs to talk to a running c# application.
For what it's worth, the c# application runs as a service, and exposes an interface via .net remoting.
I posted a question regarding a specific problem I'm having already (From VB6 to .net via COM and Remoting...What a mess!) but I think I may have my structure all wrong...
So I'm taking a step back - what's the best way to go about doing this?
One thing that's worth taking into account is that I want to call into the running application - not just call a precompiled DLL...
In the past, one way I accomplished something similar was with Microsoft Message Queueing. Both languages/platforms can read/write to a queue.
In my scenario, we had a legacy Access database that we had to maintain. We wanted to migrate away from it and replace it with a more robust .NET solution. To get real time data out of the current system into the new system, we added VBA code to write data to a message queue. Then we wrote a C# windows service to process that data in the new system.
I'm not entirely sure of what you're doing, so this may not be a fit, but I thought I'd mention it.
I've come up with a solution using my original structure...
That is, the VBA application calls a COM wrapper application that translates all of the types from .Net to COM safe types. This wrapper then calls the main service using .net remoting.
The problem I was having was that the common dlls between the wrapper and the service needed to be in the C:\Program Files\Microsoft Office\Office12 folder (along side msaccess.exe).
While I was using the AssemblyResolve method to provide the dlls at runtime, this wasn't working...So for now I'll just have to have the dlls copied to the folder - a far from elegant solution, but at least the communication is working for now.
Looking for the advantages of loading DLLs dynamically as opposed to letting your application load the DLLs by default.
One advantage is for supporting a plugin architecture.
Suppose for example you want to write a service that performs different types of tasks on a scheduled basis. What those tasks are doing, isn't actually relevant to your core service which is just there to kick them off at the right time. And, it's more than likely you want to add support to do other types of tasks in the future (or another developer might want to). In that scenario, by implementing a plugin approach, it allows you to drop in more (compatible by interface) dlls which can be coded independently of the core service. So, adding in support for a new task does not require a new build/deployment of the whole service. If a particular task needs to change, just that dll needs to be redeployed and then automatically picked up.
It also requires other developers to not be concerned with the service themselves, they just need to know what interface to implement so it can be picked up.
We use this architecture for our processing applications to handle differences that our different customers require. Each DLL has a similar structure and implements the same interface and entry method "Process()". We have an XML file that defines which class to load based on the customer and whether there are more methods besides process that needs to be called. Performance should not be an issue until your transaction count gets very high.
Loading Shared Objects dynamically is the mechanism for allowing plugins ad hoc to running applications. Without plugins a modular application would have to be put together at link-time or compile-time (look at the code of nginx).
Your question is about C#/.NET so in this world dynamic DLL loading requires advanced programming skills. This could compensate all the potential benefits of dynamic DLL loading. You would simply have to write a lot 'low level' code.
In C++/Win32 I often have to load a DLL dynamically when this DLL has some new API function which is not available on older operating systems. In this case I need to ensure the availability of this API at runtime. I cannot just link against this DLL because it will cause application loading errors on legacy operating systems.
As mentioned, you could also have some benefits in a plugin-based environment. In this case you would have more control on your resources if loading DLLs dynamically. Essentially COM is a good example of dynamic DLL handing.
If you only load the DLLs you need then the startuptime of the application should be faster.
Another reason to load DLL's dynamically is for robustness.
It is possible to load a DLL into what is known as an AppDomain. An Appdomain is basically a sand box container that you can put things into (Either portions of DLL's or whole EXEs) to run in isolation, but within your application.
Unless you call into a type contained within an AppDomain, it has no way to interact with your application.
So, if you have a dodgy third party DLL, or a DLL that you don't otherwise have the source code for, you can load it into an AppDomain to keep it isolated from your main application flow.
The end result is that if the third party DLL throws a wobbly, only the appdomain, and not your entire application is affected.