I have an application in C# that simulates multiple mathematical models written in C++ in parallel(each model in a thread). Each model has 4 methods : Init, Run, Free and Save. Each method is dynamically loaded using LoadLibrary from kernel32. Using the LoadLibray method, we load the DLL and then using GetProcAddress we get the 4 main methods. The application has Play/Pause/Stop.
When we Play the simulation for the first time, there's no problem in the simulation but, if we Stop and Play again, after a couple of seconds, the program throws AccessViolationException in the Run method of a RANDOM model. There's no pattern, which makes me belive that problem lies in the P/Invoke operation. Look at the following example :
//The C++ Dll.
extern "C" __stdcall void Init(const char* name, int size, void* buffer, void** memory);
extern "C" __stdcall void Run(int in_size, int out_size, void* in_buffer, void* out_buffer, void** memory);
extern "C" __stdcall void Free(void** memory);
extern "C" __stdcall void Save(int size, void* buffer, void** memory);
// after the P/Invoke Dll and Marshal.GetDelegateForFunctionPointer method (in C# code)
ModelInit(string name, int size, IntPtr buffer, ref IntPtr memory);
ModelRun(int in_size, int out_size, IntPtr in_buffer, IntPtr out_buffer, ref IntPtr memory); // <---- THE EXCEPTION IS THROWN HERE
ModelFree(ref IntPtr memory);
ModelSave(int size, IntPtr buffer, ref IntPtr memory);
I have no idea how to debug this, i've already tried to make c++/cli wrapper to avoid P/Invoke, but some DLLs has third-party libraries that were built in /MTd option, and the C++/CLI is incompatible.
Related
This question already has answers here:
How do I pinvoke to GetWindowLongPtr and SetWindowLongPtr on 32-bit platforms?
(2 answers)
Closed 4 years ago.
I am trying to use GetWindowLongPtrA but I keep getting the "Unable to find an entry point named 'GetWindowLongPtrA' in DLL 'user32.dll'". (also SetWindowLongPtrA getting the same error). I've tried many solutions found on Google, but they didn't solve it.
Here is the declaration of the function I've written:
[DllImport("user32.dll")]
public static extern IntPtr GetWindowLongPtrA(IntPtr hWnd, int nIndex);
Tried to put EntryPoint = "GetWindowLongPtrA", changed GetWindowLongPtrA to GetWindowLongPtr, put CharSet = CharSet.Ansi, switched to GetWindowLongPtrW with CharSet = CharSet.Unicode etc., They all didn't work.
My computer is exactly "64-bit" (but cannot call that 64-bit WinAPI function?). the OS is Windows 10.
But my system drive is running out of free space. Is this a possible cause?
What is the solution for this problem?
There is no function named GetWindowLongPtr, GetWindowLongPtrA or GetWindowLongPtrW in the 32-bit version of user32.dll:
The reason that using GetWindowLongPtr regardless of target bitness works C and C++ WinAPI code is that in 32-bit code it's a macro that calls GetWindowLong(A|W). It only exists in the 64-bit version of user32.dll:
The documentation for importing GetWindowLongPtr on pinvoke.net includes a code sample for how to make this importation transparent to target bitness (remember, the error is thrown when you actually try to call the imported function that doesn't exist, not on the DllImport line):
[DllImport("user32.dll", EntryPoint="GetWindowLong")]
private static extern IntPtr GetWindowLongPtr32(IntPtr hWnd, int nIndex);
[DllImport("user32.dll", EntryPoint="GetWindowLongPtr")]
private static extern IntPtr GetWindowLongPtr64(IntPtr hWnd, int nIndex);
// This static method is required because Win32 does not support
// GetWindowLongPtr directly
public static IntPtr GetWindowLongPtr(IntPtr hWnd, int nIndex)
{
if (IntPtr.Size == 8)
return GetWindowLongPtr64(hWnd, nIndex);
else
return GetWindowLongPtr32(hWnd, nIndex);
}
I'm using the Windows API function ReadFile() to read system metafiles. But what I'm confused about is how to actually process the data that is returned from that function. I'm assuming that it's stored in the lpBuffer parameter, and that I somehow need to decode the contents of that buffer in order to interpret the actual data.
I'm running Windows 10 and am using C# to make interop calls.
Here's my wrapper:
[DllImport("kernel32", CharSet = CharSet.Auto)]
public static extern bool ReadFile(SafeFileHandle hFile, IntPtr lpBuffer, uint nNumberOfBytesToRead, out uint lpNumberOfBytesRead, ref NativeOverlapped lpOverlapped);
And here's my call:
NativeMethods.ReadFile(_volumeHandle, (IntPtr)buffer, (uint)len, out read, ref overlapped)
//do something with the buffer???
The data contained in the buffer after the call is a pointer to an int - which is what I expected - but where is the actual file data?
You need to supply the buffer. See https://learn.microsoft.com/en-us/windows/desktop/api/fileapi/nf-fileapi-readfile
What I am trying to do is write a C# application to generate pictures of fractals (mandlebrot and julia sets). I am using unmanaged C++ with CUDA to do the heavy lifting, and C# for the user interface. When I try to run this code, I am not able to call the method I wrote in the DLL - I get an unhandled exception error for an invalid parameter.
The C++ DLL is designed to return a pointer to the pixel data for a bitmap, which is used by the .NET Bitmap to create a bitmap and display it in a PictureBox control.
Here is the relevant code:
C++: (CUDA methods omitted for conciseness
extern "C" __declspec(dllexport) int* generateBitmap(int width, int height)
{
int *bmpData = (int*)malloc(3*width*height*sizeof(int));
int *dev_bmp;
gpuErrchk(cudaMalloc((void**)&dev_bmp, (3*width*height*sizeof(int))));
kernel<<<BLOCKS_PER_GRID, THREADS_PER_BLOCK>>>(dev_bmp, width, height);
gpuErrchk(cudaPeekAtLastError());
gpuErrchk(cudaDeviceSynchronize());
cudaFree(dev_bmp);
return bmpData;
}
C#:
public class NativeMethods
{
[DllImport(#"C:\...\FractalMaxUnmanaged.dll")]
public static unsafe extern int* generateBitmap(int width, int height);
}
//...
private unsafe void mandlebrotButton_Click(object sender, EventArgs e)
{
int* ptr = NativeMethods.generateBitmap(FractalBox1.Width, FractalBox1.Height);
IntPtr iptr = new IntPtr(ptr)
fractalBitmap = new Bitmap(
FractalBox1.Width,
FractalBox1.Height,
3,
System.Drawing.Imaging.PixelFormat.Format24bppRgb,
iptr );
FractalBox1.Image = fractalBitmap;
}
Error:
************** Exception Text **************
Managed Debugging Assistant 'PInvokeStackImbalance' has detected a problem in 'C:\...WindowsFormsApplication1.vshost.exe'.
I believe the problem I am having is with the IntPtr - is this the correct method to pass a pointer from unmanaged C++ to a C# application? Is there a better method? Is passing a pointer the best method to accomplish what I am trying to do or is there a better way to pass the pixel data from unmanaged C++ w/ CUDA to C#?
EDIT:
From what I gather from the error I get when I debug the application, PInvokeStackImbalance implies that the signatures for the unmanaged and managed code don't match. However, they sure look like they match to me.
I feel like I'm missing something obvious here, any help or recommended reading would be appreciated.
You need to define the same calling convention in C and C#:
In C:
extern "C" __declspec(dllexport) int* __cdecl generateBitmap(int width, int height)
In C#:
[DllImport(#"C:\...\FractalMaxUnmanaged.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr generateBitmap(int width, int height);
Instead of cdecl you can also use stdcall, it only needs to be the same on both sides.
And handling my self a lot of managed/unmanaged code, I also advise you to pass the image array as an argument and do the memory allocation in C#. Doing so, you don't need to take care of manually freeing memory in managed world.
I need to be using the following function from a native dll (WNSMP32.dll) in my C# code.
SNMPAPI_STATUS SnmpStartupEx( _Out_ smiLPUINT32 nMajorVersion,...);
//Considering just one for purpose of discussion
For this, I have the dllimport declaration as
[DllImport("wsnmp32.dll")] internal static extern
Status SnmpStartupEx(out IntPtr majorVersion, ...);
//Considering just one for purpose of discussion
I am using the function as
IntPtr majorVersion = Marshal.AllocHGlobal(sizeof(UINT32))
status = SnmpStartupEx(out majorVersion, out minVersion,
out level, out translateMode, out retransmitMode )
After the allocation of memory, I am printing the values of the IntPtr.
<<<DEBUG OUTPUT>>> IntPtr Value = 112235522816
However after the call to the , I find that the IntPtr is changing!
<<<DEBUG OUTPUT>>> IntPtr after calling SnmpStartupEx
<<<DEBUG OUTPUT>>> IntPtr Value = 111669149698
Should I be allocating memory through Marshal.AllocHGlobal before the call?
Is it valid for the IntPtr's address to change after the call?
Try:
[DllImport("wsnmp32.dll")]
internal static extern Status SnmpStartupEx(out UInt32 majorVersion,
out UInt32 minorVersion,
out UInt32 level,
out UInt32 translateMode,
out UInt32 retransmitMode);
Every out parameter is actually a pointer to a variable which the function overwrites. You don't want to write out IntPtr unless the native code has a double-pointer.
You could do all of that yourself with AllocHGlobal and a normal (pass-by-value, not out) IntPtr parameter... but why go to all that trouble when the compiler can do it for you (and the compiler will be faster, since it will take the address of local variables on the stack instead of allocating buffer space dynamically and then copying)?
First off I would like to say, that I am not trying to hack a game. I am actually employed by the company whose process I am trying to inject. :)
I would like to know how to call a function from an already injected DLL.
So, I have successfully injected and loaded my DLL in the target using CreateRemoteThread(). Below you can see a snippet of the injection:
private static bool Inject(Process pToBeInjected, string sDllPath,out string sError, out IntPtr hwnd, out IntPtr hLibModule)
{
IntPtr zeroPtr = (IntPtr)0;
hLibModule = zeroPtr;
IntPtr hProcess = NativeUtils.OpenProcess(
(0x2 | 0x8 | 0x10 | 0x20 | 0x400), //create thread, query info, operation ,write, and read
1,
(uint)pToBeInjected.Id);
hwnd = hProcess;
IntPtr loadLibH = NativeUtils.GetProcAddress( NativeUtils.GetModuleHandle("kernel32.dll"),"LoadLibraryA");
IntPtr dllAddress = NativeUtils.VirtualAllocEx(
hProcess,
(IntPtr)null,
(IntPtr)sDllPath.Length, //520 bytes should be enough
(uint)NativeUtils.AllocationType.Commit |
(uint)NativeUtils.AllocationType.Reserve,
(uint)NativeUtils.MemoryProtection.ExecuteReadWrite);
byte[] bytes = CalcBytes(sDllPath);
IntPtr ipTmp = IntPtr.Zero;
NativeUtils.WriteProcessMemory(
hProcess,
dllAddress,
bytes,
(uint)bytes.Length,
out ipTmp);
IntPtr hThread = NativeUtils.CreateRemoteThread(
hProcess,
(IntPtr)null,
(IntPtr)0,
loadLibH, //handle to LoabLibrary function
dllAddress,//Address of the dll in remote process
0,
(IntPtr)null);
uint retV= NativeUtils.WaitForSingleObject(hThread, NativeUtils.INFINITE_WAIT);
bool exitR = NativeUtils.GetExitCodeThread(hThread, out hLibModule);
return true;
}
Note: Error checking and freeing resources were removed for brevity, but rest assured I check all the pointers and free my resources.
After the function above exits, I have a non-zero module handle to my DLL returned by LoadLibrary through hLibModule, meaning that the DLL was loaded correctly.
My DLL is a C# class library meant to show a message box (for testing). I have tried testing the function and the message box pops up. It looks like this:
public class Class1
{
public static void ThreadFunc(IntPtr param )
{
IntPtr libPtr = LoadLibrary("user32.dll");
MessageBox(IntPtr.Zero, "I'm ALIVE!!!!", "InjectedDll", 0);
}
[DllImport("kernel32", SetLastError = true)]
public static extern IntPtr LoadLibrary(string lpFileName);
[DllImport("user32.dll", CharSet = CharSet.Auto)]
static extern int MessageBox(IntPtr hWnd, String text, String caption, int options);
}
I compile it from Visual Studio and the DLL appears in the Debug folder. I then pass the full path of my DLL to the injector.
After injection into the target process, I don't know how to call my ThreadFunc from the injected DLL, so it never executes.
I cannot use GetProcAddress(hLibModule,"ThreadFunc") since I am out of process, so the answer must lie into calling CreateRemoteThread() somehow. Also, I have read that DllMain is no longer allowed for .NET DLLs, so I cannot get any free execution that way either.
Does anyone have any idea how to call a function from an injected DLL?
Thank you in advance.
Well, you already got a thread running inside that process. You make it do something boring, it only loads a DLL. This works completely by accident, LoadLibrary just happens to have to correct function signature.
It can do much more. That however better be unmanaged code, just like LoadLibrary(), you cannot count on any managed code running properly. That takes a heckofalot more work, you have to load and initialize the CLR and tell it to load and execute the assembly you want to run. And no, you cannot load the CLR in DllMain().
Keywords to look for are CorBindToRuntimeEx() and ICLRRuntimeHost::ExecuteInAppDomain(). This is gritty stuff to get going but I've seen it done. COM and C++ skills and generous helpings of luck required.