First off I would like to say, that I am not trying to hack a game. I am actually employed by the company whose process I am trying to inject. :)
I would like to know how to call a function from an already injected DLL.
So, I have successfully injected and loaded my DLL in the target using CreateRemoteThread(). Below you can see a snippet of the injection:
private static bool Inject(Process pToBeInjected, string sDllPath,out string sError, out IntPtr hwnd, out IntPtr hLibModule)
{
IntPtr zeroPtr = (IntPtr)0;
hLibModule = zeroPtr;
IntPtr hProcess = NativeUtils.OpenProcess(
(0x2 | 0x8 | 0x10 | 0x20 | 0x400), //create thread, query info, operation ,write, and read
1,
(uint)pToBeInjected.Id);
hwnd = hProcess;
IntPtr loadLibH = NativeUtils.GetProcAddress( NativeUtils.GetModuleHandle("kernel32.dll"),"LoadLibraryA");
IntPtr dllAddress = NativeUtils.VirtualAllocEx(
hProcess,
(IntPtr)null,
(IntPtr)sDllPath.Length, //520 bytes should be enough
(uint)NativeUtils.AllocationType.Commit |
(uint)NativeUtils.AllocationType.Reserve,
(uint)NativeUtils.MemoryProtection.ExecuteReadWrite);
byte[] bytes = CalcBytes(sDllPath);
IntPtr ipTmp = IntPtr.Zero;
NativeUtils.WriteProcessMemory(
hProcess,
dllAddress,
bytes,
(uint)bytes.Length,
out ipTmp);
IntPtr hThread = NativeUtils.CreateRemoteThread(
hProcess,
(IntPtr)null,
(IntPtr)0,
loadLibH, //handle to LoabLibrary function
dllAddress,//Address of the dll in remote process
0,
(IntPtr)null);
uint retV= NativeUtils.WaitForSingleObject(hThread, NativeUtils.INFINITE_WAIT);
bool exitR = NativeUtils.GetExitCodeThread(hThread, out hLibModule);
return true;
}
Note: Error checking and freeing resources were removed for brevity, but rest assured I check all the pointers and free my resources.
After the function above exits, I have a non-zero module handle to my DLL returned by LoadLibrary through hLibModule, meaning that the DLL was loaded correctly.
My DLL is a C# class library meant to show a message box (for testing). I have tried testing the function and the message box pops up. It looks like this:
public class Class1
{
public static void ThreadFunc(IntPtr param )
{
IntPtr libPtr = LoadLibrary("user32.dll");
MessageBox(IntPtr.Zero, "I'm ALIVE!!!!", "InjectedDll", 0);
}
[DllImport("kernel32", SetLastError = true)]
public static extern IntPtr LoadLibrary(string lpFileName);
[DllImport("user32.dll", CharSet = CharSet.Auto)]
static extern int MessageBox(IntPtr hWnd, String text, String caption, int options);
}
I compile it from Visual Studio and the DLL appears in the Debug folder. I then pass the full path of my DLL to the injector.
After injection into the target process, I don't know how to call my ThreadFunc from the injected DLL, so it never executes.
I cannot use GetProcAddress(hLibModule,"ThreadFunc") since I am out of process, so the answer must lie into calling CreateRemoteThread() somehow. Also, I have read that DllMain is no longer allowed for .NET DLLs, so I cannot get any free execution that way either.
Does anyone have any idea how to call a function from an injected DLL?
Thank you in advance.
Well, you already got a thread running inside that process. You make it do something boring, it only loads a DLL. This works completely by accident, LoadLibrary just happens to have to correct function signature.
It can do much more. That however better be unmanaged code, just like LoadLibrary(), you cannot count on any managed code running properly. That takes a heckofalot more work, you have to load and initialize the CLR and tell it to load and execute the assembly you want to run. And no, you cannot load the CLR in DllMain().
Keywords to look for are CorBindToRuntimeEx() and ICLRRuntimeHost::ExecuteInAppDomain(). This is gritty stuff to get going but I've seen it done. COM and C++ skills and generous helpings of luck required.
Related
I am currently trying to write a console application in C# with two screen buffers, which should be swapped back and forth (much like VSync on a modern GPU). Since the System.Console class does not provide a way to switch buffers, I had to P/Invoke several methods from kernel32.dll.
This is my current code, grossly simplified:
static void Main(string[] args)
{
IntPtr oldBuffer = GetStdHandle(-11); //Gets the handle for the default console buffer
IntPtr newBuffer = CreateConsoleScreenBuffer(0, 0x00000001, IntPtr.Zero, 1, 0); //Creates a new console buffer
/* Write data to newBuffer */
SetConsoleActiveScreenBuffer(newBuffer);
}
The following things occured:
The screen remains empty, even though it should be displaying newBuffer
When written to oldBuffer instead of newBuffer, the data appears immediately. Thus, my way of writing into the buffer should be correct.
Upon calling SetConsoleActiveScreenBuffer(newBuffer), the error code is now 6, which means invalid handle. This is strange, as the handle is not -1, which the documentation discribes as invalid.
I should note that I very rarely worked with the Win32 API directly and have very little understanding of common Win32-related problems. I would appreciate any sort of help.
As IInspectable points out in the comments, you're setting dwDesiredAccess to zero. That gives you a handle with no access permissions. There are some edge cases where such a handle is useful, but this isn't one of them.
The only slight oddity is that you're getting "invalid handle" rather than "access denied". I'm guessing you're running Windows 7, so the handle is a user-mode object (a "pseudohandle") rather than a kernel handle.
At any rate, you need to set dwDesiredAccess to GENERIC_READ | GENERIC_WRITE as shown in the sample code.
Also, as Hans pointed out in the comments, the declaration on pinvoke.net was incorrect, specifying the last argument as a four-byte integer rather than a pointer-sized integer. I believe the correct declaration is
[DllImport("kernel32.dll", SetLastError = true)]
static extern IntPtr CreateConsoleScreenBuffer(
uint dwDesiredAccess,
uint dwShareMode,
IntPtr lpSecurityAttributes,
uint dwFlags,
IntPtr lpScreenBufferData
);
I'm working on some code that involves using P/Invoke to call unmanaged functions from a few C++ DLLs. I'd like to be able to build the application as either 32 or 64 bit.
Currently, it only works as x86.
I have 32 and 64 bit copies of each of the referenced C++ DLLs and am using the following code to change the DllDirectory depending on whether the app is built as x86 or x64 (/lib/x64 holds the 64-bit dlls, /lib/x86 holds the 32-bit ones):
[DllImport("kernel32.dll", CharSet = CharSet.Auto)]
static extern bool SetDllDirectory(string lpPathName);
string libPath = Path.Combine(Environment.CurrentDirectory, "lib", (Environment.Is64BitProcess == true ? "x64" : "x86"));
SetDllDirectory(libPath);
The rest of my unmanaged C++ functions are defined as follows:
[DllImport("libgobject-2.0-0.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
static extern void g_type_init();
[DllImport("libgobject-2.0-0.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
static extern void g_object_unref(IntPtr pixbuf);
[DllImport("librsvg-2-2.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
static extern IntPtr rsvg_pixbuf_from_file_at_size(string file_name, int width, int height, out IntPtr error);
[DllImport("libgdk_pixbuf-2.0-0.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
static extern bool gdk_pixbuf_save(IntPtr pixbuf, string filename, string type, out IntPtr error, __arglist);
The code that actually uses these functions looks similar to this:
g_type_init();
IntPtr ptrError;
IntPtr ptrPixbuf = rsvg_pixbuf_from_file_at_size(filePath, width, height, out ptrError);
if (ptrError == IntPtr.Zero)
{
bool isSaved = gdk_pixbuf_save(ptrPixbuf, outputPath, outputFormat, out ptrError, __arglist(null)); //this line fails when compiled as x64!
if (isSaved && File.Exists(outputPath))
{
return outputPath;
}
}
g_object_unref(ptrPixbuf);
As I mentioned, everything works fine when running the application as x86 on my local machine (Windows 7 x64). However, when I compile it as an x64 application, I get an "AccessViolationException" at the call to gdk_pixbuf_save().
Any ideas? I'm relatively new to interop code, but I think it might have something to do with how the IntPtr variables are sent to/from the unmanaged code? But why is it different from x86 to x64?
Many thanks to all who commented -- you sent me down the right path, and helped me solve an underlying problem.
Originally, I wanted to have an x64 build just in case it was necessary...and it ended up being just that.
As it turns out, these comments are correct. On x64 builds, the undocumented ___arglist keyword does not function as intended.
I can't comment on what, specifically, goes wrong. The comment I linked mentions the possibility of the calling convention not being set correctly. I'm not sure how this works...doesn't x64 only have one calling convention, anyway?
Whatever, back to the point:
I changed my DllImport for gdk_pixbuf_save to look like this:
[DllImport("libgdk_pixbuf-2.0-0.dll", CallingConvention = CallingConvention.Cdecl)]
static extern bool gdk_pixbuf_save(UIntPtr pixbuf, string filename, string type, out UIntPtr error, UIntPtr arglist);
The key here is that I'm passing in the final parameter, arglist, as IntPtr instead of ___arglist.
Actually, I'm passing it in as UIntPtr because I switched all of my original IntPtr objects to UIntPtr.
That being said, when I call the function, it looks like this:
bool isSaved = gdk_pixbuf_save(ptrPixbuf, outputFilePath, outputFileFormat, out ptrError, UIntPtr.Zero);
As my ___arglist is empty (there are other, optional, parameters that could be specified), the documentation tells me it should be null terminated. To accomplish this, I pass in IntPtr.Zero (or UIntPtr.Zero, in my case).
Now, my code compiles, runs and I (more importantly) have access to 64-bits' worth of memory.
Thanks again to those who commented on my post - without your pointers toward the ___arglist parameter, I'd have been completely clueless.
I have an API function in my application:
<Runtime.InteropServices.DllImport("kernel32.dll", SetLastError:=True, CharSet:=Runtime.InteropServices.CharSet.Ansi, ExactSpelling:=True)>
Private Shared Function GetProcAddress(ByVal hModule As IntPtr, ByVal procName As String) As IntPtr
End Function
I just want to learn the pointer 'IntPtr' value of this function. How can I do it?
Note: I will show you the exact thing that I want in C++
void* fnGetProcAddress;
fnGetProcAddress = GetProcAddress;
Well, you can continue using P/Invoke...
(Note, this is in C#, but easily convertible)
[System.Runtime.InteropServices.DllImport("kernel32.dll")]
public static extern IntPtr GetProcAddress(IntPtr hModule, string procName);
[System.Runtime.InteropServices.DllImport("kernel32.dll")]
public static extern IntPtr GetModuleHandle(string moduleName);
var hModule = GetModuleHandle("kernel32.dll");
var procAddress = GetProcAddress(hModule, "GetProcAddress");
I want to get this address and write it in a BinaryStream as UInt32
This is a very troublesome plan. Short from the wrong data type, you have no guarantees whatsoever that the address you write is still valid when you read the stream:
The DLL might simply not be loaded when you read the stream. It does require making the LoadLibrary() call to get it in the process. So at a very minimum you'd also have to serialize the DLL path.
DLLs do not promise to get loaded at the exact same address again. The load address embedded in the DLL header is merely a request, it is very common that the requested address is already in use by another DLL, forcing Windows to relocate the DLL. That relocated address is not predictable. A far bigger problem is that relocation is intentionally done on modern Windows versions. A feature called Address Space Layout Randomization, enabled when the DLL was linked with the /DYNAMICBASE linker option. It is an anti-malware feature, making it intentionally hard for malware to patch code.
Surely there's a better way to do what you want to do. You however made the common mistake of not explaining your reasons, it is impossible to guess at.
I try to load a simple DLL compiled with GCC in cygwin into a C#.NET application. The DLL looks like this
#ifndef __FOO_H
#define __FOO_H
#if _WIN32
#define EXPORT extern "C" __declspec(dllexport)
#else //__GNUC__ >= 4
#define EXPORT extern "C" __attribute__((visibility("default")))
#endif
EXPORT int bar();
#endif // __FOO_H
The function bar() just returns 42.
I compiled and linked the DLL with
g++ -shared -o foo.dll foo.cpp
Now I want to load this super simple DLL into a C# WinForms application.
public partial class Form1 : Form
{
[DllImport("kernel32", CharSet = CharSet.Ansi, ExactSpelling = true, SetLastError = true)]
static extern IntPtr GetProcAddress(IntPtr hModule, string procName);
[DllImport("kernel32", SetLastError = true)]
static extern IntPtr LoadLibrary(string lpFileName);
public delegate IntPtr Action2();
unsafe public Form1()
{
InitializeComponent();
IntPtr pcygwin = LoadLibrary("cygwin1.dll");
IntPtr pcyginit = GetProcAddress(pcygwin, "cygwin_dll_init");
Action init = (Action)Marshal.GetDelegateForFunctionPointer(pcyginit, typeof(Action));
init();
}
unsafe private void button1_Click(object sender, EventArgs e)
{
IntPtr foo = LoadLibrary("foo.dll"); // CRASH ... sometimes
IntPtr barProc = GetProcAddress(foo, "bar");
Action2 barAction = (Action2)Marshal.GetDelegateForFunctionPointer(barProc, typeof(Action2));
IntPtr inst = barAction();
}
}
Now the strange thing is: sometimes it works and sometimes it doesn't. When it doesn't work it crashes when it loads foo.dll. I run it in debug mode but I don't even get an exception. The debugger just stops as if I stopped it myself!
I also tried to load foo.dll in the same stack frame where I load cygwin1.dll. Same thing!
Any hints why this happens and what I can do to make it work?
Update 1: We use the latest cygwin and Visual Studio 2010.
Update 2: An assumption is that it has to with timing and garbage collection. It seems to me that the time between loading cygwin1.dll and loading foo.dll matters. The shorter the time between the two LoadLibrary calls the more likely it seems to work.
Update 3: If loading foo.dll succeeds the first time it succeeds always during a session. I can click button1 as often as I want.
Note: LoadLibrary("foo.dll") does not simply fail to load foo.dll. That'd be nice. I crashes and the debugger stops working. Not even an exception is thrown. AND it does not crash always. Sometimes it works!
Look at the "UPDATED" part of my old answer about the close problem. I recommend you to to compile your DLL with respect of MinGW tools instead of CygWin tools. If nothing is changed in the since the time the requirement "Make sure you have 4K of scratch space at the bottom of your stack" makes CygWin DLLs incompatible to .NET. I don't know how to realized the requirement in a .NET application.
You should try process monitor from msft. This is most likely being caused by a failure to load a dependent dll. Process monitor will show you what dll and why it is not loading.
Try the following...
[DllImport("kernel32", CharSet=CharSet.Unicode)]
static extern IntPtr LoadLibrary(string lpLibFileName);
maybe even use
[DllImport("kernel32", CharSet=CharSet.Unicode, SetLastError=true)]
and check for return values from both your calls to LoadLibrary and GetProcAddress before trying to use them.
I am working with an application in C# that need to send a message to a C++ application.
I imported
[DllImport("user32.dll")]
public static extern IntPtr SendMessage(
int hWnd, // handle to destination window
uint Msg, // message
IntPtr wParam, // first message parameter
IntPtr lParam // second message parameter
);
but now, my problem is that I need to passe string to wParam and lParam. I tried unsafe code, but it seems string just doesnt work like the rest of variables. How can I achieve that? Thanks.
The declaration is wrong, the wParam and lParam arguments are IntPtr, not long.
There is a complication because you are trying to send strings. What matters if the target window is Unicode enabled or not. There are two versions of SendMessage, SendMessageA() and SendMessageW(). The former needs to be used if the program is dated and uses 8-bit character strings rather than UTF-16 encoded strings.
You can find out by using Spy++. Use the finder tool to select the window of the application. In the General property tab, you'll see "Window proc". It will say (Unicode) if the window is Unicode enabled. If you don't see it then the strings have to be translated to 8-bit characters.
To generate the string pointers you need to pass, you can use Marshal.StringToHGlobalAnsi or StringToHGlobalUni (respectively 8-bit and Unicode). You can however play a trick to let the P/Invoke marshaller translate the string for you. Saves you the hassle of having to free the strings after the call. For the Ansi version, you can declare the API function like this:
[DllImport("user32.dll", CharSet = CharSet.Ansi, EntryPoint = "SendMessageA", ExactSpelling = true)]
private static extern IntPtr SendMessageStrings(IntPtr hWnd, int msg, string wParam, string lParam);
And the Unicode version like this:
[DllImport("user32.dll", CharSet = CharSet.Unicode, EntryPoint = "SendMessageW", ExactSpelling = true)]
private static extern IntPtr SendMessageStrings(IntPtr hWnd, int msg, string wParam, string lParam);
One final note: this will not work as-is if the window belongs to another application, you'll crash it. The pointer values you pass are only valid in your own process, not in the C++ process. To work around that, you have to allocate memory in the target process so that the pointer is valid. That requires OpenProcess to get a handle to the process, VirtualAllocEx() to allocate memory in the target process, big enough to store the string, WriteProcessMemory to write the string. Now you can call SendMessage(), use a version that is declared with IntPtr for the wParam and lParam arguments, pass the value you got from VirtualAllocEx. Next use VirtualFreeEx() to release the memory and CloseHandle to clean up. Or keep the memory around for the next time if you do this often.
Quite a lot of P/Invoke to get wrong there. Not to mention security issues, WriteProcessMemory requires admin privileges, UAC elevation is required.
Passing the address of the string would involve unsafe code,
since Win32 API calls expect addresses (in a C/C++ native envrion). What windows message are you sending that requires a string in wParam or lParam?
We ended up using "WmCpyDta_d.dll" to deal with all that.