Running a program from a byte array using VirtualAlloc? - c#

I'm working on application SFX / Protector in C# and i want the protected assembly to be executed from a byte array instead of writing it to hard disk, in order to be much harder for reverse engineering.
I have a program within a byte array ( which have a valid entry point ) and i want to execute it.
I found a similar question on this website on how can i do that, i know this can be done using the code snippet below but can someone please guide me on how can i run a program from a byte array using this ?
technically he below code let me do this :
using System;
using System.Collections.Generic;
using System.Runtime.InteropServices;
namespace DynamicX86
{
class Program
{
const uint PAGE_EXECUTE_READWRITE = 0x40;
const uint MEM_COMMIT = 0x1000;
[DllImport("kernel32.dll", SetLastError = true)]
static extern IntPtr VirtualAlloc(IntPtr lpAddress, uint dwSize, uint flAllocationType, uint flProtect);
private delegate int IntReturner();
static void Main(string[] args)
{
List<byte> bodyBuilder = new List<byte>();
bodyBuilder.Add(0xb8);
bodyBuilder.AddRange(BitConverter.GetBytes(42));
bodyBuilder.Add(0xc3);
byte[] body = bodyBuilder.ToArray();
IntPtr buf = VirtualAlloc(IntPtr.Zero, (uint)body.Length, MEM_COMMIT, PAGE_EXECUTE_READWRITE);
Marshal.Copy(body, 0, buf, body.Length);
IntReturner ptr = (IntReturner)Marshal.GetDelegateForFunctionPointer(buf, typeof(IntReturner));
Console.WriteLine(ptr());
}
}
}
How can i implement this answer to run a program from array of bytes. I can't understand exactly what i can do with this code. Please help
This is a link where i found this answer :
Is it possible to execute an x86 assembly sequence from within C#?
Anyhelp would be highly appreciated.

What is the valid entry point and what is it's signature? How do you get these bytes from? Do you generate IL? If you do, perhaps it might be easier to simply do this.
What the code above is trying to do is allocate unmanaged memory, fill it with x86 instructions and then have .NET create a delegate from this "function pointer" and execute it - which is different from what you want.

Related

Error read addres with function ReadProcessMemory c#

I have an error with reading the memory address of the game, where:
my code is this
public partial class MainWindow: Window
    {
        [DllImport ("kernel32.dll")]
        public static extern IntPtr OpenProcess (int dwDesiredAccess, bool bInheritHandle, int dwProcessId);
        [DllImport ("kernel32.dll")]
        public static extern bool ReadProcessMemory (int hProcess, int lpBaseAddress, byte [] lpBuffer, int dwSize, ref int lpNumberOfBytesRead);
        const int PROCESS_WM_READ = 0x0010;
        public MainWindow ()
        {
            arguments dir = new arguments ();
            Process process = Process.GetProcessesByName (dir.proccessname) [0];
            IntPtr processHandle = OpenProcess (PROCESS_WM_READ, false, process.Id);
            int bytesRead = 0;
            byte [] buffer = new byte [4];
            ReadProcessMemory ((int) processHandle, dir.heal_Act, buffer, buffer.Length, ref bytesRead);
        }
however, nothing reading appears:
in cheat engine read me the values
Read address with Cheat Engine
Where in the image appears the reading of the memory address and finally the value contained with 4 bytes in size
Read address with Cheat Engine
In addition, I do not know how the summation of the address is as shown in box 2 to obtain the contained value.
they could help me to propose the reading with the indicated address, since zero appears.
You need to run your process as administrator and you should compile this for the same architecture as the target process. If you're targetting x86, compile as x86.
Keep in mind your ReadProcessMemory define only takes 4 byte addresses as arguments, you will want to use IntPtr for the addresses so when you someday switch to x64 the args will be large enough for 64 bit pointers.
If these things don't solve your problem then you're offsets or addresses are wrong. In which case the best thing to do is to use the Visual Studio Debugger and compare what you see against what you see in Cheat Engine.
Also in your PROCESS_WM_READ you have a 'W' instead of a 'V'.

P/Invoking "wglChoosePixelFormatARB" Unbalances the Stack

I am trying to use some pinvokes to set up a wgl context for some unit tests, but (so far) one of my methods unbalances the stack.
What I have done is to first create a window and get its DC. This all works as I am talking to the kernel32, user32, gdi32 libraries. I am wanting to draw to a pbuffer with OpenGL, and in order to create the PB, I need to use the extensions. Which requires that I have a context... This is all sadly normal and working so far.
The problem comes when I am trying to create the pbuffer. When I try get the configurations using wglChoosePixelFormatARB, this appears to unbalance the stack. I have just executed another ARB method (wglGetExtensionsStringARB) earlier to check the extensions - and that works fine using the same parent DC.
So, on to code... My delegate looks like this:
[UnmanagedFunctionPointer(CallingConvention.Winapi)]
[return: MarshalAs(UnmanagedType.Bool)]
public delegate bool wglChoosePixelFormatARBDelegate(
IntPtr dc,
[In] int[] attribIList,
[In] float[] attribFList,
uint maxFormats,
[Out] int[] pixelFormats,
out uint numFormats);
I find it like this:
[DllImport(opengl32)]
public static extern IntPtr wglGetProcAddress(string lpszProc);
// ...
var ptr = wglGetProcAddress("wglCreatePbufferARB");
wglCreatePbufferARB = Marshal.GetDelegateForFunctionPointer(ptr, typeof(wglChoosePixelFormatARBDelegate));
And I am invoking it like this:
var iAttrs = new int[]
{
Wgl.WGL_ACCELERATION_ARB, Wgl.WGL_FULL_ACCELERATION_ARB,
Wgl.WGL_DRAW_TO_WINDOW_ARB, Wgl.TRUE,
Wgl.WGL_SUPPORT_OPENGL_ARB, Wgl.TRUE,
Wgl.NONE, Wgl.NONE
};
var fAttrs = new float[2];
var piFormats = new int[1];
uint nFormats;
wglChoosePixelFormatARB(
parentDC,
iAttrs,
fAttrs,
(uint)piFormats.Length,
piFormats,
out nFormats);
if (nFormats == 0)
{
return IntPtr.Zero;
}
var pbuf = extensions.wglCreatePbufferARB(parentDC, piFormats[0], 1, 1, null);
The native side of this is (which is not exported):
BOOL WINAPI wglChoosePixelFormatARB (
HDC hdc,
const int *piAttribIList,
const FLOAT *pfAttribFList,
UINT nMaxFormats,
int *piFormats,
UINT *nNumFormats);
And the function def is this:
typedef BOOL (WINAPI * PFNWGLCHOOSEPIXELFORMATARBPROC) (
HDC hdc,
const int *piAttribIList,
const FLOAT *pfAttribFList,
UINT nMaxFormats,
int *piFormats,
UINT *nNumFormats);
The code looks OK to me, but there must be something wrong :) I hope someone can point out my error.
In case more code is required, I have it all here in a single file that is just a plain console app:
https://gist.github.com/mattleibow/755eba3c8ff5eafb9549842a0abb0426
(the code has large chunks of comments, this is just because I am busy porting from C++ to C#. And a third of the code is just the dllimports/structs/enums)
The function declaration looks reasonable but it seems that you are simply importing the wrong function.
You want wglChoosePixelFormatARB but actually import wglCreatePbufferARB. This smells like a class copy/paste SNAFU.
Fix this by correcting the name that you pass to GetProcAddress.

Edit Memory Address via c#

i want to edit an active app (edit a memory address),
on address 00498D45 i want to edit its value
currect value :
MOV BYTE PTR SS:[EBP-423],7
to
updated value:
MOV BYTE PTR SS:[EBP-423],8
what i got till now is this (searched about it on the net and this how far i got):
thanks in advance!
using System.Runtime.InteropServices;
[Flags]
public enum ProcessAccessFlags : uint
{
All = 0x001F0FFF,
Terminate = 0x00000001,
CreateThread = 0x00000002,
VMOperation = 0x00000008,
VMRead = 0x00000010,
VMWrite = 0x00000020,
DupHandle = 0x00000040,
SetInformation = 0x00000200,
QueryInformation = 0x00000400,
Synchronize = 0x00100000
}
[DllImport("kernel32.dll")]
private static extern IntPtr OpenProcess(ProcessAccessFlags dwDesiredAccess, [MarshalAs(UnmanagedType.Bool)] bool bInheritHandle, int dwProcessId);
[DllImport("kernel32.dll", SetLastError = true)]
private static extern bool WriteProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, uint nSize, out int lpNumberOfBytesWritten);
[DllImport("kernel32.dll", SetLastError = true)]
static extern bool ReadProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, [Out] byte[] lpBuffer, int dwSize, out int lpNumberOfBytesRead);
[DllImport("kernel32.dll")]
public static extern Int32 CloseHandle(IntPtr hProcess);
Process process = Process.GetProcessesByName("My Apps Name").FirstOrDefault();
public static bool WriteMemory(Process process, int address, long value, out int bytesWritten)
{
IntPtr hProc = OpenProcess(ProcessAccessFlags.All, false, process.Id);
byte[] val = BitConverter.GetBytes(value);
bool worked = WriteProcessMemory(hProc, new IntPtr(address), val, (UInt32) val.LongLength, out bytesWritten);
CloseHandle(hProc);
return worked;
}
From your other question:
WriteMemory(Process process,00498D45 , MOV BYTE PTR SS:[EBP-423],8)
There are so many problems with this, I don't know where to begin. First of all, that's not anywhere near correct C# syntax.
You're calling a function, but you have Process there like it's a signature.
00498D45 is not a valid constant in any base. If you mean hex, (which you probably do since you're dealing with addresses) then like all other C-like languages, that should be expressed as 0x00498D45.
That's x86 assembly code in ASCII (but it's not in a string, you just have a mess). You can't just plop ASCII assembly code into another process's address space!
Perhaps you should do a little more research on how compilation, and assembly work when building a program, and the guts of what your CPU is actually doing when it's executing a program. Also, I recommend reading through the sample code you've very obviously taken from somewhere and try to understand it. You'll be way better off learning what's going on, than asking everyone to help fix the stuff you've cobbled together. </rant>
Anyway, after you assemble your code, it looks like this (re dis-assembled):
C68559FEFFFF08 mov byte [ebp-0x1a7],0x8
That means that your instruction is actually the string of bytes C6 85 59 FE FF FF 08. So that is what you need to write into your target application.
This is the basis of what you're trying to do:
byte[] new_instr = new byte[] {0xC6, 0x85, 0x59, 0xFE, 0xFF, 0xFF, 0x08};
IntPtr target_addr = (IntPtr)0x00498D45;
int bytesWritten;
WriteProcessMemory(hProcess, target_addr, new_instr, (UInt32)new_instr.Length, out bytesWritten);
The WriteMemory memory function you've copy-and-pasted won't help you here. The problem is, it only writes a long which is 4 bytes. You need to write 7 bytes. So you'll either have to modify that function to use a byte[] parameter, or do it yourself.
I admire your ambition, but you should really start a bit lower. Write some C# code to get yourself familiar with programming in a C-like language. Then write some C to get familiar with crashing when you don't do things perfectly. Then try dabbling in assembly - perhaps writing small inline pieces into your C code. Finally then, you'll be ready to go hacking around the instructions of other running processes.
You can use an injection library like MemorySharp (I'm the author). All these functions are wrapped within a handy API, integrating an assembler (FASM syntax). Thus this piece of code perform what you want :
using (var m = new MemorySharp(ApplicationFinder.FromProcessName("My App").First()))
{
m.Assembly.Inject("MOV BYTE [EBP-423],8", new IntPtr(0x00498D45));
}

How to use the RtlCaptureStackBackTrace() in Kernel32.dll library?

I know about the StackTrace class but for what I am doing this is too slow due to all the symbol lookups.
I heard about the RtlCaptureStackBackTrace function in Kernel32.dll but I seem to be unable to get it to function correctly. I am unfamiliar with pinvoke as I rarely have to do anything with it and the stuff I do is well covered on the net, this is not however :(.
This is the definition I have:
[DllImport("kernel32.dll")]
public static extern ushort RtlCaptureStackBackTrace(uint FramesToSkip, uint FramesToCapture, out IntPtr BackTrace, out uint BackTraceHash);
And this is how I am using it:
IntPtr trace = IntPtr.Zero;
uint traceHash = 0;
int framesCaptured = 0;
framesCaptured = RtlCaptureStackBackTrace(0, 4, out trace, out traceHash);
int[] frames = new int[framesCaptured];
Marshal.Copy(trace, frames, 0, framesCaptured);
What I see is that framesCaptured is 1, which should be at least 4 user calls plus system calls for my test app.
EDIT: I should probably be explicit in that my question is what am I doing wrong with RtlCaptureStackBackTrace that I am not receiving the correct stack address information or am I using it right and just don't know how to disseminate the data?

Strange problem calling a .DLL from C#

I'm trying to call the HtmlTidy library dll from C#. There's a few examples floating around on the net but nothing definitive... and I'm having no end of trouble. I'm pretty certain the problem is with the p/invoke declaration... but danged if I know where I'm going wrong.
I got the libtidy.dll from http://www.paehl.com/open_source/?HTML_Tidy_for_Windows which seems to be a current version.
Here's a console app that demonstrates the problem I'm having:
using System;
using System.Collections.Generic;
using System.Text;
using System.Runtime.InteropServices;
namespace ConsoleApplication5
{
class Program
{
[StructLayout(LayoutKind.Sequential)]
public struct TidyBuffer
{
public IntPtr bp; // Pointer to bytes
public uint size; // # bytes currently in use
public uint allocated; // # bytes allocated
public uint next; // Offset of current input position
};
[DllImport("libtidy.dll")]
public static extern int tidyBufAlloc(ref TidyBuffer tidyBuffer, uint allocSize);
static void Main(string[] args)
{
Console.WriteLine(CleanHtml("<html><body><p>Hello World!</p></body></html>"));
}
static string CleanHtml(string inputHtml)
{
byte[] inputArray = Encoding.UTF8.GetBytes(inputHtml);
byte[] inputArray2 = Encoding.UTF8.GetBytes(inputHtml);
TidyBuffer tidyBuffer2;
tidyBuffer2.size = 0;
tidyBuffer2.allocated = 0;
tidyBuffer2.next = 0;
tidyBuffer2.bp = IntPtr.Zero;
//
// tidyBufAlloc overwrites inputArray2... why? how? seems like
// tidyBufAlloc is stomping on the stack a bit too much... but
// how? I've tried changing the calling convention to cdecl and
// stdcall but no change.
//
Console.WriteLine((inputArray2 == null ? "Array2 null" : "Array2 not null"));
tidyBufAlloc(ref tidyBuffer2, 65535);
Console.WriteLine((inputArray2 == null ? "Array2 null" : "Array2 not null"));
return "did nothing";
}
}
}
All in all I'm a bit stumpped. Any help would be appreciated!
You are working with an old definition of the TidyBuffer structure. The new structure is larger so when you call the allocate method it is overwriting the stack location for inputArray2. The new definition is:
[StructLayout(LayoutKind.Sequential)]
public struct TidyBuffer
{
public IntPtr allocator; // Pointer to custom allocator
public IntPtr bp; // Pointer to bytes
public uint size; // # bytes currently in use
public uint allocated; // # bytes allocated
public uint next; // Offset of current input position
};
For what it's worth, we tried Tidy at work and switched to HtmlAgilityPack.
Try changing your tidyBufAlloc declaration to:
[DllImport("libtidy.dll", CharSet = CharSet.Ansi)]
private static extern int tidyBufAlloc(ref TidyBuffer Buffer, int allocSize);
Note the CharSet.Ansi addition and the "int allocSize" (instead of uint).
Also, see this sample code for an example of using HTML Tidy in C#.
In your example, if inputHTML is large, say 50K, inputArray and inputArray2 will be also be 50K each.
You are then also trying to allocate 65K in the tidyBufAlloc call.
If a pointer is not initialised correctly, it is quite possible a random .NET heap address is being used. Hence overwriting part or all of a seemingly unrelated variable/buffer occurs. It is problaby just luck, or that you have already allocated large buffers, that you are not overwriting a code block which would likely cause a Invalid Memory access error.

Categories

Resources