C#~How to read MultiLevelPointer? - c#

I'm struggling a bit on this part...
I want to do this in CE!(that is read the value 20 in my c# app)
However my code is not working...
[DllImport("kernel32.dll")]
public static extern bool ReadProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, int dwSize, ref IntPtr lpNumberOfBytesRead);
public int ReadInt32(IntPtr address, int[] pointers)
{
/* FOR REFERENCE ONLY! PSEUDO-CODE
ReadProcessMemory(..., ModuleBaseAddress + 0x010F418, Temporary, ..., ...); // -> 0x02A917F8
ReadProcessMemory(..., 0x02A917F8+0x48, Temporary, .....,.); // -> 0x02A9A488
[02A9A488] = 20
*/
IntPtr bytesRead = IntPtr.Zero;
byte[] _buff = new byte[sizeof(int)];
int offIndex = 0;
IntPtr finalval = address;
Console.WriteLine("[BASE] {0:x}", (int)address);
foreach(int PointerOffs in pointers)
{
ReadProcessMemory(hProcess, address, _buff, _buff.Length, ref bytesRead);
finalval += pointers[offIndex];
Console.WriteLine("[Curr ADDRESS] {0:x}", finalval);
offIndex++;
}
return BitConverter.ToInt32(_buff, 0);
}
And this is how I access the method:
int currAmmo = (int) pReader.ReadInt32((IntPtr)LocalPlayer.BaseAddress, LocalPlayer.oMGAmmo);
Console.Write("[AMMO] {0}\n", currAmmo);

Your function has enough problems to warrant a replacement, I tried to fix it but it was easier just to start fresh. By utilizing a pre increment instead of a post increment you will de-reference the first pointer before adding an offset which is ideal.
public static int ReadInt32(IntPtr hProc, IntPtr ptr, int[] offsets)
{
IntPtr addr = ptr;
var buffer = new byte[4];
for (int i = 0; i < offsets.Length; ++i)
{
ReadProcessMemory(hProc, addr, buffer, buffer.Length, out var read1);
addr = IntPtr.Add(new IntPtr(BitConverter.ToInt32(buffer, 0)), offsets[i]);
}
ReadProcessMemory(hProc, addr, buffer, 4, out var read);
return BitConverter.ToInt32(buffer, 0);
}
I learned C# today just to answer this question :)

Related

ReadProcessMemory Read string until NULL

I'm trying to read a string "845120" from a process memory but I have some trouble...
I know "845120" is a numeric value, but in some cases it can be alphanumeric, that's why it's a string and not a 4 byte int.
Here is my Memory class, where I have all functions that deal with memory:
private static extern bool ReadProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, [Out] byte[] lpBuffer, int dwSize, out IntPtr lpNumberOfBytesRead);
public static IntPtr FindDMAAddy(IntPtr hProc, IntPtr ptr, int[] offsets)
{
var buffer = new byte[IntPtr.Size];
foreach (int i in offsets)
{
ReadProcessMemory(hProc, ptr, buffer, buffer.Length, out var read);
ptr = (IntPtr.Size == 4)
? IntPtr.Add(new IntPtr(BitConverter.ToInt32(buffer, 0)), i)
: ptr = IntPtr.Add(new IntPtr(BitConverter.ToInt64(buffer, 0)), i);
}
return ptr;
}
public static IntPtr GetModuleBaseAddress(Process proc, string modName)
{
IntPtr addr = IntPtr.Zero;
foreach (ProcessModule m in proc.Modules)
{
if (m.ModuleName == modName)
{
addr = m.BaseAddress;
break;
}
}
return addr;
}
public static string ReadStringUntilNULL(string EXENAME, int Address)
{
string value = "";
bool endOfString = false;
int counter = 0;
while (!endOfString)
{
if (ReadInt8(EXENAME, Address + counter) > (byte)0)
{
value += (char)ReadInt8(EXENAME, Address + counter);
}
else
{
return value;
}
counter++;
}
return value;
}
And here's the code that I'm using to invoke that functions:
Process process = null;
while(process == null)
{
process = Process.GetProcessesByName("client_dx").FirstOrDefault();
}
var hProc = Memory.OpenProcess(0x00000010, false, process.Id);
var modBase = Memory.GetModuleBaseAddress(process, "client_dx.exe");
var addr = Memory.FindDMAAddy(hProc, (IntPtr)(modBase + 0x003393AC), new int[] { 0x30, 0x374, 0x2C, 0x0, 0x14, 0x48, 0x10 });
var acc = Memory.ReadStringUntilNULL("client_dx.exe", addr);
Debug.WriteLine(acc);
It's working perfectly until this line:
var acc = Memory.ReadStringUntilNULL("client_dx.exe", addr);
So var addr have the correct address but var acc it's not getting the expected results.
Here I'm getting this error: cannot convert from 'System.IntPtr' to 'int'
Ok, so it expects an integer where I'm giving a pointer... so I tested with ToInt32()
var acc = Memory.ReadStringUntilNULL("client_dx.exe", addr.ToInt32());
The addr.ToInt32() operation returns 262959880 and as far as I know that's not even an address
I'm getting an empty string, the ReadStringUntilNULL function from Memory class it's only looping once..
Values are: addr 0x0fac7308 System.IntPtr and acc "" string
How can I read that string from memory? Or how can I pass the parameter correctly?
I finally wrote a class that lets me read strings until null:
public class NewMem
{
[DllImport("kernel32.dll")]
public static extern IntPtr OpenProcess(int dwDesiredAccess, bool bInheritHandle, int dwProcessId);
[DllImport("kernel32.dll")]
public static extern bool ReadProcessMemory(int hProcess, int lpBaseAddress, byte[] lpBuffer, int dwSize, ref int lpNumberOfBytesRead);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool ReadProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, [Out] byte[] lpBuffer, int dwSize, out IntPtr lpNumberOfBytesRead);
public Process Process { get; set; }
public static IntPtr GetModuleBaseAddress(Process proc, string modName)
{
IntPtr addr = IntPtr.Zero;
foreach (ProcessModule m in proc.Modules)
{
if (m.ModuleName == modName)
{
addr = m.BaseAddress;
break;
}
}
return addr;
}
public static IntPtr FindDMAAddy(IntPtr hProc, IntPtr ptr, int[] offsets)
{
var buffer = new byte[IntPtr.Size];
foreach (int i in offsets)
{
ReadProcessMemory(hProc, ptr, buffer, buffer.Length, out var read);
ptr = (IntPtr.Size == 4)
? IntPtr.Add(new IntPtr(BitConverter.ToInt32(buffer, 0)), i)
: ptr = IntPtr.Add(new IntPtr(BitConverter.ToInt64(buffer, 0)), i);
}
return ptr;
}
public string ReadStringASCII(IntPtr address)
{
var myString = "";
for (int i = 1; i < 50; i++)
{
var bytes = ReadMemory(address, i);
if (bytes[(i-1)] == 0)
{
return myString;
}
myString = Encoding.ASCII.GetString(bytes);
}
return myString;
}
public byte[] ReadMemory(IntPtr address, int size)
{
var buffer = new byte[size];
var bytesRead = 0;
ReadProcessMemory((int)Process.Handle, (int)address, buffer, buffer.Length, ref bytesRead);
return buffer;
}
}
This is my code:
NewMem MClass = new NewMem();
var client = Process.GetProcessesByName("client_dx").FirstOrDefault();
MClass.Process = client;
// Get handle to process
var hProc = NewMem.OpenProcess(0x00000010, false, client.Id);
// Get base module
var modBase = NewMem.GetModuleBaseAddress(client, "client_dx.exe");
// Get relative base address
var vBasePointer = NewMem.FindDMAAddy(hProc, (IntPtr)(modBase + 0x55F870), new int[] { 0 });
// Get string
if (vBasePointer != IntPtr.Zero)
{
var vNameAddress = vBasePointer + 0x20;
var vName = MClass.ReadStringASCII(vNameAddress);
}
It's stopping reading when finds a '0', but you can always set up some exceptions or tricks, I didn't find a cleaner way to do this but it's working :)

Using ReadProcessMemory with process module base address and offsets

How does one read memory using a process module's base address and offsets? I have grabbed the desired module's base address with the following:
Process process = Process.GetProcessesByName("process")[0];
ProcessModule bClient;
ProcessModuleCollection bModules = process.Modules;
IntPtr processHandle = OpenProcess(0x10, false, process.Id);
int firstOffset = 0xA4C58C;
int anotherOffset = 0xFC;
for (int i = 0; i < bModules.Count; i++)
{
bClient = bModules[i];
if (bClient.ModuleName == "module.dll")
{
IntPtr baseAddress = bClient.BaseAddress;
Console.WriteLine("Base address: " + baseAddress);
}
}
After that I added the first offset to the base address:
IntPtr firstPointer = IntPtr.Add(baseAddress, (int)firstOffset);
This gives me a pointer; 440911244 in this case.
I can use this pointer in Cheat Engine, for instance, to browse its memory region and find the value to which the anotherPointer points to but I don't find the proper way to add the offset to firstPointer, however.
My question is, do I have to use ReadProcessMemory just before adding the final anotherOffset to the pointer? If so, what is the proper way of using it in this case?
[DllImport("kernel32.dll", SetLastError = true)]
static extern bool ReadProcessMemory(
IntPtr hProcess,
IntPtr lpBaseAddress,
IntPtr lpBuffer,
int dwSize,
out IntPtr lpNumberOfBytesRead);
Change the ReadProcessMemory lpBuffer paramater to:
byte[] lpBuffer,
then
byte[] buffer = new byte[sizeof(float)];
IntPtr bytesRead = IntPtr.Zero;
IntPtr readAddress = IntPtr.Add(baseAddress, firstOffset);
readAddress = IntPtr.Add(readAddress, anotherOffset)
ReadProcessMemory(processHandle, readAddress, buffer, buffer.Length, out bytesRead);
float value = BitConverter.ToSingle(buffer, 0);

Read memory with module base address

How can I read a memory with module base address?
For example how can I read this memory: "winCap64.dll"+0x123456 + offsets.
I have added an example code of what I could produce after some research but I still cant read anything in C#. However the addresses are absolutely fine since they return me the correct value when I add them on Cheat Engine.
Edit: added samle code
[DllImport("kernel32.dll")]
static extern IntPtr OpenProcess(UInt32 dwDesiredAccess, Boolean bInheritHandle, UInt32 dwProcessId);
[DllImport("kernel32.dll")]
static extern bool ReadProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress,
byte[] lpBuffer, UIntPtr nSize, uint lpNumberOfBytesWritten);
static IntPtr Handle;
static void Main(string[] args)
{
Process[] Processes = Process.GetProcessesByName("process");
Process nProcess = Processes[0];
Handle = OpenProcess(0x10, false, (uint)nProcess.Id);
IntPtr pointer = IntPtr.Add(nProcess.Modules[125].BaseAddress, 0x020C5150);
int curhp = ReadOffset(pointer, 0x4D8);
int curhp2 = ReadOffset((IntPtr)curhp, 0x0);
int curhp3 = ReadOffset((IntPtr)curhp2, 0x1c0);
Console.WriteLine(curhp3.ToString());
Console.ReadKey();
}
public static int ReadOffset(IntPtr pointer, uint offset)
{
byte[] bytes = new byte[24];
uint adress = (uint)ReadPointer(pointer) + offset;
ReadProcessMemory(Handle, (IntPtr)adress, bytes, (UIntPtr)sizeof(int), 0);
return BitConverter.ToInt32(bytes, 0);
}
public static int ReadPointer(IntPtr pointer)
{
byte[] bytes = new byte[24];
ReadProcessMemory(Handle, pointer, bytes, (UIntPtr)sizeof(int), 0);
return BitConverter.ToInt32(bytes, 0);
}
How about something like this?
IntPtr pointer = IntPtr.Add(nProcess.Modules[125].BaseAddress, BaseAddress);
Console.WriteLine("Final: " + pointer.ToString("X"));
int hp = ReadInt32(pointer, Handle);
string hexPrefix = "80" + hp.ToString("X"); //because int32 will cut some digits. I sugget using int64. Even UInt64.
long hexToint = long.Parse(hexPrefix, NumberStyles.HexNumber);
hp = ReadInt32((IntPtr)hexToint + 0x00, Handle);
hexPrefix = "80" + hp.ToString("X");
hexToint = long.Parse(hexPrefix, NumberStyles.HexNumber);
hp = ReadInt32((IntPtr)hexToint + 0x1c0, Handle);
hexPrefix = "80" + hp.ToString("X");
hexToint = long.Parse(hexPrefix, NumberStyles.HexNumber);
hp = ReadInt32((IntPtr)hexToint + 0x0, Handle);
IntPtr is the architecture agnostic way to store a pointer and pass it around, say to ReadProcessMemory:
IntPtr pointer = IntPtr.Add(nProcess.Modules[125].BaseAddress, 0x02093458);

DeviceIoControlCE for I2C

I have been banging my head against a problem for days now. I would like your help.
I am trying to interface to I2C from a board running Windows CE7. The board is a Boundary Devices Nitrogen6X.
I am trying to code this in C#.
After a lot of googling and trial and error I can now do almost everything with the I2C (by that I mean I wrapped most commands in methods that work). Of course, the one thing that I cannot do yet is Reading/Writing. I have been trying a few different implementations, porting C and C++ code that supposedly worked. To no avail. Currently I am putting more effort in the two implementations I will copy here.
Neither of these implementations work for me. Both enter the error management portion, and both report error number 87 (ERROR_INVALID_PARAMETER).
Does anyone have experience on these kind of issues? could someone point out what I am doing wrong?
Edit 1: I should probably mention that I am trying to "see" some signals on the SDA and SCL pins of I2C3: of the board by simply plugging an oscilloscope to them. There is no actual device connected on the I2C bus. I would expect this to give me some sort of error after the first byte (addres+Read/Write) has been sent, because no acknowledge bit would be received. However, I see that error 87 in my code, and no change on the signals as seen from the scope (both remain high on idle).
(Code snippets follow)
The first one uses pointers and stuff, and is probably closer to C++ code:
StructLayout(LayoutKind.Sequential)]
unsafe public struct UNSAFE_I2C_PACKET
{
//[MarshalAs(UnmanagedType.U1)]
public byte byAddr; //I2C slave device address
//[MarshalAs(UnmanagedType.U1)]
public byte byRw; //Read = I2C_Read or Write = I2C_Write
//[MarshalAs(UnmanagedType.LPArray)]
public byte* pbyBuf; //Message Buffer
//[MarshalAs(UnmanagedType.U2)]
public Int16 wLen; //Message Buffer Length
//[MarshalAs(UnmanagedType.LPStruct)]
public int* lpiResult; //Contain the result of last operation
}
[StructLayout(LayoutKind.Sequential)]
unsafe public struct UNSAFE_I2C_TRANSFER_BLOCK
{
//public I2C_PACKET pI2CPackets;
public UNSAFE_I2C_PACKET* pI2CPackets;
public Int32 iNumPackets;
}
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
unsafe internal static extern int DeviceIoControlCE(
IntPtr hDevice, //file handle to driver
uint dwIoControlCode, //a correct call to CTL_CODE
[In, Out]byte* lpInBuffer,
uint nInBufferSize,
byte* lpOutBuffer,
uint nOutBufferSize,
uint* lpBytesReturned,
int* lpOverlapped);
unsafe public void ReadI2C(byte* pBuf, int count)
{
int ret;
int iResult;
UNSAFE_I2C_TRANSFER_BLOCK I2CXferBlock = new UNSAFE_I2C_TRANSFER_BLOCK();
UNSAFE_I2C_PACKET i2cPckt = new UNSAFE_I2C_PACKET();
//fixed (byte* p = pBuf)
{
i2cPckt.pbyBuf = pBuf;// p;
i2cPckt.wLen = (Int16)count;
i2cPckt.byRw = I2C_READ;
i2cPckt.byAddr = BASE;
i2cPckt.lpiResult = &iResult;
I2CXferBlock.iNumPackets = 1;
//fixed (I2C_PACKET* pck = &i2cPckt)
{
I2CXferBlock.pI2CPackets = &i2cPckt; // pck;
//fixed (I2C_TRANSFER_BLOCK* tBlock = &I2CXferBlock)
{
if (DeviceIoControlCE(_i2cFile,
I2C_IOCTL_TRANSFER,
(byte*)&I2CXferBlock,//tBlock,
(uint)sizeof(UNSAFE_I2C_TRANSFER_BLOCK),//Marshal.SizeOf(I2CXferBlock),
null,
0,
null,
null) == 0)
{
int error = GetLastError();
diag("Errore nella TRANSFER");
diag(error.ToString());
}
}
}
}
}
The second option I am working on marshals stuff around between managed and unmanaged:
[StructLayout(LayoutKind.Sequential)]
public struct I2C_PACKET
//public class I2C_PACKET
{
//[MarshalAs(UnmanagedType.U1)]
public Byte byAddr; //I2C slave device address
//[MarshalAs(UnmanagedType.U1)]
public Byte byRw; //Read = I2C_Read or Write = I2C_Write
//[MarshalAs(UnmanagedType.LPArray)]
public IntPtr pbyBuf; //Message Buffer
//[MarshalAs(UnmanagedType.U2)]
public Int16 wLen; //Message Buffer Length
//[MarshalAs(UnmanagedType.LPStruct)]
public IntPtr lpiResult; //Contain the result of last operation
}
[StructLayout(LayoutKind.Sequential)]
public struct I2C_TRANSFER_BLOCK
{
//public I2C_PACKET pI2CPackets;
//[MarshalAs(UnmanagedType.LPArray)]
public IntPtr pI2CPackets;
//[MarshalAs(UnmanagedType.U4)]
public Int32 iNumPackets;
}
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern int DeviceIoControlCE(
IntPtr hDevice, //file handle to driver
uint dwIoControlCode, //a correct call to CTL_CODE
[In] IntPtr lpInBuffer,
uint nInBufferSize,
[Out] IntPtr lpOutBuffer,
uint nOutBufferSize,
out uint lpBytesReturned,
IntPtr lpOverlapped);
unsafe public void ReadI2C(byte[] buffer)
{
int count = buffer.Length;
I2C_TRANSFER_BLOCK I2CXFerBlock = new I2C_TRANSFER_BLOCK();
I2C_PACKET I2CPckt = new I2C_PACKET();
I2CPckt.byAddr = BASE;
I2CPckt.byRw = I2C_READ;
I2CPckt.wLen = (Int16)count;
I2CPckt.lpiResult = Marshal.AllocHGlobal(sizeof(int));
I2CPckt.pbyBuf = Marshal.AllocHGlobal(count);
//GCHandle packet = GCHandle.Alloc(I2CPckt, GCHandleType.Pinned);
I2CXFerBlock.iNumPackets = 1;
I2CXFerBlock.pI2CPackets = Marshal.AllocHGlobal(Marshal.SizeOf(I2CPckt)); //(Marshal.SizeOf(typeof(I2C_PACKET)));// //(sizeof(I2C_PACKET));//
Marshal.StructureToPtr(I2CPckt, I2CXFerBlock.pI2CPackets, false);
//I2CXFerBlock.pI2CPackets = packet.AddrOfPinnedObject();
//GCHandle xferBlock = GCHandle.Alloc(I2CXFerBlock, GCHandleType.Pinned);
IntPtr xfer = Marshal.AllocHGlobal(Marshal.SizeOf(I2CXFerBlock)); //(sizeof(I2C_TRANSFER_BLOCK)); //
Marshal.StructureToPtr(I2CXFerBlock, xfer, false);
//IntPtr xfer = xferBlock.AddrOfPinnedObject();
uint size = (uint)sizeof(I2C_TRANSFER_BLOCK);//Marshal.SizeOf(I2CXFerBlock);
uint getCnt = 0;
if ((DeviceIoControlCE(_i2cFile,
I2C_IOCTL_TRANSFER,
xfer,
size,
xfer, //IntPtr.Zero,
size, //0,
out getCnt,
IntPtr.Zero)) == 0)
{
int error = GetLastError();
diag("Errore nella TRANSFER.");
diag(error.ToString());
}
else
{
//success
I2CXFerBlock = (I2C_TRANSFER_BLOCK)Marshal.PtrToStructure(xfer, typeof(I2C_TRANSFER_BLOCK));
I2CPckt = (I2C_PACKET)Marshal.PtrToStructure(I2CXFerBlock.pI2CPackets, typeof(I2C_PACKET));
Marshal.Copy(I2CPckt.pbyBuf, buffer, 0, count);
diag("Success in TRANSFER: " + buffer[0].ToString());
}
Marshal.FreeHGlobal(I2CPckt.pbyBuf);
Marshal.FreeHGlobal(I2CXFerBlock.pI2CPackets);
Marshal.FreeHGlobal(xfer);
//packet.Free();
//xferBlock.Free();
}
Edit 2: The (supposedly) working code I have (which I am unable to run) comes from drivers I have been given, which might be partially proprietary (hence I cannot share). However I found online the header for an I2C bus, that contains the following definition:
#define I2C_MACRO_TRANSFER(hDev, pI2CTransferBlock) \
(DeviceIoControl(hDev, I2C_IOCTL_TRANSFER, (PBYTE) pI2CTransferBlock, sizeof(I2C_TRANSFER_BLOCK), NULL, 0, NULL, NULL))
I initially tried giving "null" to parameters as it's done here, but I still got the same error code.
Edit 3: From the same driver, the struct definitions:
// I2C Packet
typedef struct
{
BYTE byAddr; // I2C slave device address for this I2C operation
BYTE byRW; // Read = I2C_READ or Write = I2C_WRITE
PBYTE pbyBuf; // Message Buffer
WORD wLen; // Message Buffer Length
LPINT lpiResult; // Contains the result of last operation
} I2C_PACKET, *PI2C_PACKET;
// I2C Transfer Block
typedef struct
{
I2C_PACKET *pI2CPackets;
INT32 iNumPackets;
} I2C_TRANSFER_BLOCK, *PI2C_TRANSFER_BLOCK;
Edit 4: I tried implementing version passing a ref to my structures, as #ctacke suggested in his comment. I still get the same error, so I guess I must have done womthing different from the way he thought it. Here is the snippet:
[StructLayout(LayoutKind.Sequential)]
public struct REF_I2C_PACKET //public class REF_I2C_PACKET //
{
//[MarshalAs(UnmanagedType.U1)]
public Byte byAddr; //I2C slave device address
//[MarshalAs(UnmanagedType.U1)]
public Byte byRw; //Read = I2C_Read or Write = I2C_Write
//[MarshalAs(UnmanagedType.LPArray)]
public IntPtr pbyBuf; //Message Buffer
//[MarshalAs(UnmanagedType.U2)]
public Int16 wLen; //Message Buffer Length
//[MarshalAs(UnmanagedType.LPStruct)]
public IntPtr lpiResult; //Contain the result of last operation
}
[StructLayout(LayoutKind.Sequential)]
public struct REF_I2C_TRANSFER_BLOCK //public class REF_I2C_TRANSFER_BLOCK //
{
//public I2C_PACKET pI2CPackets;
public IntPtr pI2CPackets;
//[MarshalAs(UnmanagedType.U4)]
public Int32 iNumPackets;
//[MarshalAs(UnmanagedType.LPArray)]
//[MarshalAs(UnmanagedType.ByValArray, ArraySubType = UnmanagedType.LPStruct, SizeConst = 2)]
//public REF_I2C_PACKET[] pI2CPackets;
}
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
unsafe internal static extern int DeviceIoControlCE(
IntPtr hDevice, //file handle to driver
uint dwIoControlCode, //a correct call to CTL_CODE
//[MarshalAs(UnmanagedType.AsAny)]
//[In] object lpInBuffer, //
ref REF_I2C_TRANSFER_BLOCK lpInBuffer,
uint nInBufferSize,
byte* lpOutBuffer, //ref REF_I2C_TRANSFER_BLOCK lpOutBuffer,
uint nOutBufferSize,
out uint lpBytesReturned, //uint* lpBytesReturned,
int* lpOverlapped);
unsafe public void RefReadI2C(byte[] buffer)
{
int count = buffer.Length;
REF_I2C_TRANSFER_BLOCK I2CXFerBlock = new REF_I2C_TRANSFER_BLOCK();
REF_I2C_PACKET[] I2CPckt = new REF_I2C_PACKET[2];
I2CPckt[0] = new REF_I2C_PACKET();
I2CPckt[1] = new REF_I2C_PACKET();
I2CPckt[0].byAddr = BASE;
I2CPckt[0].byRw = I2C_READ;
I2CPckt[0].wLen = (Int16)count;
I2CPckt[0].lpiResult = Marshal.AllocHGlobal(sizeof(int));
I2CPckt[0].pbyBuf = Marshal.AllocHGlobal(count);
Marshal.Copy(buffer, 0, I2CPckt[0].pbyBuf, count);
I2CPckt[1].byAddr = BASE;
I2CPckt[1].byRw = I2C_READ;
I2CPckt[1].wLen = (Int16)count;
I2CPckt[1].lpiResult = Marshal.AllocHGlobal(sizeof(int));
I2CPckt[1].pbyBuf = Marshal.AllocHGlobal(count);
Marshal.Copy(buffer, 0, I2CPckt[0].pbyBuf, count);
I2CXFerBlock.iNumPackets = 2;
I2CXFerBlock.pI2CPackets = Marshal.AllocHGlobal(Marshal.SizeOf(I2CPckt[0])*I2CPckt.Length);
uint size = (uint)Marshal.SizeOf(I2CXFerBlock); //sizeof(REF_I2C_TRANSFER_BLOCK);//Marshal.SizeOf(I2CXFerBlock);
//size += (uint)(Marshal.SizeOf(I2CPckt[0]) * I2CPckt.Length);
uint getCnt = 0;
if ((DeviceIoControlCE(_i2cFile,
I2C_IOCTL_TRANSFER,
ref I2CXFerBlock,
size,
null, //IntPtr.Zero,
0, //0,
out getCnt,
null)) == 0)
{
int error = GetLastError();
diag("Errore nella TRANSFER.");
diag(error.ToString());
}
else
{
//success
I2CPckt = (REF_I2C_PACKET[])Marshal.PtrToStructure(I2CXFerBlock.pI2CPackets, typeof(REF_I2C_PACKET[]));
Marshal.Copy(I2CPckt[0].pbyBuf, buffer, 0, count);
diag("Success in TRANSFER: " + buffer[0].ToString());
}
Marshal.FreeHGlobal(I2CPckt[0].pbyBuf);
Marshal.FreeHGlobal(I2CPckt[0].lpiResult);
}
Edit 5:
I found online (http://em-works.googlecode.com/svn/trunk/WINCE600/PLATFORM/COMMON/SRC/SOC/COMMON_FSL_V2_PDK1_9/I2C/PDK/i2c_io.cpp) The following code:
BOOL I2C_IOControl(DWORD hOpenContext, DWORD dwCode, PBYTE pBufIn,
DWORD dwLenIn, PBYTE pBufOut, DWORD dwLenOut,
PDWORD pdwActualOut)
{
/*stuff*/
case I2C_IOCTL_TRANSFER:
{
#define MARSHAL 1
#if MARSHAL
DuplicatedBuffer_t Marshalled_pInBuf(pBufIn, dwLenIn, ARG_I_PTR);
pBufIn = reinterpret_cast<PBYTE>( Marshalled_pInBuf.ptr() );
if( (dwLenIn > 0) && (NULL == pBufIn) )
{
return FALSE;
}
#endif
I2C_TRANSFER_BLOCK *pXferBlock = (I2C_TRANSFER_BLOCK *) pBufIn;
if (pXferBlock->iNumPackets<=0)
{
return FALSE;
}
#if MARSHAL
MarshalledBuffer_t Marshalled_pPackets(pXferBlock->pI2CPackets,
pXferBlock->iNumPackets*sizeof(I2C_PACKET),
ARG_I_PTR);
I2C_PACKET *pPackets = reinterpret_cast<I2C_PACKET *>(Marshalled_pPackets.ptr());
if( (NULL == pPackets) )
{
return FALSE;
}
#else
I2C_PACKET *pPackets = pXferBlock->pI2CPackets;
#endif
#if MARSHAL
struct Marshalled_I2C_PACKET
{
MarshalledBuffer_t *pbyBuf;
MarshalledBuffer_t *lpiResult;
} *Marshalled_of_pPackets;
Marshalled_of_pPackets = new Marshalled_I2C_PACKET[pXferBlock->iNumPackets];
if (Marshalled_of_pPackets==0)
{
return FALSE;
}
MarshalledBuffer_t *pMarshalled_ptr;
int i;
// Map pointers for each packet in the array
for (i = 0; i < pXferBlock->iNumPackets; i++)
{
switch( pPackets[i].byRW & I2C_METHOD_MASK )
{
case I2C_RW_WRITE:
pMarshalled_ptr = new MarshalledBuffer_t(
pPackets[i].pbyBuf,
pPackets[i].wLen,
ARG_I_PTR,
FALSE, FALSE);
if (pMarshalled_ptr ==0)
{
bRet = FALSE;
goto cleanupPass1;
}
if (pMarshalled_ptr->ptr()==0)
{
bRet = FALSE;
delete pMarshalled_ptr;
goto cleanupPass1;
}
break;
case I2C_RW_READ:
pMarshalled_ptr = new MarshalledBuffer_t(
pPackets[i].pbyBuf,
pPackets[i].wLen,
ARG_O_PTR, FALSE, FALSE);
if (pMarshalled_ptr ==0)
{
bRet = FALSE;
goto cleanupPass1;
}
if (pMarshalled_ptr->ptr()==0)
{
bRet = FALSE;
delete pMarshalled_ptr;
goto cleanupPass1;
}
break;
default:
{
bRet = FALSE;
goto cleanupPass1;
}
}
pPackets[i].pbyBuf = reinterpret_cast<PBYTE>(pMarshalled_ptr->ptr());
Marshalled_of_pPackets[i].pbyBuf = pMarshalled_ptr;
}
for (i = 0; i < pXferBlock->iNumPackets; i++)
{
pMarshalled_ptr = new MarshalledBuffer_t(
pPackets[i].lpiResult, sizeof(INT),
ARG_O_PDW, FALSE, FALSE);
if (pMarshalled_ptr ==0)
{
bRet = FALSE;
goto cleanupPass2;
}
if (pMarshalled_ptr->ptr()==0)
{
bRet = FALSE;
delete pMarshalled_ptr;
goto cleanupPass2;
}
pPackets[i].lpiResult = reinterpret_cast<LPINT>(pMarshalled_ptr->ptr());
Marshalled_of_pPackets[i].lpiResult = pMarshalled_ptr;
}
#endif
bRet = pI2C->ProcessPackets(pPackets, pXferBlock->iNumPackets);
#if MARSHAL
DEBUGMSG (ZONE_IOCTL|ZONE_FUNCTION, (TEXT("I2C_IOControl:I2C_IOCTL_TRANSFER -\r\n")));
i = pXferBlock->iNumPackets;
cleanupPass2:
for (--i; i>=0; --i)
{
delete Marshalled_of_pPackets[i].lpiResult;
}
i = pXferBlock->iNumPackets;
cleanupPass1:
for (--i; i>=0; --i)
{
delete Marshalled_of_pPackets[i].pbyBuf;
}
delete[] Marshalled_of_pPackets;
#endif
break;
}
/*stuff*/
}
I cannot claim to understand 100% of it, but from the Windows naming conventions (http://msdn.microsoft.com/en-us/library/windows/desktop/aa378932(v=vs.85).aspx) it would appear that the size parameter I should send is the total number of bytes of my transfer, including everything. I have tried to figure that number out myself, but I have so far not been able to. Alternatively, I guess it would be possible to try and do something to the structures I have to turn them into a byte array. Only I guess that it would need to have a specific order of the bytes in it for the system to understand it.
Can anyone pitch in on that?

Hook recv and unreadable buffer

I have an application that shows a WebBrowser component, which contains a flash application that create a XMLSocket with a server.
I'm now trying to hook recv ( luckly a LocalHook) for log purpuse, but when I try to read the socket content I get only strange chars, but if i set the hook with SpyStudio I get readable strings.
Here is the code I use :
I set the hook with
CreateRecvHook = LocalHook.Create(
LocalHook.GetProcAddress("ws2_32.dll", "recv"),
new Drecv(recv_Hooked),
this);
CreateRecvHook.ThreadACL.SetExclusiveACL(new Int32[] { 0 });
I set up everything I need with
[DllImport("ws2_32.dll")]
static extern int recv(
IntPtr socketHandle,
IntPtr buf,
int count,
int socketFlags
);
[UnmanagedFunctionPointer(CallingConvention.StdCall,
CharSet = CharSet.Unicode,
SetLastError = true)]
delegate int Drecv(
IntPtr socketHandle,
IntPtr buf,
int count,
int socketFlags
);
static int recv_Hooked(
IntPtr socketHandle,
IntPtr buf,
int count,
int socketFlags)
{
byte[] test = new byte[count];
Marshal.Copy(buf, test, 0, count);
IntPtr ptr = IntPtr.Zero;
ptr = Marshal.AllocHGlobal(count);
Marshal.Copy(test, 0, ptr, count);
string s = System.Text.UnicodeEncoding.Unicode.GetString(test);
Debug.WriteLine(s);
System.IO.StreamWriter file = new System.IO.StreamWriter("log.txt");
file.WriteLine(s);
file.Close();
return recv(socketHandle, buf, count, socketFlags);;
}
I've already tried using different Encoding without success. As a side note, the WebBrowser doesn't seems to have any problem.
You're saving the content of the uninitialized buffer, no wonder it's garbage.
There is nothing in that buffer until after recv (the real one) fills it in. You also can't know how many bytes are actually valid except by inspecting the return code from the real recv.

Categories

Resources