PInvoke byte array to char not behaving properly in 64 bit [duplicate] - c#

Follow up to PInvoke byte array to char not behaving properly in 64 bit. (Stale question and my suspicions were wrong, thus the title and descriptionwas unfitting).
I am using P/Invoke to call C++ code from C#. I have both the C# and C++ projects set to build in x64 in the build configurations of VS. when I run this program, the parameters of the P/Invoke call are shifted by 32 bits as follows
C# : |Parameter 1|Parameter 2|Parameter 3|Parameter 4|
| | | | |
V V V V V
C++: |Parameter 1|Parameter 2|Parameter 3|Parameter 4|
So if I pass 1,2,3,4 from the C# side, the C++ side receives 2,3,4,garbage.
I have worked around this by passing in an extra int in front of the C# parameters without changing the C++ side. this offsets the parameters by 32 bits and realigns them and the program works perfectly.
Does anyone know what is causing this strange offset and the proper way to correct it?
Here is a simplified example showing my method signatures
C# side:
[DllImport(#"C:\FullPath\CppCode.dll", EntryPoint = "MethodName",
CallingConvention = CallingConvention.Cdecl))]
private static extern bool MethodName(parameters);
C++ side:
extern "C" __declspec(dllexport)
bool CppClass::MethodName(parameters)
I suspect that since the parameters are off by 32 bits, there might be something that isn't really being done in 64 bit properly. Perhaps when calling a method, there is an implicit pointer that is passed before the variables? If that is only a 32 bit pointer on the C# side and the C++ is expecting a 64 bit pointer, that could cause this offset situation, but I'm not sure.

MethodName should not be a class instance method, as the first bytes (4 in x86 world, 8 in x64 world) will be used for the class instance pointer in the unmanaged world.
So, it should be a static method (or a C style method).

Related

Getting Memory Exception Error when calling C++ method from C#

I have C++ dll which has below method signature. I would like to know what should be compatible C# method to call C++ dll. I am getting error on execution as attempted to read write protected memory. The dll used here is third party dll. I am able to call simple methods without pointers from C++ dll.
C++ Declaration:
int __stdcall Deduct(int V, unsigned char *AI);
C# Declaration
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(long i,ref string AI);
As per the document of third party dll.
AI Used as input and output buffer. The buffer shall have at least 80 bytes
Input Additional Information for the transaction. It is a byte pointer containing 7 bytes.
e.g. Assume the usage information is not used,
if receipt number = 1234567,
hex value = 0x12D687,
7 bytes AI = D6 87 00 00 00 D6 87
Output On return, the AI contains the UD.
Please help.
I think the c# signature looks wrong.
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(int i,ref char[] AI);
if it is expecting a char[80] as it's input you may need to declare it as such and null terminate it yourself as whatever is reading from it might be reading past the end of the array.
usage
char[] tmp = new char[80];
tmp[0] = (char)0x00;
int i = 0;
Deduct(i, ref tmp);
I'm not sure if this will work but hopefully it helps.
Besides string marshaling, I found a likely reason for getting an exception.
You have specified the return type and the first parameter as int in C, but long int C#. This could cause problems.
C language specifies minimum bits of integer types which means that different for environments.
int in C : unsigned integer, at least 16 bits
long in C : unsigned integer, at least 32 bits
int in C# : 32-bit unsigned integer, always
long in C# : 64-bit unsigned integer, always
Most of major 32/64-bit OS/compilers use 32 bits for type int in C. However long is always 64-bit in C#. So the signature mismatches. I would suggest to revise:
public static extern int Deduct(int i, ref string AI); // Still not sure about string

Is std::byte in C++ 17 equivalent to byte in C#?

I just noticed std::byte in the C++ 17.
I am asking this question because I use the code below to send byte array to C++ to play audio sound.
C#:
[DllImport ("AudioStreamer")]
public static extern void playSound (byte[] audioBytes);
C++:
#define EXPORT_API __declspec(dllexport)
extern "C" void EXPORT_API playSound(unsigned char* audioBytes)
With the new byte type in C++ 17, it looks like I might be able to do this now:
C#:
[DllImport ("AudioStreamer")]
public static extern void playSound (byte[] audioBytes);
C++:
#define EXPORT_API __declspec(dllexport)
extern "C" void EXPORT_API playSound(byte[] audioBytes)
I am not sure if this will even work because the compiler I use does not support byte in C++ 17 yet.
So, is std::byte in C++ 17 equivalent to byte in C#? Is there a reason to not use std::byte over unsigned char* ?
According to C++ reference,
Like the character types (char, unsigned char, signed char) std::byte can be used to access raw memory occupied by other objects.
This tells me that you can freely replace
unsigned char audioBytes[]
with
std::byte audioBytes[]
in a function header, and everything is going to work, provided that you plan to treat bytes as bytes, not as numeric objects.
std::byte is equivalent to both unsigned char and char in C++ in a sense that it is a type to represent 1 byte of raw memory.
If you used unsigned char* in your interface, you can easily replace it with std::byte.
In your C# code this will result in no changes at all, on the C++ side this will make your type system more strict (which is a good thing) due to the fact that you will not be able to treat your std::bytes as text characters or as small integers.
Of course this is C++17 feature which may or may not be properly supported by your compiler.

What is the equivalent of unsigned long int in c#?

I have a C struct that is defined as follows:
typedef struct {
unsigned long int a;
} TEST;
I want to create a C# equivalent of this struct?
Any help? What is confusing me is that "unsigned long int" is at least 32-bit, what does that mean, it's either 32-bit, 64-bit or 16-bit, right?
You want an uint or ulong depending on what an int or long was on your native C platform:
C# uint is 32 bits
C# ulong is 64 bits
The at least and platform dependency is a necessary concern in C because it is actually translated into machine code and C was developed for many architectures with varying word sizes. C# on the contrary is defined against a virtual machine (exactly like Java or Javascript) and thus can abstract the hardware's word size in favor of one defined by the language's standard VM (the CLR in C#). Differences between the VM and harware word size are taken care of by the VM and hidden to the hosted code.

How to pass an unsigned long to a Linux shared library using P/Invoke

I am using C# in Mono and I'm trying to use pinvoke to call a Linux shared library.
The c# call is defined as:
[DllImport("libaiousb")]
extern static ulong AIOUSB_Init();
The Linux function is defined as follows:
unsigned long AIOUSB_Init() {
return(0);
}
The compile command for the Linux code is:
gcc -ggdb -std=gnu99 -D_GNU_SOURCE -c -Wall -pthread -fPIC
-I/usr/include/libusb-1.0 AIOUSB_Core.c -o AIOUSB_Core.dbg.o
I can call the function ok but the return result is bonkers. It should be 0 but I'm getting some huge mangled number.
I've put printf's in the Linux code just before the function value is returned and it is correct.
One thing I have noticed that is a little weird is that the printf should occur before the function returns. However, I see the function return to C# and then the c# prints the return result and finally the printf result is displayed.
From here:
An unsigned long can hold all the values between 0 and ULONG_MAX inclusive. ULONG_MAX must be at least 4294967295. The long types must contain at least 32 bits to hold the required range of values.
For this reason a C unsigned long is usually translated to a .NET UInt32:
[DllImport("libaiousb")]
extern static uint AIOUSB_Init();
You're probably running that on a system where C unsigned long is 32-bits. C# unsigned long is 64 bits. If you want to make sure the return value is a 64-bits unsigned long, include stdint.h and return an uint64_t from AIOUSB_Init().

Using P/Invoke correctly

I need to call an external dll from c#. This is the header definition:
enum WatchMode {
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1 };
LONG ADS_API WDT_GetMode ( LONG i_hHandle, WatchMode * o_pWatchMode );
I've added the enum and the call in C#:
public enum WatchMode
{
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1
}
[DllImport("AdsWatchdog.dll")]
internal static extern long WDT_GetMode(long hHandle, ref WatchMode watchmode);
This generates an AccessViolationException. I know the dll is 'working' because I've also added a call to GetHandle which returns the hHandle mentioned above. I've tried to change the param to an int (ref int watchmode) but get the same error. Doesn anyone know how I can PInvoke the above call?
You're running into a parameter size problem difference between C# and C++. In the C++/windows world LONG is a 4 byte signed integer. In the C# world long is a 8 byte signed integer. You should change your C# signature to take an int.
ffpf is wrong in saying that you should use an IntPtr here. It will fix this particular problem on a 32 bit machine since an IntPtr will marshal as a int. If you run this on a 64 bit machine it will marshal as a 8 byte signed integer again and will crash.
The Managed, Native, and COM Interop Team released the PInvoke Interop Assistant on codeplex. Maybe it can create the proper signature.
http://www.codeplex.com/clrinterop/Release/ProjectReleases.aspx?ReleaseId=14120

Categories

Resources