I have C++ dll which has below method signature. I would like to know what should be compatible C# method to call C++ dll. I am getting error on execution as attempted to read write protected memory. The dll used here is third party dll. I am able to call simple methods without pointers from C++ dll.
C++ Declaration:
int __stdcall Deduct(int V, unsigned char *AI);
C# Declaration
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(long i,ref string AI);
As per the document of third party dll.
AI Used as input and output buffer. The buffer shall have at least 80 bytes
Input Additional Information for the transaction. It is a byte pointer containing 7 bytes.
e.g. Assume the usage information is not used,
if receipt number = 1234567,
hex value = 0x12D687,
7 bytes AI = D6 87 00 00 00 D6 87
Output On return, the AI contains the UD.
Please help.
I think the c# signature looks wrong.
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(int i,ref char[] AI);
if it is expecting a char[80] as it's input you may need to declare it as such and null terminate it yourself as whatever is reading from it might be reading past the end of the array.
usage
char[] tmp = new char[80];
tmp[0] = (char)0x00;
int i = 0;
Deduct(i, ref tmp);
I'm not sure if this will work but hopefully it helps.
Besides string marshaling, I found a likely reason for getting an exception.
You have specified the return type and the first parameter as int in C, but long int C#. This could cause problems.
C language specifies minimum bits of integer types which means that different for environments.
int in C : unsigned integer, at least 16 bits
long in C : unsigned integer, at least 32 bits
int in C# : 32-bit unsigned integer, always
long in C# : 64-bit unsigned integer, always
Most of major 32/64-bit OS/compilers use 32 bits for type int in C. However long is always 64-bit in C#. So the signature mismatches. I would suggest to revise:
public static extern int Deduct(int i, ref string AI); // Still not sure about string
Related
I just noticed std::byte in the C++ 17.
I am asking this question because I use the code below to send byte array to C++ to play audio sound.
C#:
[DllImport ("AudioStreamer")]
public static extern void playSound (byte[] audioBytes);
C++:
#define EXPORT_API __declspec(dllexport)
extern "C" void EXPORT_API playSound(unsigned char* audioBytes)
With the new byte type in C++ 17, it looks like I might be able to do this now:
C#:
[DllImport ("AudioStreamer")]
public static extern void playSound (byte[] audioBytes);
C++:
#define EXPORT_API __declspec(dllexport)
extern "C" void EXPORT_API playSound(byte[] audioBytes)
I am not sure if this will even work because the compiler I use does not support byte in C++ 17 yet.
So, is std::byte in C++ 17 equivalent to byte in C#? Is there a reason to not use std::byte over unsigned char* ?
According to C++ reference,
Like the character types (char, unsigned char, signed char) std::byte can be used to access raw memory occupied by other objects.
This tells me that you can freely replace
unsigned char audioBytes[]
with
std::byte audioBytes[]
in a function header, and everything is going to work, provided that you plan to treat bytes as bytes, not as numeric objects.
std::byte is equivalent to both unsigned char and char in C++ in a sense that it is a type to represent 1 byte of raw memory.
If you used unsigned char* in your interface, you can easily replace it with std::byte.
In your C# code this will result in no changes at all, on the C++ side this will make your type system more strict (which is a good thing) due to the fact that you will not be able to treat your std::bytes as text characters or as small integers.
Of course this is C++17 feature which may or may not be properly supported by your compiler.
I am developing a wrapper library that allow my project using a x86 C++ dll library in any CPU environment, I have no control about the dll thus I am using DllImport in C#.
There is a provided function which declared in C++: int __stdcall Func(int V, unsigned char *A)
and provided a sample declaration in VB: Private Declare Function Func Lib "lib.dll" Alias "_Func#8" (ByVal V As Long, A As Any) As Long
This function will request a device to Add/Deduct a value to/from a card by passing Convert.ToInt64(decimalValue) as V, and some customize information in A.
Here is the description of A:
It is a byte pointer containing 7 bytes.
The first 5 bytes are used to stores info that will be passed to the card log (The last 4 digits of the receipt number should be included in the first 2 bytes, the other 3 could be A3A4A5)
The last 2 bytes are used to stores info that will be passed to the device (The last 4 digits of the receipt number)
On return, the A contains a 32 bytes data.
After hours and hours of researches and tries, I cannot make result other than 'Access Violation Exception'. Please see the following draft code:
[DllImport("lib.dll", EntryPoint="_Func#8")]
public static external Int64 Func(Int64 V, StringBuilder sb);
string ReceiptNum = "ABC1234";
decimal Amount = 10m;
byte[] A = new byte[32];
A[0] = Convert.ToByte(ReceiptNum.Substring(3, 2));
A[1] = Convert.ToByte(ReceiptNum.Substring(5));
A[2] = Convert.ToByte("A3");
A[3] = Convert.ToByte("A4");
A[4] = Convert.ToByte("A5");
A[5] = Convert.ToByte(ReceiptNum.Substring(3, 2));
A[6] = Convert.ToByte(ReceiptNum.Substring(5));
StringBuilder sb = new StringBuilder(
new ASCIIEncoding().GetString(A), A.Length
);
Int64 Result = Func(Convert.ToInt64(Amount), sb);
And at this point it throws the exception. I have tried passing IntPtr, byte*, byte (by A[0]), byval, byref and none of them works. (Tried to deploy as x86 CPU as well)
Would appreciate any help! Thanks for your time!
PS - The reason of using StringBuilder is the library contains a function that accept a "char *Data" parameter that causes the same exception, and the solution is using StringBuilder to pass as a pointer, this function's VB Declaration is: Private Declare Function Func1 Lib "lib.dll" Alias "_Func1#12(ByVal c As Byte, ByVal o As Byte, ByVal Data As String) As Long
Your external definition is wrong.
StringBuilder is a complex structure containing an array of c# char.
c# chars are utf-16 (double bytes with complex rules for decoding unicode multichar caracters). Probably not what your are seeking.
If your data is a raw byte bufer you should go for byte[]
Int64 is also c# long.
Well, your native method signature takes int, and you're trying to pass a long long. That's not going to work, rather obviously. The same is true with the return value. Don't assume that VB maps clearly to VB.NET, much less C# - Long means a 32-bit integer in VB, but not in .NET. Native code is a very complex environment, and you better know what you're doing when trying to interface with native.
StringBuilder should only be used for character data. That's not your case, and you should use byte[] instead. No matter the fun things you're doing, you're trying to pass invalid unicode data instead of raw bytes. The confusion is probably from the fact that C doesn't distinguish between byte[] and string - both are usually represented as char*.
Additionally, I don't see how you'd expect this wrapper to work in an AnyCPU environment. If the native DLL is 32-bit, you can only use it from a 32-bit process. AnyCPU isn't magic, it just defers the decision of bit-ness to runtime, rather than compile-time.
Is this list correct?
unsigned int(c) -> uint(c#)
const char*(c) -> String(c#)
unsigned int*(c) -> uint[](c#)
unsigned char*(c) -> byte[](c#)
I think there's a mistake here because with these 4 parameters for native function I have PInvokeStackImbalance.
C function is:
bool something
(unsigned char *a,
unsigned int a_length,
unsigned char *b,
unsigned int *b_length);
PInvoke is:
[DllImport(#"lib.dll", EntryPoint = "something")]<br>
public static extern bool something(
byte[] a,
uint a_length,
byte[] b,
uint[] b_length);
First, PInvoke.net is your friend.
Second, You conversions are correct except that you should use a StringBuilder for functions that take a char* as a buffer to fill ([in out]).
Your stack imbalance may be due to the use of different calling conventions. The default calling convention for C# is __stdcall, but your C function is probably __cdecl. If that is the case you will need to add the CallingConvention to your DLLImport attribute.
EDIT: Also, as Groo pointed out, if the pointer arguments in your C function are actually just pointers to unsigned int (for example, as opposed to expecting an array of int) then you should use ref uint instead of an int[].
So I'm trying to invoke a function I have in my unmanaged C++ dll.
void foo(char* in_file, char * out_file)
In my C# application I declare the same function as
[DllImport("dll.dll")]
public static extern void foo(byte[] in_file, byte[] out_file);
The actual parameters I pass through an ASCII encoder like so
byte[] param1 = Encoding.ASCII.GetBytes(filename1);
byte[] param2 = Encoding.ASCII.GetBytes(filename2);
foo(param1, param2);
Everything works, except when the length of param1 is 0 or 36 or 72 characters (bytes). Then for some reason the DLL reports the received length of 1, 38, 73 with the 0x16 character appened on the very end.
I have wasted half a day trying to debug and understand why this is happening. Any ideas?
What is the significance of multiples of 36?
Edit/Solution:
After I posted this a possible answer struck me. By appending the null character 0x0 to the end of my strings before converting them to byte arrays I have eliminated the problem. Although I'm not sure if that is by design.
You've stumbled across the answer yourself - nowhere is the length passed to the function.
You need to think carefully about what functions you are running these on and whether ASCII is really what you want.
You probably really want this, with no ASCII encoding necessary and null termination explicit in the marshaled type:
[DllImport("dll.dll")]
public static extern void foo(
[MarshalAs(LPStr)] string in_file,
[MarshalAs(LPStr)] string out_file);
But check your foo() implementation and make sure you understand what you really expect to be passed in.
I need to call an external dll from c#. This is the header definition:
enum WatchMode {
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1 };
LONG ADS_API WDT_GetMode ( LONG i_hHandle, WatchMode * o_pWatchMode );
I've added the enum and the call in C#:
public enum WatchMode
{
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1
}
[DllImport("AdsWatchdog.dll")]
internal static extern long WDT_GetMode(long hHandle, ref WatchMode watchmode);
This generates an AccessViolationException. I know the dll is 'working' because I've also added a call to GetHandle which returns the hHandle mentioned above. I've tried to change the param to an int (ref int watchmode) but get the same error. Doesn anyone know how I can PInvoke the above call?
You're running into a parameter size problem difference between C# and C++. In the C++/windows world LONG is a 4 byte signed integer. In the C# world long is a 8 byte signed integer. You should change your C# signature to take an int.
ffpf is wrong in saying that you should use an IntPtr here. It will fix this particular problem on a 32 bit machine since an IntPtr will marshal as a int. If you run this on a 64 bit machine it will marshal as a 8 byte signed integer again and will crash.
The Managed, Native, and COM Interop Team released the PInvoke Interop Assistant on codeplex. Maybe it can create the proper signature.
http://www.codeplex.com/clrinterop/Release/ProjectReleases.aspx?ReleaseId=14120