I need to call an external dll from c#. This is the header definition:
enum WatchMode {
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1 };
LONG ADS_API WDT_GetMode ( LONG i_hHandle, WatchMode * o_pWatchMode );
I've added the enum and the call in C#:
public enum WatchMode
{
WATCH_MODE_SYSTEM = 0,
WATCH_MODE_APPLICATION = 1
}
[DllImport("AdsWatchdog.dll")]
internal static extern long WDT_GetMode(long hHandle, ref WatchMode watchmode);
This generates an AccessViolationException. I know the dll is 'working' because I've also added a call to GetHandle which returns the hHandle mentioned above. I've tried to change the param to an int (ref int watchmode) but get the same error. Doesn anyone know how I can PInvoke the above call?
You're running into a parameter size problem difference between C# and C++. In the C++/windows world LONG is a 4 byte signed integer. In the C# world long is a 8 byte signed integer. You should change your C# signature to take an int.
ffpf is wrong in saying that you should use an IntPtr here. It will fix this particular problem on a 32 bit machine since an IntPtr will marshal as a int. If you run this on a 64 bit machine it will marshal as a 8 byte signed integer again and will crash.
The Managed, Native, and COM Interop Team released the PInvoke Interop Assistant on codeplex. Maybe it can create the proper signature.
http://www.codeplex.com/clrinterop/Release/ProjectReleases.aspx?ReleaseId=14120
Related
I have C++ dll which has below method signature. I would like to know what should be compatible C# method to call C++ dll. I am getting error on execution as attempted to read write protected memory. The dll used here is third party dll. I am able to call simple methods without pointers from C++ dll.
C++ Declaration:
int __stdcall Deduct(int V, unsigned char *AI);
C# Declaration
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(long i,ref string AI);
As per the document of third party dll.
AI Used as input and output buffer. The buffer shall have at least 80 bytes
Input Additional Information for the transaction. It is a byte pointer containing 7 bytes.
e.g. Assume the usage information is not used,
if receipt number = 1234567,
hex value = 0x12D687,
7 bytes AI = D6 87 00 00 00 D6 87
Output On return, the AI contains the UD.
Please help.
I think the c# signature looks wrong.
[System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct#8", CallingConvention = CallingConvention.Cdecl)]
public static extern long Deduct(int i,ref char[] AI);
if it is expecting a char[80] as it's input you may need to declare it as such and null terminate it yourself as whatever is reading from it might be reading past the end of the array.
usage
char[] tmp = new char[80];
tmp[0] = (char)0x00;
int i = 0;
Deduct(i, ref tmp);
I'm not sure if this will work but hopefully it helps.
Besides string marshaling, I found a likely reason for getting an exception.
You have specified the return type and the first parameter as int in C, but long int C#. This could cause problems.
C language specifies minimum bits of integer types which means that different for environments.
int in C : unsigned integer, at least 16 bits
long in C : unsigned integer, at least 32 bits
int in C# : 32-bit unsigned integer, always
long in C# : 64-bit unsigned integer, always
Most of major 32/64-bit OS/compilers use 32 bits for type int in C. However long is always 64-bit in C#. So the signature mismatches. I would suggest to revise:
public static extern int Deduct(int i, ref string AI); // Still not sure about string
I am developing a wrapper library that allow my project using a x86 C++ dll library in any CPU environment, I have no control about the dll thus I am using DllImport in C#.
There is a provided function which declared in C++: int __stdcall Func(int V, unsigned char *A)
and provided a sample declaration in VB: Private Declare Function Func Lib "lib.dll" Alias "_Func#8" (ByVal V As Long, A As Any) As Long
This function will request a device to Add/Deduct a value to/from a card by passing Convert.ToInt64(decimalValue) as V, and some customize information in A.
Here is the description of A:
It is a byte pointer containing 7 bytes.
The first 5 bytes are used to stores info that will be passed to the card log (The last 4 digits of the receipt number should be included in the first 2 bytes, the other 3 could be A3A4A5)
The last 2 bytes are used to stores info that will be passed to the device (The last 4 digits of the receipt number)
On return, the A contains a 32 bytes data.
After hours and hours of researches and tries, I cannot make result other than 'Access Violation Exception'. Please see the following draft code:
[DllImport("lib.dll", EntryPoint="_Func#8")]
public static external Int64 Func(Int64 V, StringBuilder sb);
string ReceiptNum = "ABC1234";
decimal Amount = 10m;
byte[] A = new byte[32];
A[0] = Convert.ToByte(ReceiptNum.Substring(3, 2));
A[1] = Convert.ToByte(ReceiptNum.Substring(5));
A[2] = Convert.ToByte("A3");
A[3] = Convert.ToByte("A4");
A[4] = Convert.ToByte("A5");
A[5] = Convert.ToByte(ReceiptNum.Substring(3, 2));
A[6] = Convert.ToByte(ReceiptNum.Substring(5));
StringBuilder sb = new StringBuilder(
new ASCIIEncoding().GetString(A), A.Length
);
Int64 Result = Func(Convert.ToInt64(Amount), sb);
And at this point it throws the exception. I have tried passing IntPtr, byte*, byte (by A[0]), byval, byref and none of them works. (Tried to deploy as x86 CPU as well)
Would appreciate any help! Thanks for your time!
PS - The reason of using StringBuilder is the library contains a function that accept a "char *Data" parameter that causes the same exception, and the solution is using StringBuilder to pass as a pointer, this function's VB Declaration is: Private Declare Function Func1 Lib "lib.dll" Alias "_Func1#12(ByVal c As Byte, ByVal o As Byte, ByVal Data As String) As Long
Your external definition is wrong.
StringBuilder is a complex structure containing an array of c# char.
c# chars are utf-16 (double bytes with complex rules for decoding unicode multichar caracters). Probably not what your are seeking.
If your data is a raw byte bufer you should go for byte[]
Int64 is also c# long.
Well, your native method signature takes int, and you're trying to pass a long long. That's not going to work, rather obviously. The same is true with the return value. Don't assume that VB maps clearly to VB.NET, much less C# - Long means a 32-bit integer in VB, but not in .NET. Native code is a very complex environment, and you better know what you're doing when trying to interface with native.
StringBuilder should only be used for character data. That's not your case, and you should use byte[] instead. No matter the fun things you're doing, you're trying to pass invalid unicode data instead of raw bytes. The confusion is probably from the fact that C doesn't distinguish between byte[] and string - both are usually represented as char*.
Additionally, I don't see how you'd expect this wrapper to work in an AnyCPU environment. If the native DLL is 32-bit, you can only use it from a 32-bit process. AnyCPU isn't magic, it just defers the decision of bit-ness to runtime, rather than compile-time.
I'm currently having a problem with unsafe pointers which appears to be a compiler bug.
Note: the problem is not with the fact that I am using pointers and unsafe code; the code works pretty well. The problem is related to a confirmed compiler bug that refuses to compile legal code under certain circumstances. If you're interested in that problem, visit my other question
Since my problem is with the declaration of pointer variables, I have decided to work around the problem, and use IntPtr instead, and cast to actual pointers when needed.
However, I noticed that I cannot do something like this:
IntPtr a = something;
IntPtr b = somethingElse;
if (a > b) // ERROR. Can't do this
{
}
The > and < operators don't seem to be defined for IntPtr. Notice that I can indeed compare two actual pointers.
IntPtr has a .ToInt64() method. However, this returns a signed value, which may return incorrect values when comparing with > and < when positive and negative values are involved.
To be honest, I don't really understand what use is there to a .ToInt64() method that returns a signed value, considering that pointer comparisons are performed unsigned, but that's not my question.
One could argue that IntPtrs are opaque handles, and it is therefore meaningless to compare with > and <. However, I would point out that IntPtr has addition and subtraction methods, which mean that there is actually a notion of order for IntPtr, and therefore > and < are indeed meaningful.
I guess I could cast the result of ToInt64() to a ulong and then compare, or cast the IntPtr to a pointer and then do the comparison, but it makes me think why aren't > and < defined for IntPtr.
Why can't I compare two IntPtrs directly?
IntPtr has always been a little neglected. Until .NET 4.0 there weren't even the Add/operator+ and Subtract/operator-.
Now... if you want to compare two pointers, cast them to long if they are an IntPtr or to ulong if they are UIntPtr. Note that on Windows you'll need a UIntPtr only if you are using a 32 bits program with the /3GB option, because otherwise a 32 bits program can only use the lower 2gb of address space, while for 64bit programs, much less than 64 bits of address space is used (48 bits at this time).
Clearly if you are doing kernel programming in .NET this changes :-) (I'm jocking here, I hope :-) )
For the reason of why IntPtr are preferred to UIntPtr: https://msdn.microsoft.com/en-us/library/system.intptr%28v=vs.110%29.aspx
The IntPtr type is CLS-compliant, while the UIntPtr type is not. Only the IntPtr type is used in the common language runtime. The UIntPtr type is provided mostly to maintain architectural symmetry with the IntPtr type.
There are some languages that don't have the distinction between signed and unsigned types. .NET wanted to support them.
Done some tests by using
editbin /LARGEADDRESSAWARE myprogram.exe
(I was even able to crash my graphic adapter :-) )
static void Main(string[] args)
{
Console.WriteLine("Is 64 bits", Environment.Is64BitProcess);
const int memory = 128 * 1024;
var lst = new List<IntPtr>(16384); // More than necessary
while (true)
{
Console.Write("{0} ", lst.Count);
IntPtr ptr = Marshal.AllocCoTaskMem(memory);
//IntPtr ptr = Marshal.AllocHGlobal(memory);
lst.Add(ptr);
if ((long)ptr < 0)
{
Console.WriteLine("\nptr #{0} ({1}, {2}) is < 0", lst.Count, ptr, IntPtrToUintPtr(ptr));
}
}
}
I was able to allocate nearly 4 gb of memory with a 32 bits program (on a 64 bit OS) (so I had negative IntPtr)
And here it is a cast from IntPtr to UIntPtr
public static UIntPtr IntPtrToUintPtr(IntPtr ptr)
{
if (IntPtr.Size == 4)
{
return unchecked((UIntPtr)(uint)(int)ptr);
}
return unchecked((UIntPtr)(ulong)(long)ptr);
}
Note that thanks to how sign extension works, you can't simply do (UIntPtr)(ulong)(long)ptr, because at 32 bits it will break.
But note that few programs really support > 2 gb on 32 bits... http://blogs.msdn.com/b/oldnewthing/archive/2004/08/12/213468.aspx
Comparing IntPtr is very, very dangerous. The core reason why the C# language disallows this, even though the CLR has no problem with it.
IntPtr is frequently used to store an unmanaged pointer value. Big problem is: pointer values are not signed values. Only an UIntPtr is an appropriate managed type to store them. Big problem with UIntPtr is that it is not a CLS-compliant type. Lots of languages don't support unsigned types. Java, JScript and early versions of VB.NET are examples. All the framework methods therefore use IntPtr.
It is so especially nasty because it often works without a problem. Starting with 32-bit versions of Windows, the upper 2 GB of the address space is reserved to the operating system so all pointer values used in a program are always <= 0x7FFFFFFFF. Works just fine in an IntPtr.
But that is not true in every case. You could be running as a 32-bit app in the wow64 emulator on a 64-bit operating system. That upper 2 GB of address space is no longer needed by the OS so you get a 4 GB address space. Which is very nice, 32-bit code often skirts OOM these days. Pointer values now do get >= 0x80000000. And now the IntPtr comparison fails in completely random cases. A value of, say, 0x80000100 actually is larger than 0x7FFFFE00, but the comparison will say it is smaller. Not good. And it doesn't happen very often, pointer values tend to be similar. And it is quite random, actual pointer values are highly unpredictable.
That is a bug that nobody can diagnose.
Programmers that use C or C++ can easily make this mistake as well, their language doesn't stop them. Microsoft came up with another way to avoid that misery, such a program must be linked with a specific linker option to get more than 2 GB of address space.
IMHO, IntPtr wasn't developed for such aims like higher/lower comparison. It the structure which stores memory address and may be tested only for equality. You should not consider relative position of anything in memory(which is managed by CLI). It's like comparing which IP in Enternet is higher.
Follow up to PInvoke byte array to char not behaving properly in 64 bit. (Stale question and my suspicions were wrong, thus the title and descriptionwas unfitting).
I am using P/Invoke to call C++ code from C#. I have both the C# and C++ projects set to build in x64 in the build configurations of VS. when I run this program, the parameters of the P/Invoke call are shifted by 32 bits as follows
C# : |Parameter 1|Parameter 2|Parameter 3|Parameter 4|
| | | | |
V V V V V
C++: |Parameter 1|Parameter 2|Parameter 3|Parameter 4|
So if I pass 1,2,3,4 from the C# side, the C++ side receives 2,3,4,garbage.
I have worked around this by passing in an extra int in front of the C# parameters without changing the C++ side. this offsets the parameters by 32 bits and realigns them and the program works perfectly.
Does anyone know what is causing this strange offset and the proper way to correct it?
Here is a simplified example showing my method signatures
C# side:
[DllImport(#"C:\FullPath\CppCode.dll", EntryPoint = "MethodName",
CallingConvention = CallingConvention.Cdecl))]
private static extern bool MethodName(parameters);
C++ side:
extern "C" __declspec(dllexport)
bool CppClass::MethodName(parameters)
I suspect that since the parameters are off by 32 bits, there might be something that isn't really being done in 64 bit properly. Perhaps when calling a method, there is an implicit pointer that is passed before the variables? If that is only a 32 bit pointer on the C# side and the C++ is expecting a 64 bit pointer, that could cause this offset situation, but I'm not sure.
MethodName should not be a class instance method, as the first bytes (4 in x86 world, 8 in x64 world) will be used for the class instance pointer in the unmanaged world.
So, it should be a static method (or a C style method).
I am using C# in Mono and I'm trying to use pinvoke to call a Linux shared library.
The c# call is defined as:
[DllImport("libaiousb")]
extern static ulong AIOUSB_Init();
The Linux function is defined as follows:
unsigned long AIOUSB_Init() {
return(0);
}
The compile command for the Linux code is:
gcc -ggdb -std=gnu99 -D_GNU_SOURCE -c -Wall -pthread -fPIC
-I/usr/include/libusb-1.0 AIOUSB_Core.c -o AIOUSB_Core.dbg.o
I can call the function ok but the return result is bonkers. It should be 0 but I'm getting some huge mangled number.
I've put printf's in the Linux code just before the function value is returned and it is correct.
One thing I have noticed that is a little weird is that the printf should occur before the function returns. However, I see the function return to C# and then the c# prints the return result and finally the printf result is displayed.
From here:
An unsigned long can hold all the values between 0 and ULONG_MAX inclusive. ULONG_MAX must be at least 4294967295. The long types must contain at least 32 bits to hold the required range of values.
For this reason a C unsigned long is usually translated to a .NET UInt32:
[DllImport("libaiousb")]
extern static uint AIOUSB_Init();
You're probably running that on a system where C unsigned long is 32-bits. C# unsigned long is 64 bits. If you want to make sure the return value is a 64-bits unsigned long, include stdint.h and return an uint64_t from AIOUSB_Init().