JNI - Disconnected from the target VM, during the execution of native code - c#

I'm getting an error with a Java exception while executing my program. Though java catches the exception it continues the execution of native code without giving the intended result.
here is the native code that I guess something wrong with it :-
System::Collections::Generic::List<cli::array<unsigned char>^>^ processImages(unsigned char* frontImage, unsigned char* backImage);
jbyteArray convertUnsignedCharIntoJByteArray(JNIEnv *env,array<unsigned char>^ inputArray);
unsigned char* convertJByteArrayIntoUnsignedChar(JNIEnv *env, jbyteArray inImage);
JNIEXPORT jobject JNICALL Java_com_me_NativeCaller_processImages
(JNIEnv *env, jclass c, jbyteArray front, jbyteArray back){
unsigned char* frontBuffer = convertJByteArrayIntoUnsignedChar(env, front);
unsigned char* backBuffer = convertJByteArrayIntoUnsignedChar(env, back);
jbyteArray intermediateArray1 = convertUnsignedCharIntoJByteArray(env,(array<unsigned char>^)returnedValue[0]);
jbyteArray intermediateArray2 = convertUnsignedCharIntoJByteArray(env,(array<unsigned char>^)returnedValue[1]);
int aLen1 = strlen(reinterpret_cast<const char*>(&intermediateArray1));
int aLen2 = strlen(reinterpret_cast<const char*>(&intermediateArray2));
jbyteArray finalArray = env->NewByteArray(2);
env->SetByteArrayRegion(finalArray,0,1,(jbyte*)&intermediateArray1);
env->SetByteArrayRegion(finalArray,1,2,(jbyte*)&intermediateArray2);
return static_cast<jobject>(finalArray);
}
/*takes an array<unsigned char>^ as input and convert it into a jbyteArray*/
jbyteArray convertUnsignedCharIntoJByteArray(JNIEnv *env,array<unsigned char>^ inputArray){
int aLen = strlen(reinterpret_cast<const char*>(&inputArray));
jbyteArray intermediateArray = env->NewByteArray(aLen);
env->SetByteArrayRegion(intermediateArray,0,aLen,(jbyte*)&inputArray);
return intermediateArray;
}
to be precise I think the way I convert manged unsigned char into jbyteArray is not correct.
can someone show me where did I go wrong, and a possible way to overcome this problem.

This looks incorrect:
jbyteArray intermediateArray1 = convertUnsignedCharIntoJByteArray(env,(array<unsigned char>^)returnedValue[0]);
jbyteArray intermediateArray2 = convertUnsignedCharIntoJByteArray(env,(array<unsigned char>^)returnedValue[1]);
//...
env->SetByteArrayRegion(finalArray,0,1,(jbyte*)&intermediateArray1);
env->SetByteArrayRegion(finalArray,1,2,(jbyte*)&intermediateArray2);
Not familiar with C++/CLI syntax using that "^" stuff, I'm focusing on jbyteArray and the SetByteArrayRegion() call. A jbyteArray is an alias for a pointer. Given that, the calls to SetByteArrayRegion() are not correct. It should be:
env->SetByteArrayRegion(finalArray,0,1,(jbyte*)intermediateArray1);
env->SetByteArrayRegion(finalArray,1,2,(jbyte*)intermediateArray2);
You may also check your other usages of &array in other parts of your code. Again, I'm not familiar with the nuances of the "^" syntax, but have used traditional C++ to implement JNI code.

Related

Passing a return char* from C++ to C# with DllImport

I am a newbie on both C# WPF and C++.
Recently, I got an external .dll, which returns a char*, and I want to receive the return value in C# by using DllImport. Then, using str.Split(';') to separate the characters. To the purpose, I created a button to show the first character in the string I split on a label when I click the button.
Therefore, I use an IntPtr to receive a char*, which is from C++ .dll, and call Marshal.PtrToStringAnsi() to turn it into a string. However, when I execute the code, it sometimes works but sometimes crushes. Then error code always shows
Unhandled exception at 0x00007FFC06839269 (ntdll.dll) in UITest.exe: 0xC0000374: heap corruption (parameters: 0x00007FFC068A27F0).
I thought my code is reasonable and I couldn't find root causes. Can anyone help me? Thanks!
The below shows the content in .dll and also the code I used in C#.
C++ code in Dlltest.h:
#define DLL_EXPORT extern "C" __declspec(dllexport)
char* getRbtData = nullptr;
DLL_EXPORT char* func_getRbtData();
C++ code in Dlltest.cpp:
char* func_getRbtData()
{
getRbtData = new char(128);
memset(getRbtData, 0, strlen(getRbtData));
char* _getRbtData = "1.0;23.0;55.0;91.0;594.0;";
memcpy(getRbtData, _getRbtData, strlen(_getRbtData));
return getRbtData;
};
C# code in UITest.xaml.cs:
[DllImport("DllTest.dll",EntryPoint = "func_getRbtData", CharSet = CharSet.Ansi)]
public static extern IntPtr func_getRbtData();
string[] words;
private void btn_test_Click(object sender, RoutedEventArgs e)
{
IntPtr intptr = func_getRbtData();
string str = Marshal.PtrToStringAnsi(intptr);
words = str.Split(';');
lb_content.Content = words[1];
}
There are several problems with your code.
On the C++ side, your DLL function is implemented all wrong:
getRbtData = new char(128);
You are allocating a single char whose value is 128, not an array of 128 chars. You need to use new char[128] instead for that.
memset(getRbtData, 0, strlen(getRbtData));
getRbtData is not a pointer to a null-terminated string, so strlen(getRbtData) is undefined behavior. It reads into surrounding memory while calculating the length until it finds a random 0x00 byte in memory.
And then the subsequent memset() into getRbtData will overwrite that surrounding memory. If it doesn't just crash outright.
char* _getRbtData = "1.0;23.0;55.0;91.0;594.0;";
Prior to C++11, this assignment is OK but discouraged. In C++11 and later, this assignment is actually illegal and won't compile.
String literals are read-only data, so you need to use const char instead of char in your pointer type. You should do that even in older compilers.
memcpy(getRbtData, _getRbtData, strlen(_getRbtData));
strlen(_getRbtData) is OK since _getRbtData is a pointer to a null-terminated string.
However, since getRbtData is not allocated with enough memory to receive all of the copied chars, memcpy() into getRbtData is also undefined behavior and will trash memory, if not crash outright.
return getRbtData;
This is OK to pass the pointer to C#.
However, since the memory is being allocated with new (better, new[]), it needs to be freed with delete (delete[]), which you are not doing. So you are leaking the memory.
Marshal.PtrToStringAnsi() on the C# side will not (and cannot) free new'ed memory for you. So your C# code will need to pass the pointer back to the DLL so it can delete the memory properly.
Otherwise, you will need to allocate the memory using the Win32 API LocalAlloc() or CoTaskMemAlloc() function so that the Marshal class can be used on the C# side to free the memory directly without passing it back to the DLL at all.
On the C# side, you are using the wrong calling convention on your DllImport statement. The default is StdCall for compatibility with most Win32 API functions. But your DLL function is not specifying any calling convention at all. Most C/C++ compilers will default to __cdecl unless configured differently.
With that said, try this instead:
Dlltest.h
#define DLL_EXPORT extern "C" __declspec(dllexport)
DLL_EXPORT char* func_getRbtData();
DLL_EXPORT void func_freeRbtData(char*);
Dlltest.cpp
char* func_getRbtData()
{
const char* _getRbtData = "1.0;23.0;55.0;91.0;594.0;";
int len = strlen(_getRbtData);
char *getRbtData = new char[len+1];
// alternatively:
/*
char *getRbtData = (char*) LocalAlloc(LMEM_FIXED, len+1);
if (!getRbtData) return NULL;
*/
memcpy(getRbtData, _getRbtData, len+1);
return getRbtData;
}
void func_freeRbtData(char *p)
{
delete[] p;
// alternatively:
// LocalFree((HLOCAL)p);
}
UITest.xaml.cs
[DllImport("DllTest.dll", EntryPoint = "func_getRbtData", CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr func_getRbtData();
[DllImport("DllTest.dll", EntryPoint = "func_freeRbtData", CallingConvention = CallingConvention.Cdecl)]
public static extern void func_freeRbtData(IntPtr p);
string[] words;
private void btn_test_Click(object sender, RoutedEventArgs e)
{
IntPtr intptr = func_getRbtData();
string str = Marshal.PtrToStringAnsi(intptr);
func_freeRbtData(intptr);
// alternatively:
// Marshal.FreeHGlobal(intptr);
words = str.Split(';');
lb_content.Content = words[1];
}
new char(128) returns a pointer to one character, with initial value 128.
I could tell you how to allocate 128 characters, but the chief problem with that is that you can't clean it up so that's not a useful answer. Check the existing questions about returning a string to C#.

C DLL does NOT Work in C# but Does work in C++

I don`t understand this code below, that it is working with my c++ but it just does not want to work with c#.
Can you please help me to understand what is wrong here and i think i have to say that i am absolutely new to C#.
My_Lib.h
extern "C" __declspec(dllexport) void __cdecl get(char** buffer);
My_lib.c
void get(char** buffer)
{
*buffer = (char*)calloc(6, sizeof(char));
assert(*buffer);
buffer[5] = '\0';
*buffer = "Hello";
}
in my C#----->
public static class NativeMethods
{
[DllImport("My_C_Lib.dll", CallingConvention = CallingConvention.Cdecl)]
unsafe public static extern void get(char** buffer);
}
//////////////////// Main()///////
unsafe
{
char* my_buf;
NativeMethods.get(&my_buf);
string s = new string(my_buf);
Console.WriteLine(s);
}
NOTE: Actually my DLL does Work when i call this c function from c++ but as i said above NOT in C#, there is no Errors but string s variable in C# prints some undefined sibols, but DLL "works"
thanks in advance!!!
The code is nearly correct, but...
C DLL does NOT Work in C# but Does work in C++
In C and C++ char is a 8-bit data type. In C# char is a 16-bit data type.
This means that C# expects that the pointer returned by the get() function is expected to be a "wide string", while in C++ expects an "ANSI string".
I simply changed one single line in your program:
*buffer = "H\0e\0l\0l\0o\0\0\0";
... and it works!
You may of course also use the "wide string" functions of the modern C compilers:
void get(wchar_t** buffer)
{
*buffer = L"Hello";
}
By the way
There is another error in your program:
*buffer = (char*)calloc(6, sizeof(char));
...
*buffer = "Hello";
This makes no sense:
The second line (*buffer = "Hello";) will not copy the string to the memory allocated by calloc, but it will write the address of the string "Hello" to the variable buffer and the value (address) returned by calloc is overwritten (and lost).
Your best bet is to change the PInvoke signature of your C function to taking a ref IntPtr:
[DllImport("My_C_Lib.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern void get(ref IntPtr buffer);
To call this, you'll need to instantiate an IntPtr and pass it by reference to the C function (this is analagous to declaring a char * in C and passing it by address):
IntPtr ptr;
get(ref ptr);
// ptr is now an unmanaged pointer to the allocated string
Now, you need to convert the pointed-to string to a managed string. To do this you must make a copy of the string.
string str = Marshal.PtrToStringAnsi(ptr);
Be sure you understand the distinctions between ANSI and Unicode strings and make sure you call the appropriate PtrToString...() variant.
Note that the .NET runtime will not manage the allocated ptr for you since it was allocated by unmanaged code. You must free it yourself using whatever the appropriate mechanism is for your DLL (ideally, the DLL should provide a corresponding free function since it is the one who allocated the memory in the first place).
For robustness, add appropriate error / null pointer checking. Make sure get() succeeded and that ptr is not IntPtr.Zero.
To avoid leaks, if there is any possibility of any code throwing an exception between the time get() returns and the time you free the pointer, the try/finally construct is your friend.
(Aside: Note that get is a Contextual Keyword in C# and thus while you can use it as an identifier, you may prefer not to in order to avoid confusion.)

Issues import C DLL into C# program

I've been reading on this problem a lot before i decided to post it here. I have C dll that i need to access inside of a C# program. The DLL's code is pretty simple. As it's nothing more then a hook into biometrics driver (it's C because it's has to include lib and header files from driver for the code to work). Here is the code for the dll:
#include <stdio.h>
#include <windows.h>
#include <DPCN_TM.h>
__declspec(dllexport) unsigned char * ConvertPlatinumTemplateToDpTemplate(const unsigned char* inputData);
HRESULT Convert(
const unsigned char* inputData,
const size_t size,
DPCN_DATA_TYPE inputDataType,
DPCN_DATA_TYPE outputDataType,
DPCN_PURPOSE purpose,
unsigned char** ppbOutputData,
const void * outputParameters,
size_t * pcbData)
{
HRESULT hr = 0;
size_t cbData = 0;
if (FAILED(hr = DPCNConvertFingerprintData(inputData, size, inputDataType, purpose, NULL, outputDataType, outputParameters, NULL, &cbData))) {
return hr;
}
if (!(*ppbOutputData = (unsigned char *)malloc(cbData))) {
return DPCN_ERR_NO_MEMORY;
}
hr = DPCNConvertFingerprintData(inputData, size, inputDataType, purpose, NULL, outputDataType, outputParameters, *ppbOutputData, &cbData);
*pcbData = cbData;
return hr;
}
unsigned char * ConvertPlatinumTemplateToDpTemplate(const unsigned char* inputData) {
HRESULT hr = 0;
const size_t inputSize = sizeof(inputData);
DPCN_DATA_TYPE inputDataType = DPCN_DT_DP_TEMPLATE;
DPCN_DATA_TYPE outputDataType = DPCN_DT_DP_PLATINUM_TEMPLATE;
unsigned char *pbOutputData = NULL;
size_t cbData = 0;
hr = Convert(inputData, inputSize, inputDataType, outputDataType, DPCN_PURPOSE_IDENTIFICATION, &pbOutputData, NULL, &cbData);
return pbOutputData;
}
As you can see the contents of the DLL is pretty straight forward. From the code you can see I'm needing to access this function inside of the C# program.
unsigned char * ConvertPlatinumTemplateToDpTemplate(const unsigned char* inputData);
Now in my C# code I have done this:
[DllImport(#"path_to_dll\DPFPTemplateConvert.dll")]
public extern byte[] ConvertPlatinumTemplateToDpTemplate(byte[] inputData);
When I call the function I end up getting this error:
A first chance exception of type 'System.Runtime.InteropServices.MarshalDirectiveException' occurred in DLLImportTest.exe
Additional information: Cannot marshal 'return value': Invalid managed/unmanaged type combination.
What am I doing wrong?
An unsigned char * cannot be converted to a .NET byte array, for one simple reason: what should be the length of that array?
And even then, it's a bad idea to pass a pointer out of your function, if that pointer is pointing to memory allocated by that function. Who will release this memory?
You should have the .NET side allocate the byte[] for the result, and pass it into the function.
If the .NET side does not know beforehand how big the allocated array needs to be, use a callback as explained here: http://blog.getpaint.net/2012/04/30/marshaling-native-arrays-back-as-managed-arrays-without-copying/

How do I call this c function in c# (unmarshalling return struct)?

I want to use c# interop to call a function from a dll written in c. I have the header files.
Take a look at this:
enum CTMBeginTransactionError {
CTM_BEGIN_TRX_SUCCESS = 0,
CTM_BEGIN_TRX_ERROR_ALREADY_IN_PROGRESS,
CTM_BEGIN_TRX_ERROR_NOT_CONNECTED
};
#pragma pack(push)
#pragma pack(1)
struct CTMBeginTransactionResult {
char * szTransactionID;
enum CTMBeginTransactionError error;
};
struct CTMBeginTransactionResult ctm_begin_customer_transaction(const char * szTransactionID);
How do I call ctm_begin_customer_transaction from c#. The const char * mapps well to string, but despite various attempts (looking at stackoverflow and other sites), I fail to marshal the return structure. If I define the function to return IntPtr it works ok...
Edit
I changed the return type to IntPtr and use:
CTMBeginTransactionResult structure = (CTMBeginTransactionResult)Marshal.PtrToStructure(ptr, typeof(CTMBeginTransactionResult));
but it throws AccessViolationException
I also tried:
IntPtr ptr = Transactions.ctm_begin_customer_transaction("");
int size = 50;
byte[] byteArray = new byte[size];
Marshal.Copy(ptr, byteArray, 0, size);
string stringData = Encoding.ASCII.GetString(byteArray);
stringData == "70e3589b-2de0-4d1e-978d-55e22225be95\0\"\0\0\a\0\0\b\b?" at this point. The "70e3589b-2de0-4d1e-978d-55e22225be95" is the szTransactionID from the struct. Where is the Enum? Is it the next byte?
There's a memory management problem hidden in this struct. Who owns the C string pointer? The pinvoke marshaller will always assume that the caller owns it so it will try to release the string. And passes the pointer to CoTaskMemFree(), same function as the one called by Marshal.FreeCoTaskMem(). These functions use the COM memory allocator, the universal interop memory manager in Windows.
This rarely comes to a good end, C code does not typically use that allocator unless the programmer designed his code with interop in mind. In which case he'd never have used a struct as a return value, interop always works much less trouble-free when the caller supplies buffers.
So you cannot afford to let the marshaller do its normal duty. You must declare the return value type as IntPtr so it doesn't try to release the string. And you must marshal it yourself with Marshal.PtrToStructure().
That however still leaves the question unanswered, who owns the string? There is nothing you can do to release the string buffer, you don't have access to the allocator used in the C code. The only hope you have is that the string wasn't actually allocated on the heap. That's possible, the C program might be using string literals. You need to verify that guess. Call the function a billion times in a test program. If that doesn't explode the program then you're good. If not then only C++/CLI can solve your problem. Given the nature of the string, a "transaction ID" ought to change a lot, I'd say you do have a problem.
I hate to answer my own question, but I found the solution to marshal the resulting struct. The struct is 8 bytes long (4 bytes for the char * and 4 bytes for enum). Marshalling the string does not work automatically, but the following works:
// Native (unmanaged)
public enum CTMBeginTransactionError
{
CTM_BEGIN_TRX_SUCCESS = 0,
CTM_BEGIN_TRX_ERROR_ALREADY_IN_PROGRESS,
CTM_BEGIN_TRX_ERROR_NOT_CONNECTED
};
// Native (unmanaged)
[StructLayout(LayoutKind.Sequential, Pack = 1, CharSet = CharSet.Ansi)]
internal struct CTMBeginTransactionResult
{
public IntPtr szTransactionID;
public CTMBeginTransactionError error;
};
// Managed wrapper around native struct
public class BeginTransactionResult
{
public string TransactionID;
public CTMBeginTransactionError Error;
internal BeginTransactionResult(CTMBeginTransactionResult nativeStruct)
{
// Manually marshal the string
if (nativeStruct.szTransactionID == IntPtr.Zero) this.TransactionID = "";
else this.TransactionID = Marshal.PtrToStringAnsi(nativeStruct.szTransactionID);
this.Error = nativeStruct.error;
}
}
[DllImport("libctmclient-0.dll")]
internal static extern CTMBeginTransactionResult ctm_begin_customer_transaction(string ptr);
public static BeginTransactionResult BeginCustomerTransaction(string transactionId)
{
CTMBeginTransactionResult nativeResult = Transactions.ctm_begin_customer_transaction(transactionId);
return new BeginTransactionResult(nativeResult);
}
The code works, but I still need to investigate, if calling the unmanaged code results in memory leaks.

Correct way to marshall uchar[] from native dll to byte[] in c#

I'm trying to marshall some data that my native dll allocated via CoTaskMemAlloc into my c# application and wondering if the way I'm doing it is just plain wrong or I'm missing some sublte decorating of the method c# side.
Currently I have c++ side.
extern "C" __declspec(dllexport) bool __stdcall CompressData( unsigned char* pInputData, unsigned int inSize, unsigned char*& pOutputBuffer, unsigned int& uOutputSize)
{ ...
pOutputBuffer = static_cast<unsigned char*>(CoTaskMemAlloc(60000));
uOutputSize = 60000;
And on the C# side.
private const string dllName = "TestDll.dll";
[System.Security.SuppressUnmanagedCodeSecurity]
[DllImport(dllName)]
public static extern bool CompressData(byte[] inputData, uint inputSize, out byte[] outputData, out uint outputSize );
...
byte[] outputData;
uint outputSize;
bool ret = CompressData(packEntry.uncompressedData, (uint)packEntry.uncompressedData.Length, out outputData, out outputSize);
here outputSize is 60000 as expected, but outputData has a size of 1, and when I memset the buffer c++ side, it seems to only copy across 1 byte, so is this just wrong and I need to marshall the data outside the call using an IntPtr + outputSize, or is there something sublte I'm missing to get working what I have already?
Thanks.
There are two things.
First, the P/Invoke layer does not handle reference parameters in C++, it can only work with pointers. The last two parameters (pOutputBuffer and uOutputSize) in particular are not guaranteed to marshal correctly.
I suggest you change your C++ method declaration to (or create a wrapper of the form):
extern "C" __declspec(dllexport) bool __stdcall CompressData(
unsigned char* pInputData, unsigned int inSize,
unsigned char** pOutputBuffer, unsigned int* uOutputSize)
That said, the second problem comes from the fact that the P/Invoke layer also doesn't know how to marshal back "raw" arrays (as opposed to say, a SAFEARRAY in COM that knows about it's size) that are allocated in unmanaged code.
This means that on the .NET side, you have to marshal the pointer that is created back, and then marshal the elements in the array manually (as well as dispose of it, if that's your responsibility, which it looks like it is).
Your .NET declaration would look like this:
[System.Security.SuppressUnmanagedCodeSecurity]
[DllImport(dllName)]
public static extern bool CompressData(byte[] inputData, uint inputSize,
ref IntPtr outputData, ref uint outputSize);
Once you have the outputData as an IntPtr (this will point to the unmanaged memory), you can convert into a byte array by calling the Copy method on the Marshal class like so:
var bytes = new byte[(int) outputSize];
// Copy.
Marshal.Copy(outputData, bytes, 0, (int) outputSize);
Note that if the responsibility is yours to free the memory, you can call the FreeCoTaskMem method, like so:
Marshal.FreeCoTaskMem(outputData);
Of course, you can wrap this up into something nicer, like so:
static byte[] CompressData(byte[] input, int size)
{
// The output buffer.
IntPtr output = IntPtr.Zero;
// Wrap in a try/finally, to make sure unmanaged array
// is cleaned up.
try
{
// Length.
uint length = 0;
// Make the call.
CompressData(input, size, ref output, ref length);
// Allocate the bytes.
var bytes = new byte[(int) length)];
// Copy.
Marshal.Copy(output, bytes, 0, bytes.Length);
// Return the byte array.
return bytes;
}
finally
{
// If the pointer is not zero, free.
if (output != IntPtr.Zero) Marshal.FreeCoTaskMem(output);
}
}
The pinvoke marshaller cannot guess how large the returned byte[] might be. Raw pointers to memory in C++ do not have a discoverable size of the pointed-to memory block. Which is why you added the uOutputSize argument. Good for the client program but not quite good enough for the pinvoke marshaller. You have to help and apply the [MarshalAs] attribute to pOutputBuffer, specifying the SizeParamIndex property.
Do note that the array is getting copied by the marshaller. That's not so desirable, you can avoid it by allowing the client code to pass an array. The marshaller will pin it and pass the pointer to the managed array. The only trouble is that the client code will have no decent way to guess how large to make the array. The typical solution is to allow the client to call it twice, first with uOutputSize = 0, the function returns the required array size. Which would make the C++ function look like this:
extern "C" __declspec(dllexport)
int __stdcall CompressData(
const unsigned char* pInputData, unsigned int inSize,
[Out]unsigned char* pOutputBuffer, unsigned int uOutputSize)

Categories

Resources