I am calling a C++ COM component using interop and the marshalling requires one of the parameters to be passed in as ref Byte. The argument is actually a string. How do I convert a string (or a char array) to Byte to pass to this method?
Method IDL
[helpstring("method Read")]
HRESULT _stdcall Read(
[in] unsigned char* path,
[in] unsigned char command,
[in] unsigned char nShortAddr,
[out] short* pnDataSize,
[out] unsigned char** ppbyData,
[out] unsigned long* pnError);
The IL
.method public hidebysig newslot virtual
instance void Read([in] uint8& path,
[in] uint8 command,
[in] uint8 nShortAddr,
[out] int16& pnDataSize,
[out] native int ppbyData,
[out] uint32& pnError) runtime managed internalcall
The method in the wrapper as seen in Visual Studio
public virtual void Read(ref byte path, byte command, byte nShortAddr,
out short pnDataSize, System.IntPtr ppbyData, out uint pnError)
This is just a poorly authored COM interface. It can only work when it is used in-process from C++ code. Which is probably all that the creator had intended. An argument type declaration like unsigned char* is ambiguous, it could mean that a single byte is passed by reference but could also indicate an array of bytes. You can use attributes in IDL to distinguish between the two but they were not used. The type library importer therefore could not assume anything else but that this was a ref byte.
There's only one decent fix if the source IDL can't be updated, you have to edit the interop library. Do so by decompiling the existing one first with ildasm.exe. Then edit the argument type, best done by creating a dummy C# interface with the same declaration as an example. Put humpty-dumpty back together with ilasm.exe
You cannot convert String to Byte
But a conversion of String to Byte[] is possible by using the ASCII encoding
String is .Net is - A string is a sequential collection of Unicode characters that is used to represent text. A String object is a sequential collection of System.Char objects that represent a string.
you can get String as Byte[] using System.Text.Encoding.ASCII.GetBytes(my_string)
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(my_string)
Related
I have a very basic call from C# to C++. The function call takes three parameters:
extern "C" int CLIENT_API example(const char, char * const, const unsigned int);
The C# code imports that function as:
[DllImport("my.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern int example(in sbyte id, byte[] buffer, in uint bufferSize);
When I call the function from C# like this:
uint bufferSize = 400;
byte[] buffer = new byte[bufferSize];
example(0, buffer, bufferSize);
I log on the C++ size variant values for both id and bufferSize (every run has different values for those two variables). What am I doing wrong? I have other interop functions that pass pointers and strings to my C++ DLL and they aren't having any issues. It seems like it just happens with the more primitive types of byte and uint.
The in modifier makes the argument a read-only ref argument. It's not the same as the [In] attribute, which is a marshalling hint.
You're passing 0 and bufferSize, and C# is taking their addresses (the literal 0 by way of a temporary variable) and passing those.
You're getting varying values because the address is different each time.
Remove the in modifiers so that C# passes those values rather than addresses.
I'm reading all about structure marshalling between C and C# in 64bit environment without success.
What I need to do is to pass the structure
typedef struct DescStructTa
{
char* pszNa;
unsigned long ulTId;
char* pszT;
unsigned short usRId;
unsigned long ulOff;
unsigned long ulSi;
char szAcc[2];
unsigned char bySSize;
} DescStruct;
from a C# application to a C DLL not made by us calling the method
MT_DLL_DECL long GetAll(UINTPTR ulHandler, DescStruct **ppList, unsigned long *pulNumS);
After long search I wrote the following code:
[StructLayout(LayoutKind.Sequential, Pack = 8)]
public struct PlcSymbolDescStruct
{
[MarshalAs(UnmanagedType.LPStr)]
string pszNa;
UInt32 ulTId;
[MarshalAs(UnmanagedType.LPStr)]
string pszT;
[MarshalAs(UnmanagedType.U2)]
ushort usRId;
UInt32 ulOff;
UInt32 ulSi;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = 2)]
string szAcc;
[MarshalAs(UnmanagedType.U1)]
byte bySSize;
}
[DllImport("DataSource.dll", SetLastError = true, CallingConvention = CallingConvention.StdCall)]
[return: MarshalAs(UnmanagedType.U8)]public static extern long GetAll(ulong ulHandler, out DescStruct [] ppList,
out uint pulNumS);
Unfortunately this is still giving me a heap corruption.
If I try to pack less than 8 it gives (as expected) an access violation, but I expected this to work and I'm not able to figure out what could be wrong.
Any suggestion?
Thanks in advance!
I’m not sure you need Pack = 8. The default packing appears to be more or less compatible between C# and C++.
The szAcc field probably uses Unicode. To switch, specify CharSet=CharSet.Ansi in your [StructLayout]
Finally, if your C API has that double pointer to the array of them, you have to allocate that memory in a special way, either CoTaskMemAlloc or LocalAlloc, forgot which one.
It’s easier, also more efficient, to implement 2 functions instead. One to obtain the required size of the array. Another one to write them into caller-allocated buffer, a single pointer in C++ as opposed to double pointer, and UnmanagedType.LPArray in C#.
Update
allocation should not be necessary since the documentation states that the call returns a pointer to it's internal data structure
The runtime doesn’t know what was written in that documentation, attempts to free anyway, and crashes. You can use out IntPtr in C# API, and marshal manually.
If you have modern .NET, use unsafe, cast the IntPtr to PlcSymbolDescStruct* raw pointer, and construct ReadOnlySpan<PlcSymbolDescStruct> from that raw pointer.
if you’re stuck with old desktop .NET, call Marshal.PtrToStructure<PlcSymbolDescStruct> in a loop, incrementing the pointer by Marshal.SizeOf<PlcSymbolDescStruct>() bytes between loop iterations. No need to use unsafe code in this case.
I have a structure:
typedef struct _wfs_bcr_caps
{
WORD wClass;
BOOL bCompound;
BOOL bCanFilterSymbologies;
LPUSHORT lpwSymbologies;
DWORD dwGuidLights[32];
LPSTR lpszExtra;
BOOL bPowerSaveControl;
BOOL bAntiFraudModule;
}
I need to make a correct copy of this structure in C#.
But I have a problem with LPUSHORT type. Could some one help me to set up correct marshal attributes for lpwSymbologies property?
LPUSHORT is just long pointer to ushort value. You can marshal it as IntPtr and than read a value using Marshal.ReadInt16 or Marshal.ReadInt32 (since you are using unsigned short). Another option is described in this article, Unmanaged to Managed type translation table, e.g. marshalling LP<struct> into [In] ref <struct>
This question already has an answer here:
How to pass a nullable type to a P/invoked function [duplicate]
(1 answer)
Closed 5 years ago.
How can i skip the optional parameter (pioRecvPci) in C#?
I think the main problem is that in C the parameter is a pointer so it is possible to supply NULL while in C# the ref keyword on a struct is used which can't be null by definition.
C Code
typedef struct {
DWORD dwProtocol;
DWORD cbPciLength;
} SCARD_IO_REQUEST;
LONG WINAPI SCardTransmit(
__in SCARDHANDLE hCard,
__in LPCSCARD_IO_REQUEST pioSendPci,
__in LPCBYTE pbSendBuffer,
__in DWORD cbSendLength,
__inout_opt LPSCARD_IO_REQUEST pioRecvPci,
__out LPBYTE pbRecvBuffer,
__inout LPDWORD pcbRecvLength
);
C# Code
[StructLayout(LayoutKind.Sequential)]
public struct SCARD_IO_REQUEST
{
public int dwProtocol;
public int cbPciLength;
}
[DllImport("winscard.dll")]
public static extern int SCardTransmit(
int hCard,
ref SCARD_IO_REQUEST pioSendRequest,
ref byte SendBuff,
int SendBuffLen,
ref SCARD_IO_REQUEST pioRecvRequest,
ref byte RecvBuff,
ref int RecvBuffLen);
You can change the struct to a class and then pass null. Remember that a C# struct is quite different from a C++ struct, and here your really want to use a C# class.
Or if you always want to ignore pioRecvRequest change the signature of SCardTransmit so that pioRecvRequest is of type IntPtr. Then pass IntPtr.Zero for the value.
Actually, the SCARD_IO_REQUEST is just a header and if you want to pass it in the call you will have to manage this structure and the additional buffer space yourself anyway so IntPtr is the right choice. You will then have to use the Marshal functions to allocate and fill the structure before the call and unmarshal the data and free it after the call.
C# supports method overloads. Something you can take advantage of here, redeclare the method but now give it an IntPtr argument type (no ref). And pass IntPtr.Zero.
Second way is to marshal the structure yourself. Declare the argument type IntPtr. And use Marshal.AllocHGlobal() to allocate memory for the structure, Marshal.StructureToPtr() to copy the structure into it. Marshal.FreeHGlobal() after the call. Or pass IntPtr.Zero. Clearly the overload trick is much less painful.
I am working with an existing code base made up of some COM interfaces written in C++ with a C# front end. There is some new functionality that needs to be added, so I'm having to modify the COM portions. In one particular case, I need to pass an array (allocated from C#) to the component to be filled.
What I would like to do is to be able to pass an array of int to the method from C#, something like:
// desired C# signature
void GetFoo(int bufferSize, int[] buffer);
// desired usage
int[] blah = ...;
GetFoo(blah.Length, blah);
A couple of wrenches in the works:
C++/CLI or Managed C++ can't be used (COM could be done away with in this case).
The C# side can't be compiled with /unsafe (using Marshal is allowed).
The COM interface is only used (an will only ever be used) by the C# part, so I'm less concerned with interoperability with other COM consumers. Portability between 32 and 64 bit is also not a concern (everything is being compiled and run from a 32 bit machine, so code generators are converting pointers to integers). Eventually, it will be replaced by just C++/CLI, but that is a ways off.
My initial attempt
is something similar to:
HRESULT GetFoo([in] int bufferSize, [in, size_is(bufferSize)] int buffer[]);
And the output TLB definition is (seems reasonable):
HRESULT _stdcall GetFoo([in] int bufferSize, [in] int* buffer);
Which is imported by C# as (not so reasonable):
void GetFoo(int bufferSize, ref int buffer);
Which I could use with
int[] b = ...;
fixed(int *bp = &b[0])
{
GetFoo(b.Length, ref *bp);
}
...except that I can't compile with /unsafe.
At the moment
I am using:
HRESULT GetFoo([in] int bufferSize, [in] INT_PTR buffer);
Which imports as:
void GetFoo(int bufferSize, int buffer);
And I need use use it like:
int[] b = ...;
GCHandle bPin = GCHandle.Alloc(b, GCHandleType.Pinned);
try
{
GetFoo(b.Length, (int)Marshal.UnsafeAddrOfPinnedArrayElement(b, 0));
}
finally
{
bPin.Free();
}
Which works..., but I'd like to find a cleaner way.
So, the question is
Is there an IDL definition that is friendly to the C# import from TLB generator for this case? If not, what can be done on the C# side to make it a little safer?
So you're asking for an IDL datatype that is 32-bits on a 32-bit machine and 64-bits on a 64-bit machine. But you don't want the marshaling code to treat it like a pointer, just as an int. So what do you expect to happen to the extra 32-bits when you call from a 64-bit process to a 32-bit process?
Sound like a violation of physics to me.
If it's inproc only, see the bottom of this discussion: http://www.techtalkz.com/vc-net/125190-how-interop-net-client-com-dll.html.
The recommendation seems to be to use void * instead of intptr and flag with the [local] so the marshaller doesn't get involved.
I don't know much about C# COM operability, but have you tried using SAFEARRAY(INT_PTR) or something similar?
Hmmm... I've found some information that gets me closer...
Marshaling Changes - Conformant C-Style Arrays
This IDL declaration (C++)
HRESULT GetFoo([in] int bufferSize, [in, size_is(bufferSize)] int buffer[]);
Is imported as (MSIL)
method public hidebysig newslot virtual instance void GetFoo([in] int32 bufferSize, [in] int32& buffer) runtime managed internalcall
And if changed to (MSIL)
method public hidebysig newslot virtual instance void GetFoo([in] int32 bufferSize, [in] int32[] marshal([]) buffer) runtime managed internalcall
Can be used like (C#)
int[] b = ...;
GetFoo(b.Length, b);
Exactly what I was gunning for!
But, are there any other solutions that don't require fixing up the MSIL of the runtime callable wrapper that is generated by tlbimport?