There is a GlobalSign CA. In most cases its root certificate is already exists in the Windows Certificates storage.
But sometimes (especially on old Windows versions) the storage doesn't contain the certificate.
I need to check if the certificate exists and import it if does not. I exported the certificate to a file and imported it using the code below:
public void ImportCertificate(StoreName storeName,
StoreLocation location,
byte[] certificateData)
{
X509Store x509Store = new X509Store(storeName, location);
X509Certificate2 certificate = new X509Certificate2(certificateData);
x509Store.Open(OpenFlags.ReadWrite);
x509Store.Add(certificate);
x509Store.Close();
}
The code adds the certificate but all certificate purposes are checked:
I don't want to add extra purposes to the certificate just want to set those ones which have other root CAs like below:
How to do it programatically?
You need to use CertSetCertificateContextProperty function to set store-attached properties.
In the dwPropId parameter you pass CERT_ENHKEY_USAGE_PROP_ID. You can find its numeric value in Wincrypt.h C++ header file. In a given case, dwPropId is 9:
#define CERT_ENHKEY_USAGE_PROP_ID 9
In the dwFlags you pass zero (0).
In the pvData parameter (which is IntPtr in managed signature) you pass an unmanaged pointer to an ASN.1-encoded byte array that represents a collection of object identifiers where each OID represents explicitly enabled key usage.
Here is the interop signature:
[DllImport("Crypt32.dll", CharSet = CharSet.Auto, SetLastError = true)]
internal static extern Boolean CertSetCertificateContextProperty(
[In] IntPtr pCertContext,
[In] UInt32 dwPropId,
[In] UInt32 dwFlags,
[In] IntPtr pvData,
);
Add usings reference to System.Runtime.InteropServices namespace.
Next, prepare a collection of key usages:
create a new instance of OidCollection class.
add required OIDs to the collection.
Use created OID collection to instantiate the X509EnhancedKeyUsageExtension class. This ctor overload is fine: X509EnhancedKeyUsageExtension(OidCollection, Boolean).
RawData property of EKU extension will contain ASN.1-encoded byte array which will be passed to CertSetCertificateContextProperty function.
Last parameter of CertSetCertificateContextProperty function is of IntPtr type and expects a pointer to an unmanaged memory block, so:
Use Marshal.AllocHGlobal(eku.RawData.Length) to allocate the properly sized buffer in unmanaged memory.
Use Marshal.Copy(byte[], IntPtr, int, int) static method overload to copy eku.RawData byte array to unmanaged pointer acquired in step 1.
call the CertSetCertificateContextProperty function. If it returns true then everything is ok.
After finishing all job, you must release unmanaged resources to avoid memory leak. Use Marshal.FreeHGlobal(IntPtr) method to release the pointer acquired during Marshal.AllocHGlobal call.
I lack the reputation to comment on other answers but Crypt32's answer is actually incorrect in two places so I'm posting a new answer that is based on Crypt32's answers.
pvData of CertSetCertificateContextProperty cannot accept X509EnhancedKeyUsageExtension.RawData directly, it needs to be in a CRYPT_INTEGER_BLOB struct
The function signature of Marshal.Copy is incorrect, it should be Marshal.Copy(Byte[], Int32, IntPtr, Int32)
With this in mind, let's start from the top:
Make sure you obtain the X509Certificate2 you require (Certificate), probably from a X509Store that has been opened as ReadWrite.
Import the Interop Signature and use the System.Runtime.InteropServices namespace.
[DllImport("Crypt32.dll", CharSet = CharSet.Auto, SetLastError = true)]
internal static extern Boolean CertSetCertificateContextProperty(
[In] IntPtr pCertContext,
[In] UInt32 dwPropId,
[In] UInt32 dwFlags,
[In] IntPtr pvData,
);
[StructLayout(LayoutKind.Sequential, CharSet=CharSet.Unicode)]
public struct CRYPTOAPI_BLOB {
public uint cbData;
public IntPtr pbData;
}
Create an OidCollection instance
Add the require OIDs of the EKUs you desire to the OidCollection
Create an X509EnhancedKeyUsageExtension instance (eku) using the OidCollection
Use Marshal.AllocHGlobal(eku.RawData.Length) to allocate the properly sized buffer in unmanaged memory as pbData
Use Marshal.AllocHGlobal with the size of the CRYPT_INTEGER_BLOB struct to allocate the properly sized buffer in unmanaged memory as pvData
Create an instance of the CRYPT_INTEGER_BLOB struct and allocate pbData to it and the length, eku.RawData.Length, to cbData
Use Marshal.StructureToPtr to assign the CRYPT_INTEGER_BLOB struct to pvData
Call CertSetCertificateContextProperty with pCertContext as Certificate.Handle, dwPropId being CERT_ENHKEY_USAGE_PROP_ID which is 9, dwFlags of 0 and pvData as pvData. It will return return true if successful, it may throw an error regarding a memory access exception if the Certificate Handle you passed into it is read-only, make sure it's from an X509Store opened as ReadWrite.
Free allocated unmanaged memory via Marshal.FreeHGlobal(pvData) and Marshal.FreeHGlobal(pbData)
Close any Opened X509Store
Again, thanks to Crypt32 for this answer.
Related
I signed a file using Signtool.exe and now I am trying to load the certificate attached to the file using the following method
var cert = X509Certificate2.CreateFromSignedFile(filePath);
but his line throws an error "Cannot find the requested object.". When I try reading the certificate from a microsoft signed dll e.g. EntityFramework.dll, it works without any problems. I thought it could be because I don't have the certificate in the Trusted Store but even after adding it there, it continues to throw error. Does anyone know how to fix this?
You can use wintrust component to collect the signature information
[DllImportAttribute("wintrust.dll", EntryPoint = "WTGetSignatureInfo", CallingConvention = CallingConvention.StdCall)]
internal static extern int WTGetSignatureInfo([InAttribute()] [MarshalAsAttribute(UnmanagedType.LPWStr)] string pszFile, [InAttribute()] System.IntPtr hFile, SIGNATURE_INFO_FLAGS sigInfoFlags, ref SIGNATURE_INFO psiginfo, ref System.IntPtr ppCertContext, ref System.IntPtr phWVTStateData);
This will collect the signature details from any signable files that microsoft prescribes. But make sure that you execute the given function under single threaded apartment model. Otherwise you will find weird results for signed script files like .js/.vbs and etc.
Please refer How to validate authenticode for Javascript in C# for more details.
I am using webclient to get the image data from a url, and trying to generate a video with it like so:
//http://www.codeproject.com/Articles/7388/A-Simple-C-Wrapper-for-the-AviFile-Library
WebClient client = new WebClient();
System.Net.WebRequest request = System.Net.WebRequest.Create(images);
System.Net.WebResponse response = request.GetResponse();
System.IO.Stream responseStream = response.GetResponseStream();
Bitmap bitmap = new Bitmap(responseStream);
//create a new AVI file
AviManager aviManager = new AviManager(#"C:\Users\Laptop\Documents\tada.avi", false);
//add a new video stream and one frame to the new file
//set IsCompressed = false
VideoStream aviStream = aviManager.AddVideoStream(false, 2, bitmap);
aviManager.Close();
But it cuts out on the following. In the library on this line
int result = Avi.AVIFileCreateStream(aviFile, out aviStream, ref strhdr);
I get the following error:
System.AccessViolationException: 'Attempted to read or write protected
memory. This is often an indication that other memory is corrupt.'
You probably have "Any CPU" at the moment...so your app then gets compiled/run as a 64bit process (on a 64bit Windows version).
The problem appears to be that that AVI Wrapper library was probably never tested with a 64bit .NET app....it hasn't defined the "pinvoke" definitions properly so that the parameters are correctly pushed on/popped off the stack when making the 64 bit API calls.
Change your project settings "platform target" to x86...so that you can avoid the issue....and can call the "avifil32.dll" albeit in 32bit mode.
Windows does ship with a 32bit and 64bit of that AVI library so in theory it is possible to call an AVI library when you are a 64bit process....but you need to define the interop/marshalling pinvoke properly.
c:\windows\system32\avifil32.dll (64bit)
c:\windows\syswow64\avifil32.dll (32bit)
In 32bit (Microsoft uses the ILP32 data model)...
an int is 4 bytes
a pointer is 4 bytes
In 64bit (Microsoft uses the LLP64 or P64 data model)....
an int is (still) 4 bytes
a pointer is (now) 8 bytes
(see https://msdn.microsoft.com/en-us/library/windows/desktop/aa384083(v=vs.85).aspx)
The mistake that often happens is that "pinvoke" definitions have used "int" when defining pointer types, instead of the more correct IntPtr type.
Thus, the "call" works ok on 32bit (because an "int" is the same size as a "pointer")....while on 64bit they are different sizes.
Other things change when you are 64bit too...such as the default boundary alignment...this can change offsets of types within structures - so you have to be careful when you are defining your pinvoke c# structures...so they match.
In case you are interested for the function call AVIFileCreateStream its WIN32 signature is as follows:
STDAPI AVIFileCreateStream(
PAVIFILE pfile,
PAVISTREAM *ppavi,
AVISTREAMINFO *psi
);
And the "types" of its parameters are:
typedef IAVIFile *PAVIFILE; // i.e. just a pointer
typedef IAVIStream *PAVISTREAM; // i.e. just a pointer
typedef struct {
DWORD fccType;
DWORD fccHandler;
DWORD dwFlags;
DWORD dwCaps;
WORD wPriority;
WORD wLanguage;
DWORD dwScale;
DWORD dwRate;
DWORD dwStart;
DWORD dwLength;
DWORD dwInitialFrames;
DWORD dwSuggestedBufferSize;
DWORD dwQuality;
DWORD dwSampleSize;
RECT rcFrame;
DWORD dwEditCount;
DWORD dwFormatChangeCount;
TCHAR szName[64];
} AVISTREAMINFO;
That wrapper library defined the NET "pinvoke" to AVIFileCreateStream using this:
//Create a new stream in an open AVI file
[DllImport("avifil32.dll")]
public static extern int AVIFileCreateStream(
int pfile,
out IntPtr ppavi,
ref AVISTREAMINFO ptr_streaminfo);
Immediately, you can see that the first parameter is defined incorrectly.
When the "call is made"....only 4 bytes will be placed onto the stack for the first parameter...instead of 8, then for the second parameter (which is a pointer to a pointer) 8 bytes are pushed (because IntPtr was used) (the address where to "write" an address), third parameter is an address to an AVISTREAMINFO structure.
Thus when AVIFileCreateStream is called it accesses those parameters on the stack, but they are basically junk.....it will be trying to use a pointer with the wrong value (i.e. only 4 bytes of the (first parameters) pointers address has come through on the stack...and the remaining 4 bytes (of the 8 byte pointer) are filled from the "next" thing on the stack...thus the pointer address is highly likely to be garbage....which is why you get the access violation.
The way it should have been defined is something like this (note there are other ways to achieve the same):
[DllImport("avifil32.dll", SetLastError=true)]
public static extern int AVIFileCreateStream(IntPtr pfile, out IntPtr ppavi, ref AVISTREAMINFO psi);
I'm trying to call LoadLibrary method, But it returns 0. Marshal.GetLastWin32Error is returning 126 (The specified module could not be found.).
Code:
[DllImport("kernel32", SetLastError = true, CharSet = CharSet.Ansi)]
static extern IntPtr LoadLibrary([MarshalAs(UnmanagedType.LPStr)]string lpFileName);
string path = #"C:\junk\测试\BlueStacksKK_DeployTool_2.5.48.7209_china_gmgr\ProgramFiles\BstkC.dll";
IntPtr ptr = LoadLibrary(path);
int error = Marshal.GetLastWin32Error();
But If I move this file to some other location like C:\Test\BstkC.dll, it works fine.
Issue could be due to 测试 in path. So If we have direcotry in other languages other then English, how will it work.
Just for your information. File.Exists(path) returns true.
You have to set the character set used to Unicode, since you use non-unicode character in your path:
[DllImport("kernel32", CharSet=CharSet.Unicode)]
static extern IntPtr LoadLibrary(string lpFileName);
Now it takes the LoadLibraryA (ANSI) variant. See MSDN.
Try:
[DllImport("kernel32", SetLastError = true]
static extern IntPtr LoadLibraryW([MarshalAs(UnmanagedType.LPWStr)]string lpFileName);
The underlying Win32 API comes with two flavors: ASCII mode (which allows only ASCII characters in strings) and Unicode Mode (which allows UTF16 characters in strings).
C# is UTF16 based, basically, you invoked a ASCII-flavored function with UTF16 string, you need to explicitly tell the CLR you want the Unicode flavored function (LoadLibraryW) and retain the UTF16 encoding of the C# string (by using LPWStr).
everyone, I am facing an issue with LogonUser function.
I just want to know if I can import the LogonUser function into C# by this signature:
[DllImport("advapi32.dll", SetLastError = true)]
internal static extern int LogonUser(string username, string domain, IntPtr password, int logonType, int logonProvider, ref IntPtr token);
Because I want to secure my password not using a string, but a SecureString class. Then later the function is used like below:
var passwordPtr = Marshal.SecureStringToGlobalAllocUnicode(password);
var result = LogonUser(userName, domain, passwordPtr, LOGON32_LOGON_INTERACTIVE, LOGON32_PROVIDER_DEFAULT, ref token);
I always get result = 0 and the message shows that the user name and password is incorrect.
But when I change the signature to use string password, then everything works well.
Please help me as to secure the password with SecureString is important to me.
As pointed out by Alex K, there's an example of using LogonUser in the SecureStringToGlobalAllocUnicode. Note that the P/Invoke declaration there is:
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
internal static extern bool LogonUser(String username, String domain, IntPtr password,
int logonType, int logonProvider, ref IntPtr token);
And that CharSet = CharSet.Unicode has been specified. Unfortunately, for historical reasons, the default CharSet value is Ansi and so that's what your P/Invoke attempt is going to use.
This will be fine for the username parameter since the P/Invoke infrastructure will ensure that it converts the string appropriately. But it's not appropriate for the password parameter since you've already performed the string conversion and you've done it as Unicode - and all that P/Invoke is seeing is an IntPtr now.
I'd suggest updating your P/Invoke signature to match that given in the sample.
Another alternative would be to switch to using SecureStringToGlobalAllocAnsi and leaving your P/Invoke signature alone. But this is a seriously second-rate solution. Writing non-Unicode aware code in 2015 is seriously not to be recommended.
Just get in the habit of always specifying CharSet.Unicode in any P/Invoke signatures you write.
ive had some problems building a correct .dll from a c++-project to my c#-project.
I played around with the c++-project-properties and got a .dll file which i can add and refer to in my c# web-project. I use Dllimport to make a function call to the .dll like this:
[DllImport("Filename.dll", CharSet = CharSet.Ansi)]
static extern void Function1([MarshalAs(UnmanagedType.LPStr)] string src,
int srcLen,
[MarshalAs(UnmanagedType.LPWStr)] StringBuilder dst,
int dstLen)
The c++ function header is:
__declspec(dllimport) void Function1(unsigned char *src,
unsigned long srclen,
unsigned char *dst,
unsigned long dstlen);
Im calling Function1 in c# with this:
string strSrc = "Something";
StringBuilder strDest = new StringBuilder(kryptlen-1);
int l = strSrc.Length();
Function1(strSrc, l, strDest, l);
No exceptions or errors are occuring, though im not getting the output im expecting. The function is a decrypting method that takes an encrypted string(src) and returns the decrypted version of this (dst).
Is it the way ive generated the .dll file or is it the wrong way im calling the function? Im running out of ideas ive tried most combinations.
Thanks in advice!
C++ by default unless changed uses a caller( Cdecl ) calling convention. Your C++ code does not change the calling convention. Your C# code by default ( unless you change it ) will use a callee convention ( StdCall ).
While this might not be exactly the problem your having it still is technically incorrect. Even if you were to fix your current problem you likely will end up having a problem because of the calling convention.
No exceptions or errors are occuring,
though im not getting the output im
expecting. The function is a
decrypting method that takes an
encrypted string(src) and returns the
decrypted version of this (dst).
What exactly are you getting?
The solution to the calling convention problem is to declare the calling convention.
[DllImport("Filename.dll", CharSet = CharSet.Ansi, CallingConvention=CallingConvention.Cdecl)] static extern void Function1([MarshalAs(UnmanagedType.LPStr)] string src, int srcLen, [MarshalAs(UnmanagedType.LPWStr)] StringBuilder dst, int dstLen)
Where'd UnmanagedType.LPWStr come from? There are no wide strings in the C++ declaration. You're also passing the source length twice, while the variable names suggest you need the source length and the destination buffer capacity.
If src is encrypted data as you say, the correct p/invoke signature is probably:
[DllImport("Filename.dll", CallingConvention = CallingConvention.CDecl)]
static extern void Function1(byte[] src,
UInt32 srcLen,
[MarshalAs(UnmanagedType.LPStr)] StringBuilder dst,
UInt32 dstLen)
Trying to force binary data into a Unicode string is a losing proposition.
After some time changing the settings back and forth i noticed that the data i had to start with was the same data i had after a decrypt and an encrypt. So I realized that the encryption/decryption was working. The problem was the data. When i had the correct data i got the correct output!
This is the end setup im stuck with now, hope it can help someone:
cryptedString = "CryptedStringFromDataBase";
StringBuilder decryptedString = new StringBuilder(300);
Decrypt(cryptedString, (uint)cryptedString.Length, decryptedString, 300);
[DllImport("Filename.dll", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.Cdecl)]
static extern void Decrypt(string src, UInt32 srcLen, [MarshalAs(UnmanagedType.LPStr)] StringBuilder dst, UInt32 dstLen);