I have a worker role which uses the Third party WCF services, the developers of WCF services told me to add two certificates on my Azure Trusted Root store, for authentication.
When I am trying to add the same via certificates tab in roles --> properties, it is not allowed.
So I come to a solution, where I write a code in console application along with these two certificates (.cer files). When the worker role executes, it adds these certificates on desired storename i.e. LocalMachine--> TrustedRoot.
However still my application is throwing error saying "Keyset does not exist"
I have checked through RDP, and found that the cer files are already installed on this location.
Also, I am trying to find the Private Key, but unable to do so, I think this is because these are ".cer" files.
The problem is that your files are .cer which do not store private key. One of these certificates (leaf, or authentication certificate) must be in a PFX. Commonly they have .pfx or .p12 extension.
Related
I am currently trying to access a service provided by a 3rd party. They have issued us a certificate in PKCS format. The certificate is installed in Local computer - Trusted root directory.
Our application at run time finds this certificate and sends it to the authentication URL hosted by the 3rd party, where it is authenticated and the SAML tokens are issued. This is then used to call the actual service that does the functionality we desire.
when I run my application consuming this service via the service reference in the development machine [Windows 7] everything works smoothly.
Now the pain point is since we have a Citrix environment where the testing would take place, we get an error as
Exception in METHOD: SOAP security negotiation with "Service URL" for
target failed. Inner Exception:
System.Security.Cryptography.CryptographicException: Keyset does not
exist.
Can any one help me in resolving this issue, as am unable to reproduce it my local and it happens only in the server OS, am not sure if its an issue with privileges or some code issue.
There are two likely causes of this issue:
The certificate does not have a private key.
The user your process runs as does not have permissions to read the private key.
As you already have this up & running in your development environment, lets assume the cause is 2.
If you don't know it already, you need to determine the user account that your process runs as on your test server. Then open MMC on the test server and add the Certificates snap-in. Find the certificate, right-click and choose All Tasks | Manage Private Keys... and grant read access to the user.
Read lots more about this at:
X509Certificate - Keyset does not exist
CryptographicException 'Keyset does not exist', but only through WCF
Service failure with CryptographicException – Keyset does not exist
Wcf: Keyset does not exist
Thanks for the information. The root cause for this issue was permission issue for the certificate. Since the certificate was installed in the server with Admin privileges, the permission had to be given to all the users for accessing the certificate.
I'm integrating my app with Xero which requires two certificates. I uploaded them to Azure with help from this article, but I'm still unable to connect to the Xero API. I'm hoping someone has experience integrating a Xero Partner Application with an Azure Web App.
I've uploaded two pfx files; one is a self-signed certificate and the other is the partner certificate issued by Xero. The latter pfx file contains two certificates; an Entrust Commercial Private Sub CA1 (whatever than means) and a unique Entrust Id certificate for my app.
I'm using the following code to load the certificates by their unique thumbprint:
static X509Certificate2 GetCertificateFromStore(string thumbprint)
{
var store = new X509Store(StoreLocation.CurrentUser);
try
{
thumbprint = Regex.Replace(thumbprint, #"[^\da-zA-z]", string.Empty).ToUpper();
store.Open(OpenFlags.ReadOnly);
var certCollection = store.Certificates;
var currentCerts = certCollection.Find(X509FindType.FindByTimeValid, DateTime.Now, false);
var signingCert = currentCerts.Find(X509FindType.FindByThumbprint, thumbprint, false);
if (signingCert.Count == 0)
{
throw new Exception($"Could not find Xero SSL certificate. cert_name={thumbprint}");
}
return signingCert[0];
}
finally
{
store.Close();
}
}
This works fine locally, but on my azure web site I get a 403.7 error:
The page you are attempting to access requires your browser to have a Secure Sockets Layer (SSL) client certificate that the Web server recognizes.
I've also looked at the following references to try and resolve the issue:
Xero Partner SSL configuration in Azure (Uses a cloud service and not a web app, so I couldn't follow the steps at the end)
403 Forbidden when loading X509Certificate2 from a file (Thread posted on the Xero forums about the same issue, figured out that the resolution is only for once again; cloud services)
Xero partner connections and Azure Websites (Posted solution suggests using a VM)
What I haven't tried yet:
Converting my web app to a cloud service; trying to avoid doing this however I'm not sure what steps are involved.
Using a VM; I haven't found any detailed steps on how to do this but seems like a better option than above.
Screenshot of the error:
A 403 error means we are not seeing the Xero Entrust certificate in the connection.
More details about it here - http://blog.xero.com/developer/api-overview/http-response-codes/#403
Basically , It runs on your local IIS instance because it is a "single tenant" machine where your application doesn't need to be isolated from others.
While you application is blocked by the security model used to isolate web sites.
In summary, you have to do the following to get your certificates working on Azure:
1) Export the certificate, private key, and all intermediary certificates into a PFX file.
2) Upload the certificate using the Azure portal to the cloud service that you're running (it should appear as multiple entries).
3) Access the certificate through the machine store in code.
Based on data taken from:
https://community.xero.com/developer/discussion/626401
https://social.msdn.microsoft.com/Forums/azure/en-US/29b30f25-eea9-4e8e-8292-5ac8085fd42e/access-to-certificates-in-azure-web-sites
I hope it solved your issue.
Finally got this working and I will post my solution which will hopefully save developers a lot of time and frustration when connecting with Xero.
The Xero Partner Application will not work with Azure App Services (Web Sites). You have to upload two additional certificates along with your self-signed and the Xero partner certificate. These can be found on your local machine and can be exported in cer format (details of these certificates below). Not being able to upload these certificates for Azure app services is indeed the crutch. They also have to be uploaded to specific stores (Root/CA), which you cannot do with app services. These are the steps I took to connect with Xero.
Converted my web site to Azure Cloud Services: I was weary of changing our environment as we already have a live site. It turns out that cloud services is essentially the same as app services; you still are deploying to a VM somewhere. However, you have more control over the back end and you can remote desktop in. Read more here. Used links below to create and convert my website to cloud services:
Create a new cloud service project
Convert existing web site to cloud service
Uploaded 4 certificates to my cloud project using the azure portal. You will need to upload the following:
Your self-signed certificate (the one you created here)
The partner certificate issued by Xero (you probably got it here)
The Intermediate Entrust certificate (this one should be contained within the .p12 file you downloaded above)
The Entrust Root certificate (this should be in your Trusted Root Store**)
Added the certificates to my Web Role in the Cloud project. You have to right click on the properties of your web role and go to the certificates tab. Add all 4 certificates to your web role using the thumbprint which will be visible in the portal after you uploaded them. Take note of the Store Name for the two entrust certs:
You may have to adopt a lot of patience as I have to get through step one. You'll have to figure out the new deployment process, how to debug your project locally, and probably a lot of other frustrating tidbits!
**This is the correct Entrust Root certificate you can get by using certmgr.msc:
Make sure you added the application setting from step 2 of your referenced article.
Adding an app setting named WEBSITE_LOAD_CERTIFICATES with its value set to the thumbprint of the certificate will make it accessible to your web application. You can have multiple comma-separated thumbprint values or can set this value to “ * “ (without quotes) in which case all your certificates will be loaded to your web applications personal certificate store.
I'd also be more specific in specifying the certificate store, i.e. use:
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
This is how we load certs in all of our Azure Web Apps.
In my ASP.NET application I'm loading a certificate from the certificate store:
var myCert = CertificateUtils.GetCertificate("thumbprint");
This certificate contains a key pair which is used to decrypt the encrypted application settings.
The certificate is installed in Personal certificate store under the Local Computer. It works well when the application is running under the IIS Express. But if I execute it under the full IIS Web Server, the myCert instance is missing the private key.
The PrivateKey field of myCert object contains an exception:
'myCert.PrivateKey' threw an exception of type 'System.Security.Cryptography.CryptographicException'
I have checked that other fields of myCert object contain same values (like, for example, certificate serial number, thumbprint or expiration), so it seems it's getting the same certificate under both IIS and IIS Express. Only the private key is missing in the case of full IIS.
The only thing I have changed was the Local Development Server in project's properties ("Use IIE Express" / "Use IIS Web Server"). It's running inside the Azure Emulator Express in both cases.
Does anyone have an idea, why is this happenning?
Running on IIS Express, the program uses your credentials to access the certificate, while on IIS the pool identity's credentials are used. You can easily check the certificate ACL to see who is allowed or not.
Follow these steps:
Check what Application Pool your web site uses
Open Internet Information Services Manager, select Sites in the Connections tree on the left. Select your site in the middle panel and click Basic settings under Actions on the right panel.
Check what identity the Application Pool uses
Select Application Pools in the Connections tree on the left and find the identity in the middle panel. It'll be probably "NETWORK SERVICE".
Add read permissions for the identity used by Application Pool to your certificate
Open the Microsoft Management Console (mmc), add the Certificates snap-in for local Computer account and find your certificate under Personal certificates. Open its context menu, All Tasks and Manage Private Keys.... Click Add.., enter the identity ("NETWORK SERVICE") and click Check Names and OK. Under Permissions for allow only the Read permission.
You can read details in this question: How to give ASP.NET access to a private key in a certificate in the certificate store?
I was having this problem to debug the application ".PrivateKey' threw an exception of type 'System.Security.Cryptography.CryptographicException"
I solve like this:
In mmc > Local Computer > Personal > Certificate > right click on certificate > All Tasks > Manage Private Keys:
Add "everyone" user and select Total Control.
I'm trying to connect to a service that a 3rd party company is publishing. For the authentication part, we use two certificates, one with a public key and one with a private key.
I've made an console application just to test the certificates in differente stores, with the following possibilities:
Location: Current User; Store: Personal
Location: Local Machine; Store: Personal (installed with admin user. I don't have admin permissions)
It was working until I've changed my computer to another this week. I've tested on other machines and it's working on both configurations. But mine only works when I try the 'Current User Location'. Why? My application needs to use the 'Local Machine Location'.
The only possibility I can think about is some kind of permission. But I'm not finding any clue on web. All the similar links say something about the bindings, wrong certificates, overriding ServiceCallBack, etc.
Someone knows if any permission is needed to use the certificate from LocalMachine?
Obs: the application can find the certificate, but when it uses I got the following error:
Could not establish trust relationship for the SSL/TLS secure channel with authority 'name-of-certificate'
Note: I know there is other posts similar to this, but the problem/scenario is really different.
Possibly the identity of the application pool has to rights to read the private key of the certificate from the Local Machine store.
To add the permission, go to the Certificate snapin, right click the certificate, select All Tasks and Manage private keys. From there, add the application pool identity.
Also, as always, make sure that the application pool's "Load user profile" setting is set to true.
I have installed a .pfx to my Azure website using the management portal upload certificate.
I am now trying to access them using the code below:
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
certificateStore.Open(OpenFlags.ReadOnly);
var certificates = certificateStore.Certificates;
StringBuilder sb = new StringBuilder();
foreach (var certificate in certificates)
{
sb.AppendLine(certificate.Subject);
}
When published to Azure, a bunch of certificates are listed but not the one that one that I have uploaded.
The certificates listed are here:
CN=WW.azurewebsites.windows.net, OU=CIS(RD), O=Microsoft
CN=FullOSTransport
CN=client.geo.to.stamp.azurewebsites.windows.net
CN=ma.waws-prod-am2-005.azurewebsites.windows.net, OU=OrganizationName, O=Microsoft,
L=Redmond, S=WA, C=US
CN=FullOSTransport
CN=FullOSTransport
I purchased the certificate from Verisign and it appears to be uploaded correctly and does appear in the 'HTTPS' bar in the browser (in Chrome).
Any help would be really appreciated as I'm at a loss here.
Update
It looks like we would need to convert to a Cloud Service for the above code to work. But can I add the certificates to my app_data folder as suggested here?
http://blog.tylerdoerksen.ca/2015/11/29/pfx-certificate-files-and-azure-web-apps/
This seems to work for Azure-Websites without the use of web roles.
Thanks
I have faced the similar issue, below is the solution that worked for me.
Solution:
once you have uploaded your certificate through the Azure portal you need to add an appsetting (also through the portal) called WEBSITE_LOAD_CERTIFICATES and set the value for this to the thumbprint of your uploaded certificate. This can be a comma separated list of multiple thumbprints if you want, or even * to load all your uploaded certificates
Then load ur certificate using the below code.
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
var certs = store.Certificates.Find(X509FindType.FindByThumbprint, YOUR_THUMBPRINT, false);
I have installed a .pfx to my Azure website using the management portal upload certificate.
I recently had to go through this process for an Azure Web Site so these are the things I would try in this order to save the time.
What you can do to debug?
First, remote into the machine and find whether the certificate exists there. You can find that using mmc.exe and add certificates snap-in. See here for complete instructions.
In the case of an Azure Web Site, you have to enable the remote desktop by going into Azure Management Portal, and then create a session into the VM that has your Web Site deployed.
Deploying certificates
If certificate does not exist, you will have to deploy it. For testing, you could do it manually by going into the VMs using the remote session and importing the certificate.
In the case of Web Site, if you want it to be deployed automatically, you will have to update the service definition files for that role to make sure that the certificate will be deployed properly. Also, keep in mind that your certificate should be uploaded as a "Service Certificate" and not a "Management Certificate" if you want your roles to be able to use it. If you are using Visual studio, you could also add it to your project and that may deploy it.
Permissions
Additionally, (and especially if you had manually deployed the certificate e.g. on a VM), you will need to check that IIS has permissions to access the certificate. This page here explains deploying certificates and how to give appropriate permissions. If your certificate is included in the deployment package, then this is not necessary as Azure Deployment will take care of it.
FYI: It works locally because the certificate already exists in the store your code is looking into, and there's nothing that is going to remove the certificate (unless you do it manually) to verify that if you deployed locally again, the certificate will be deployed again (assuming that your deployment locally and on Azure cloud is exactly the same). In many cases, the local environment and Azure cloud environment can be different (unfortunately), because Azure will provision clean VMs, and everything needs to be deployed properly. On the local machines, we have a lot of "leftovers".