I have installed a .pfx to my Azure website using the management portal upload certificate.
I am now trying to access them using the code below:
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
certificateStore.Open(OpenFlags.ReadOnly);
var certificates = certificateStore.Certificates;
StringBuilder sb = new StringBuilder();
foreach (var certificate in certificates)
{
sb.AppendLine(certificate.Subject);
}
When published to Azure, a bunch of certificates are listed but not the one that one that I have uploaded.
The certificates listed are here:
CN=WW.azurewebsites.windows.net, OU=CIS(RD), O=Microsoft
CN=FullOSTransport
CN=client.geo.to.stamp.azurewebsites.windows.net
CN=ma.waws-prod-am2-005.azurewebsites.windows.net, OU=OrganizationName, O=Microsoft,
L=Redmond, S=WA, C=US
CN=FullOSTransport
CN=FullOSTransport
I purchased the certificate from Verisign and it appears to be uploaded correctly and does appear in the 'HTTPS' bar in the browser (in Chrome).
Any help would be really appreciated as I'm at a loss here.
Update
It looks like we would need to convert to a Cloud Service for the above code to work. But can I add the certificates to my app_data folder as suggested here?
http://blog.tylerdoerksen.ca/2015/11/29/pfx-certificate-files-and-azure-web-apps/
This seems to work for Azure-Websites without the use of web roles.
Thanks
I have faced the similar issue, below is the solution that worked for me.
Solution:
once you have uploaded your certificate through the Azure portal you need to add an appsetting (also through the portal) called WEBSITE_LOAD_CERTIFICATES and set the value for this to the thumbprint of your uploaded certificate. This can be a comma separated list of multiple thumbprints if you want, or even * to load all your uploaded certificates
Then load ur certificate using the below code.
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
var certs = store.Certificates.Find(X509FindType.FindByThumbprint, YOUR_THUMBPRINT, false);
I have installed a .pfx to my Azure website using the management portal upload certificate.
I recently had to go through this process for an Azure Web Site so these are the things I would try in this order to save the time.
What you can do to debug?
First, remote into the machine and find whether the certificate exists there. You can find that using mmc.exe and add certificates snap-in. See here for complete instructions.
In the case of an Azure Web Site, you have to enable the remote desktop by going into Azure Management Portal, and then create a session into the VM that has your Web Site deployed.
Deploying certificates
If certificate does not exist, you will have to deploy it. For testing, you could do it manually by going into the VMs using the remote session and importing the certificate.
In the case of Web Site, if you want it to be deployed automatically, you will have to update the service definition files for that role to make sure that the certificate will be deployed properly. Also, keep in mind that your certificate should be uploaded as a "Service Certificate" and not a "Management Certificate" if you want your roles to be able to use it. If you are using Visual studio, you could also add it to your project and that may deploy it.
Permissions
Additionally, (and especially if you had manually deployed the certificate e.g. on a VM), you will need to check that IIS has permissions to access the certificate. This page here explains deploying certificates and how to give appropriate permissions. If your certificate is included in the deployment package, then this is not necessary as Azure Deployment will take care of it.
FYI: It works locally because the certificate already exists in the store your code is looking into, and there's nothing that is going to remove the certificate (unless you do it manually) to verify that if you deployed locally again, the certificate will be deployed again (assuming that your deployment locally and on Azure cloud is exactly the same). In many cases, the local environment and Azure cloud environment can be different (unfortunately), because Azure will provision clean VMs, and everything needs to be deployed properly. On the local machines, we have a lot of "leftovers".
Related
I've been reading a lot lately about managing secrets with Azure Key Vault. I managed to create and install a .pfx certificate in a server with Ubuntu 20.04 and uploaded the certificate to my Azure AD following these steps.
The certificate is found correctly before connecting to my Key Vault and the secrets are retrieved when I am in development both from Windows and Linux (WSL). However, when I deploy the app to my production server, the service I created to manage kestrel throws a 'core-dump' error, similar to this issue.
But in my case, when I check the journal, I find the following:
Unhandled exception. System.InvalidOperationException: Sequence contains no elements
Surprisingly, this doesn't happen if I just manually run the application by using "dotnet app.dll".
How is this even possible? It opens the store, finds the certificate and access the secrets if I run it manually but doesn't find anything when is run by the service.
This is the relevant code I am using to configure the access to Key Vault in my Program.cs:
// Azure Key Vault configuration.
using var store = new X509Store(StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
var certs = store.Certificates.Find(X509FindType.FindByThumbprint, configuration["KeyVault:AzureADCertThumbprint"], false);
configuration.AddAzureKeyVault(
new Uri($"https://{configuration["KeyVault:KeyVaultName"]}.vault.azure.net/"),
new ClientCertificateCredential(configuration["KeyVault:AzureADDirectoryId"], configuration["KeyVault:AzureADApplicationId"], certs.OfType<X509Certificate2>().Single()),
new KeyVaultSecretManager());
store.Close();
Can anyone help me to find the issue? Thanks in advance.
I checked some Microsoft docs and I didn't found anything wrong with code to access the key vault. I guess you may forget something in application setting on portal, as your code has no problem.
After creating your certificate, configure Azure AD and associate the certificate after that we can access Azure Key Vault from .NET Client using X509 Certificate.
After importing the certificate and casting Certificate Data to Base64, we should create Azure Resource Manager AD Application and Service Principal. Successfully configuring Azure Resource Manager Application and Service Principal for an Azure Key Vault could solve the problem you are facing.
Check this Accessing Azure Key Vaults Using Certification and Setup Key Vault using Azure AD Application and Certificates for more information.
I have created a Blazor WebAssembly app with a server backend using Identity Server out-of-box (from the template).
I want to publish it to Azure but I don't get it working with loading the certificate from Azure KeyVault.
I have used the wizards in Visual Studio to generate the boilerplate code. Everything has been configured too.
And I read this guide too: https://learn.microsoft.com/en-us/aspnet/core/blazor/security/webassembly/hosted-with-identity-server?view=aspnetcore-5.0&tabs=visual-studio#host-in-azure-app-service-with-a-custom-domain
And I have created a certificate in the Key Vault named IdentityServerSigning with CN=IdentityServerSigning.
When I then run the app I get 500.30.
Opening the Web-based Console from the Portal, I launch the app and get that it could not find a valid certificate 'CN=IdentityServerSigning' on the 'CurrentUser/My'.
What am I missing?
I guess you may forget to add application setting on portal, if your code has no problem.
We need to give Azure App Service permission to use the newly uploaded certificate. For that:
Go to Configuration in the menu of your Azure App Service
Click on New application setting
In Name, put: WEBSITE_LOAD_CERTIFICATES
In Value, put the Thumbprint that you copied from the previous section.
Click Ok, and don’t forget to click Save
If it doesn't work, pls read below blogs, you will find out the issues.
Suggestion
You can refer to this blogs to check the steps, I believe it will useful to you.
Blazor: Using a Self-Signed Certificate for IdentityServer4 in Azure App Service
To load the cert from Azure Key Vault in IdentityServer 4, you can use ActiveLogin:(https://github.com/ActiveLogin/ActiveLogin.Authentication)
More here: https://github.com/IdentityServer/IdentityServer4/issues/2705
I'm integrating my app with Xero which requires two certificates. I uploaded them to Azure with help from this article, but I'm still unable to connect to the Xero API. I'm hoping someone has experience integrating a Xero Partner Application with an Azure Web App.
I've uploaded two pfx files; one is a self-signed certificate and the other is the partner certificate issued by Xero. The latter pfx file contains two certificates; an Entrust Commercial Private Sub CA1 (whatever than means) and a unique Entrust Id certificate for my app.
I'm using the following code to load the certificates by their unique thumbprint:
static X509Certificate2 GetCertificateFromStore(string thumbprint)
{
var store = new X509Store(StoreLocation.CurrentUser);
try
{
thumbprint = Regex.Replace(thumbprint, #"[^\da-zA-z]", string.Empty).ToUpper();
store.Open(OpenFlags.ReadOnly);
var certCollection = store.Certificates;
var currentCerts = certCollection.Find(X509FindType.FindByTimeValid, DateTime.Now, false);
var signingCert = currentCerts.Find(X509FindType.FindByThumbprint, thumbprint, false);
if (signingCert.Count == 0)
{
throw new Exception($"Could not find Xero SSL certificate. cert_name={thumbprint}");
}
return signingCert[0];
}
finally
{
store.Close();
}
}
This works fine locally, but on my azure web site I get a 403.7 error:
The page you are attempting to access requires your browser to have a Secure Sockets Layer (SSL) client certificate that the Web server recognizes.
I've also looked at the following references to try and resolve the issue:
Xero Partner SSL configuration in Azure (Uses a cloud service and not a web app, so I couldn't follow the steps at the end)
403 Forbidden when loading X509Certificate2 from a file (Thread posted on the Xero forums about the same issue, figured out that the resolution is only for once again; cloud services)
Xero partner connections and Azure Websites (Posted solution suggests using a VM)
What I haven't tried yet:
Converting my web app to a cloud service; trying to avoid doing this however I'm not sure what steps are involved.
Using a VM; I haven't found any detailed steps on how to do this but seems like a better option than above.
Screenshot of the error:
A 403 error means we are not seeing the Xero Entrust certificate in the connection.
More details about it here - http://blog.xero.com/developer/api-overview/http-response-codes/#403
Basically , It runs on your local IIS instance because it is a "single tenant" machine where your application doesn't need to be isolated from others.
While you application is blocked by the security model used to isolate web sites.
In summary, you have to do the following to get your certificates working on Azure:
1) Export the certificate, private key, and all intermediary certificates into a PFX file.
2) Upload the certificate using the Azure portal to the cloud service that you're running (it should appear as multiple entries).
3) Access the certificate through the machine store in code.
Based on data taken from:
https://community.xero.com/developer/discussion/626401
https://social.msdn.microsoft.com/Forums/azure/en-US/29b30f25-eea9-4e8e-8292-5ac8085fd42e/access-to-certificates-in-azure-web-sites
I hope it solved your issue.
Finally got this working and I will post my solution which will hopefully save developers a lot of time and frustration when connecting with Xero.
The Xero Partner Application will not work with Azure App Services (Web Sites). You have to upload two additional certificates along with your self-signed and the Xero partner certificate. These can be found on your local machine and can be exported in cer format (details of these certificates below). Not being able to upload these certificates for Azure app services is indeed the crutch. They also have to be uploaded to specific stores (Root/CA), which you cannot do with app services. These are the steps I took to connect with Xero.
Converted my web site to Azure Cloud Services: I was weary of changing our environment as we already have a live site. It turns out that cloud services is essentially the same as app services; you still are deploying to a VM somewhere. However, you have more control over the back end and you can remote desktop in. Read more here. Used links below to create and convert my website to cloud services:
Create a new cloud service project
Convert existing web site to cloud service
Uploaded 4 certificates to my cloud project using the azure portal. You will need to upload the following:
Your self-signed certificate (the one you created here)
The partner certificate issued by Xero (you probably got it here)
The Intermediate Entrust certificate (this one should be contained within the .p12 file you downloaded above)
The Entrust Root certificate (this should be in your Trusted Root Store**)
Added the certificates to my Web Role in the Cloud project. You have to right click on the properties of your web role and go to the certificates tab. Add all 4 certificates to your web role using the thumbprint which will be visible in the portal after you uploaded them. Take note of the Store Name for the two entrust certs:
You may have to adopt a lot of patience as I have to get through step one. You'll have to figure out the new deployment process, how to debug your project locally, and probably a lot of other frustrating tidbits!
**This is the correct Entrust Root certificate you can get by using certmgr.msc:
Make sure you added the application setting from step 2 of your referenced article.
Adding an app setting named WEBSITE_LOAD_CERTIFICATES with its value set to the thumbprint of the certificate will make it accessible to your web application. You can have multiple comma-separated thumbprint values or can set this value to “ * “ (without quotes) in which case all your certificates will be loaded to your web applications personal certificate store.
I'd also be more specific in specifying the certificate store, i.e. use:
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
This is how we load certs in all of our Azure Web Apps.
In my ASP.NET application I'm loading a certificate from the certificate store:
var myCert = CertificateUtils.GetCertificate("thumbprint");
This certificate contains a key pair which is used to decrypt the encrypted application settings.
The certificate is installed in Personal certificate store under the Local Computer. It works well when the application is running under the IIS Express. But if I execute it under the full IIS Web Server, the myCert instance is missing the private key.
The PrivateKey field of myCert object contains an exception:
'myCert.PrivateKey' threw an exception of type 'System.Security.Cryptography.CryptographicException'
I have checked that other fields of myCert object contain same values (like, for example, certificate serial number, thumbprint or expiration), so it seems it's getting the same certificate under both IIS and IIS Express. Only the private key is missing in the case of full IIS.
The only thing I have changed was the Local Development Server in project's properties ("Use IIE Express" / "Use IIS Web Server"). It's running inside the Azure Emulator Express in both cases.
Does anyone have an idea, why is this happenning?
Running on IIS Express, the program uses your credentials to access the certificate, while on IIS the pool identity's credentials are used. You can easily check the certificate ACL to see who is allowed or not.
Follow these steps:
Check what Application Pool your web site uses
Open Internet Information Services Manager, select Sites in the Connections tree on the left. Select your site in the middle panel and click Basic settings under Actions on the right panel.
Check what identity the Application Pool uses
Select Application Pools in the Connections tree on the left and find the identity in the middle panel. It'll be probably "NETWORK SERVICE".
Add read permissions for the identity used by Application Pool to your certificate
Open the Microsoft Management Console (mmc), add the Certificates snap-in for local Computer account and find your certificate under Personal certificates. Open its context menu, All Tasks and Manage Private Keys.... Click Add.., enter the identity ("NETWORK SERVICE") and click Check Names and OK. Under Permissions for allow only the Read permission.
You can read details in this question: How to give ASP.NET access to a private key in a certificate in the certificate store?
I was having this problem to debug the application ".PrivateKey' threw an exception of type 'System.Security.Cryptography.CryptographicException"
I solve like this:
In mmc > Local Computer > Personal > Certificate > right click on certificate > All Tasks > Manage Private Keys:
Add "everyone" user and select Total Control.
if we need to secure web site or use HTTPS for our web site then we need to use certificate at iis level. in development pc we often use Self-Signed Certificates which can be created very easily from IIS.
i visit this url http://weblogs.asp.net/scottgu/archive/2007/04/06/tip-trick-enabling-ssl-on-iis7-using-self-signed-certificates.aspx to learn how to create & use SSL for our site
after doing everything when we run or test the site in local pc then i feel Self-Signed Certificates does not work like real life certificate which people buy. here i am adding couple of picture from there you can see what kind of problem i am talking about.
just see the second picture and look at url. in case of ssl a lock sign come with green color.
so just guide me what else we need to do as a result Self-Signed Certificates just works like real life certificate in my pc. please discuss this in detail or redirect me to right article which can show what else to configure as a result browser address bar should properly reflect for SSL.
thanks
The certificate works the same. The problem is that a self-signed certificate is not always included in the browser's Trusted Issuing authority. If your sole purpose is for development, you can follow this method here of adding your issuer (self) to trusted authority or adding the certificate itself as trusted.
In production website, you need to purchase an SSL certificate because your visitors' browsers cannot trust self-signed certificates as they cannot verify the issuer.
Having said that, for development and testing purposes, the behaviour you described is fine, but if you really need to get rid of the warning, you need to register the certificate in your local PC (all PCs that you don't want to see the warning on) and then use the same certificate for your website in IIS.
Follow this guide from step 2 onward, but here are the outlines:
First you need to copy the certificate to your local PC:
In IIS, export the certificate to a file.
Copy the file to your local PC.
Use MMC to import the certificate from the file. Make sure you import it to Personal folder.
Repeat the last two steps for all PCs.
Now that you have the certificate registered in your local PC, you need to tell your PC to trust it:
View the certificate in MMC and go to the second "Details" tab.
Scroll-down to the "Thumbprint" and selected it to display the certificate hash.
Copy the has into the clipboard (the hash identifies your certificate).
Open Notepad and paste the hash there.
Remove all the spaces from the hash using the "Replace" feature in Notepad.
Use the hash in the following command:
netsh http add sslcert ipport=0.0.0.0:443 appid={214124cd-d05b-4309-9af9-9caa44b2b74a} certhash=PASTE_YOUR_CERT_HASH_HERE
Note: The "AppId" doesn't really matter, its just a GUID.
In MMC, move the certificate from the Personal folder to the Trusted Root Certificates folder.