I am writing a utility that would allow me to monitor the health of our websites. This consists of a series of validation tasks that I can run against a web application. One of the tests is to anticipate the expiration of a particular SSL certificate.
I am looking for a way to pre-fetch the SSL certificate installed on a web site using .NET or a WINAPI so that I can validate the expiration date of the certificate associated with a particular website.
One way I could do this is to cache the certificates when they are validated in the ServicePointManager.ServerCertificateValidationCallback handler and then match them up with configured web sites, but this seems a bit hackish. Another would be to configure the application with the certificate for the website, but I'd rather avoid this if I can in order to minimize configuration.
What would be the easiest way for me to download an SSL certificate associated with a website using .NET so that I can inspect the information the certificate contains to validate it?
EDIT:
To extend on the answer below there is no need to manually create the ServicePoint prior to creating the request. It is generated on the request object as part of executing the request.
private static string GetSSLExpiration(string url)
{
HttpWebRequest request = WebRequest.Create(url) as HttpWebRequest;
using (WebResponse response = request.GetResponse()) { }
if (request.ServicePoint.Certificate != null)
{
return request.ServicePoint.Certificate.GetExpirationDateString();
}
else
{
return string.Empty;
}
}
I've just tried this, which "works on my machine (tm)".
It returns the text string representing the expiry date on the server's certificate, and requires an actual hit to be made on the web site.
private string GetSSLExpiryDate(string url)
{
Uri u = new Uri(url);
ServicePoint sp = ServicePointManager.FindServicePoint(u);
string groupName = Guid.NewGuid().ToString();
HttpWebRequest req = HttpWebRequest.Create(u) as HttpWebRequest;
req.ConnectionGroupName = groupName;
using (WebResponse resp = req.GetResponse())
{
// Ignore response, and close the response.
}
sp.CloseConnectionGroup(groupName);
// Implement favourite null check pattern here on sp.Certificate
string expiryDate = sp.Certificate.GetExpirationDateString();
return expiryDate;
}
I'm afraid I don't know all the rights and wrongs of using ServicePoint, and any other hoops that you might need to jump through, but you do get an SSL expiry date for the actual web site you want to know about.
EDIT:
ensure the "url" parameter use the https:// protocol.
e.g. string contosoExpiry = GetSSLExpiryData("https://www.contoso.com/");
Related
I use HttpClient in my app to send my user/password to a service that returns some cookies that I can later use for all my other requests. The service is located at https://accounts.dev.example.com/login and returns two cookies that have Domain=.dev.example.com. The issue I'm finding is that, in some machines (Windows Domain Controllers), these cookies are not being used when I request resources in subdomains like https://accounts.dev.example.com/health-check, but according to the MDN docs a cookie for a domain can be used for requesting resources to subdomains:
Domain= Optional
Specifies those hosts to which the cookie will be sent. If not specified, defaults to the host portion of the current document location (but not including subdomains). Contrary to earlier specifications, leading dots in domain names are ignored. If a domain is specified, subdomains are always included.
Do you know how to properly configure HttpClient to pass the domain cookies to subdomain requests?
A bit more of details:
The cookies returned by my authentication service at https://accounts.dev.example.com/login look like this in the HTTP headers:
Set-Cookie: AK=112233;Version=1;Domain=.dev.example.com;Path=/;Max-Age=5400;Secure;HttpOnly,
Set-Cookie: AS=445566;Version=1;Domain=.dev.example.com;Path=/;Max-Age=5400;Secure;HttpOnly,
Then I can query C#'s CookieContainer with either of these calls in normal workstations:
cookies.GetCookies("https://accounts.dev.example.com")
cookies.GetCookies("https://dev.example.com")
Both of which will return the 2 cookies like:
$Version=1; AK=112233; $Path=/; $Domain=.dev.example.com
$Version=1; AS=445566; $Path=/; $Domain=.dev.example.com
But in the other machines (the Domain Controller's) the first call will return an empty list, while the second will return the 2 cookies.
Why this difference on the behaviour of CookieContainer.GetCookies depending on which machine is running the code?
My workstations are using Microsoft Windows 10 Home Single Language (.Net 4.0.30319.42000) and the DCs are using Microsoft Windows Server 2012 R2 Datacenter (.Net 4.0.30319.36399).
The code
This is a modified version of my code:
public static async Task<string> DoAuth(CookieContainer cookies,
Dictionary<string, string> postHeaders,
StringContent postBody)
{
try
{
using (var handler = new HttpClientHandler())
{
handler.CookieContainer = cookies;
using (var client = new HttpClient(handler, true))
{
foreach (var key in postHeaders.Keys)
client.DefaultRequestHeaders.Add(key, postHeaders[key]);
var response = await client.PostAsync("https://accounts.dev.example.com/login", postBody);
response.EnsureSuccessStatusCode();
// This line returns 0 in Domain Controllers, and 2 in all other machines
Console.Write(cookies.GetCookies("https://accounts.dev.example.com").Count);
return await response.Content.ReadAsStringAsync();
}
}
}
catch (HttpRequestException e)
{
...
throw;
}
}
As I couldn't find an answer to this (not in TechNet either), I decided to go with the following solution, which works, but not sure if there is a proper way of solving the issue:
foreach (Cookie cookie in cookies.GetCookies(new Uri("https://dev.example.com")))
{
cookies.Add(new Uri("https://accounts.dev.example.com"), new Cookie(cookie.Name, cookie.Value, cookie.Path, ".accounts.dev.example.com"));
}
So, I'm duplicating the cookie for each one of the subdomains that my app should send these cookies to.
The underlying issue seems to be a bug in the Set-Cookie header. It seems the cause of the issue is the Version= component in the Set-Cookie header. This makes the CookieContainer fall on its face and results in the strange $Version and $Domain cookies then being sent in subsequent client requests. As far as I can tell there is no way to remove these broken cookies either. Iterating GetCookies() with the originating domain does not reveal the erroneous cookies.
I have developed a C# desktop application which makes HTTPS requests to the customers' servers (usually Documentum/SharePoint/Alfresco/NemakiWare/etc HTTPS-based servers).
Several customers have asked us to support their servers which are protected by CA SSO (new name of Siteminder).
QUESTION: What do I need to do to allow my application to send HTTPS requests (and receive responses) with CA SSO-protected servers?
I have developed NTLM-SSO support for our C# desktop application and it works well, but I am not sure about how to proceed for CA SSO.
I have asked the same question on the CA forum, but like most questions there it remains unanswered.
To authenticate with CA SSO and then connect to the desired URL we need to access a protected resource on a web server configured to use CA SSO authentication:
Requests a resource on the server, using an HTTP request.
The request is received by the web server and is intercepted by the CA SSO web agent.
The web agent determines whether or not the resource is protected, and if so, gathers the user’s credentials and passes them to the Policy server.
The Policy server authenticates the user and verifies whether or not the authenticated user is authorized for the requested resource, based on rules and policies contained in the Policy store.
After the user is authenticated and authorized, the Policy server grants access to the protected resources.
This is accomplished with the following steps:
Open a connection (HTTP request in this case) to the URI of the protected resource. Since the request has not yet been authenticated, the CA SSO agent will issue a redirect to a login page. In the code, AllowAutoRedirect is set to false. This is important as the redirect URL will be required for the subsequent POST of login data in step 3 below. If AllowAutoRedirect were True, the response would not include a Location header and the subsequent POST would be made to the original URL, which would then redirect to the login page again. However, a POST occurs between a client and the server, any POST data carried in the payload of the request of step 3 will be lost during the redirect.
Dim request As HttpWebRequest
Dim response As HttpWebResponse
Dim url As String = PROTECTED_URL
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
' make sure we have a valid response
If response.StatusCode <> HttpStatusCode.Found Then
Throw New InvalidProgramException
End If
' get the login page
url = response.Headers("Location")
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
The next step involves creating an HTTPS request that POSTs all the form data, including userid and password, back to the server. The purpose of an authentication agent is to verify a user’s identity by validating their userid and password. Thus, their URLs naturally use SSL (secure sockets layer) and are encrypted for us, so we do not required further encryption in our program. However, the formatting of the POST data is interesting in as much as there are two alternatives. The sample program uses the simpler approach of setting the content type to application/x-www-form-urlencoded. Here the POST data is formatted similar to a query string and sent as part of the next request.
Dim postData As String
postData = ""
For Each inputName As String In tags.Keys
If inputName.Substring(0, 2).ToLower = "sm" Then
postData &= inputName & "=" & _
HttpUtility.UrlEncode(tags(inputName)) & "&"
End If
Next
postData += "postpreservationdata=&"
postData += "USER=" + HttpUtility.UrlEncode(USERNAME) & "&"
postData += "PASSWORD=" + HttpUtility.UrlEncode(PASSWORD)
request = WebRequest.Create(url)
cookies = New CookieContainer
request.CookieContainer = cookies
request.ContentType = FORM_CONTENT_TYPE
request.ContentLength = postData.Length
request.Method = POST_METHOD
request.AllowAutoRedirect = False ' Important
Dim sw As StreamWriter = New StreamWriter(request.GetRequestStream())
sw.Write(postData)
sw.Flush()
sw.Close()
response = request.GetResponse
Same idea as Mohit's answer, but it can be done with a much simpler code:
//Make initial request for SM to give you some cookies and the authentication URI
RestClient client = new RestClient("http://theResourceDomain/myApp");
client.CookieContainer = new CookieContainer();
IRestResponse response = client.Get(new RestRequest("someProduct/orders"));
//Now add credentials.
client.Authenticator = new HttpBasicAuthenticator("username", "password");
//Get resource from the SiteMinder URI which will redirect back to the API URI upon authentication.
response = client.Get(new RestRequest(response.ResponseUri));
Although this uses RestSharp, it can be easily replicated using HttpClient or even HttpWebRequest.
My app communicates with an internal web API that requires authentication.
When I send the request I get the 401 challenge as expected, the handshake occurs, the authenticated request is re-sent and everything continues fine.
However, I know that the auth is required. Why do I have to wait for the challenge? Can I force the request to send the credentials in the first request?
My request generation is like this:
private static HttpWebRequest BuildRequest(string url, string methodType)
{
var request = HttpWebRequest.CreateHttp(url);
request.PreAuthenticate = true;
request.AuthenticationLevel = AuthenticationLevel.MutualAuthRequested;
request.Credentials = CredentialCache.DefaultNetworkCredentials;
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
request.ContentType = CONTENT_TYPE;
request.Method = methodType;
request.UserAgent = BuildUserAgent();
return request;
}
Even with this code, the auth header isn't included in the initial request.
I know how to include the auth info with basic.... what I want to do is to use Windows Auth of the user executing the app (so I can't store the password in a config file).
UPDATE I also tried using a HttpClient and its own .Credentials property with the same result: no auth header is added to the initial request.
The only way I could get the auth header in was to hack it in manually using basic authentication (which won't fly for this use-case)
Ntlm is a challenge/response based authentication protocol. You need to make the first request so that the server can issue the challenge then in the subsequent request the client sends the response to the challenge. The server then verifies this response with the domain controller by giving it the challenge and the response that the client sent.
Without knowing the challenge you can't send the response which is why 2 requests are needed.
Basic authentication is password based so you can short circuit this by sending the credentials with the first request but in my experience this can be a problem for some servers to handle.
More details available here:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa378749(v=vs.85).aspx
I'm not 100% sure, but I suspect that there is no way around this; it's simply the way HttpWebRequest works.
In the online .NET source, function DoSubmitRequestProcessing which is here, you can see this comment just after the start of the function, line 1731:
// We have a response of some sort, see if we need to resubmit
// it do to authentication, redirection or something
// else, then handle clearing out state and draining out old response.
A little further down (line 1795) (some lines removed for brevity)
if (resubmit)
{
if (CacheProtocol != null && _HttpResponse != null) CacheProtocol.Reset();
ClearRequestForResubmit(ntlmFollowupRequest);
...
}
And in ClearRequestForResubmit line 5891:
// We're uploading and need to resubmit for Authentication or Redirect.
and then (Line 5923):
// The second NTLM request is required to use the same connection, don't close it
if (ntlmFollowupRequest) {....}
To my (admittedly n00bish) eyes these comments seem to indicate that the developers decided to follow the "standard" challenge-response protocol for NTML/Kerberos and not include any way of sending authentication headers up-front.
Setting PreAuthenticate is what you want, which you are doing. The very first request will still do the handshake but for subsequent requests it will automatically send the credentials (based on the URL being used). You can read up on it here: http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.preauthenticate(v=vs.110).aspx.
I need to make an http request for JSON data on a sharepoint site. When accessing the data through the browser, I am first prompted for a username and password. After entering my credentials I am allowed to see the JSON. I am new to C# and am wondering the best way go about forming my request, retrieving the response, and parsing the JSON. I have worked with JSON requests in java before, but never had to deal with sharepoint credentials. Thanks in advance.
Here you go:
private void login_Click(object sender, EventArgs e)
{
string username = uname.Text;
string password = pword.Text;
string url = "THE SITE URL HERE";
var req = (HttpWebRequest)WebRequest.Create(url);
req.Credentials = new NetworkCredential(username, password);
var response = req.GetResponse();
//Do Stuff with response
}
You should be able to create a WebRequest and either pass in your credentials there or if that doesn't work check out this answer about setting your sharepoint credentials.
You should just be fine making an HttpWebRequest and setting the ClientCertificates property to include your certificate from it's .pfx file, unless you've got some other permissions issue going on, because you don't need to have a ServerCertificateValidationCallback handler in order to establish an SSL connection to a remote server as a client. The handler's only necessary if you wish to intercept, inspect, and/or override the default validation behavior of the remote machine's certificate.
https://stackoverflow.com/questions/5595049/servicepointmanager-servercertificatevalidationcallback-question
I'm trying to call a web service from a c# application, with sessionID.
In order to do this I need to set the "Domain" header in a cookie.
In Fiddler it looks like - "ASP.NET_SessionId=izdtd4tbzczsa3nlt5ujrbf5" (no domain is specified in the cookie).
The web service is at - "http://[some ip goes here]:8989/MyAPI.asmx".
I've tried:
http://[ip] ,
http://[ip]:8989 ,
http://[ip]:8989/MyAPI.asmx
All of these cause runtime error.
I've also tried the ip alone (i.e. 100.10.10.10) , which doesn't cause a runtime error, and sets the cookie, but the cookie is never sent when I invoke a web method.
Here's my code for setting the domain:
if (!string.IsNullOrEmpty(currentSessionID))
{
req.CookieContainer=new CookieContainer();
Cookie cookie = new Cookie("ASP.NET_SessionId", currentSessionID);
cookie.Domain = GetCookieUrl(); //<- What should this be?
req.CookieContainer.Add(cookie);
}
So what should the domain be?
Thanks.
I believe it should simply be [ip]. Drop the http:// part of what you've tried.
According to this page on MSDN, your code should be
cookie.Domain = "100.10.10.10";
Next, exactly what error are you getting? Also, are you confusing a Compile error with a Runtime error? I find it hard to believe you are getting a compilation error as Domain is a String property which means you can put pretty much anything into it.
Finally, why are you sending a cookie to a web service? The normal way is to pass everything in the form post or on the query string.
Update
BTW, if you absolutely must add a cookie to the header in order to pass it to a web service, the way you do this is (taken from here):
byte[] buffer = Encoding.ASCII.GetBytes("fareId=123456"); //the data you want to send to the web service
HttpWebRequest WebReq = (HttpWebRequest)WebRequest.Create(url);
WebReq.Method = "POST";
WebReq.ContentType = "application/x-www-form-urlencoded";
WebReq.ContentLength = buffer.Length;
WebReq.Headers["Cookie"] = "ASP.NET_SessionId=izdtd4tbzczsa3nlt5ujrbf5"
Stream PostData = WebReq.GetRequestStream();
Note that this sets the header inline with the request without instantiating a "cookie" object. The Domain property of a cookie is to help ensure the cookie is only sent to the domain listed. However, if you are initiating the request and trying to append a cookie to it, then the best way is to just add it as a string to the request headers.
The reason the cookie was not sent is that the request's content length should be set after adding the cookie, and not before.
The domain is the ip alone.
// Simple function to get cookie domain
private string GetCookieDomain(string uri)
{
Uri req_uri = new Uri(uri);
return req_uri.GetComponents(UriComponents.Host, UriFormat.Unescaped);
}