I have developed a C# desktop application which makes HTTPS requests to the customers' servers (usually Documentum/SharePoint/Alfresco/NemakiWare/etc HTTPS-based servers).
Several customers have asked us to support their servers which are protected by CA SSO (new name of Siteminder).
QUESTION: What do I need to do to allow my application to send HTTPS requests (and receive responses) with CA SSO-protected servers?
I have developed NTLM-SSO support for our C# desktop application and it works well, but I am not sure about how to proceed for CA SSO.
I have asked the same question on the CA forum, but like most questions there it remains unanswered.
To authenticate with CA SSO and then connect to the desired URL we need to access a protected resource on a web server configured to use CA SSO authentication:
Requests a resource on the server, using an HTTP request.
The request is received by the web server and is intercepted by the CA SSO web agent.
The web agent determines whether or not the resource is protected, and if so, gathers the user’s credentials and passes them to the Policy server.
The Policy server authenticates the user and verifies whether or not the authenticated user is authorized for the requested resource, based on rules and policies contained in the Policy store.
After the user is authenticated and authorized, the Policy server grants access to the protected resources.
This is accomplished with the following steps:
Open a connection (HTTP request in this case) to the URI of the protected resource. Since the request has not yet been authenticated, the CA SSO agent will issue a redirect to a login page. In the code, AllowAutoRedirect is set to false. This is important as the redirect URL will be required for the subsequent POST of login data in step 3 below. If AllowAutoRedirect were True, the response would not include a Location header and the subsequent POST would be made to the original URL, which would then redirect to the login page again. However, a POST occurs between a client and the server, any POST data carried in the payload of the request of step 3 will be lost during the redirect.
Dim request As HttpWebRequest
Dim response As HttpWebResponse
Dim url As String = PROTECTED_URL
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
' make sure we have a valid response
If response.StatusCode <> HttpStatusCode.Found Then
Throw New InvalidProgramException
End If
' get the login page
url = response.Headers("Location")
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
The next step involves creating an HTTPS request that POSTs all the form data, including userid and password, back to the server. The purpose of an authentication agent is to verify a user’s identity by validating their userid and password. Thus, their URLs naturally use SSL (secure sockets layer) and are encrypted for us, so we do not required further encryption in our program. However, the formatting of the POST data is interesting in as much as there are two alternatives. The sample program uses the simpler approach of setting the content type to application/x-www-form-urlencoded. Here the POST data is formatted similar to a query string and sent as part of the next request.
Dim postData As String
postData = ""
For Each inputName As String In tags.Keys
If inputName.Substring(0, 2).ToLower = "sm" Then
postData &= inputName & "=" & _
HttpUtility.UrlEncode(tags(inputName)) & "&"
End If
Next
postData += "postpreservationdata=&"
postData += "USER=" + HttpUtility.UrlEncode(USERNAME) & "&"
postData += "PASSWORD=" + HttpUtility.UrlEncode(PASSWORD)
request = WebRequest.Create(url)
cookies = New CookieContainer
request.CookieContainer = cookies
request.ContentType = FORM_CONTENT_TYPE
request.ContentLength = postData.Length
request.Method = POST_METHOD
request.AllowAutoRedirect = False ' Important
Dim sw As StreamWriter = New StreamWriter(request.GetRequestStream())
sw.Write(postData)
sw.Flush()
sw.Close()
response = request.GetResponse
Same idea as Mohit's answer, but it can be done with a much simpler code:
//Make initial request for SM to give you some cookies and the authentication URI
RestClient client = new RestClient("http://theResourceDomain/myApp");
client.CookieContainer = new CookieContainer();
IRestResponse response = client.Get(new RestRequest("someProduct/orders"));
//Now add credentials.
client.Authenticator = new HttpBasicAuthenticator("username", "password");
//Get resource from the SiteMinder URI which will redirect back to the API URI upon authentication.
response = client.Get(new RestRequest(response.ResponseUri));
Although this uses RestSharp, it can be easily replicated using HttpClient or even HttpWebRequest.
Related
The company I work for have a limited internet connection and we use the proxy (example: 10.10.10.10:8080) to access at some restricted connections.
I can use the API in Postman (putting the proxy in the Postman settings) but when putting in the C# WebClient code it gives me a 403-Forbidden error.
I only need the var sensorData field but I splitted in var data and var data2 to understand where was the problem. It gives me the error at the var data = ...
Uri uri = new Uri("https://XXXXXXXX/api/DatasourceData/DatasourceDataHistoryBySerialNumber/");
Token token = new Token();
token = GetToken(tokenAPI);
using (WebClient client = new WebClient())
{
try
{
client.Proxy = new WebProxy("10.10.10.10", 8080);
client.Headers.Add("Authorization", "Bearer " + token.AccessToken);
client.QueryString.Add("serialNumbersDatasource", "I2001258");
client.QueryString.Add("startDate", string.Format("{0:s}", "2019-12-01"));
client.QueryString.Add("endDate", string.Format("{0:s}", DateTime.Now));
client.QueryString.Add("isFilterDatesByDataDate", "false");
var data = client.DownloadData(uri);
var data2 = (Encoding.UTF8.GetString(data));
sensorData = (JsonConvert.DeserializeObject<List<Sensor>>(Encoding.UTF8.GetString(client.DownloadData(uri))))[0];
}
}
Seems the problem at this line
client.Headers.Add("Authorization", "Bearer " + "tokenTest");
here you wil add header Authorization with value Bearer tokenTest
so, 403 Forbidden returns by service which you are addressing, but not a proxy
change to
client.Headers.Add("Authorization", "Bearer " + tokenTest);
and check if tokenTest has valid value
Check to see if you need any additional properties on the proxy. You may possibly need to enable:
UseDefaultCredentials (Boolean) true if the default credentials are
used; otherwise, false. The default value is false
Also, check your full url and query string that you are producing - look at the outgoing request fabricated (in the debugger) or through Fiddler and make sure it all lines up, url, query string, headers, etc.
From the docs:
Address
Gets or sets the address of the proxy server.
BypassArrayList
Gets a list of addresses that do not use the proxy server.
BypassList
Gets or sets an array of addresses that do not use the proxy server.
BypassProxyOnLocal
Gets or sets a value that indicates whether to bypass the proxy server for local addresses.
Credentials
Gets or sets the credentials to submit to the proxy server for authentication.
UseDefaultCredentials
Gets or sets a Boolean value that controls whether the DefaultCredentials are sent with requests.
Probably a problem with authorization header.
Is the token valid? Does it work with the same token in Postman?
I bet the api can't validate the token and and gives you no authorization to the resources. This is what a 403 would mean (but don't know what the api programmer actually intended by giving you 403).
Do you have access to the api's source code?
The token is really a string "tokentest" and that works with Postman?
I would suggest you to go for xNet.dll instead of webclient Because xNet library are considered best for proxy and webrequest.
var request = new HttpRequest()
request.UserAgent = Http.ChromeUserAgent();
request.Proxy = Socks5ProxyClient.Parse("10.10.10.10:8080");//can use socks4/5 http
Based on this
Try adding User-Agent in the header
client.Headers.Add("User-Agent", "PostmanRuntime/7.26.1");
In my case i did not specify security protocol. Paste this line of code before running any WebClient requests.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
original answer: How to specify SSL protocol to use for WebClient class
My app communicates with an internal web API that requires authentication.
When I send the request I get the 401 challenge as expected, the handshake occurs, the authenticated request is re-sent and everything continues fine.
However, I know that the auth is required. Why do I have to wait for the challenge? Can I force the request to send the credentials in the first request?
My request generation is like this:
private static HttpWebRequest BuildRequest(string url, string methodType)
{
var request = HttpWebRequest.CreateHttp(url);
request.PreAuthenticate = true;
request.AuthenticationLevel = AuthenticationLevel.MutualAuthRequested;
request.Credentials = CredentialCache.DefaultNetworkCredentials;
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
request.ContentType = CONTENT_TYPE;
request.Method = methodType;
request.UserAgent = BuildUserAgent();
return request;
}
Even with this code, the auth header isn't included in the initial request.
I know how to include the auth info with basic.... what I want to do is to use Windows Auth of the user executing the app (so I can't store the password in a config file).
UPDATE I also tried using a HttpClient and its own .Credentials property with the same result: no auth header is added to the initial request.
The only way I could get the auth header in was to hack it in manually using basic authentication (which won't fly for this use-case)
Ntlm is a challenge/response based authentication protocol. You need to make the first request so that the server can issue the challenge then in the subsequent request the client sends the response to the challenge. The server then verifies this response with the domain controller by giving it the challenge and the response that the client sent.
Without knowing the challenge you can't send the response which is why 2 requests are needed.
Basic authentication is password based so you can short circuit this by sending the credentials with the first request but in my experience this can be a problem for some servers to handle.
More details available here:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa378749(v=vs.85).aspx
I'm not 100% sure, but I suspect that there is no way around this; it's simply the way HttpWebRequest works.
In the online .NET source, function DoSubmitRequestProcessing which is here, you can see this comment just after the start of the function, line 1731:
// We have a response of some sort, see if we need to resubmit
// it do to authentication, redirection or something
// else, then handle clearing out state and draining out old response.
A little further down (line 1795) (some lines removed for brevity)
if (resubmit)
{
if (CacheProtocol != null && _HttpResponse != null) CacheProtocol.Reset();
ClearRequestForResubmit(ntlmFollowupRequest);
...
}
And in ClearRequestForResubmit line 5891:
// We're uploading and need to resubmit for Authentication or Redirect.
and then (Line 5923):
// The second NTLM request is required to use the same connection, don't close it
if (ntlmFollowupRequest) {....}
To my (admittedly n00bish) eyes these comments seem to indicate that the developers decided to follow the "standard" challenge-response protocol for NTML/Kerberos and not include any way of sending authentication headers up-front.
Setting PreAuthenticate is what you want, which you are doing. The very first request will still do the handshake but for subsequent requests it will automatically send the credentials (based on the URL being used). You can read up on it here: http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.preauthenticate(v=vs.110).aspx.
I'm working on a project of a mobile version of an archaic Online Learning System in my campus. I've been trying for weeks to scrape something in this website, but I need to login first in order to get it. I have search anything including HttpWebRequest, CookiesAwareWebClient, etc
My method until now is:
Find the "action" URL in the login form of the site
Sent POST request to that URL
Receive response containing cookies in the Headers["Set-Cookie"]
Create new HttpWebRequest with the URL to the content(that need to be logged in first).
Copy the headers of set-cookie into that request.
Run it (but fails)
I also have tried using CookieCollection in CookieAwareWebClient but it didn't work too.
How to do it properly? Is the location of a Cookie in HttpWebRequest is only in Headers, or in HTTP Packets, where is the location of CookieCollection? Does CookieCollection included in the next request?
Thanks
You need to use a CookieContainer. That will process and hold the cookies for you between HttpWebRequest objects:
var cookieJar = new CookieContainer();
var loginWebRequest = WebRequest.Create(loginUrl) as HttpWebRequest;
loginWebRequest.CookieContainer = cookieJar;
// Execute the Web Request
var authRequiredWebRequest = WebRequest.Create(protectedUrl) as HttpWebRequest;
authRequiredWebRequest.CookieContainer = cookieJar;
// Execute the next request
// It will have the auth cookie set appropriately
I have a office plugin that connect a service using HttpWebRequest.
Inside a domain I pass CredentialCache.DefaultNetworkCredentials so all is fine.
Outside a domain a user need to provide username, domain and password.
This don't work atm.
Some part of the code out of it:
CookieContainer cookies = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = WebRequestMethods.Http.Post;
request.AllowAutoRedirect = true;
request.CookieContainer = cookies; // provide session cookie to handle redirects of login controller of the webservice
if (isWindowAuthentication) // isWindowAuthentication is set earlier by config
{
if (Common.UserName.Length > 0)
{
string[] domainuser;
string username;
string domain;
if (Common.UserName.Contains("#"))
{
domainuser = Common.UserName.Split('#');
username = domainuser[0];
domain = domainuser[1];
}
else
{
domainuser = Common.UserName.Split('\\');
username = domainuser[1];
domain = domainuser[0];
}
NetworkCredential nc = new NetworkCredential(username, Common.Password, domain);
CredentialCache cache = new CredentialCache();
cache.Add(request.RequestUri, "NTLM", nc);
request.Credentials = cache;
}
else
{
request.Credentials = CredentialCache.DefaultNetworkCredentials;
}
}
Later on I do the request request.GetResponse();.
If I use CredentialCache.DefaultNetworkCredentials then everything works fine.
The moment I switch to my own new NetworkCredential() part the authentication fails.
I checked the logs of the Apache (it is Apache 2.2 using SSPI mod).
When it succeed the first request redirect to the login controller, then the login controller request credentials. Passed and works (redirect to the target site).
Log 1 (works):
192.168.14.9 - - [25/Oct/2012:11:35:35 +0200] "POST /ror/ioi/start?document%5Bguid%5D=%7Be3d8f1de-10f2-4493-a0c0-97c2acb034e6%7D HTTP/1.1" 302 202
192.168.14.9 - - [25/Oct/2012:11:35:35 +0200] "GET /ror_auth/login?ror_referer=%2Fror%2Fioi%2Fstart%3Fdocument%255Bguid%255D%3D%257Be3d8f1de-10f2-4493-a0c0-97c2acb034e6%257D HTTP/1.1" 401 401
192.168.14.9 - - [25/Oct/2012:11:35:35 +0200] "GET /ror_auth/login?ror_referer=%2Fror%2Fioi%2Fstart%3Fdocument%255Bguid%255D%3D%257Be3d8f1de-10f2-4493-a0c0-97c2acb034e6%257D HTTP/1.1" 401 401
192.168.14.9 - rausch [25/Oct/2012:11:35:35 +0200] "GET /ror_auth/login?ror_referer=%2Fror%2Fioi%2Fstart%3Fdocument%255Bguid%255D%3D%257Be3d8f1de-10f2-4493-a0c0-97c2acb034e6%257D HTTP/1.1" 302 156
The own credential results here Log 2 (do not work):
192.168.14.9 - - [25/Oct/2012:12:05:23 +0200] "POST /ror/ioi/start?document%5Bguid%5D=%7B6ac54e8a-19f1-4ccd-9684-8d864dd9ccf7%7D HTTP/1.1" 302 202
192.168.14.9 - - [25/Oct/2012:12:05:23 +0200] "GET /ror_auth/login?ror_referer=%2Fror%2Fioi%2Fstart%3Fdocument%255Bguid%255D%3D%257B6ac54e8a-19f1-4ccd-9684-8d864dd9ccf7%257D HTTP/1.1" 401 401
What I don't understand is when I inspect e.g. CredentialCache.DefaultNetworkCredentials.UserName then is is empty.
Anyone know what to do and how I have to set my own credentials correct that the authentication works as expected?
Finally after a lot of testing and investigation and many resources on stack overflow I found out what is going on.
The problem seems to be that the httpwebrequest don't handle the authentication when parts of the webseite requests credentials and some don't.
Background:
Our Site has its own session management and redirect to a login controller when no valid session is available. Only this login controller is set to NTLM authentication.
This we made because we have a web site without NTLM auth at all (no 401, 302 request loops in IE!) and only validate once (and we handle authentication on different url to prevent the problem that IE stop posting data at non-authenticated sites => see http://support.microsoft.com/?id=251404).
Solution:
I normally sent a request on my target page and the webserver redirect, authenticate and redirect back to the target. As the httpwebrequest don't handle this for any reason if I have my own credentials set (see above code of my question) I changed to code to authenticate once to my login controller and store the session in a cookie container.
For all following request I don't autenticate at all anymore. I add the cookie container and my server gets a valid session. So I don't have to authenticate anymore. Sideeffect is better performance this way.
Another tricky thing was that I not only use httpwebrequest, I also use a webform control.
Therefor I found the solution to add my own cookie session here: Use cookies from CookieContainer in WebBrowser (Thanks to Aaron who saved me a lot of trouble as well).
I need to make an http request for JSON data on a sharepoint site. When accessing the data through the browser, I am first prompted for a username and password. After entering my credentials I am allowed to see the JSON. I am new to C# and am wondering the best way go about forming my request, retrieving the response, and parsing the JSON. I have worked with JSON requests in java before, but never had to deal with sharepoint credentials. Thanks in advance.
Here you go:
private void login_Click(object sender, EventArgs e)
{
string username = uname.Text;
string password = pword.Text;
string url = "THE SITE URL HERE";
var req = (HttpWebRequest)WebRequest.Create(url);
req.Credentials = new NetworkCredential(username, password);
var response = req.GetResponse();
//Do Stuff with response
}
You should be able to create a WebRequest and either pass in your credentials there or if that doesn't work check out this answer about setting your sharepoint credentials.
You should just be fine making an HttpWebRequest and setting the ClientCertificates property to include your certificate from it's .pfx file, unless you've got some other permissions issue going on, because you don't need to have a ServerCertificateValidationCallback handler in order to establish an SSL connection to a remote server as a client. The handler's only necessary if you wish to intercept, inspect, and/or override the default validation behavior of the remote machine's certificate.
https://stackoverflow.com/questions/5595049/servicepointmanager-servercertificatevalidationcallback-question