C# WebProxy not works with https - c#

I have Client + Win service.
Both must work with web server using proxy.
Without proxy all works fine,
With system proxy settings (proxy settings from ie)
WebRequest.DefaultWebProxy
Client works fine, but service don't see this proxy settings (netsh winhttp set proxy don't help me). So - proxy server works OK
When I trying to use manual settings:
var_proxy = new WebProxy(Server + ":" + Port, true)
{Credentials = null};
var request = (HttpWebRequest)WebRequest.Create(target);
request.ContentType = Constants.ContentType; // Default content type
request.UserAgent = _userAgentHeader;
request.Method = "POST";
request.Proxy = _proxy;
And as I see in proxy logs - It works with http server, but not used with https! All requests go directly.
How can I fix it?

Related

403 Forbidden with WebClient

The company I work for have a limited internet connection and we use the proxy (example: 10.10.10.10:8080) to access at some restricted connections.
I can use the API in Postman (putting the proxy in the Postman settings) but when putting in the C# WebClient code it gives me a 403-Forbidden error.
I only need the var sensorData field but I splitted in var data and var data2 to understand where was the problem. It gives me the error at the var data = ...
Uri uri = new Uri("https://XXXXXXXX/api/DatasourceData/DatasourceDataHistoryBySerialNumber/");
Token token = new Token();
token = GetToken(tokenAPI);
using (WebClient client = new WebClient())
{
try
{
client.Proxy = new WebProxy("10.10.10.10", 8080);
client.Headers.Add("Authorization", "Bearer " + token.AccessToken);
client.QueryString.Add("serialNumbersDatasource", "I2001258");
client.QueryString.Add("startDate", string.Format("{0:s}", "2019-12-01"));
client.QueryString.Add("endDate", string.Format("{0:s}", DateTime.Now));
client.QueryString.Add("isFilterDatesByDataDate", "false");
var data = client.DownloadData(uri);
var data2 = (Encoding.UTF8.GetString(data));
sensorData = (JsonConvert.DeserializeObject<List<Sensor>>(Encoding.UTF8.GetString(client.DownloadData(uri))))[0];
}
}
Seems the problem at this line
client.Headers.Add("Authorization", "Bearer " + "tokenTest");
here you wil add header Authorization with value Bearer tokenTest
so, 403 Forbidden returns by service which you are addressing, but not a proxy
change to
client.Headers.Add("Authorization", "Bearer " + tokenTest);
and check if tokenTest has valid value
Check to see if you need any additional properties on the proxy. You may possibly need to enable:
UseDefaultCredentials (Boolean) true if the default credentials are
used; otherwise, false. The default value is false
Also, check your full url and query string that you are producing - look at the outgoing request fabricated (in the debugger) or through Fiddler and make sure it all lines up, url, query string, headers, etc.
From the docs:
Address
Gets or sets the address of the proxy server.
BypassArrayList
Gets a list of addresses that do not use the proxy server.
BypassList
Gets or sets an array of addresses that do not use the proxy server.
BypassProxyOnLocal
Gets or sets a value that indicates whether to bypass the proxy server for local addresses.
Credentials
Gets or sets the credentials to submit to the proxy server for authentication.
UseDefaultCredentials
Gets or sets a Boolean value that controls whether the DefaultCredentials are sent with requests.
Probably a problem with authorization header.
Is the token valid? Does it work with the same token in Postman?
I bet the api can't validate the token and and gives you no authorization to the resources. This is what a 403 would mean (but don't know what the api programmer actually intended by giving you 403).
Do you have access to the api's source code?
The token is really a string "tokentest" and that works with Postman?
I would suggest you to go for xNet.dll instead of webclient Because xNet library are considered best for proxy and webrequest.
var request = new HttpRequest()
request.UserAgent = Http.ChromeUserAgent();
request.Proxy = Socks5ProxyClient.Parse("10.10.10.10:8080");//can use socks4/5 http
Based on this
Try adding User-Agent in the header
client.Headers.Add("User-Agent", "PostmanRuntime/7.26.1");
In my case i did not specify security protocol. Paste this line of code before running any WebClient requests.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
original answer: How to specify SSL protocol to use for WebClient class

WebClient.DownloadData(uri) HTTP NTLM Authentication fails with 401 using correct credentials

I have the following code:
using (WebClient wcli = new WebClient())
{
wcli.UseDefaultCredentials = true;
wcli.Credentials = new NetworkCredential("RS_Username", "RS_Password", "RS_Domain");
byte[] buff = wcli.DownloadData(www);
HttpContext.Current.Response.ClearContent();
HttpContext.Current.Response.AddHeader("Content-Disposition", "attachment; filename=\"" + reportName + ".pdf\"");
HttpContext.Current.Response.ContentType = "application/pdf";
HttpContext.Current.Response.BinaryWrite(buff);
HttpContext.Current.Response.Flush();
HttpContext.Current.Response.End();
}
I use it get the result of a report in SSRS 2014 and have it as a document download from my web application (in .Net 3.5, hosted on Window 8.1, IIS 8.5).
The problem I have is that I keep getting 401 Unauthorized when calling wcli.DownloadData(www) (Note the using any browser the reports are working fine with the credentials used)
I have done a TCP Dump and I have found out that the NTLM handshake is not occurring:
C -> S: GET Request
C <- S: '401 Unauthorized' response with header
'WWW-Authenticate: NTLM'
Nothing else
Another application hosted on the same machine but using .Net 4.5 uses the same code without any problem.
I believe it has to be due to a missing/wrong configuration, but I do not succeed to figure out which one.
Any ideas?
UPDATE
What I have forgotten to mention is that both web applications mentioned (both hosted on the same server and IIS) are connecting to the same Reporting Services server (but different folders).
I have had the same issue, in my case (Still .Net4.5) what worked was to use:
WebClient wc = new WebClient();
wc.UseDefaultCredentials = false;
string host = "http://localhost"; //this comes from a function in my code
var myCredentialCache = new System.Net.CredentialCache();
myCredentialCache.Add(new Uri(host + "/"), "NTLM", new System.Net.NetworkCredential(accessUser, accessPassword, domain));
wc.Credentials = myCredentialCache;
var result = wc.DownloadFile(www);
The main diference for me was to use a CredentialCache and setting the host uri, also the domain in the network credential
You're telling the WebClient to use the DefaultCredentials. wcli.UseDefaultCredentials = true; but you're also passing in a NetworkCredential. Perhaps the other code that works, if they're both using the same code, is working because it's user has access.
using (WebClient wcli = new WebClient())
{
wcli.UseDefaultCredentials = true;
wcli.Credentials = new NetworkCredential("RS_Username", "RS_Password", "RS_Domain");
}
However, that may not be your bug. Came across this article: http://www.benjaminathawes.com/2010/10/14/ntlms-dependency-on-http-keep-alives-another-cause-of-the-dreaded-401-1-error/.
It mentions the need to use HTTP keep-alive to keep the TCP connection open for the NTLM handshake. The fact that the handshake just dies leads me to think that maybe that could also be the issue. I would check to verify that keep-alive is indeed on.

HTTP request from a C# desktop application to a Siteminder-protected server

I have developed a C# desktop application which makes HTTPS requests to the customers' servers (usually Documentum/SharePoint/Alfresco/NemakiWare/etc HTTPS-based servers).
Several customers have asked us to support their servers which are protected by CA SSO (new name of Siteminder).
QUESTION: What do I need to do to allow my application to send HTTPS requests (and receive responses) with CA SSO-protected servers?
I have developed NTLM-SSO support for our C# desktop application and it works well, but I am not sure about how to proceed for CA SSO.
I have asked the same question on the CA forum, but like most questions there it remains unanswered.
To authenticate with CA SSO and then connect to the desired URL we need to access a protected resource on a web server configured to use CA SSO authentication:
Requests a resource on the server, using an HTTP request.
The request is received by the web server and is intercepted by the CA SSO web agent.
The web agent determines whether or not the resource is protected, and if so, gathers the user’s credentials and passes them to the Policy server.
The Policy server authenticates the user and verifies whether or not the authenticated user is authorized for the requested resource, based on rules and policies contained in the Policy store.
After the user is authenticated and authorized, the Policy server grants access to the protected resources.
This is accomplished with the following steps:
Open a connection (HTTP request in this case) to the URI of the protected resource. Since the request has not yet been authenticated, the CA SSO agent will issue a redirect to a login page. In the code, AllowAutoRedirect is set to false. This is important as the redirect URL will be required for the subsequent POST of login data in step 3 below. If AllowAutoRedirect were True, the response would not include a Location header and the subsequent POST would be made to the original URL, which would then redirect to the login page again. However, a POST occurs between a client and the server, any POST data carried in the payload of the request of step 3 will be lost during the redirect.
Dim request As HttpWebRequest
Dim response As HttpWebResponse
Dim url As String = PROTECTED_URL
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
' make sure we have a valid response
If response.StatusCode <> HttpStatusCode.Found Then
Throw New InvalidProgramException
End If
' get the login page
url = response.Headers("Location")
request = WebRequest.Create(url)
request.AllowAutoRedirect = False
response = request.GetResponse
The next step involves creating an HTTPS request that POSTs all the form data, including userid and password, back to the server. The purpose of an authentication agent is to verify a user’s identity by validating their userid and password. Thus, their URLs naturally use SSL (secure sockets layer) and are encrypted for us, so we do not required further encryption in our program. However, the formatting of the POST data is interesting in as much as there are two alternatives. The sample program uses the simpler approach of setting the content type to application/x-www-form-urlencoded. Here the POST data is formatted similar to a query string and sent as part of the next request.
Dim postData As String
postData = ""
For Each inputName As String In tags.Keys
If inputName.Substring(0, 2).ToLower = "sm" Then
postData &= inputName & "=" & _
HttpUtility.UrlEncode(tags(inputName)) & "&"
End If
Next
postData += "postpreservationdata=&"
postData += "USER=" + HttpUtility.UrlEncode(USERNAME) & "&"
postData += "PASSWORD=" + HttpUtility.UrlEncode(PASSWORD)
request = WebRequest.Create(url)
cookies = New CookieContainer
request.CookieContainer = cookies
request.ContentType = FORM_CONTENT_TYPE
request.ContentLength = postData.Length
request.Method = POST_METHOD
request.AllowAutoRedirect = False ' Important
Dim sw As StreamWriter = New StreamWriter(request.GetRequestStream())
sw.Write(postData)
sw.Flush()
sw.Close()
response = request.GetResponse
Same idea as Mohit's answer, but it can be done with a much simpler code:
//Make initial request for SM to give you some cookies and the authentication URI
RestClient client = new RestClient("http://theResourceDomain/myApp");
client.CookieContainer = new CookieContainer();
IRestResponse response = client.Get(new RestRequest("someProduct/orders"));
//Now add credentials.
client.Authenticator = new HttpBasicAuthenticator("username", "password");
//Get resource from the SiteMinder URI which will redirect back to the API URI upon authentication.
response = client.Get(new RestRequest(response.ResponseUri));
Although this uses RestSharp, it can be easily replicated using HttpClient or even HttpWebRequest.

WebRequest with Proxy to whatsmyip.net shows my real ip

I'm trying to proxy my requests, but it seems that the proxy setting is ignored.
I'm using the following code:
var req = (HttpWebRequest)WebRequest.Create("http://whatsmyip.net/");
req.Proxy = new WebProxy("195.128.253.243", 8080) { BypassProxyOnLocal = false };
req.CachePolicy = new RequestCachePolicy(RequestCacheLevel.NoCacheNoStore);
var html = new StreamReader(req.GetResponse().GetResponseStream()).ReadToEnd();
The proxy is just a random free proxy from here.
The result always contains my real ip instead of the proxy ip.
When I'm surfing to that website using hidemyass or other alternatives, the ip changes as expected.
Anyone has an idea for what am I doing wrong?
I just tried your code (without the HttpWebRequest cast) bouncing off my local tor server and it works as expected. Have you tried the proxy directly within IE?

C# WebRequest returning 401

There is a web file within my intranet that my computer is authorized to read and write. I can open up IE or Firefox and view the file by typing int the url address. I need to write a C# desktop app that reads/writes to that file. Even though my computer has access, all my attempts so far result in 401, unauthorized access errors. The program needs to work from any computer whose account has been authorized, so I cannot hard-code any username/password. I've never done anything like this, but I was able to scrounge the following from several sites:
WebRequest objRequest = HttpWebRequest.Create("https://site.com/file");
objRequest.Credentials = CredentialCache.DefaultNetworkCredentials;
objRequest.Proxy = WebRequest.DefaultWebProxy;
objRequest.Proxy.Credentials = CredentialCache.DefaultCredentials;
WebResponse objResponse = (WebResponse)objRequest.GetResponse();
using (StreamReader sr = new StreamReader(objResponse.GetResponseStream()))
{
string str = sr.ReadToEnd();
sr.Close();
//... Do stuff with str
}
If it matters, I'm working in .NET 2.0
Just ran into the same problem, it all started working when I added:
objRequest.UseDefaultCredentials = true;
Did you try using Fiddler to inspect the actual request that was sent to the server?
You can also check if the server requires a client certificate to allow the connection.
Since you are accessing an intranet server, do you really need to set the proxy part? I mean most of the time, the proxy is configured to ignore local addresses anyway.
This won't work if NTLM credentials are required:
objRequest.Credentials = CredentialCache.DefaultNetworkCredentials;
You need to pass in the actual credentials like:
NetworkCredential networkCredential = new NetworkCredential(UserName, Password, Domain);
CredentialCache credCache = new CredentialCache();
credCache.Add(new Uri(url), "NTLM", networkCredential);
objRequest.Proxy.Credentials = credCache;

Categories

Resources