I'm working on a .NET app that calls 3rd party web services over the internet. The services do not use SOAP, so we manually construct an XML request document, send it to the service via HTTP, and retrieve an XML response.
Our code is a Windows service that is run in the context of a normal Windows domain account, and sits behind a proxy server (Microsoft ISA Server) configured to require NTLM authentication. The account running our service has permission to access the internet through the proxy server.
The code looks like this:
// Create the request object.
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
request.Method = "POST";
// Configure for authenticating proxy server requiring Windows domain credentials.
request.Proxy = New WebProxy(proxyAddress) { UseDefaultCredentials = true };
// Set other required headers.
request.Accept = acceptableMimeType;
request.Headers.Add(HttpRequestHeader.AcceptCharset, acceptableCharset);
request.Headers.Add(HttpRequestHeader.AcceptEncoding, "none");
request.Headers.Add(HttpRequestHeader.AcceptLanguage, "en-gb");
request.Headers.Add(HttpRequestHeader.CacheControl, "no-store");
request.Headers.Add(HttpRequestHeader.ContentEncoding, "none");
request.Headers.Add(HttpRequestHeader.ContentLanguage, "en-gb");
request.ContentType = requestMimeType;
request.ContentLength = requestBytes.Length;
// Make the method call.
using(Stream stream = request.GetRequestStream()) {
stream.Write(requestBytes, 0, requestBytes.Length);
}
HttpWebResponse response = (HttpWebResponse) request.GetResponse();
// Extract the data from the response without relying on the HTTP Content-Length header
// (we cannot trust all providers to set it correctly).
const int bufferSize = 1024 * 64;
List<byte> responseBytes = new List<byte>();
using(Stream stream = new BufferedStream(response.GetResponseStream(), bufferSize)) {
int value;
while((value = stream.ReadByte()) != -1) {
responseBytes.Add((byte) value);
}
}
This works fine if the proxy server is turned off, or the URL has been whitelisted as not requiring authentication, but as soon as authentication is active, it always fails with an HTTP 407 error.
I put the above code in a test harness, and tried every method I could think of for configuring the request.Proxy property, without success.
I then noticed that all the 3rd party web services that we have to call are HTTPS. When I tried accessing them as HTTP instead, the proxy authentication started working. Is there some extra hoop I have to jump through to get proxy authentication and HTTPS to play nicely?
PS: The same problems occur with the open source SmoothWall proxy server, so I can't just write it off as a bug in ISA Server.
PPS: I'm aware that you can configure proxy settings in app.config, but (a) doing it in code shouldn't make any difference, and (b) the application design requires that we read the proxy settings from a database at runtime.
Have you tried setting the proxy in the app.config ?
To disable the proxy, in the App.config file add the following configuration
<system.net>
<defaultProxy enabled="false" useDefaultCredentials="false">
<proxy/>
<bypasslist/>
<module/>
</defaultProxy>
</system.net>
To enable the proxy and to use the default proxy settings(specified in IE) add this configuration in your App.config
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy/>
<bypasslist/>
<module/>
</defaultProxy>
</system.net>
I did have a similar situation
Did you noticed it worked when you accessed the internet before you ran the code? and if you had not accessed the internet for ages (20mins for me) you got the error.
have you tried to set the proxy credentials directly?
//setup the proxy
request.Proxy = new WebProxy("proxyIp", 8080);
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
I hope this fixes your issue too
I think I will have to write off this question. My original posted code does appear to work sometimes. Our proxy server is extremely unreliable; one minute it will block an internet connection from any software, and the next it will allow it. The IT guys seem powerless to do anything about it, and we (everyone outside the IT department) have no authority to make changes to the network infrastructure.
If anyone has any ideas on how to "harden" my code to compensate for an unreliable proxy server, then I'd be interested to hear them. :-)
Is there something wrong with your proxy server's certificate? If your service can't establish HTTPS then it will throw an error.
Related
Hopefully someone can help with this problem. Recently our machines were updated with KB4344167 which includes security updates for .NET 4.7.1. Unfortunately this update has broken our code for a Webrequest. When we run the code below we get this error:
The request was aborted: Could not create SSL/TLS secure channel.
// Create a request for the URL.
WebRequest request = WebRequest.Create(url);
//specify to use TLS 1.2 as default connection
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
request.Timeout = int.Parse(configmanager.GetSetting("Webtimeout"));
// Set proxy
request.Proxy = WebRequest.DefaultWebProxy;
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
// Define a cache policy for this request only.
HttpRequestCachePolicy noCachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.NoCacheNoStore);
request.CachePolicy = noCachePolicy;
ServicePointManager.ServerCertificateValidationCallback = (s, cert, chain, ssl) => true;
// Get the response.
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
When the security update is uninstalled from the machine the code executes fine. Are we missing something in the code above? Thats about the only thing I can think of.
Any help is greatly appreciated!
#Damien_The_Unbeliever had the correct answer. Ultimately the problem was the order of the ServicePointManager and the Webrequest.Create. Reversing those lines, so the ServicePointManager is defined before the Webrequest.Create fixed the issue. I still don't know why adding the ServicePointManager after the Create fixed our original issue when our server moved to TLS 1.2, but we're not going to worry about that now.
I ran into something similar. It appears MS may have broken something in their attempt to only enable TLS 1.2. https://support.microsoft.com/en-us/help/4458166/applications-that-rely-on-tls-1-2-strong-encryption-experience-connect
So far, I've tried adding the suggested config to the app.config and it worked like a charm. No more SSL/TLS errors.
<runtime>
<AppContextSwitchOverrides value="Switch.System.Net.DontEnableSchUseStrongCrypto=false" />
</runtime>
NOTE: we found this on servers that are selectively patched, i.e. they don't yet have the MS fix. Our development machines never saw the problem.
I have Fiddler running on port 8888
Web.config:
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy bypassonlocal="False" proxyaddress="http://localhost:8888" usesystemdefault="False" autoDetect="False" />
</defaultProxy>
</system.net>
App code (specifies credentials for the API controller its talking to):
var wc = new WebClient();
wc.Proxy = new WebProxy(new Uri($"http://localhost:8888"), false);
wc.Credentials = new NetworkCredential("myuser", "mypass");
var response = wc.UploadString("http://localhost:11026/api/mycontroller/mymethod", "POST", request);
I cannot for the life of me get it to communicate via Fiddler so I can debug requests. Am I doing something wrong?
UPDATE
This information is supplied by Fiddler help but having used localhost.fiddler in both web.config and app code, still not being captured by Fiddler (and when Fiddler is closed it doesn't cause a connection failed error)
Solution 2: Use http://ipv4.fiddler
Use http://ipv4.fiddler to hit localhost on the IPv4 adapter. This
works especially well with the Visual Studio test webserver (codename:
Cassini) because the test server only listens on the IPv4 loopback
adapter. Use http://ipv6.fiddler to hit localhost on the IPv6 adapter,
or use http://localhost.fiddler to hit localhost using "localhost" in
the Host header. This last option should work best with IIS Express.
Rather than trying to send it via the proxy, send it via one of the fiddler loopbacks.
Such as http://fiddler.ipv4:11026 ... ...
Details are given here:
http://docs.telerik.com/fiddler/observe-traffic/troubleshooting/notraffictolocalhost
Shows how to set it up and capture traffic. It is the method I've used when I want to see traffic being passed between local web apis and my Mvc project
I am trying to write C# code which makes a web request against a REST service endpoint used for calculating sales tax within a web application. This is a third party service, and it is secured using SSL. There are two environments, UAT and production. The code that runs the webrequest looks like this:
...
var req = WebRequest.Create(url) as HttpWebRequest;
req.Method = "POST";
req.ContentType = "application/json";
...
using (var webresponse = req.GetResponse())
{
using (var responseStream = new StreamReader(webresponse.GetResponseStream()))
{
var respJson = responseStream.ReadToEnd();
calcResult = BuildResponse(calcRequest, respJson, consoleWriteRawReqResponse);
}
}
return calcResult;
This works fine against the UAT environment. But when I run the same code against the production environment, I get the error:
"Could not create SSL/TLS secure channel"
I am able to execute both requests from Postman without issue, without any special modifications.
This led me down the path of investigating this error, and I found many helpful SO posts discussing the topic, including:
The request was aborted: Could not create SSL/TLS secure channel
Could not create SSL/TLS secure channel, despite setting ServerCertificateValidationCallback
These helped by pointing me in the right direction, which was to look at setting the ServicePointManager.SecurityProtocol setting to a different value, and using the ServicePointManager.ServerCertificateValidationCallback to investigate errors.
What I found after playing with these is the following:
The UAT environment call will work with the default setting of Ssl3 | Tls (default for .NET 4.5.2), while the production environment will not.
The production call will work ONLY when I set this setting to explicitly to Ssl3.
That code looks like this:
...
var req = WebRequest.Create(url) as HttpWebRequest;
req.Method = "POST";
req.ContentType = "application/json";
...
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
ServicePointManager.ServerCertificateValidationCallback = new RemoteCertificateValidationCallback(CertValidationCallback);
using (var webresponse = req.GetResponse())
{
using (var responseStream = new StreamReader(webresponse.GetResponseStream()))
{
var respJson = responseStream.ReadToEnd();
calcResult = BuildResponse(calcRequest, respJson, consoleWriteRawReqResponse);
}
}
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3 | SecurityProtocolType.Tls;
return calcResult;
This is particularly confusing because in looking at the endpoints in a web browser, I can see that they are both secured by the same wildcard certificate and are both using TLS 1.0.
So I would expect that setting ServicePointManager.SecurityProtocol to TLS would work, but it does not.
I really want to avoid setting ServicePointManager.SecurityProtocol explicitly to SSL3 because our application is a web application and has multiple other integration points that communicate over SSL. These are all working fine and I do not want to adversely affect their functionality. Even if I set this setting right before the call, and then change it back right after, I run the risk of hitting concurrency issues since ServicePointManager.SecurityProtocol is static.
I investigated that topic as well, and did not like what I read. There are mentions of using different app domains:
.NET https requests with different security protocols across threads
How to use SSL3 instead of TLS in a particular HttpWebRequest?
But that seems overly complex / hacky to me. Is dealing with creating an app domain really the only solution? Or is this something I simply should not be trying to solve and instead take it up with the owner of the service in question? It is very curious to me that it would work with TLS on one environment / server, but not the other.
EDIT
I did some more playing with this. I changed my client to use the approach outlined quite well in this blog post (using a different app domain to isolate the code that changes the ServicePointManager.SecurityProtocol):
https://bitlush.com/blog/executing-code-in-a-separate-application-domain-using-c-sharp
This actually worked quite well, and could be fall back solution. But I also learned that the provider of the service in question has a different endpoint (same URL, different port) that is secured using TLS 1.2. Thankfully, by expanding my SecurityProtocol setting like so in the global.asax.cs application start event:
ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
I am able to communicate with the service fine in all environments. It also does not affect my existing integrations with other services (CyberSource, for example).
However - now there is a new but related question. Why is it that this call ONLY works if I expand SecurityProtocolType as above? My other integrations, like CyberSource, did not require this. Yet this one does. And they all appear to be secured using TLS 1.2 from what I saw in the browser.
If you run 4.0 you can use it like this:
ServicePointManager.SecurityProtocol = (SecurityProtocolType)3072; // SecurityProtocolType.Tls12
This one worked for me:
ServicePointManager.Expect100Continue = true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls
| SecurityProtocolType.Tls11
| SecurityProtocolType.Tls12
| SecurityProtocolType.Ssl3;
Some of the advanced TLS configurations on a web server are hidden. Your production server has likely been modified to protect against DROWN, logjam, FREAK, POODLE and BEAST attacks.
see
https://tecadmin.net/enable-tls-on-windows-server-and-iis/
https://support.microsoft.com/en-us/help/187498/how-to-disable-pct-1.0,-ssl-2.0,-ssl-3.0,-or-tls-1.0-in-internet-information-services
To make changes to these advanced TLS settings, it's not as simple as clicking some buttons in IIS. Well it could be that simple if you use a third-party tool like this: https://www.nartac.com/Products/IISCrypto
These configurations work fine for major recent web browsers, but .Net seems to struggle with such modern secure server configurations (unless you manually override defaults as you discovered).
Conclusion: it's not obvious, your UAT and Production environments seem the same, but they're not.
I would like to know how can I fix this issue wherein a WebApp running on IIS 7/8 with Windows Authentication is throwing 401 error while executing HttpWebRequest to another site. This WebApp works fine if I run it locally i.e debug mode.
Here is the code snippet
HttpWebRequest webReq;
webReq = (HttpWebRequest)WebRequest.Create("http://sharepoint_site/_vti_bin/listdata.svc/mylist);
webReq.Accept = "application/json";
webReq.UseDefaultCredentials = true;
webReq.Credentials = CredentialCache.DefaultNetworkCredentials;
//webReq.Credentials = new NetworkCredential("user","password","domain");
webReq.Method = "GET";
webReq.KeepAlive = true;
Stream objStream = webReq.GetResponse().GetResponseStream();
StreamReader objReader = new StreamReader(objStream);
HttpWebResponse response = (HttpWebResponse)webReq.GetResponse();
I was also able to make it work by adding BackConnectionHostNames entry in the registry
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0
but I need to pass in the credentials (commented above) which I don't like because I don't want to use my account or any service account.
I want the WebApp to use DefaultNetworkCredentials or DefaultCredentials. I enabled Windows Authentication and NTLM provider on the IIS of the machine hosting this WebApp.
Any help will be greatly appreciated, thanks and more power to this community.
CredentialCache.DefaultNetworkCredentials uses the network credentials that the process is running under. If it's running in IIS, it will be the application pool identity, which the web service won't accept.
You will either have to pass different credentials in code (what you said you didn't want to do) or update the application pool to run with network credentials (right-click the application pool in IIS -> Advanced Settings -> Identity)
I have written a WinForms app that uploads addresses from a spreadsheet, and geocodes them using an external geocoding service. This all works fine on my local machine, but the time has come for it to be installed on other peoples computers for testing. The app no longer works now though, generating the below error:
System.Net.WebException: The remote server returned an error: (407) Proxy Authentication Required.
Having read a lot and chatted breifly to our network guys, it seems i need to establish the Security Context for the users account and work with this to correct the error.
Has anyone got any pointers about how I should be going about this?
Thanks in advance!
C
It depends on how your uploading the data. If your using a http request (as it looks like you are) it will look something like;
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create("https://test.example.com/");
req.Method = "POST";
req.ContentType = "text/xml";
req.Credentials = new NetworkCredential("TESTACCOUNT", "P#ssword");
StreamWriter writer = new StreamWriter(req.GetRequestStream());
writer.Write(input);
writer.Close();
var rsp = req.GetResponse().GetResponseStream();