We use a third party service which - when accessed via a browser - it yields this error :
OK - we should call them and probably tell them to fix this at their side.
But -
Question:
Looking at this simple C# code - Why don't I see any exception about this warning , or in other words - How can I make C# to reflect this warning or unsafe access ?
NB I already know that I can use a more advanced webrequest class using other class - But it doesn't matter for this question. (imho).
void Main()
{
Console.WriteLine(CreatePost("https://------", "dummy")); // No exception/warning here
}
private string CreatePost(string uri, string data)
{
HttpWebRequest request = (HttpWebRequest)
WebRequest.Create(uri); request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
request.Method = "POST";
byte[] postBytes = Encoding.GetEncoding("UTF-8").GetBytes(data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
// now send it
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return new StreamReader(response.GetResponseStream(), Encoding.GetEncoding("UTF-8")).ReadToEnd();
}
Also - I know that browser url address is using GET (unlike the C# post verb) - but I don't think that they've redirected this action to a silenced warning)
You don't see any warning when accessing it via C# because Google Chrome is checking how the SSL is set up and putting the warning in the way to try and protect you (and users of said service). When you access it from C#, it never touches Chrome and so you don't get the warning.
You'll get a similar warning in a few other browsers, but it's not part of the response to the request you're making - just the browser trying to keep you safe.
You could manually check the signature algorithm in your code, and throw an exception if it's not what you deem "secure".
Edit: you can check the signature algorithm by adding a custom validation callback to ServicePointManager, something like this:
ServicePointManager.ServerCertificateValidationCallback =
new RemoteCertificateValidationCallback(
(sender, certificate, chain, errors) => {
var insecureAlgorithms = new List<String> { "SHA1" };
var sslCertificate = (X509Certificate2) certificate;
var signingAlgorithm = sslCertificate.SignatureAlgorithm;
if (insecureAlgorithms.Contains(signingAlgorithm.FriendlyName))
{
return false;
}
// do some other checks here...
return true;
}
);
Related
After endless research and testing of different combinations, I'm clueless right now.
I receive an WebException: The request timed out only if I my byteArray gets filled by something else than System.Text.Encoding.UTF8.GetBytes(""). (Like "hello")
The server setup is a https-request to a Google Load Balancer, which communicates with the backend via HTTP. The backend is an Apache with PHP.
For testing purposes (self-signed SSL-Cert) I have this:
System.Net.ServicePointManager.ServerCertificateValidationCallback =
delegate (object s,
System.Security.Cryptography.X509Certificates.X509Certificate certificate,
System.Security.Cryptography.X509Certificates.X509Chain chain,
System.Net.Security.SslPolicyErrors sslPolicyErrors){
return true;
};
If I enter the URL in my web-browser (Chrome), I get a response.
If I use the HTTP-requester from Mozilla with or without content to send, I get the correct response data (after adding an SSL-Security exception)
If I run my code below with System.Text.Encoding.UTF8.GetBytes("") everything works (except I cannot send data and therefore receive what I want)
Here's the code I'm using.
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create("https://someurl.com/some.php");
webRequest.Proxy = null;
webRequest.Credentials = CredentialCache.DefaultCredentials;
webRequest.Method = "POST";
webRequest.Timeout = 3000;
byte[] byteArray = System.Text.Encoding.UTF8.GetBytes("someData"); //works if empty
webRequest.ContentType = "application/x-www-form-urlencoded";
webRequest.ContentLength = byteArray.Length;
Stream postData = webRequest.GetRequestStream();
postData.Write(byteArray, 0, byteArray.Length);
postData.Close();
HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse(); //ERROR MESSAGE
Stream dataStream = webResponse.GetResponseStream();
reader = new StreamReader(dataStream);
string data = reader.ReadToEnd(); //output data
reader.Close ();
dataStream.Close ();
webResponse.Close ();
The exact error (btw, all this happens in the Unity3D editor):
WebException: The request timed out
System.Net.HttpWebRequest.EndGetResponse (IAsyncResult asyncResult)
System.Net.HttpWebRequest.GetResponse ()
So why on earth is it not working, once there is something the GetRequestStream has to write?
Thanks and all the best,
Kruegbert
..::Addendum
if I increase the timeout, it just takes longer until the same msg appears.
If I write webRequest.ContentLength = byteArray.Length+1 I receive a response, but it's a WebException error: ProtocolError
If I write webRequest.ContentLength = byteArray.Length-1 I get the ProtocolViolationException
I already tried the same with try/catch/using resulting in the same behaviour
I figured out, why it was not working - still I don't know why it behaves like this. (Maybe a UnityEditor thing)
I added
webRequest.ProtocolVersion = HttpVersion.Version10;
and everything worked. No more timeout errors. And yes webRequest.ProtocolVersion = HttpVersion.Version11; results in the timeout error.
However, making a HttpRequest from the web succeeds with either of these: HTTP/1.1, HTTP/1.0 (with Host header), HTTP/1.0 (without Host header)
I've been working on a project which makes use of an RTC API and forms authentication. I've hit a bit of bizarre behaviour and I just can't figure this one out.
The scenario that has played out to date is that I can successfully run this project locally end to end. That is, this specific piece of code can:
Contact the remote server and successfully authenticate
After authentication I'm able to pass XML to update a ticket in RTC
The problem starts when I publish to our IIS (7.5) server. All works fine right up until the last .GetResponse call which uses a PUT method to pass my XML to update the ticket in RTC. I keep getting 'The operation has timed out'.
I've spent literally days trying to figure this one out doing all manner of things but nothing has proved useful.
As a test I changed the PUT method on the second call to a GET. And it works! If I used a PUT with the .AllowAutoRedirect = false it works in that I get a response back, but then nothing happens on the RTC side so the request is clearly being ignored. I also noticed that the status being returned is marked as 'Found' instead of 'OK'.
Some people thought at this stage perhaps it was a lack of connectivity between the remote server and the web server. This wouldn't be the case as authentication works and this happens against the same server. I have also manually passed the XML / PUT call using the RESTClient on the web server which was accepted fine.
I just can't understand why it works end to end when running locally, but plays up once deployed to IIS?
I tried using log tracing and I'm not entirely sure if I'm getting anything useful from it. It might be totally unrelated but I can see this in the log that is generated on the IIS server:
<EventData>
<Data Name="ContextId">{00000000-0000-0000-12AF-0080000000F8}</Data>
<Data Name="ModuleName">ManagedPipelineHandler</Data>
<Data Name="Notification">128</Data>
<Data Name="HttpStatus">500</Data>
<Data Name="HttpReason">Internal Server Error</Data>
<Data Name="HttpSubStatus">0</Data>
<Data Name="ErrorCode">0</Data>
<Data Name="ConfigExceptionInfo"></Data>
</EventData>
As I say, I'm not sure if this is even related to the problem I'm having, but rather than ignore it I thought I'd share.
Code that forms the call (excuse the standard of coding, it's work in progress and got messy trying out different things to fix this problem)
//Setup webrequest
CookieContainer _cookies = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(getPath);
var test44 = test4.ToString();
request.CookieContainer = _cookies;
request.ContentType = "application/rdf+xml";
request.Accept = "application/rdf+xml";
request.Method = "PUT";
request.AllowAutoRedirect = true;
request.AllowWriteStreamBuffering = true;
request.Timeout = 40000;
byte[] bytes = Encoding.ASCII.GetBytes(test44);
request.ContentLength = bytes.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(bytes, 0, bytes.Length);
dataStream.Close();
//Pass request
logger.Info("Made it up to start of RTC request for secure document.");
using (HttpWebResponse getrespn = requestSecureDocument(request, "https://myserver:9100/jazz", "username", "pass", test44))
{
//Stream ReceiveStream = getrespn.GetResponseStream();
// Encoding encode = System.Text.Encoding.GetEncoding("utf-8");
//StreamReader readStream = new StreamReader(ReceiveStream);
//response = readStream.ReadToEnd();
getrespn.Close();
}
The segment of code which interacts with the RTC server (based on the example from: https://nkumar83.wordpress.com/2013/06/13/consuming-rtc-rational-team-concert-oslc-apis-using-c-post-1-authentication/ with my own tweaks):
public static HttpWebResponse requestSecureDocument(HttpWebRequest _requestItem, string _rtcServerURL, string _userName, string _password, string passXml)
{
try
{
//FormBasedAuth Step 1: Request the resource
HttpWebRequest _request = (HttpWebRequest)WebRequest.Create(_requestItem.RequestUri);
_request.CookieContainer = _requestItem.CookieContainer;
//store the response in _docResponse variable
HttpWebResponse _docResponse = (HttpWebResponse)_request.GetResponse();
//HttpStatusCode.OK indicates that the request succeeded
if (_docResponse.StatusCode == HttpStatusCode.OK)
{
//X-com-ibm-team... header signifies form based authentication is being used
string _rtcAuthHeader = _docResponse.Headers["X-com-ibm-team-repository-web-auth-msg"];
if ((_rtcAuthHeader != null) && _rtcAuthHeader.Equals("authrequired"))
{
_docResponse.GetResponseStream().Flush();
_docResponse.Close();
//Prepare form for authentication
HttpWebRequest _formPost = (HttpWebRequest)WebRequest.Create(_rtcServerURL + "/j_security_check");
_formPost.Method = "POST";
_formPost.Timeout = 30000;
_formPost.CookieContainer = _request.CookieContainer;
_formPost.Accept = "text/xml";
_formPost.ContentType = "application/x-www-form-urlencoded";
string _authString = "j_username=" + _userName + "&j_password=" + _password;
Byte[] _outBuffer = Encoding.UTF8.GetBytes(_authString);
_formPost.ContentLength = _outBuffer.Length;
Stream _str = _formPost.GetRequestStream();
_str.Write(_outBuffer, 0, _outBuffer.Length);
_str.Close();
//FormBasedAuth Step 2: Submit the login form and get response
HttpWebResponse _formResponse = (HttpWebResponse)_formPost.GetResponse();
_rtcAuthHeader = _formResponse.Headers["X-com.ibm-team.repository-web-auth-msg"];
//Check if auth failed
if ((_rtcAuthHeader != null) && _rtcAuthHeader.Equals("authfailed"))
{
//auth fialed
var fail = "";
}
else
{
//login successful
//FormBasedAuth Step 3: Resend the request for the protected resource
_formResponse.GetResponseStream().Flush();
_formResponse.Close();
using (HttpWebResponse getresp = (HttpWebResponse)_requestItem.GetResponse()) *** THIS IS TH LINE WHICH THROWS THE EXCEPTION ***
{
return getresp;
}
}
}
}
return _docResponse;
}
catch (WebException e)
{
var filePath = AppDomain.CurrentDomain.GetData("DataDirectory") + #"/trapA.xml";
using (StreamWriter writer = new StreamWriter(filePath, true))
{
writer.WriteLine("Message: Failed to trigger getresponse successfully: " + e);
}
}
return null;
}
Hope someone out there can help :o)
Well I'm pleased to say I've finally got to the bottom of this one. Turns out the problem wasn't anything to do with IIS and does actually work when published 'if' I'm not using the RTC client to make updates to a ticket.
The short story is that our RTC client uses a custom script to post out to our web api. However the RTC client appears to put a record lock on the ticket your trying to update which is persisted until a response from our API is provided. Of course this can't happen because part of the response is to confirm if the update was successful which can't happen due to the lock made by the RTC client.
The solution was to get the call in from RTC closed as quickly as possible. So the segment of code which authenticates and calls back out to RTC to make updates is now wrapped around with some new code to create a new thread. This has allowed the connection to be closed in about 5 seconds, all the while our app continues to make the necessary calls to complete the transaction.
Thread t = new Thread(() = > {
//code here
}
I am trying to write a login script, but for some reason I am getting an internal server error (500).
I tried this with PHP and cURL, there I got a response when I set the option VERIFY_PEER = false.
Here's the C# code:
private void Login()
{
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create("https://whatever.com");
ASCIIEncoding encoding = new ASCIIEncoding();
string postData = string.Format("user={0}&password={1}&submit=login", User, Password);
byte[] data = Encoding.ASCII.GetBytes(postData);
webRequest.Method = "POST";
ServicePointManager.ServerCertificateValidationCallback = (x,y,z,a) => true;
webRequest.ContentLength = data.Length;
webRequest.KeepAlive = false;
using (Stream stream = webRequest.GetRequestStream())
{
stream.Write(data, 0, data.Length);
}
using (HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse())
{
using (StreamReader responseStream = new StreamReader(response.GetResponseStream()))
{
Console.WriteLine(responseStream.ReadToEnd());
}
}
}
Does anybody know why I am not getting a response?
Thanks for your help.
I would suggest using Fiddler to inspect the HTTP request/responses. You can set it up to intercept HTTPS traffic too. That way you can see exactly what your system is requesting.
From your comments your getting a 500 error, that normally means the url/request is malformed if it works from one script/app and not the other. This will let you see whats being request (and highlight any protocol error for you).
I found the mistake... It seems as if the server requieres a "valid" useragent. When setting firefox as a useragent, everything works fine.
I am trying to write code that will authenticate to the website wallbase.cc. I've looked at what it does using Firfebug/Chrome Developer tools and it seems fairly easy:
Post "usrname=$USER&pass=$PASS&nopass_email=Type+in+your+e-mail+and+press+enter&nopass=0" to the webpage "http://wallbase.cc/user/login", store the returned cookies and use them on all future requests.
Here is my code:
private CookieContainer _cookies = new CookieContainer();
//......
HttpPost("http://wallbase.cc/user/login", string.Format("usrname={0}&pass={1}&nopass_email=Type+in+your+e-mail+and+press+enter&nopass=0", Username, assword));
//......
private string HttpPost(string url, string parameters)
{
try
{
System.Net.WebRequest req = System.Net.WebRequest.Create(url);
//Add these, as we're doing a POST
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
((HttpWebRequest)req).Referer = "http://wallbase.cc/home/";
((HttpWebRequest)req).CookieContainer = _cookies;
//We need to count how many bytes we're sending. Post'ed Faked Forms should be name=value&
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(parameters);
req.ContentLength = bytes.Length;
System.IO.Stream os = req.GetRequestStream();
os.Write(bytes, 0, bytes.Length); //Push it out there
os.Close();
//get response
using (System.Net.WebResponse resp = req.GetResponse())
{
if (resp == null) return null;
using (Stream st = resp.GetResponseStream())
{
System.IO.StreamReader sr = new System.IO.StreamReader(st);
return sr.ReadToEnd().Trim();
}
}
}
catch (Exception)
{
return null;
}
}
After calling HttpPost with my login parameters I would expect all future calls using this same method to be authenticated (assuming a valid username/password). I do get a session cookie in my cookie collection but for some reason I'm not authenticated. I get a session cookie in my cookie collection regardless of which page I visit so I tried loading the home page first to get the initial session cookie and then logging in but there was no change.
To my knowledge this Python version works: https://github.com/sevensins/Wallbase-Downloader/blob/master/wallbase.sh (line 336)
Any ideas on how to get authentication working?
Update #1
When using a correct user/password pair the response automatically redirects to the referrer but when an incorrect user/pass pair is received it does not redirect and returns a bad user/pass pair. Based on this it seems as though authentication is happening, but maybe not all the key pieces of information are being saved??
Update #2
I am using .NET 3.5. When I tried the above code in .NET 4, with the added line of System.Net.ServicePointManager.Expect100Continue = false (which was in my code, just not shown here) it works, no changes necessary. The problem seems to stem directly from some pre-.Net 4 issue.
This is based on code from one of my projects, as well as code found from various answers here on stackoverflow.
First we need to set up a Cookie aware WebClient that is going to use HTML 1.0.
public class CookieAwareWebClient : WebClient
{
private CookieContainer cookie = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address);
request.ProtocolVersion = HttpVersion.Version10;
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = cookie;
}
return request;
}
}
Next we set up the code that handles the Authentication and then finally loads the response.
var client = new CookieAwareWebClient();
client.UseDefaultCredentials = true;
client.BaseAddress = #"http://wallbase.cc";
var loginData = new NameValueCollection();
loginData.Add("usrname", "test");
loginData.Add("pass", "123");
loginData.Add("nopass_email", "Type in your e-mail and press enter");
loginData.Add("nopass", "0");
var result = client.UploadValues(#"http://wallbase.cc/user/login", "POST", loginData);
string response = System.Text.Encoding.UTF8.GetString(result);
We can try this out using the HTML Visualizer inbuilt into Visual Studio while staying in debug mode and use that to confirm that we were able to authenticate and load the Home page while staying authenticated.
The key here is to set up a CookieContainer and use HTTP 1.0, instead of 1.1. I am not entirely sure why forcing it to use 1.0 allows you to authenticate and load the page successfully, but part of the solution is based on this answer.
https://stackoverflow.com/a/10916014/408182
I used Fiddler to make sure that the response sent by the C# Client was the same as with my web browser Chrome. It also allows me to confirm if the C# client is being redirect correctly. In this case we can see that with HTML 1.0 we are getting the HTTP/1.0 302 Found and then redirects us to the home page as intended. If we switch back to HTML 1.1 we will get an HTTP/1.1 417 Expectation Failed message instead.
There is some information on this error message available in this stackoverflow thread.
HTTP POST Returns Error: 417 "Expectation Failed."
Edit: Hack/Fix for .NET 3.5
I have spent a lot of time trying to figure out the difference between 3.5 and 4.0, but I seriously have no clue. It looks like 3.5 is creating a new cookie after the authentication and the only way I found around this was to authenticate the user twice.
I also had to make some changes on the WebClient based on information from this post.
http://dot-net-expertise.blogspot.fr/2009/10/cookiecontainer-domain-handling-bug-fix.html
public class CookieAwareWebClient : WebClient
{
public CookieContainer cookies = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
var request = base.GetWebRequest(address);
var httpRequest = request as HttpWebRequest;
if (httpRequest != null)
{
httpRequest.ProtocolVersion = HttpVersion.Version10;
httpRequest.CookieContainer = cookies;
var table = (Hashtable)cookies.GetType().InvokeMember("m_domainTable", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.GetField | System.Reflection.BindingFlags.Instance, null, cookies, new object[] { });
var keys = new ArrayList(table.Keys);
foreach (var key in keys)
{
var newKey = (key as string).Substring(1);
table[newKey] = table[key];
}
}
return request;
}
}
var client = new CookieAwareWebClient();
var loginData = new NameValueCollection();
loginData.Add("usrname", "test");
loginData.Add("pass", "123");
loginData.Add("nopass_email", "Type in your e-mail and press enter");
loginData.Add("nopass", "0");
// Hack: Authenticate the user twice!
client.UploadValues(#"http://wallbase.cc/user/login", "POST", loginData);
var result = client.UploadValues(#"http://wallbase.cc/user/login", "POST", loginData);
string response = System.Text.Encoding.UTF8.GetString(result);
You may need to add the following:
//get response
using (System.Net.WebResponse resp = req.GetResponse())
{
foreach (Cookie c in resp.Cookies)
_cookies.Add(c);
// Do other stuff with response....
}
Another thing that you might have to do is, if the server responds with a 302 (redirect) the .Net web request will automatically follow it and in the process you might lose the cookie you're after. You can turn off this behavior with the following code:
req.AllowAutoRedirect = false;
The Python you reference uses a different referrer (http://wallbase.cc/start/). It is also followed by another post to (http://wallbase.cc/user/adult_confirm/1). Try the other referrer and followup with this POST.
I think you are authenticating correctly, but that the site needs more info/assertions from you before proceeding.
I am having difficulty in consuming the reCaptcha Web Service using C#/.Net 3.5. Although I think the problem is with consuming web services in general.
String validate = String.Format("http://api-verify.recaptcha.net/verify?privatekey={0}&remoteip={1}&challenge={2}&response={3}", PrivateKey, UserIP, Challenge, Response);
WebClient serviceRequest = new WebClient();
serviceRequest.Headers.Add("ContentType","application/x-www-form-urlencoded")
String response = serviceRequest.DownloadString(new Uri(validate ));
It keeps telling me that the error is: nverify-params-incorrect. Which means:
The parameters to /verify were incorrect, make sure you are passing all the required parameters.
But it's correct. I am using the private key, the IP address (locally) is 127.0.0.1, and the challenge and response seem fine. However the error keeps occurring.
I am pretty sure this is a issue with how I am requesting the service as this is the first time I have actually used webservices and .Net.
I also tried this as it ensures the data is posted:
String queryString = String.Format("privatekey={0}&remoteip={1}&challenge={2}&response={3}",PrivateKey, UserIP, Challenge, Response);
String Validate = "http://api-verify.recaptcha.net/verify" + queryString;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(new Uri(Validate));
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = Validate.Length;
**HttpWebResponse captchaResponse = (HttpWebResponse)request.GetResponse();**
String response;
using (StreamReader reader = new StreamReader(captchaResponse.GetResponseStream()))
response = reader.ReadToEnd();
Seems to stall at the point where I get response.
Any advice?
Thanks in advance
Haven't worked with the recaptcha service previously, but I have two troubleshooting recommendations:
Use Fiddler or Firebug and watch what you're sending outbound. Verifying your parameters would help you with basic troubleshooting, i.e. invalid characters, etc.
The Recaptcha Wiki has an entry about dealing with development on Vista. It doesn't have to be limited to Vista, though; if you're system can handle IPv6, then your browser could be communicating in that format as a default. It appears as if Recaptcha deals with IPv4. Having Fiddler/Firebug working would tell you about those other parameters that could be causing you grief.
This may not help solve your problem but it might provide you with better troubleshooting info.
So got this working, for some reason I needed to write the request to a stream like so:
//Write data to request stream
using (Stream requestSteam = request.GetRequestStream())
requestSteam.Write(byteData, 0, byteData.Length);
Could anyone explain why this works. I didn't think I would need to do this, don't completely understand what's happening behind the scenes..
Damien's answer is correct of course, but just to be clear about the order of things (I was a little confused) and to have a complete code sample...
var uri = new Uri("http://api-verify.recaptcha.net/verify");
var queryString = string.Format(
"privatekey={0}&remoteip={1}&challenge={2}&response={3}",
privateKey,
userIP,
challenge,
response);
var request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Method = WebRequestMethods.Http.Post;
request.ContentLength = queryString.Length;
request.ContentType = "application/x-www-form-urlencoded";
using (var writer = new StreamWriter(request.GetRequestStream()))
{
writer.Write(queryString);
}
string result;
using (var webResponse = (HttpWebResponse)request.GetResponse())
{
var reader = new StreamReader(webResponse.GetResponseStream());
result = reader.ReadToEnd();
}
There's a slight difference in that I'm writing the post variables to the request, but the core of it is the same.