Sending connection header set as keep-alive - c#

I'm trying to send the same information from my application as I send from the browser. Here is part of data captured by Fiddler:
POST http://something/ HTTP/1.1
Host: something.com
Connection: keep-alive
I got stuck with this connection property. If I set the property keep-alive to true, in Fiddler I see this:
Proxy-Connection: Keep-Alive
If I try to set the connection property to Keep-alive, I get this error:
Keep-Alive and Close may not be set using this property.
How to write the code so that in Fiddler I can see this:
Connection: keep-alive
My full code:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://myUrl ");
request.Method = "POST";
request.ProtocolVersion = HttpVersion.Version11;
request.Accept = "*/*";
WebHeaderCollection headers = new WebHeaderCollection();
headers.Add("Accept-Encoding", "myEncoding");
headers.Add("Accept-Language", "myLang");
request.Headers = headers;
request.ContentType = "myContentType";
request.Referer = "myReferer";
request.UserAgent = "myUserAgent";
ASCIIEncoding encoding = new ASCIIEncoding();
string postData = "myData";
byte[] data = encoding.GetBytes(postData);
request.GetResponse().Close();

To have your application send a Connection: Keep-Alive header, use the KeepAlive property on the HttpWebRequest object.
When a client knows that it is behind a proxy (like Fiddler), it may send a Proxy-Connection: Keep-Alive header instead of a Connection: Keep-Alive header. The expectation is that a HTTP/1.1 proxy (like Fiddler) will convert that header from Proxy-Connection to Connection before passing it to the upstream server.
This "proxy renames header" pattern was introduced many years ago to attempt to workaround hangs in HTTP/1.0 servers that didn't support Keep-Alive properly; the idea is that the server would ignore the Proxy-Connection header if the outdated proxy didn't rename the header by removing the Proxy- prefix.

Related

Return Unauthorized (401) only when calling from code (c#)

UPDATE
As #Alexandru Clonțea suggested, I checked the fiddler log and found:
In both success or fail cases, there are actually 2 requests being sent. The first request are mostly the same for both cases, it's something like:
GET http://myservice.com/handler?param1=something&param2=somethingelse HTTP/1.1
Authorization: Basic xxxxxx
Accept: application/json, application/xml, text/json, text/x-json,
text/javascript, text/xml
User-Agent: RestSharp/100.0.0.0
Host: myservice.com
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
The response for them are the same, which is:
HTTP/1.1 301 Moved Permanently
Content-Type: text/html; charset=utf-8
Location: /handler/?param1=something&param2=somethingelse
Date: Sat, 08 Sep 2018 01:50:16 GMT
Content-Length: 115
Moved Permanently.
I have noticed that it always try to redirect the call to /handler/?param1=something&param2=somethingelse, and that's because of the setup of the server code. it's actually working as expected. The difference is in the second request. The second request of the failure case (which is the c# code) doesn't have the authorization header and that's why it failed. Now, my question will be, why does the second request miss the authorization header? How can I fix it? Below is an example of the failed request:
GET http://myservice.com/handler/?param1=something&param2=somethingelse HTTP/1.1
Accept: application/json, application/xml, text/json, text/x-json,
text/javascript, text/xml
User-Agent: RestSharp/100.0.0.0
Accept-Encoding: gzip, deflate
Host: myservice.com
Backgroud:
I have a service written in GO deployed on a server. It requires a basic authentication. For example, I can call it successfully with the following request:
GET /handler/?
param1=something&param2=somethingelse HTTP/1.1
> Host: myservice.com
> Authorization: Basic xxxxxx
> User-Agent: RestClient/5.16.6
> Accept: */*
The request above is made by a rest api client tool (like postman), and it's working fine. It's also working fine if I call it from a browser.
Problem:
Now, I try to make the same call to the same service using c# code, and I have it as:
// pass cert validation
ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, sslPolicyErrors) => true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3 | SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
HttpClient client = new HttpClient();
var byteArray = Encoding.ASCII.GetBytes(username + ":" + password);
var auth = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", Convert.ToBase64String(byteArray));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, url);
request.Headers.Authorization = auth;
var response = client.SendAsync(request).Result; // don't need async
But in this case, I am getting Unauthorized (401) back. I have checked into the actually request that was sent by the code, it had exactly the same authorization header as the one shows above (Authorization: Basic xxxxxx, and the xxxxxx is the same as above) and same uri as well. Actually, everything it sent looks the same as when I used the rest api client tool, but it was just failed in code.
when I check the log on the server side, I see the log below when it returns 401:
[GIN-debug] redirecting request 301: /handler --> /hanlder/?param1=something&param2=somethingelse
but I don't see this log when the call is from the rest api client tool (or browser)
As you may know from the log, the server-side code is using the go gin framework. But since it works fine in other cases, I don't think it's a problem with the server-side code.
Back to the C# code, I have tried to use the HttpWebRequest with NetworkCredential instead of the HttpClient, and I also try to use client.DefaultRequestHeaders.Authorization = auth, but I was still getting the same error.
I am wondering if someone has seen this before or could help? It will be really appreciated.
As a workaround, I can modify the request to be http://myservice.com/handler/?param1=something&param2=somethingelse so that no redirection is needed. Thus, it will be correctly authorized.
But still, haven't figure out how to make the second request to be sent with the authorize header yet.

RestSharp - Service Unavailable - Maximum number of active clients reached

I'm using RestSharp to communication with an Web Service.
I use this code
public static object GetTagValue(string url, string tagname, out string resp)
{
object result = null;
resp = string.Empty;
string theReq = string.Format("tags/{0}", tagname);
var client = new RestClient(url);
var request = new RestRequest(theReq, Method.GET);
request.RequestFormat = DataFormat.Json;
IRestResponse response = client.Execute(request);
resp = response.Content;
if (!string.IsNullOrWhiteSpace(resp))
{
dynamic json = JValue.Parse(resp);
if (null != json.value)
{
result = json.value;
}
}
return result;
}
Call to the server
GET http://ame-hp/tags/int32 HTTP/1.1
Accept: application/json, application/xml, text/json, text/x-json,
text/javascript, text/xml
User-Agent: RestSharp/105.2.3.0
Host: ame-hp
Accept-Encoding: gzip, deflate
Response from the server for a working call:
HTTP/1.1 200 Ok
Server: Internet Pack HTTP Server
Connection: Close
Set-Cookie: SID=f11985564d;Expires=Fri, 27 Jan 2017 07:52:17
GMT;Path=/
Content-Type: application/json
Content-Length: 133
{"quality":"Good","description":"","name":"int32","value":0,"dataType":"int32","controllers":[],"initialValue":null,"readonly":false}
It's working but after two calls the service answers with this
{"Code":503,"Message":"Service Unavailable - Maximum number of active clients reached."}
Third call to server
GET http://ame-hp/tags/int32 HTTP/1.1
Accept: application/json, application/xml, text/json, text/x-json,
text/javascript, text/xml
User-Agent: RestSharp/105.2.3.0
Host: ame-hp
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Response from the server for all calls from now and until the server is restarted
HTTP/1.1 503 Service Unavailable - Maximum number of active clients
reached.
Server: Internet Pack HTTP Server
Connection: Keep-Alive
Content-Type: application/json
Content-Length: 88
{"Code":503,"Message":"Service Unavailable - Maximum number of active
clients reached."}
So I assume that the service has a limit of two clients.
But why do there exist two active clients?
Either the Server or RestSharp is not closing the connection, but which?
Is there something I can do in RestSharp to close the connection?
The problem was, as I assumed, that the server only allows 2 clients. On the first connection a Session Cookie is sent back and this has to be used for the rest of the calls.
In RestSharp you only need to add one line to make this happen
After creating the client (which by the way you need to reuse for all the calls). You add this line:
client.CookieContainer = new System.Net.CookieContainer();
The initialization of the client would be
client = new RestClient();
client.CookieContainer = new System.Net.CookieContainer();
Then you can use
client.BaseUrl = new Uri(url);
To set the url you want to call

Proper form of HTTPS request

I need to send POST request and recieve access token.
Http request should look like this:
POST /oauth/token HTTP/1.1
Host: api.quizlet.com
Authorization: Basic c3ZWRUhNZVA0aDp3eS4yUXA0ZXNFY0xQUFl2WkRFTGpn
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
grant_type=authorization_code&code=GENERATED_CODE
I don't know, how to send "grant_type" and "code" in my request, because (according to Fiddler, where I have tested it) they should be in Request body.
Code I have looks like this:
client = new WebClient();
client.Headers[HttpRequestHeader.Authorization] = "Basic " + "MY_SECRET_CODE";
client.Headers[HttpRequestHeader.ContentType] = "application/x-www-form-urlencoded";
client.Headers[HttpRequestHeader.Host] = "api.quizlet.com";
client.Headers[HttpRequestHeader.AcceptCharset] = "UTF-8";
client.UploadStringCompleted += ClientOnUploadStringCompleted;
client.UploadStringAsync(tokenUrl, "POST",string.Format("grant_type={0}&code={1}",
HttpUtility.HtmlEncode("authorization_code"),HttpUtility.HtmlEncode(code)));
Btw, this code runs on WP7 and I have been messing with this single request for almost 2 days and those values, that I provide in request, are 100% right, because I tried to paste sample request in Fiddler and recieved proper token.
EDIT:
I forgot redirect_uri parameter in the data, which I tried to upload, so it didn't work...Proper data string should look like this:
string.Format("grant_type={0}&code={1}&redirect_uri={2}",
HttpUtility.HtmlEncode("authorization_code"),HttpUtility.HtmlEncode(code), HttpUtility.HtmlEncode("http://someurl.com"))
At least one issue seems to be that the expected character set for the target server should be set in the ContentType header and not the AcceptCharset:
client.Headers[HttpRequestHeader.ContentType] = "application/x-www-form-urlencoded; charset=UTF-8";
client.Headers[HttpRequestHeader.Host] = "api.quizlet.com";
//client.Headers[HttpRequestHeader.AcceptCharset] = "UTF-8";

How to set Accept and Accept-Language header fields?

I can set Request.Content-Type = ... , Request.Content-Length = ...
How to set Accept and Accept-Language?
I want to upload a file (RFC 1867) and need to create a request like this:
POST /test-upload.php.xml HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (Windows NT 5.2; WOW64; rv:2.0.1) Gecko/20100101 Firefox/4.0.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: tr-tr,tr;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-9,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Content-Type: multipart/form-data; boundary=---------------------------21724139663430
Content-Length: 56048
Take a look at Accept property:
HttpWebRequest myHttpWebRequest=(HttpWebRequest)WebRequest.Create(myUri);
myHttpWebRequest.Accept="image/*";
HttpWebResponse myHttpWebResponse=
(HttpWebResponse)myHttpWebRequest.GetResponse();
This MSDN article shows how to add custom headers to your request:
//Get the headers associated with the request.
WebHeaderCollection myWebHeaderCollection = myHttpWebRequest.Headers;
//Add the Accept-Language header (for Danish) in the request.
myWebHeaderCollection.Add("Accept-Language:da");
//Include English in the Accept-Langauge header.
myWebHeaderCollection.Add("Accept-Language","en;q=0.8");
When you want to set the Accept type and content type, just cast the webrequest to HttpwebRequest
var webreq= (HttpWebRequest)WebRequest.Create(requestUri);
webreq.Method = "POST";
webreq.Accept = "application/json";
webreq.ContentType = "application/json";
You need to be sure that you type cast the request to (HttpWebRequest), where the accept header property is available.
In the old WebRequest class, the Accept header is not accessible.
I have to confirm after several annoying attempts to use the headers that the
myWebHeaderCollection.Add("foo","bar"); solution works perfectly.
if you want to set the Language.
myWebHeaderCollection.Add("AcceptCharset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7");
myWebHeaderCollection.Add("TransferEncoding", "gzip,deflate");
Does not set the values however. Which may seem like a logical conclusion given the first one works.
If you are using HttpRequestMessage, set the header using Headers.Add method. In your case :
request.Headers.Add("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8");
In a case where I have the pleasure of maintaining 15 year old vb.NET 3.5 code, this workaround was successful for me:
webReq = WebRequest.Create(apiHost)
CType(webReq, HttpWebRequest).Accept = "application/json"

HttpWebRequest with caching enabled throws exceptions

I'm working on a small C#/WPF application that interfaces with a web service implemented in Ruby on Rails, using handcrafted HttpWebRequest calls and JSON serialization. Without caching, everything works as it's supposed to, and I've got HTTP authentication and compression working as well.
Once I enable caching, by setting request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable);, things go awry - in the production environment. When connecting to a simple WEBrick instance, things work fine, I get HTTP/1.1 304 Not Modified as expected and HttpWebRequest delivers the cached content.
When I try the same against the production server, running nginx/0.8.53 + Phusion Passenger 3.0.0, the application breaks. First request (uncached) is served properly, but on the second request which results in the 304 response, I get a WebException stating that "The request was aborted: The request was canceled." as soon as I invoke request.GetResponse().
I've run the connections through fiddler, which hasn't helped a whole lot; both WEBrick and nginx return an empty entity body, albeit different response headers. Intercepting the request and changing the response headers for nginx to match those of WEBrick didn't change anything, leading me to think that it could be a keep-alive issue; setting request.KeepAlive = false; changes nothing, though - it doesn't break stuff when connecting to WEBrick, and it doesn't fix stuff when connecting to nginx.
For what it's worth, the WebException.InnerException is a NullReferenceException with the following StackTrace:
at System.Net.HttpWebRequest.CheckCacheUpdateOnResponse()
at System.Net.HttpWebRequest.CheckResubmitForCache(Exception& e)
at System.Net.HttpWebRequest.DoSubmitRequestProcessing(Exception& exception)
at System.Net.HttpWebRequest.ProcessResponse()
at System.Net.HttpWebRequest.SetResponse(CoreResponseData coreResponseData)
Headers for the (working) WEBrick connection:
########## request
GET /users/current.json HTTP/1.1
Authorization: Basic *REDACTED*
Content-Type: application/json
Accept: application/json
Accept-Charset: utf-8
Host: testbox.local:3030
If-None-Match: "84a49062768e4ca619b1c081736da20f"
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
########## response
HTTP/1.1 304 Not Modified
X-Ua-Compatible: IE=Edge
Etag: "84a49062768e4ca619b1c081736da20f"
Date: Wed, 01 Dec 2010 18:18:59 GMT
Server: WEBrick/1.3.1 (Ruby/1.8.7/2010-08-16)
X-Runtime: 0.177545
Cache-Control: max-age=0, private, must-revalidate
Set-Cookie: *REDACTED*
Headers for the (exception-throwing) nginx connection:
########## request
GET /users/current.json HTTP/1.1
Authorization: Basic *REDACTED*
Content-Type: application/json
Accept: application/json
Accept-Charset: utf-8
Host: testsystem.local:8080
If-None-Match: "a64560553465e0270cc0a23cc4c33f9f"
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
########## response
HTTP/1.1 304 Not Modified
Connection: keep-alive
Status: 304
X-Powered-By: Phusion Passenger (mod_rails/mod_rack) 3.0.0
ETag: "a64560553465e0270cc0a23cc4c33f9f"
X-UA-Compatible: IE=Edge,chrome=1
X-Runtime: 0.240160
Set-Cookie: *REDACTED*
Cache-Control: max-age=0, private, must-revalidate
Server: nginx/0.8.53 + Phusion Passenger 3.0.0 (mod_rails/mod_rack)
UPDATE:
I tried doing a quick-and-dirty manual ETag cache, but turns out that's a no-go: I get a WebException when invoking request.GetResponce(), telling me that "The remote server returned an error: (304) Not Modified." - yeah, .NET, I kinda knew that, and I'd like to (attempt to) handle it myself, grr.
UPDATE 2:
Getting closer to the root of the problem. The showstopper seems to be a difference in the response headers for the initial request. WEBrick includes a Date: Wed, 01 Dec 2010 21:30:01 GMT header, which isn't present in the nginx reply. There's other differences as well, but intercepting the initial nginx reply with fiddler and adding a Date header, the subsequent HttpWebRequests are able to process the (unmodified) nginx 304 replies.
Going to try to look for a workaround, as well as getting nginx to add the Date header.
UPDATE 3:
It seems that the serverside issue is with Phusion Passenger, they have an open issue about lack of the Date header. I'd still say that HttpWebRequest's behavior is... suboptimal.
UPDATE 4:
Added a Microsoft Connect ticket for the bug.
I think the designers find it reasonable to throw an exception when the "expected behavior"---i.e., getting a response body---cannot be completed. You can handle this somewhat intelligently as follows:
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError)
{
var statusCode = ((HttpWebResponse)ex.Response).StatusCode;
// Test against HttpStatusCode enumeration.
}
else
{
// Do something else, e.g. throw;
}
}
So, it turns out to be Phusion Passenger (or nginx, depending on how you look at it - and Thin as well) that doesn't add a Date HTTP response header, combined with what I see as a bug in .NET HttpWebRequest (in my situation there's no If-Modified-Since, thus Date shouldn't be necessary) leading to the problem.
The workaround for this particular case was to edit our Rails ApplicationController:
class ApplicationController < ActionController::Base
# ...other stuff here
before_filter :add_date_header
# bugfix for .NET HttpWebRequst 304-handling bug and various
# webservers' lazyness in not adding the Date: response header.
def add_date_header
response.headers['Date'] = Time.now.to_s
end
end
UPDATE:
Turns out it's a bit more complex than "just" setting HttpRequestCachePolicy - to repro, I also need to have manually constructed HTTP Basic Auth. So the involved components are the following:
HTTP server that doesn't include a HTTP "Date:" response header.
manual construction of HTTP Authorization request header.
use of HttpRequestCachePolicy.
Smallest repro I've been able to come up with:
namespace Repro
{
using System;
using System.IO;
using System.Net;
using System.Net.Cache;
using System.Text;
class ReproProg
{
const string requestUrl = "http://drivelog.miracle.local:3030/users/current.json";
// Manual construction of HTTP basic auth so we don't get an unnecessary server
// roundtrip telling us to auth, which is what we get if we simply use
// HttpWebRequest.Credentials.
private static void SetAuthorization(HttpWebRequest request, string _username, string _password)
{
string userAndPass = string.Format("{0}:{1}", _username, _password);
byte[] authBytes = Encoding.UTF8.GetBytes(userAndPass.ToCharArray());
request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(authBytes);
}
static public void DoRequest()
{
var request = (HttpWebRequest) WebRequest.Create(requestUrl);
request.Method = "GET";
request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable);
SetAuthorization(request, "user#domain.com", "12345678");
using(var response = request.GetResponse())
using(var stream = response.GetResponseStream())
using(var reader = new StreamReader(stream))
{
string reply = reader.ReadToEnd();
Console.WriteLine("########## Server reply: {0}", reply);
}
}
static public void Main(string[] args)
{
DoRequest(); // works
DoRequest(); // explodes
}
}
}

Categories

Resources