Why is HttpClient so much faster with HTTPS over HTTP? - c#

I was investigating a strange bug the other day in which normalizing a URL would cause a massive 300% slowdown in my application:
if (!TryNormalize(uri, out uri))
throw new ArgumentException("URL is not a valid YouTube URL!");
string pageSource;
using (var http = new HttpClient())
pageSource = await http.GetStringAsync(uri);
When TryNormalize was commented out, GetStringAsync would take about .5s to complete. Yet when it was uncommented, downloading the string would take up to 2s. Turns out that TryNormalize was prefixing all the URLs it processed with "http://", and adding an extra S solved the problem.
So with that said, why does this happen? To my understanding, HTTPS should be slower because the string has to be encrypted before transmission from the server, while HTTP doesn't offer such an option. And even if I'm no expert on HTTP, 300% seems like quite a dramatic slowdown. Am I missing something here?
Edit: Source code of TryNormalize:
public static bool TryNormalize(string videoUri, out string normalized)
{
normalized = null;
var builder = new StringBuilder(videoUri);
videoUri = builder.Replace("youtu.be/", "youtube.com/watch?v=")
.Replace("youtube.com/embed/", "youtube.com/watch?v=")
.Replace("/v/", "/watch?v=")
.Replace("/watch#", "/watch?")
.ToString();
string value;
if (!Query.TryGetParamValue("v", videoUri, out value))
return false;
normalized = "http://youtube.com/watch?v=" + value; // replacing with HTTPS here results in 1.5s speedup
return true;
}

This is because there are many redirections when you use the variantions of youtube url. For example navigating to http://youtu.be/O3UBOOZw-FE results in two redirections.(see the Location header)
1.
HTTP/1.1 302 Found
Date: Fri, 21 Aug 2015 16:52:40 GMT
Server: gwiseguy/2.0
Location: http://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
Content-Length: 0
Content-Type: text/html
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
2.
HTTP/1.1 301 Moved Permanently
Date: Fri, 21 Aug 2015 16:52:40 GMT
Server: gwiseguy/2.0
Content-Type: text/html; charset=utf-8
X-Content-Type-Options: nosniff
Expires: Tue, 27 Apr 1971 19:44:06 EST
Content-Length: 0
Cache-Control: no-cache
X-XSS-Protection: 1; mode=block; report=https://www.google.com/appserve/security-bugs/log/youtube
Location: https://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
X-Frame-Options: SAMEORIGIN
until you finally get the url https://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
Since those redirections are handled automatically by HttpClient, you only see the final result of 3 requests.

Related

(failed)net::ERR_HTTP2_PROTOCOL_ERROR for JWT

I'm using this code in my .NET API:
jwtOptions.Events = new JwtBearerEvents()
{
OnAuthenticationFailed = c =>
{
c.NoResult();
Logger.LogException(c.Exception);
c.Response.StatusCode = 401;
c.Response.ContentType = "text/plain";
if (ApiConfig.IsDeveloping)
{
return c.Response.WriteAsync(c.Exception.ToString());
}
return c.Response.WriteAsync("An error occured processing your authentication.");
},
OnTokenValidated = c =>
{
if (c.Principal.IsInRole("SuperAdmin"))
{
c.HttpContext.Items["IsSuperAdmin"] = true;
}
return Task.CompletedTask;
}
};
But this causes my React client to not get the errors if the authentication token is invalid.
When I look at the Chrome's Network tab, I see this error:
(failed)net::ERR_HTTP2_PROTOCOL_ERROR
What should I do? What have I done wrong in my .NET code?
Update I realized that it depends on the token that I send. If I send a totally invalid token like lkjsdlkjf then I get the 401 response. But if I use a valid JWT token from internet that is invalid for my current session, I get this error.
Update 2
This is the request/response I logged using Wireshark. Request/response seem to be fine. But Chrome or Postman cannot understand it correctly:
GET /locale/data HTTP/1.1
X-Forwarded-Host: api.admin.example.local
X-Forwarded-Proto: https
Connection: Upgrade
Host: api.admin.example.local
Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c
User-Agent: PostmanRuntime/7.29.2
Accept: */*
Postman-Token: 02c8e0f2-0afb-4112-8abf-c4815ea66176
Accept-Encoding: gzip, deflate, br
HTTP/1.1 401 Unauthorized
Content-Type: text/plain
Date: Mon, 09 Jan 2023 07:47:32 GMT
Server: Kestrel
Cache-Control: no-cache, no-store, must-revalidate
Content-Encoding: br
Expires: 0
Pragma: no-cache
Transfer-Encoding: chunked
Vary: Accept-Encoding
Milliseconds: 2238
33
...An error occured processing your authentication.

How to fix Bad Request when writing AppProperties for the file

There is a problem writing AppProperties for the file using the Google drive api. The request is executed, then an error is returned 400.
Request is sent via dotnet api:
public Resource ResourcePatch(string FileID, IDictionary<string, string> properties)
{
GoogleAPI.File file = new GoogleAPI.File() { AppProperties = properties };
var updateRequest = _driveService.Files.Update(file, FileId);
updateRequest.Fields = "id, appProperties";
file = updateRequest.Execute();
...
This is what is sent and what I get.
Request:
PATCH https://www.googleapis.com/drive/v3/files/1DUpjXjNro6LreMgyVbYW7k05eBes0A3o?fields=id%2C%20appProperties HTTP/1.1
User-Agent: Sberdisk google-api-dotnet-client/1.38.0.0 (gzip)
Authorization: Bearer ...
Content-Type: application/json; charset=utf-8
Host: www.googleapis.com
Content-Length: 49
Accept-Encoding: gzip, deflate
{"appProperties":{"SHARED_PROPERTY_NAME":"true"}}
Response, sometimes:
HTTP/1.0 200 This buggy server did not return headers
<html><title>Error 400 (Bad Request)!!1</title></html>
Sometimes:
HTTP/1.1 200 OK
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: Mon, 01 Jan 1990 00:00:00 GMT
Date: Thu, 07 Feb 2019 12:39:13 GMT
Vary: Origin
Vary: X-Origin
Content-Type: application/json; charset=UTF-8
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Server: GSE
Alt-Svc: quic=":443"; ma=2592000; v="44,43,39"
Content-Length: 183
{
"id": "1nWtqhBYKSl0YhtLMypEx5rmAuo43-3Ts",
"appProperties": {
"SHARED_PROPERTY_NAME": "true"
}
}
Sometimes the request is executed, sometimes an Bad Request. I do't see system logic in this. What can be the reason?

C# HttpClient PostAsJsonAsync -> Error reading string

Hi I try to call an web api by server side with:
using (var client = new HttpClient())
{
using (var rsp = client.PostAsJsonAsync<Request>(url, model).Result)
{
if (!rsp.IsSuccessStatusCode)
{
// throw an appropriate exception
}
var result = rsp.Content.ReadAsAsync<string>().Result;
}
}
but I get error
Error reading string. Unexpected token: StartObject. Path '', line 1, position 1.
If I try to call same url from jQuery
$.post('http://localhost/api/Test')
the server return
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json
Expires: -1
Server: Microsoft-IIS/8.5
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Sun, 25 Oct 2015 12:15:56 GMT
Content-Length: 104
{
"Header": {
"Token": "Response",
"Timestamp": "2015-10-25T14:15:56.0092197+02:00"
}
}
The "model" arrive on api controller but I can't get response from request.
ReadAsAsync<T> attempts to deserialize the response to type T. In this case, you're saying you want to deserialize JSON to a string, which doesn't really make sense. Either use a type matching the response (i.e. a custom data structure containing Header, Token, etc.) or use ReadAsStringAsync() if you really want to get a string.

How to get the header values in a GET request?

System.Net.WebRequest req = System.Net.WebRequest.Create(URL);
req.Proxy = null;
System.Net.WebResponse resp = req.GetResponse();
System.IO.StreamReader sr = new System.IO.StreamReader(resp.GetResponseStream());
string result = sr.ReadToEnd().Trim();
I have this code which makes a GET request on my URL and returns the JSON data. However, I also need to get the header values from it.
Example output of the URL is the following:
Content-Type: application/json
Content-Language: en
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Vary: Cookie, Accept-Language
Pragma: no-cache
Cache-Control: private, no-cache, no-store, must-revalidate
Set-Cookie: csrftoken=66161e4f97cbf771199ff78cfeea835e; expires=Sat, 20-Feb-2016 06:49:03 GMT; Max-Age=31449600; Path=/
Set-Cookie: mid=VOgqXwABAAHBumwiwEqLc2ScukeD; expires=Fri, 16-Feb-2035 06:49:03 GMT; Max-Age=630720000; Path=/
Connection: close
Content-Length: 108
{"status":"ok","shift":18,"header":"638wprvx7lg5Um0dZzBAKfjIkML12ChQ","edges":100,"iterations":10,"size":42}
Using my code, I can get the last JSON data returned, but I also need the headers. How can I do that? Thanks.
You can use Headers property of HttpRequest
Ref
resp.Headers
to get the headers for the response.
The HttpWebRequest has a built in property called Headers.
Please have a look at the MSDN page here:
https://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.headers(v=vs.110).aspx
At the bottom of the page you'll find a very simpe code example that should get you started!

Http Header request returns 'ServerProtocolViolation'

I've got an interesting problem...
I'm messing around with a link checker program, here's the heart of it:
private static string CheckURL(string url)
{
string status = string.Empty;
string strProxyURL = "http://blah:1010";
string strNetworkUserName = "blahblahblah";
string strNetworkUserPassword = "blahblahblah";
string strNetworkUserDomain = "blahblah";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
WebProxy proxy = new System.Net.WebProxy(strProxyURL, false);
proxy.Credentials = new System.Net.NetworkCredential(strNetworkUserName, strNetworkUserPassword, strNetworkUserDomain);
request.Method = "HEAD";
request.Proxy = proxy;
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
status = response.StatusCode.ToString();
}
}
catch (WebException ex)
{
status = ex.Status.ToString();
}
return url + ";" + status;
}
... which I pinched from here.
The problem is that for most URLs I feed it I get an 'OK' status. But I have a page that acts as a PDF viewer that seems to return an OK when examined with Fiddler but shows 'ServerProtocolViolation' as the status within my checker.
I've noticed one oddity of the Fiddler result of this URL, it's got 3 instances of x-xss-protection and x-frame-options, but that's not going to stop it from working, is it???
Here's the Fiddler data:
HTTP/1.0 200 OK
Cache-Control: private
Pragma: public
Content-Length: 230070
Content-Type: application/pdf
Expires: Tue, 27 Jan 2015 17:17:46 GMT
Server: Microsoft-IIS/7.5
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
shortcut icon: href='http://www.jameshay.co.uk/favicon.ico' type='image/x-icon'
Content-Disposition: inline;filename=JamesHayDocJHMP0016Doc2931.pdf
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
X-AspNet-Version: 4.0.30319
X-UA-Compatible: IE=edge
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
Date: Tue, 27 Jan 2015 17:17:45 GMT
X-Cache: MISS from proxy-3_10
X-Cache: MISS from ClientSiteProxy
X-Cache-Lookup: MISS from ClientSiteProxy:3128
Connection: close
Edit (28/01 09:10am):
When using Fiddler I replace the proxy with this...
WebProxy proxy = new System.Net.WebProxy("127.0.0.1", 8888);
The DocumentView page is the only one that still adds the x-xss-protection and x-frame-options via the code behind, as the web.config file has those settings too:
<httpProtocol>
<customHeaders>
<clear />
<add name="X-UA-Compatible" value="IE=edge" />
<add name="X-XSS-Protection" value="1; mode=block" />
<add name="X-Frame-Options" value="DENY" />
</customHeaders>
</httpProtocol>
I presume it's that which is causing the duplication... but is a duplication really going to mess with the response?
(End of edit)
So what can I do to either get the http request to come back with an 'OK' within my code, or is there an alternative way of checking the URL exists that I can use?
Any help, as always, much appreciated :)
Here's an example URL for the PDF viewer

Categories

Resources