Http Header request returns 'ServerProtocolViolation' - c#

I've got an interesting problem...
I'm messing around with a link checker program, here's the heart of it:
private static string CheckURL(string url)
{
string status = string.Empty;
string strProxyURL = "http://blah:1010";
string strNetworkUserName = "blahblahblah";
string strNetworkUserPassword = "blahblahblah";
string strNetworkUserDomain = "blahblah";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
WebProxy proxy = new System.Net.WebProxy(strProxyURL, false);
proxy.Credentials = new System.Net.NetworkCredential(strNetworkUserName, strNetworkUserPassword, strNetworkUserDomain);
request.Method = "HEAD";
request.Proxy = proxy;
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
status = response.StatusCode.ToString();
}
}
catch (WebException ex)
{
status = ex.Status.ToString();
}
return url + ";" + status;
}
... which I pinched from here.
The problem is that for most URLs I feed it I get an 'OK' status. But I have a page that acts as a PDF viewer that seems to return an OK when examined with Fiddler but shows 'ServerProtocolViolation' as the status within my checker.
I've noticed one oddity of the Fiddler result of this URL, it's got 3 instances of x-xss-protection and x-frame-options, but that's not going to stop it from working, is it???
Here's the Fiddler data:
HTTP/1.0 200 OK
Cache-Control: private
Pragma: public
Content-Length: 230070
Content-Type: application/pdf
Expires: Tue, 27 Jan 2015 17:17:46 GMT
Server: Microsoft-IIS/7.5
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
shortcut icon: href='http://www.jameshay.co.uk/favicon.ico' type='image/x-icon'
Content-Disposition: inline;filename=JamesHayDocJHMP0016Doc2931.pdf
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
X-AspNet-Version: 4.0.30319
X-UA-Compatible: IE=edge
X-XSS-Protection: 1; mode=block
X-Frame-Options: DENY
Date: Tue, 27 Jan 2015 17:17:45 GMT
X-Cache: MISS from proxy-3_10
X-Cache: MISS from ClientSiteProxy
X-Cache-Lookup: MISS from ClientSiteProxy:3128
Connection: close
Edit (28/01 09:10am):
When using Fiddler I replace the proxy with this...
WebProxy proxy = new System.Net.WebProxy("127.0.0.1", 8888);
The DocumentView page is the only one that still adds the x-xss-protection and x-frame-options via the code behind, as the web.config file has those settings too:
<httpProtocol>
<customHeaders>
<clear />
<add name="X-UA-Compatible" value="IE=edge" />
<add name="X-XSS-Protection" value="1; mode=block" />
<add name="X-Frame-Options" value="DENY" />
</customHeaders>
</httpProtocol>
I presume it's that which is causing the duplication... but is a duplication really going to mess with the response?
(End of edit)
So what can I do to either get the http request to come back with an 'OK' within my code, or is there an alternative way of checking the URL exists that I can use?
Any help, as always, much appreciated :)
Here's an example URL for the PDF viewer

Related

Jira API Help C# HttpClient

Okay, so I'm very new to using API's in code and I've been able to use a few that were actually pretty easy. But none of them required authentication. I've been trying to use Jira's REST API service via C#'s HttpClient class. See code below:
public void UpdateJiraIssue(string issueValue)
{
string url = $#"http://jira.mySite.com/rest/api/2/issue/{issueValue}/editmeta";
string jsonString = #"myNeatJsonData";
var content = new StringContent(jsonString, Encoding.UTF8, "application/json");
//Initialize Client
HttpClient apiClient = new HttpClient();
apiClient.BaseAddress = new System.Uri(url);
apiClient.DefaultRequestHeaders.Accept.Clear();
byte[] cred = UTF8Encoding.UTF8.GetBytes("username:password");
apiClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(cred));
apiClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
async Task RunJiraAPI()
{
using (HttpResponseMessage resp = await apiClient.PostAsync("editmeta", content))
{
if (resp.IsSuccessStatusCode)
{
var jsonSring = await resp.Content.ReadAsStringAsync();
}
}
}
RunJiraAPI();
return;
}
The problem I run into is that I get a 401 error (Authentication). Here's what my 'resp' object contains when I run the code:
resp: {StatusCode: 401, ReasonPhrase: ' ', Version: 1.1, Content: System.Net.Http.StreamContent, Headers:
{
X-AREQUESTID: 400x1314x1
X-XSS-Protection: 1; mode=block
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
Content-Security-Policy: frame-ancestors 'self'
X-ASEN: SEN-11158344
X-AUSERNAME: anonymous
Cache-Control: no-store, no-transform, no-cache
Set-Cookie: atlassian.xsrf.token=B2ZY-C2JQ-1AGH-PBLW_5ccc79da5af8e6abcb9bff5250f3305af3b2877a_lout; Path=/; Secure
WWW-Authenticate: OAuth realm="https%3A%2F%2Fjira.mySite.com"
X-Powered-By: ARR/3.0
X-Powered-By: ASP.NET
Date: Wed, 15 Jan 2020 13:40:22 GMT
Content-Length: 109
Content-Type: application/json; charset=UTF-8
}}
Request Message: {Method: POST, RequestUri: 'https://jira.rhlan.com/rest/api/2/issue/RHD-1116/editmeta', Version: 1.1, Content: System.Net.Http.StringContent, Headers:
{
Authorization: Basic cWE6aGVjc29mdDEyMw==
Accept: application/json
Content-Type: Application/json; charset=utf-8
Content-Length: 70
}}
Status Code: Unauthorized
I need to work on my json string a bit to get it working right (which is why I didn't include what it actually contains), but once I get passed the authentication error, I'll probably actually change things to do a get Jira issue via the API so I can see all the json data returned that way. Then I'll edit my json string accordingly.
Any ideas what I'm doing wrong here?
You can pass in credentials assuming you have a username and an api token.
string credentials= string.Format("{0}:{1}", username, apitoken);
byte[] byteCredentials = UTF8Encoding.UTF8.GetBytes(credentials);
And in your apiClient you can use it like this.
apiClient.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", Convert.ToBase64String(byteCredentials));
You need a username and api-token. Your api-token should be your login password.

Why is HttpClient so much faster with HTTPS over HTTP?

I was investigating a strange bug the other day in which normalizing a URL would cause a massive 300% slowdown in my application:
if (!TryNormalize(uri, out uri))
throw new ArgumentException("URL is not a valid YouTube URL!");
string pageSource;
using (var http = new HttpClient())
pageSource = await http.GetStringAsync(uri);
When TryNormalize was commented out, GetStringAsync would take about .5s to complete. Yet when it was uncommented, downloading the string would take up to 2s. Turns out that TryNormalize was prefixing all the URLs it processed with "http://", and adding an extra S solved the problem.
So with that said, why does this happen? To my understanding, HTTPS should be slower because the string has to be encrypted before transmission from the server, while HTTP doesn't offer such an option. And even if I'm no expert on HTTP, 300% seems like quite a dramatic slowdown. Am I missing something here?
Edit: Source code of TryNormalize:
public static bool TryNormalize(string videoUri, out string normalized)
{
normalized = null;
var builder = new StringBuilder(videoUri);
videoUri = builder.Replace("youtu.be/", "youtube.com/watch?v=")
.Replace("youtube.com/embed/", "youtube.com/watch?v=")
.Replace("/v/", "/watch?v=")
.Replace("/watch#", "/watch?")
.ToString();
string value;
if (!Query.TryGetParamValue("v", videoUri, out value))
return false;
normalized = "http://youtube.com/watch?v=" + value; // replacing with HTTPS here results in 1.5s speedup
return true;
}
This is because there are many redirections when you use the variantions of youtube url. For example navigating to http://youtu.be/O3UBOOZw-FE results in two redirections.(see the Location header)
1.
HTTP/1.1 302 Found
Date: Fri, 21 Aug 2015 16:52:40 GMT
Server: gwiseguy/2.0
Location: http://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
Content-Length: 0
Content-Type: text/html
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
2.
HTTP/1.1 301 Moved Permanently
Date: Fri, 21 Aug 2015 16:52:40 GMT
Server: gwiseguy/2.0
Content-Type: text/html; charset=utf-8
X-Content-Type-Options: nosniff
Expires: Tue, 27 Apr 1971 19:44:06 EST
Content-Length: 0
Cache-Control: no-cache
X-XSS-Protection: 1; mode=block; report=https://www.google.com/appserve/security-bugs/log/youtube
Location: https://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
X-Frame-Options: SAMEORIGIN
until you finally get the url https://www.youtube.com/watch?v=O3UBOOZw-FE&feature=youtu.be
Since those redirections are handled automatically by HttpClient, you only see the final result of 3 requests.

Request canceled when downloading feed

I have a worker role on Azure. Every two hours, i download products from 3 different affiliate company.
First time, when the job runs, it's works perfect. The next time the job should run, I get from one affiliate, "The request was aborted: The request was canceled." The 2 others works perfect.
It's not just once, every time. I have ask this affiliate company, but there is no problem. So it must be my code. I have this to download json doc:
using (var Client = new WebClient())
{
Client.Headers.Add("X-API-KEY", Key);
Data = Client.DownloadString(URL);
}
What have I missed?
UPDATE 1:
i have try this:
HttpWebRequest Req = (HttpWebRequest)WebRequest.Create(URL);
Req.KeepAlive = false;
Req.Headers.Add("X-API-KEY", Key);
Req.Method = "GET";
using (var Resp = Req.GetResponse())
{
using (var Reader = new StreamReader(Resp.GetResponseStream()))
{
Data = Reader.ReadToEnd();
}
}
Same problem.
UPDATE 2
Request 1
GET https://se.#####.com/1/stores.json HTTP/1.1
X-API-KEY: x.............N
Host: se.#####.com
Connection: Close
Response 1
HTTP/1.1 200
Server: nginx
Date: Mon, 08 Jun 2015 14:56:39 GMT
Content-Type: application/json
Transfer-Encoding: chunked
Connection: close
Set-Cookie: ci_session=..............; expires=Mon, 08-Jun-2015 16:56:39 GMT; Max-Age=7200; path=/
e30
{"status":true,"data":[................]}
0
Request 2
GET https://se.#####.com/1/stores.json HTTP/1.1
X-API-KEY: x.............N
Host: se.#####.com
Connection: Close
Response 2
HTTP/1.1 200
Server: nginx
Date: Mon, 08 Jun 2015 15:06:29 GMT
Content-Type: application/json
Transfer-Encoding: chunked
Connection: close
Set-Cookie: ci_session=..................; expires=Mon, 08-Jun-2015 17:06:29 GMT; Max-Age=7200; path=/
e30
{"status":true,"data":[.....................]}
0
UPDATE 3
[TestMethod]
public void DownloadTest()
{
Test();
Test();
Test();
}
private static void Test()
{
const string merchantsUrl = "https://se.#####.com/1/stores.json";
string Data;
var Req = (HttpWebRequest)WebRequest.Create(merchantsUrl);
Req.KeepAlive = false;
Req.Headers.Add("X-API-KEY", ".....");
Req.Method = "GET";
using (var Resp = Req.GetResponse())
{
using (var Reader = new StreamReader(Resp.GetResponseStream()))
{
Data = Reader.ReadToEnd();
}
}
}
Change the order of requests and see if the request to same company is failing?
By default WebClient opens a connection as KeepAlive which might be causing this issue. So try explicitly mentioning that KeepAlive mode is false.
Capture the request details for the first time and second time in fiddler or similar tools which could give you more details in tracking down the issue.

Httpresponse Error 503

I'm trying to download a web page data from a website that is hosted by CloudFlare. It uses HTTPS and gets ID for the connection before getting into the page.
I'm trying to get the id in WebRequest and WebResponse but I'm getting the following error:
An unhandled exception of type 'System.Net.WebException' occurred in System.dll
Additional information: The remote server returned an error: (503) Server Unavailable.
I tried to make the request from Fiddler and here is the response:
HTTP/1.1 503 Service Temporarily Unavailable
Date: Tue, 12 May 2015 13:38:17 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=d57a7d982035dad7ebafe63444d125e451431437897; expires=Wed, 11-May-16 13:38:17 GMT; path=/; domain=.hornystress.me; HttpOnly
X-Frame-Options: SAMEORIGIN
Refresh: 8;URL=/cdn-cgi/l/chk_jschl?pass=1431437901.815-Ym1g5qTodK
Cache-Control: no-cache
Server: cloudflare-nginx
CF-RAY: 1e5685ed559606ee-LHR
Here is my code :
public static string GetCookie(string link)
{
WebRequest request = WebRequest.Create("https://hornystress.me");
request.Proxy = WebProxy.GetDefaultProxy();
request.Timeout *= 100;
WebResponse response = request.GetResponse();
return response.Headers.Get("Set-Cookie");
}
Whatever you're doing may look like an attack against the site & it is triggering a security feature. You might want to ask the site owner to whitelist the IP(s) the calls are being made from.
It's Cloudflare security, designed to stop the exact thing you're trying to do. You won't be able to access that site using a script.
the problem was the the website normal response is Error code 503 with a cookie that i want... so the compiler throws a WebException error;wich i should catch...
public static string GetCookie()
{
WebRequest request = WebRequest.Create("https://hornystress.me");
request.Proxy = WebProxy.GetDefaultProxy();
request.Timeout *= 100;
string cookie;
WebResponse response;
try
{
response = request.GetResponse();
cookie = response.Headers.Get("Set-Cookie");
}
catch (WebException we)
{
cookie=we.Response.Headers.Get("Set-Cookie");
}
return cookie;
}

Call PHP based webservice

I'm working on a asp.net webapplication, build in C#. I have to implement a third party web-service that is created using PHP. It is a very simple service containing only one function. I added the service reference using the wsdl, so far so good.
When I call the web-service function with the correct parameters it always returns null. I started troubleshooting with SoapUI. I captured the soap message from the application and pasted it in SoapUI, executed it and it returned the correct message. Using Fiddler I discovered something weird in the response from the web-service as shown in the raw output:
HTTP/1.1 200 OK
Date: Wed, 21 Nov 2012 15:24:31 GMT
Server: Apache/2.2.16 (Unix) mod_ssl/2.2.16 OpenSSL/0.9.8o
X-Powered-By: PHP/5.2.13-pl1-gentoo
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: PHPSESSID=ddc342cfe7e56e77456fe31b758bf3de; path=/
Vary: Accept-Encoding,User-Agent
Content-Encoding: gzip
Content-Length: 812
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: text/xml; charset=utf-8
?????????KS?0???`r?~?<??? ?I
I?????0??+?.????kK.????[E?????[???????}??g4
?1J???~w?i??<?M?+w??[>]ziIc???.?
???yvi?"x? F??d?Y?aR,4?X?
[UQ^F)?$`?
7??[?F"?$??h???S?a??4??Q?E??6Td,t6%Hg??w/)??????]??G* ?l[??&6?0?$??>??????~?????:??6??W#?a????E?G?
s??Z????§o?_??c??\???-???)?????cc??w???/??f??}?)??r???????T?/??? m??K??8? ?X?/F8?<???:?m???&f ?Z#[31?*?X,c?Z??0h"??aFb.?<??p??a???Q?B?r>????Z??5??6???????n\y?d?.??\??Hc]??
Z,?x??l???g?Q?*&???1?)??????^?????v??pQ???_y~??%??????*?
>???;??6?+?>???RQq?????a?(?Z????C?5???G??Ce??H?9??xYL|"??i?
e8?Vk???s???AK^?e~??
??(??Lt???r???vs????7??d?w???Jj-B????pt????c??MBi?s)Mo?.??^?aB3?x8&??:_K|???5???)[?M?Xc?j?zX?=G?i/??TO???g????5??c0??w???T??
The header is displayed correctly. The response is encoded and needs to be decoded. Both SoapUI and Fiddler are able to decode the response, but the proxy class can't and returns null.
How can I overcome this problem? Any help is greatly appreciated!
EDIT:
The way the service is called:
LisenceServiceFR.ServiceRegistration_PortTypeClient client = new LisenceServiceFR.ServiceRegistration_PortTypeClient();
LisenceServiceFR.aVehicleInfo info = client.getVehicleInfo("xxx", "xxx", licensePlate, "localhost");
Edit 2:
The response XML from Fiddler.
<?xml version="1.0" encoding="UTF-8"?>
<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns1="http://services.audaconfr.com/ServiceRegistration.wsdl">
<SOAP-ENV:Body>
<SOAP-ENV:getVehicleInfoResponse>
<aVehicle>
<ns1:errorCode>200</ns1:errorCode>
<ns1:errorMessage>Success</ns1:errorMessage>
<ns1:vehicleXml>
<vehicule>
<carr>MONOSPACE COMPACT</carr>
<carr_cg>CI</carr_cg>
<co2>152</co2>
<!-- etc -->
</vehicule>
</ns1:vehicleXml>
</aVehicle>
</SOAP-ENV:getVehicleInfoResponse>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>
I ended up using HttpWebRequest to call the webservice:
System.Xml.XmlDocument doc = new System.Xml.XmlDocument();
doc.InnerXml = xml;
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(endPoint);
req.Timeout = 100000000;
if (proxy != null)
req.Proxy = new WebProxy(proxy, true);
req.Headers.Add("SOAPAction", "");
req.ContentType = "application/soap+xml;charset=\"utf-8\"";
req.Accept = "application/x-www-form-urlencoded";
req.Method = "POST";
Stream stm = req.GetRequestStream();
doc.Save(stm);
stm.Close();
WebResponse resp = req.GetResponse();
stm = resp.GetResponseStream();
StreamReader r = new StreamReader(stm);
string responseData = r.ReadToEnd();
XDocument response = XDocument.Parse(responseData);
/* extract data from response */
It was not the solution I was looking for, but is works like a charm.

Categories

Resources