WebClient request timeout - c#

I'm trying to return some html from a url via the WebClient class:
Response.Write(new System.Net.WebClient().DownloadString("http://www.partnersite.com/html/"));
Whenever this code runs I get a timeout.
The request was aborted: The operation has timed out.
If I browse directly to "http://www.partnersite.com/html/" I get an immediate response.
The timeout only seems to happen with this particular website, if I make a request to another site eg.
WebClient().DownloadString("http://www.google.com"));
I'm instantly returned the html.
Is this simply a case of something going on at the partner site which means it's not responding? Or is there something else I can try before making that call?
Thanks.

Is the site accessible from Internet Explorer?
If not and if the site is in the same network, check whether you have bypassed the proxy for the site from the Internet Explorer options.

Related

504 error when requesting file from url in codebehind

I'm trying to read a file, using a URL, however I keep getting a 504 Gateway Timeout.
The user submits an form, and I need to grab some information from a rather large xml file (45mb), using an XmlTextReader. However each time the request is made, it comes back with a 504 Gateway Timeout on one server, however it works fine on another server. The 504 error is thrown after about 20 seconds, however on the other server, where it does work, the file is read much faster than that.
XmlTextReader reader = new XmlTextReader(localUrl);
The strange issue is that IIS is not even logging this request. I've gone through the logs and I can find the entry in the system that works, however in the system that doesn't work, there is no request in the IIS logs. Making it look like its not even hitting IIS.
It seems the problem is that the user the AppPool is running under had its proxy settings set up incorrectly, therefore it was unable to make the call that it needed to make.
Once I corrected the proxy settings for that user, it started to work.

FiddlerCore behavior when network is disconnected

I'm handling local requests by using FiddlerCore like this:
private static void FiddlerApplication_BeforeRequest(Session session)
{
if (session.hostname.ToLower() == "localhost")
ProcessRequest(session);
}
Everything works well but when the Internet network is down, I'm getting the following message:
"[Fiddler] DNS Lookup for "www.google.com" failed. The system reports that no network connection is available. No such host is known"
My question is:
How should I configure FiddlerCore so when the network is down, I will receive the regular 404 page?
You're noting: When a proxy is present, the browser doesn't show its default error pages. That is a correct statement, and it's not unique to Fiddler.
You're confusing folks because you're talking about "regular 404 response"s. That's confusing because the page you're talking about has nothing to do with a HTTP/404-- it's the browser's DNS Lookup Failure or Server Unreachable error page, which it shows when a DNS lookup fails or a TCP/IP connection attempt fails. Neither one of those is a 404, which is an error code that can be returned by a server only after the DNS lookup succeeds and the TCP/IP connection is successful.
To your question of: How can I make a request through a proxy result in the same error page that would be shown if the proxy weren't present, the answer is that, in general, you can't. You could do something goofy like copying the HTML out of the browser's error page and having the proxy return that, but because each browser (and version) may use different error pages, your ruse would be easily detectable.
One thing you could try is to make a given site bypass the proxy (such that the proxy is only used for the hosts you care about). To do that, you'd create a Proxy Autoconfiguration script file (a PAC file) and have its FindProxyForURL function return DIRECT for everything except the site(s) you want to have go through the proxy. see the Configuration Script section of this post.
But, stepping back a bit, do you even need a proxy at all? Can't you just run your web server on localhost? When you startup Fiddler/FiddlerCore, it listens on localhost:[port]. Just direct your HTTP request there without setting Fiddler as the system proxy.

Http Post with Partial URL in C#.NET

I have a web application (which I have no control over) I need to send HTTP post programatically to. Currently I've using HttpWebRequest like
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://someserver.com/blah/blah.aspx");
However the application was returning a "Unknown Server Error (not the IIS error, a custom application error page)" when posting to data. Using Fiddler to compare my Post vs IE post I can see the only difference is in the POST line of the request:
In Internet Explorer Fiddler (RAW view) shows traffic
POST /blah/blah.aspx HTTP/1.1
In my C# program fiddler (RAW view) records traffic as
POST https://someserver.com/blah/blah.aspx HTTP/1.1
This is only difference from both both requests.
From what I've researched so far it seems there is no way to make HttpWebRequest.Create post the relative URL.Note: I see many posts on "how to use relative URLs" but these suggestions do not work, as the actual post is still done using an absolute URL (when you sniff the HTTP traffic)
What is simplest way to accomplish this post with relative URL?
(Traffic is NOT going through a proxy)
Update: For the time being I'm using IE automation to do scheduled perf test, instead of method above. I might look at another scripting language as I did want to test without any browser.
No, you can't do POST without server in a Url.
One possible reason your program fails is if it does not use correct proxy and as result can't resolve server name.
Note: Fiddler shows path and host separately in the view you are talking about.
Configure you program to use Fiddler as proxy (127.0.0.1:8888) and compare requests that you are making with browser's ones. Don't forget to switch Fiddler to "show all proceses".
Here is article on configuring Fiddler for different type of environment including C# code: Fiddler: Configuring clients
objRequest = (HttpWebRequest)WebRequest.Create(url);
objRequest.Proxy= new WebProxy("127.0.0.1", 8888);

webserver forcing browser to only open one parallel connection

Hello I'm trying to write a webserver in C#.
The server is going to dynamically create a website based on some templates I defined.
The problem I have is that you only can access the webpage if you enter a password.
So I decided to make the browser open up a keep-alive connection, passing every request through it.
Then I have control over logged in clients and not logged in clients. Now the problem is that Firefox and Google Chrome, when it comes to requesting the images on the website, they just open up another connection from the same ip but a different port.
My webserver thinks that its another client and sends the login http page instead of the requested image.
So every time the website loads only 1 - 4 images are getting actually sent.
Now my question: Is there any way to force the browser NOT to open up parallel connections?
Or if not possible how should I deal with the problem?
For those who like to see some code here is what the core of the server looks like, just to understand my problem:
void ThreadStart()
{
while (true)
{
RunClient(listener.AcceptTcpClient());
}
}
void RunClient(TcpClient c)
{
Thread tht = new Thread(new ParameterizedThreadStart(RunIt));
tht.IsBackground = true;
tht.Start(c);//The login page is getting sent...
thtt.Add(tht);
}
Thanks in advance, Alex
Authenticating a HTTP connection rather than individual requests is wrong, wrong, wrong. Even if you could make the browser reuse a single connection (which you can't, because that's not how HTTP works), you wouldn't be able to count on this being respected by proxies or transparent web caches.
This is (some of) what cookies were invented for. Use them, or some kind of session identifier built into the URLs.

HttpWebRequest returns 404s for 302s only in Internet Explorer

I have a Silverlight (v3) application that uses WebRequest to make an HTTP POST request to a webpage on the same website as the Silverlight app. This HTTP request gets back a 302 (a redirect) to another page on the same website, which HttpWebRequest is automatically supposed to follow (according to the documentation).
There's nothing particularly special about the code that makes the request (it uses the browser's HTTP stack, it is not configured to use the alternate inbuilt Silverlight HTTP stack):
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(String.Format("{0}?name={1}&size={2}", _UploadUrl, Uri.EscapeUriString(Name), TotalBytes));
request.Method = "POST";
All this works fine in Firefox and Chrome; Silverlight makes the POST HTTP request, receives a 302 response and automatically does a GET HTTP request of the specified redirect URL and returns that to me (I know this because I used Fiddler to watch the HTTP requests going on).
However, in Internet Explorer (v8), Silverlight does the POST HTTP request and then throws a WebException with a 404 error code!
Using Fiddler, I can see that Silverlight/Internet Explorer was successfully returned the 302 status code for the request, and I assume that the 404 status code (and associated WebException) that I get in Silverlight is because as far as I know HTTP requests that are done via the browser stack can only return 200 or 404 due to limitations. The real question is why does Internet Explorer not follow through the redirect like the other browsers?
Thanks in advance for any help!
EDIT: I would prefer not to use the Silverlight client HTTP stack because to my knowledge requests issued by it do not include cookies that are a part of the browser's session, critically including the ASP.NET authentication cookie that I need to be attached to the HTTP requests being made by the Silverlight control.
EDIT 2: I have discovered that Internet Explorer only exhibits this behaviour when you do a POST request. A GET request redirects successfully. This seems like pretty bad behaviour considering how many websites now do things in the Post-Redirect-Get style.
IE is closer to the specification, in that in responding to a 302 for a POST the user agent should send a POST (though it should not do so without user confirmation).
On the other hand, FF and Chrome are deliberately wrong, in copying ways in which user agents were frequently wrong some considerable time ago (the problem started in the early days of HTTP).
For this reason, 307 was introduced in HTTP/1.1 to be clearer that the same HTTP method should be used (i.e. in this case, it should be a POST) while 303 has always meant that one should use GET.
Therefore, instead of doing Response.Redirect which results in a 302 - that different user agents will handle in different ways, send a 303. The following code does so (and includes a valid entity body just to be within the letter of the spec). There is an overload so you can call it with either a Uri or a string:
private void SeeOther(Uri uri)
{
if(!uri.IsAbsoluteUri)
uri = new Uri(Request.Url, uri);
Response.StatusCode = 303;
Response.AddHeader("Location", uri.AbsoluteUri);
Response.ContentType = "text/uri-list";
Response.Write(uri.AbsoluteUri);
Context.ApplicationInstance.CompleteRequest();
}
private void SeeOther(string relUri)
{
SeeOther(new Uri(Request.Url, relUri));
}
I believe this was a feature change in Internet Explorer 7, where by they changed the expected 200 response to a 302 telling IE to be redirected. There is no smooth solution to this problem that I know off. A similar question was posed a while back here.
Change in behavior with Internet Explorer 7 and later in regard to CONNECT requests

Categories

Resources