FiddlerCore behavior when network is disconnected - c#

I'm handling local requests by using FiddlerCore like this:
private static void FiddlerApplication_BeforeRequest(Session session)
{
if (session.hostname.ToLower() == "localhost")
ProcessRequest(session);
}
Everything works well but when the Internet network is down, I'm getting the following message:
"[Fiddler] DNS Lookup for "www.google.com" failed. The system reports that no network connection is available. No such host is known"
My question is:
How should I configure FiddlerCore so when the network is down, I will receive the regular 404 page?

You're noting: When a proxy is present, the browser doesn't show its default error pages. That is a correct statement, and it's not unique to Fiddler.
You're confusing folks because you're talking about "regular 404 response"s. That's confusing because the page you're talking about has nothing to do with a HTTP/404-- it's the browser's DNS Lookup Failure or Server Unreachable error page, which it shows when a DNS lookup fails or a TCP/IP connection attempt fails. Neither one of those is a 404, which is an error code that can be returned by a server only after the DNS lookup succeeds and the TCP/IP connection is successful.
To your question of: How can I make a request through a proxy result in the same error page that would be shown if the proxy weren't present, the answer is that, in general, you can't. You could do something goofy like copying the HTML out of the browser's error page and having the proxy return that, but because each browser (and version) may use different error pages, your ruse would be easily detectable.
One thing you could try is to make a given site bypass the proxy (such that the proxy is only used for the hosts you care about). To do that, you'd create a Proxy Autoconfiguration script file (a PAC file) and have its FindProxyForURL function return DIRECT for everything except the site(s) you want to have go through the proxy. see the Configuration Script section of this post.
But, stepping back a bit, do you even need a proxy at all? Can't you just run your web server on localhost? When you startup Fiddler/FiddlerCore, it listens on localhost:[port]. Just direct your HTTP request there without setting Fiddler as the system proxy.

Related

C# MVC get exactly the same URL string like in browser from server side

I know this is quite easy to get URL in many ways but I have unexpected issues when querying website by DNS, which redirects to a non-default port.
There is a web app working on port 7000 which is accessible by server name or dns:
http://MyServer:7000
http://MyApp.intranet.net (maps MyServer on port 7000)
This is how users see them in the browser.
So now, how to get exactly the same URL link like in a browser.
HttpContext.Current.Request.Url.AbsoluteUri returns respectively:
http://MyServer:7000
http://MyApp.intranet.net:7000 (Port was added, this address doesn't exist!!!)
HttpContext.Current.Request.Url.OriginalString:
http://MyServer:7000
http://MyApp.intranet.net:7000 (the same like above)
HttpContext.Current.Request.Url.Authority:
MyServer:7000
MyApp.intranet.net:7000 (the same like above)
HttpContext.Current.Request.Url.Host:
MyServer (no port information, returns error)
MyApp.intranet.net (that's fine)
I didn't find any property which returns both addresses correctly, as browser user has.
And I don't want to hardcode anything or send this from the client-side.
Thanks for any advice.

How to catch and potential 301 or 404 errors when appropriate?

We had a little mishap where an https binding was created for a website without a hostname (just an ip), and then another website was created with only an http binding to a hostname using the same ip as the first site.
So the problem is when you navigate to the 2nd site over https, instead of getting an error it just goes to the first website. As a result Google was able to access the first site through the 2nd sites host name over https and now we have lots of duplicate links out in google land.
I've already stopped the bleeding, but now I need to 301 all the bad links that were created by Google for the 2nd site. My plan is that, going forward, anytime a 404 error is encountered in the 2nd site then it will call for just the header from the same link on the 1st site. If the header returns with an OK status then it will do a permanent redirect to the 1st site.
There's just one part of that plan I don't know how to do off the top of my head... what's the best way to intercept the 404s in such a way I can run my code to determine whether it should be 301'd or not?

504 error when requesting file from url in codebehind

I'm trying to read a file, using a URL, however I keep getting a 504 Gateway Timeout.
The user submits an form, and I need to grab some information from a rather large xml file (45mb), using an XmlTextReader. However each time the request is made, it comes back with a 504 Gateway Timeout on one server, however it works fine on another server. The 504 error is thrown after about 20 seconds, however on the other server, where it does work, the file is read much faster than that.
XmlTextReader reader = new XmlTextReader(localUrl);
The strange issue is that IIS is not even logging this request. I've gone through the logs and I can find the entry in the system that works, however in the system that doesn't work, there is no request in the IIS logs. Making it look like its not even hitting IIS.
It seems the problem is that the user the AppPool is running under had its proxy settings set up incorrectly, therefore it was unable to make the call that it needed to make.
Once I corrected the proxy settings for that user, it started to work.

webserver forcing browser to only open one parallel connection

Hello I'm trying to write a webserver in C#.
The server is going to dynamically create a website based on some templates I defined.
The problem I have is that you only can access the webpage if you enter a password.
So I decided to make the browser open up a keep-alive connection, passing every request through it.
Then I have control over logged in clients and not logged in clients. Now the problem is that Firefox and Google Chrome, when it comes to requesting the images on the website, they just open up another connection from the same ip but a different port.
My webserver thinks that its another client and sends the login http page instead of the requested image.
So every time the website loads only 1 - 4 images are getting actually sent.
Now my question: Is there any way to force the browser NOT to open up parallel connections?
Or if not possible how should I deal with the problem?
For those who like to see some code here is what the core of the server looks like, just to understand my problem:
void ThreadStart()
{
while (true)
{
RunClient(listener.AcceptTcpClient());
}
}
void RunClient(TcpClient c)
{
Thread tht = new Thread(new ParameterizedThreadStart(RunIt));
tht.IsBackground = true;
tht.Start(c);//The login page is getting sent...
thtt.Add(tht);
}
Thanks in advance, Alex
Authenticating a HTTP connection rather than individual requests is wrong, wrong, wrong. Even if you could make the browser reuse a single connection (which you can't, because that's not how HTTP works), you wouldn't be able to count on this being respected by proxies or transparent web caches.
This is (some of) what cookies were invented for. Use them, or some kind of session identifier built into the URLs.

How to simulate a Host file for the time of one request

I need to access simultaniously multiple instances of a web services with the following Url. The web services is hosted in IIS and has SSL enabled.
https://services.mysite.com/data/data.asmx
Usually, when we do this process manually, we go one by one and update the Windows host file (c:\Windows\System32\drivers\etc\hosts) like this :
192.1.1.100 services.mysite.com
I would like to automate the process and do it with some multithreading. So I cannot change the Host file. Is there a way to simulate a host file when we do a HTTP request in C#?
Thanks!
If you know the IP address of the server's SSL endpoint (which isn't necessarily the same as the server's default IP address), then you could just aim you web-service at that? Obviously the SSL check will fail, but you can disable that through code...
ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true; // you might want to check some of the certificate detials...
};
I think you get the same effect by setting the proxy server of that specific request to the IP address of the actual Web server you want to send the request to.
You can change the URL that your request is hitting at runtime, something like this:
svc.Url = "http://firstServer.com";
So if you create a program that loops through each of your desired servers, just update the URL property directly (that example is taken from WSE 3 based web services).

Categories

Resources