Request Web Page in c# spoofing the Host - c#

I need to create a request for a web page delivered to our web sites, but I need to be able to set the host header information too. I have tried this using HttpWebRequest, but the Header information is read only (Or at least the Host part of it is). I need to do this because we want to perform the initial request for a page before the user can. We have 10 web server which are load balanced, so we need to request the file from each of the web servers.
I have tried the following:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://192.168.1.5/filename.htm");
request.Headers.Set("Host", "www.mywebsite.com");
WebResponse response = request.GetResponse();
Obviously this does not work, as I can't update the header, and I don't know if this is indeed the right way to do it.

Although this is a very late answer, maybe someone can get benefit of it
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(new Uri("http://192.168.1.1"));
request.Headers.GetType().InvokeMember("ChangeInternal", BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.InvokeMethod, null, request.Headers, new object[] {"Host","www.mysite.com"});
Reflection is your friend :)

I have managed to find out a more long winded route by using sockets. I found the answer in the MSDN page for IPEndPoint:
string getString = "GET /path/mypage.htm HTTP/1.1\r\nHost: www.mysite.mobi\r\nConnection: Close\r\n\r\n";
Encoding ASCII = Encoding.ASCII;
Byte[] byteGetString = ASCII.GetBytes(getString);
Byte[] receiveByte = new Byte[256];
Socket socket = null;
String strPage = null;
try
{
IPEndPoint ip = new IPEndPoint(IPAddress.Parse("10.23.1.93"), 80);
socket = new Socket(ip.AddressFamily, SocketType.Stream, ProtocolType.Tcp);
socket.Connect(ip);
}
catch (SocketException ex)
{
Console.WriteLine("Source:" + ex.Source);
Console.WriteLine("Message:" + ex.Message);
}
socket.Send(byteGetString, byteGetString.Length, 0);
Int32 bytes = socket.Receive(receiveByte, receiveByte.Length, 0);
strPage = strPage + ASCII.GetString(receiveByte, 0, bytes);
while (bytes > 0)
{
bytes = socket.Receive(receiveByte, receiveByte.Length, 0);
strPage = strPage + ASCII.GetString(receiveByte, 0, bytes);
}
socket.Close();

I had a problem where the URL dns I used had several different IP addresses, I wanted to call each address separately using the same dns name in the host - the solution is using a proxy:
string retVal = "";
// Can't change the 'Host' header property because .NET protects it
// HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
// request.Headers.Set(HttpRequestHeader.Host, DEPLOYER_HOST);
// so we must use a workaround
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Proxy = new WebProxy(ip);
using (WebResponse response = request.GetResponse())
{
using (TextReader reader = new StreamReader(response.GetResponseStream()))
{
string line;
while ((line = reader.ReadLine()) != null)
retVal += line;
}
}
return retVal;
Host header is set from 'url' automatically by .NET, and 'ip' contains the actual address of the web server you want to contact (you can use a dns name here too)

I know this is old, but I came across this same exact problem, and I found a better solution to this then using sockets or reflection...
What I did was create a new class that durives from WebHeaderCollection and bypasses validation of what you stick inside it:
public class MyHeaderCollection:WebHeaderCollection
{
public new void Set(string name, string value)
{
AddWithoutValidate(name, value);
}
//or
public new string this[string name]
{
get { return base[name]; }
set { AddWithoutValidate(name, value); }
}
}
and here is how you use it:
var http = WebRequest.Create("http://example.com/");
var headers = new MyHeaderCollection();
http.Headers = headers;
//Now you can add/override anything you like without validation:
headers.Set("Host", http.RequestUri.Host);
//or
headers["Host"] = http.RequestUri.Host;
Hope this helps anyone looking for this!

I know this is an old question, but these days, you can do.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://192.168.1.5/filename.htm");
request.Host = "www.mywebstite.com";
WebResponse response = request.GetResponse();

The "Host" header is protected and cannot be modified programmatically. I suppose to work around this, you could try and bind via reflection to the private "InnerCollection" property of the WebRequest object and calling the "Set" ar "Add" method on it to modify the Host header. I haven't tried this, but from a quick look at the source code in Reflector, I think it's easily accomplished. But yeah, binding to private properties of framework objects is not the best way to accomplish things. :) Use only if you MUST.
edit: Or like the other guy mentions in the linked question, just open up a socket and do a quick "GET" manually. Should be a no brainer, if you don't need to tinker with other stuff, like cookies or whatever else niceties the HttpWebRequest provides.

Alright, little bit of research turns up this:
https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=384456
Seems MS may do something about this at some point.

You can use my solution for this problem, it posted here :
How to set custom "Host" header in HttpWebRequest?
This can help you to edit host header, and avoid to using proxy and direct socket requests.

Necromancing.
For those still on .NET 2.0
It is in fact quite easy, if you know how.
Problem is, you can't set the host header, because the framework won't let you change the value at runtime. (.net framework 4.0+ will let you override host in a httpwebrequest).
Next attempt will be setting the header with reflection - as demonstrated in the top upvoted answer here - to get around it, which will let you change the header value. But at runtime, it will overwrite this value with the host part of the url, which means reflection will bring you nothing, which is why I don't understand why people keep upvoting this.
If the dns-name doesn't exist, which is quite frankly the only case in which you want to do this in the first place, you can't set it, because .NET can't resolve it, and you can't override the .NET DNS resolver.
But what you can do, is setting a webproxy with the exact same IP as the destination server.
So, if your server IP is 28.14.88.71:
public class myweb : System.Net.WebClient
{
protected override System.Net.WebRequest GetWebRequest(System.Uri address)
{
System.Net.WebRequest request = (System.Net.WebRequest)base.GetWebRequest(address);
//string host = "redmine.nonexistantdomain.com";
//request.Headers.GetType().InvokeMember("ChangeInternal",
// System.Reflection.BindingFlags.NonPublic |
// System.Reflection.BindingFlags.Instance |
// System.Reflection.BindingFlags.InvokeMethod, null,
// request.Headers, new object[] { "Host", host }
//);
//server IP and port
request.Proxy = new System.Net.WebProxy("http://28.14.88.71:80");
// .NET 4.0 only
System.Net.HttpWebRequest foo = (System.Net.HttpWebRequest)request;
//foo.Host = host;
// The below reflection-based operation is not necessary,
// if the server speaks HTTP 1.1 correctly
// and the firewall doesn't interfere
// https://yoursunny.com/t/2009/HttpWebRequest-IP/
System.Reflection.FieldInfo horribleProxyServicePoint = (typeof(System.Net.ServicePoint))
.GetField("m_ProxyServicePoint", System.Reflection.BindingFlags.NonPublic |
System.Reflection.BindingFlags.Instance);
horribleProxyServicePoint.SetValue(foo.ServicePoint, false);
return foo; // or return request; if you don't neet this
}
}
and voila, now
myweb wc = new myweb();
string str = wc.DownloadString("http://redmine.netexistantdomain.com");
and you get the correct page back, if 28.14.88.71 is a webserver with virtual name-based hosting (based on http-host-header).
Now you have the correct answer to the original question, for both WebRequest and WebClient. I think using custom sockets to do this would be the wrong approach, particularly when SSL should be used, and when an actual solution is that simple...

Related

Recover from a NameResolutionFailure error during a HTTPWebRequest

In our Outlook COM add-in, we're making an API call to our server using the .NET HTTPWebRequest method. One of our customers is running into a System.Net.WebException with the message The remote name could not be resolved: 'mydomain.com' and WebExceptionStatus.NameResolutionFailure as the status. All the users from this particular customer's company are using outlook/the addin from behind a VPN so we are piggy-backing proxy configuration from IE in order to make the request.
Our API calls work for a period of time but then it randomly disconnects and then does not allow future requests to go through either. Once the users close and restart Outlook though, it seems to work just fine again without changing any network configuration or reconnecting to wifi etc.
Other posts like this suggested retrying with a sleep in between. We have added a retry mechanism with 3 attempts, each with a sleep in between but that has not resolved the intermitent issue.
Our domain is hooked up to an AWS Classic Load Balancer so mydomain.com actually resolves a CNAME record to an AWS static domain, pointing to the ELB. I'm not sure if that would have any impact on the request or routing it.
The strange part is we also have a web browser component that loads a web page in a sidebar from the exact same domain as the API calls. It works perfectly and loads a URL from the same domain. The users can also load the URL in their browsers without any issues. It just appears that the HTTPWebRequest is running into the domain resolution issue. We've checked that it's not just a matter of a weak wifi signal. Since they are able to use IE which has the same proxy config to access the site just fine, I don't think it's that.
We're at a loss for how to gracefully recover and have the request try again. I've looked into suggestions from this answer and this other answer, we'll be trying those next. Unfortunately, we are not able to make the requests use direct IP addresses as some of the other answers suggest. That also eliminates the ability to edit the hosts file to point straight to it. The reason is we can't assign a static IP on a classic ELB.
We're considering trying to set the host to use the CNAME record from AWS directly but this is going to cause SSL errors as it doesn't have a valid cert for the CNAME entry. Is there a way to get around that by masking it via a header, similar to the IP approach?
Feel free to ask for more information, I will do my best to provide it.
Any suggestions on what to try / troubleshoot are welcome!
Update: We’re targeting .NET v4.5
Here's the code
var result = string.Empty;
bool retrying = false;
int retries = 0;
HttpWebRequest webRequest = null;
try
{
ServicePointManager.ServerCertificateValidationCallback =
CertificateCheck;
ServicePointManager.MaxServicePoints = 4;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
retry:
webRequest = (HttpWebRequest)WebRequest.Create(uriParam);
webRequest.Timeout = 120000;
webRequest.Method = "POST";
webRequest.ContentType = "application/x-www-form-urlencoded";
webRequest.Accept = acceptParam;
webRequest.Headers.Add("Cookie", cookieParam);
webRequest.UseDefaultCredentials = true;
webRequest.Proxy = null;
webRequest.KeepAlive = true; //default
webRequest.ServicePoint.ConnectionLeaseTimeout = webRequest.Timeout;
webRequest.ServicePoint.MaxIdleTime = webRequest.Timeout;
webRequest.ContentLength = dataParam.Length;
using (var reqStream = webRequest.GetRequestStream())
{
reqStream.Write(dataParam, 0, dataParam.Length);
reqStream.Flush();
reqStream.Close();
}
try
{
using (WebResponse webResponse = webRequest.GetResponse())
{
using (var responseStream = webResponse.GetResponseStream())
{
if (responseStream != null)
{
using (var reader = new StreamReader(responseStream))
{
result = reader.ReadToEnd();
}
}
}
webResponse.Close();
}
}
catch (WebException)
{
if (retrying && retries == 3)
{
//don't retry any more
return string.Empty;
}
retrying = true;
retries++;
webRequest.Abort();
System.Threading.Thread.Sleep(2000);
goto retry;
}
}
catch (Exception ex)
{
Log.Error(ex);
result = string.Empty;
}
finally
{
webRequest?.Abort();
}
return result;

How do I navigate to a website using a proxy?

I've looked through google, and I saw things like WebRequest, WebProxy, etc. There were a lot of pages, but I don't get it. So let's say I have a TextBox with the URL in it, and another TextBox with the proxy in it. How would I make it so that I could use a proxy on the URL?
One option would be to create a request using the HttpWebRequest object detailed here:
http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.aspx
One of the properties of the HttpWebRequest object is 'Proxy':
http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.proxy.aspx
A good implementation example can be found here:
problem using proxy with HttpWebRequest in C#
You could use RestSharp's Rest Client (https://www.nuget.org/packages/RestSharp) to pull data, and then render in a WebBrowser object:
try {
var response = new RestClient {
BaseUrl = "https://theproxisright.com/",
Proxy = new WebProxy("1.2.3.4", 8080),
Timeout = 10000
}.Execute(new RestRequest {
Resource = "api/Proxy/Get?apiKey=ENTER_FREE_OR_UNLIMITED_API_KEY_HERE",
Method = Method.GET,
});
if (response.ErrorException != null) {
throw response.ErrorException;
} else {
Console.WriteLine(response.Content);
var wb = new WebBrowser{ Width = 200 };
webBrowserStack.Children.Add(wb);
wb.NavigateToString(response.Content);
}
} catch (Exception ex) {
Console.Error.WriteLine(ex.Message);
}

Get web page contents from Firefox in a C# program

I need to write a simple C# app that should receive entire contents of a web page currently opened in Firefox. Is there any way to do it directly from C#? If not, is it possible to develop some kind of plug-in that would transfer page contents? As I am a total newbie in Firefox plug-ins programming, I'd really appreciate any info on getting me started quickly. Maybe there are some sources I can use as a reference? Doc links? Recommendations?
UPD: I actually need to communicate with a Firefox instance, not get contents of a web page from a given URL
It would help if you elaborate What you are trying to achieve. May be plugins already out there such as firebug can help.
Anways, if you really want to develop both plugin and C# application:
Check out this tutorial on firefox extension:
http://robertnyman.com/2009/01/24/how-to-develop-a-firefox-extension/
Otherwise, You can use WebRequest or HttpWebRequest class in .NET request to get the HTML source of any URL.
I think you'd almost certainly need to write a Firefox plugin for that. However there are certainly ways to request a webpage, and receive its HTML response within C#. It depends on what your requirements are?
If you're requirements are simply receive the source from any website, leave a comment and I'll point you towards the code.
Uri uri = new Uri(url);
System.Net.HttpWebRequest req = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(uri.AbsoluteUri);
req.AllowAutoRedirect = true;
req.MaximumAutomaticRedirections = 3;
//req.UserAgent = _UserAgent; //"Mozilla/6.0 (MSIE 6.0; Windows NT 5.1; Searcharoo.NET)";
req.KeepAlive = true;
req.Timeout = _RequestTimeout * 1000; //prefRequestTimeout
// SIMONJONES http://codeproject.com/aspnet/spideroo.asp?msg=1421158#xx1421158xx
req.CookieContainer = new System.Net.CookieContainer();
req.CookieContainer.Add(_CookieContainer.GetCookies(uri));
System.Net.HttpWebResponse webresponse = null;
try
{
webresponse = (System.Net.HttpWebResponse)req.GetResponse();
}
catch (Exception ex)
{
webresponse = null;
Console.Write("request for url failed: {0} {1}", url, ex.Message);
}
if (webresponse != null)
{
webresponse.Cookies = req.CookieContainer.GetCookies(req.RequestUri);
// handle cookies (need to do this incase we have any session cookies)
foreach (System.Net.Cookie retCookie in webresponse.Cookies)
{
bool cookieFound = false;
foreach (System.Net.Cookie oldCookie in _CookieContainer.GetCookies(uri))
{
if (retCookie.Name.Equals(oldCookie.Name))
{
oldCookie.Value = retCookie.Value;
cookieFound = true;
}
}
if (!cookieFound)
{
_CookieContainer.Add(retCookie);
}
}
string enc = "utf-8"; // default
if (webresponse.ContentEncoding != String.Empty)
{
// Use the HttpHeader Content-Type in preference to the one set in META
doc.Encoding = webresponse.ContentEncoding;
}
else if (doc.Encoding == String.Empty)
{
doc.Encoding = enc; // default
}
//http://www.c-sharpcorner.com/Code/2003/Dec/ReadingWebPageSources.asp
System.IO.StreamReader stream = new System.IO.StreamReader
(webresponse.GetResponseStream(), System.Text.Encoding.GetEncoding(doc.Encoding));
webresponse.Close();
This does what you want.
using System.Net;
var cli = new WebClient();
string data = cli.DownloadString("http://www.heise.de");
Console.WriteLine(data);
Native messaging enables an extension to exchange messages with a native application installed on the user's computer.

Test if a URI is up

I'm trying to make a simple app that will "ping" a uri and tell me if its responding or not
I have the following code but it only seems to check domains at the root level
ie www.google.com and not www.google.com/voice
private bool WebsiteUp(string path)
{
bool status = false;
try
{
Uri uri = new Uri(path);
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
WebResponse response;
response = request.GetResponse();
if (response.Headers != null)
{
status = true;
}
}
catch (Exception loi)
{
return false;
}
return status;
}
Is there any existing code out there that would better fit this need?
Edit: Actually, I tell a lie - by default 404 should cause a web exception anyway, and I've just confirmed this in case I was misremembering. While the code given in the example is leaky, it should still work. Puzzling, but I'll leave this answer here for the better safety with the response object.
The problem with the code you have, is that while it is indeed checking the precise URI given, it considers 404, 500, 200 etc. as equally "successful". It also is a bit wasteful in using GET to do a job HEAD suffices for. It should really clean up that WebResponse too. And the term path is a silly parameter name for something that isn't just a path, while we're at it.
private bool WebsiteUp(string uri)
{
try
{
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
request.Method = "HEAD";
using(WebResponse response = request.GetResponse())
{
HttpWebResponse hRes = response as HttpWebResponse;
if(hRes == null)
throw new ArgumentException("Not an HTTP or HTTPS request"); // you may want to have this specifically handle e.g. FTP, but I'm just throwing an exception for now.
return hRes.StatusCode / 100 == 2;
}
}
catch (WebException)
{
return false;
}
}
Of course there are poor websites out there that return a 200 all the time and so on, but this is the best one can do. It assumes that in the case of a redirect you care about the ultimate target of the redirect (do you finally end up on a successful page or an error page), but if you care about the specific URI you could turn off automatic redirect following, and consider 3xx codes successful too.
There is a Ping class you can utilize for that, more details can be found here:
http://msdn.microsoft.com/en-us/library/system.net.networkinformation.ping.aspx
I did something similar when I wrote a torrent client to check valid tracker URLS, pretty sure I found the answer on SO but cant seem to find it anymore, heres the code sample I have from that post.
using(var client = new WebClient()) {
client.HeadOnly = true;
// exists
string Address1 = client.DownloadString("http://google.com");
// doesnt exist - 404 error
string Address2 = client.DownloadString("http://google.com/sdfsddsf");
}

C# Check url exist?

How can I check whether a page exists at a given URL?
I have this code:
private void check(string path)
{
try
{
Uri uri = new Uri(path);
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
WebResponse response;
response = request.GetResponse();
}
catch(Exception loi) { MessageBox.Show(loi.Message); }
}
But that gives an error message about the proxy. :(
First, you need to understand that your question is at least twofold,
you must first check if the server is responsive, using ping for example - that's the first check, while doing this, consider timeout, for which timeout you will consider a page as not existing?
second, try retrieving the page using many methods which are available on google, again, you need to consider the timeout, if the server taking long to replay, the page might still "be there" but the server is just under tons of pressure.
If the proxy needs to authenticate you with your Windows credentials (e.g. you are in a corporate network) use:
WebRequest request=WebRequest.Create(url);
request.UseDefaultCredentials=true;
request.Proxy.Credentials=request.Credentials;
try
{
Uri uri = new Uri(path);
HttpWebRequest request = HttpWebRequest.Create(uri);
request.Timeout = 3000;
HttpWebResponse response;
response = request.GetResponse();
if (response.StatusCode.Equals(200))
{
// great - something is there
}
}
catch (Exception loi)
{
MessageBox.Show(loi.Message);
}
You can check the content-type and length, see MSDN HTTPWebResponse.
At a guess, without knowing the specific error message or path, you could try casting the WebRequest to a HttpWebRequest and then setting the WebProxy.
See MSDN: HttpWebRequest - Proxy Property

Categories

Resources