I'm trying to make a simple app that will "ping" a uri and tell me if its responding or not
I have the following code but it only seems to check domains at the root level
ie www.google.com and not www.google.com/voice
private bool WebsiteUp(string path)
{
bool status = false;
try
{
Uri uri = new Uri(path);
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
WebResponse response;
response = request.GetResponse();
if (response.Headers != null)
{
status = true;
}
}
catch (Exception loi)
{
return false;
}
return status;
}
Is there any existing code out there that would better fit this need?
Edit: Actually, I tell a lie - by default 404 should cause a web exception anyway, and I've just confirmed this in case I was misremembering. While the code given in the example is leaky, it should still work. Puzzling, but I'll leave this answer here for the better safety with the response object.
The problem with the code you have, is that while it is indeed checking the precise URI given, it considers 404, 500, 200 etc. as equally "successful". It also is a bit wasteful in using GET to do a job HEAD suffices for. It should really clean up that WebResponse too. And the term path is a silly parameter name for something that isn't just a path, while we're at it.
private bool WebsiteUp(string uri)
{
try
{
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
request.Method = "HEAD";
using(WebResponse response = request.GetResponse())
{
HttpWebResponse hRes = response as HttpWebResponse;
if(hRes == null)
throw new ArgumentException("Not an HTTP or HTTPS request"); // you may want to have this specifically handle e.g. FTP, but I'm just throwing an exception for now.
return hRes.StatusCode / 100 == 2;
}
}
catch (WebException)
{
return false;
}
}
Of course there are poor websites out there that return a 200 all the time and so on, but this is the best one can do. It assumes that in the case of a redirect you care about the ultimate target of the redirect (do you finally end up on a successful page or an error page), but if you care about the specific URI you could turn off automatic redirect following, and consider 3xx codes successful too.
There is a Ping class you can utilize for that, more details can be found here:
http://msdn.microsoft.com/en-us/library/system.net.networkinformation.ping.aspx
I did something similar when I wrote a torrent client to check valid tracker URLS, pretty sure I found the answer on SO but cant seem to find it anymore, heres the code sample I have from that post.
using(var client = new WebClient()) {
client.HeadOnly = true;
// exists
string Address1 = client.DownloadString("http://google.com");
// doesnt exist - 404 error
string Address2 = client.DownloadString("http://google.com/sdfsddsf");
}
Related
hello I am trying to launch an intent with a webview from a user entered URL, I have been looking everywhere online and I can't find a concrete answer as to how to make sure the website will actually connect before allowing the user to proceed to the next activity. I have found many tools to make sure the URL follows the correct format but none that actually let me make sure it can actually connect.
You can use WebClient and check if any exception is thrown:
using (var client = new HeadOnlyClient())
{
try
{
client.DownloadString("http://google.com");
}
catch (Exception ex)
{
// URL is not accessible.
}
}
You can catch more specific exceptions to make it more elegant.
You can also use custom modification to WebClient to check HEAD only and decrease the amount of data downloaded:
class HeadOnlyClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest req = base.GetWebRequest(address);
req.Method = "HEAD";
return req;
}
}
I would suggest you to use HttpHead for simple request with AndroidHttpClient, but it is deprecated now. You can try to implement HEAD Request by sockets.
You can try to ping the address first.
See this SO question: How to Ping External IP from Java Android
Another option:
Connectivity Plugin for Xamarin and Windows
Task<bool> IsReachable(string host, int msTimeout = 5000);
But, any pre-check that succeeds isn't guaranteed as the very next request might fail so you should still handle that.
Here's what I ended up doing to Check if a Host name is reachable. I was connecting to a site with a self signed certificate so that's why I have the delegate in the ServiceCertificateValidationCallback.
private async Task<bool> CheckHostConnectionAsync (string serverName)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(serverName);
ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true;
};
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
request.Timeout = 1000 * 40;
try
{
using (HttpWebResponse response = (HttpWebResponse) await request.GetResponseAsync ())
{
// Do nothing; we're only testing to see if we can get the response
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
return false;
}
if (Message.Length == 0)
{
goToMainActivity (serverName);
}
return true;
}
I am trying to swap my website over to consuming the new Twitter 1.1 API with uses OAuth 1.0a. I am able to get the correct response using a REST client and I am now trying to duplicate that on my website using c#.
I have constructed my headers the appropriate way and I have verified that they are in the correct format for what Twitter is looking for.
The issue I am having is that I do not think I am actually sending the request. I say this because my application returns almost instantly. The request should take a second or so to send at least, and my response has totally empty, with no 401 or 400 status code.
Below I have the code that actually sends the request. I am actually sending the request and if so why am I not getting any status code or anything.
Thanks in advance for the help.
//string url = "https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=MYSCREENNAME&count=2";
string url = "https://api.twitter.com/1.1/statuses/user_timeline.json";
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
webRequest.Method = "GET";
webRequest.Headers.Add("Authorization", authorizationHeaderParams);
try {
var response = webRequest.GetResponse() as HttpWebResponse;
if (response != null && response.StatusCode != HttpStatusCode.OK) {
lblresponse.InnerText = "The request did not complete and returned status code: {0} " + response.StatusCode;
}
if (response != null) {
var reader = new StreamReader(response.GetResponseStream());
reader.ReadToEnd();
lblresponse.InnerText += "success";
}
} catch {
lblresponse.InnerText += "fail";
}
So yeah this code goes straight to the catch block. My thoughts are I am not actually sending the request, since it takes no time to happen. I know there are some libraries designed to make this easier but I would much rather learn how to do it myself (with the help of you guys).
Thanks.
The request is going to throw an exception in the case of a 400 or 401. So catch System.Web.Exception in the catch block to see if there's a 400 or 401.
catch(System.Web.Exception ex) {
var errorReponse = (HttpWebResponse)ex.Response;
var statusCode = errorReponse.StatusCode;
lblresponse.InnerText += "fail";
}
I want my program in C# to check if a website is online prior to executing, how would I make my program ping the website and check for a response in C#?
A Ping only tells you the port is active, it does not tell you if it's really a web service there.
My suggestion is to perform a HTTP HEAD request against the URL
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("your url");
request.AllowAutoRedirect = false; // find out if this site is up and don't follow a redirector
request.Method = "HEAD";
try {
response = request.GetResponse();
// do something with response.Headers to find out information about the request
} catch (WebException wex)
{
//set flag if there was a timeout or some other issues
}
This will not actually fetch the HTML page, but it will help you find out the minimum of what you need to know. Sorry if the code doesn't compile, this is just off the top of my head.
You have use System.Net.NetworkInformation.Ping see below.
var ping = new System.Net.NetworkInformation.Ping();
var result = ping.Send("www.google.com");
if (result.Status != System.Net.NetworkInformation.IPStatus.Success)
return;
Small remark for Digicoder's code and complete example of Ping method:
private bool Ping(string url)
{
try
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.Timeout = 3000;
request.AllowAutoRedirect = false; // find out if this site is up and don't follow a redirector
request.Method = "HEAD";
using (var response = request.GetResponse())
{
return true;
}
}
catch
{
return false;
}
}
if (!NetworkInterface.GetIsNetworkAvailable())
{
// Network does not available.
return;
}
Uri uri = new Uri("http://stackoverflow.com/any-uri");
Ping ping = new Ping();
PingReply pingReply = ping.Send(uri.Host);
if (pingReply.Status != IPStatus.Success)
{
// Website does not available.
return;
}
The simplest way I can think of is something like:
WebClient webClient = new WebClient();
byte[] result = webClient.DownloadData("http://site.com/x.html");
DownloadData will throw an exception if the website is not online.
There is probably a similar way to just ping the site, but it's unlikely that the difference will be noticeable unless you are checking many times a second.
I have tested a few plugins for Firefox and Chrome which can identify IP number of a given website ofcause. But some of them can also show what server-side technology the website runs on.
How do they do this? I know about client-user-agent, is there something similar in the HTTP protocol where the server sendes a "server-host-agent" kinda string?
And if so, how would the code for retreiving this look. I guess its something with WebClient?
Anyone?
Using the HttpWebRequest and setting the Method property to HEAD, you can do a HTTP HEAD request, which is very lightweight. It will return the HTTP Headers (which may or may not be correct). They HTTP Headers may also differ from server to server as there is no standard for what headers a server should expose.
The code:
HttpWebRequest myReq = (HttpWebRequest)WebRequest.Create("http://www.contoso.com/");
myReq.Method = "HEAD";
WebResponse myRes = myReq.GetResponse();
for(int i=0; i < myHttpWebResponse.Headers.Count; ++i) {
Console.WriteLine(
"\nHeader Name:{0}, Value :{1}",
myHttpWebResponse.Headers.Keys[i], myHttpWebResponse.Headers[i]
);
}
EDIT:
var request = (HttpWebRequest)WebRequest.Create("http://www.http500.com");
try
{
var response = request.GetResponse();
}
catch (WebException wex)
{
// Safe cast to HttpWebResponse using 'as', will return null if unsuccessful
var httpWebResponse = wex.Response as HttpWebResponse;
if(httpWebResponse != null)
{
var httpStatusCode = httpWebResponse.StatusCode;
// HttpStatusCode is an enum, cast it to int for its actual value
var httpStatusCodeInt = (int)httpWebResponse.StatusCode;
}
}
How can I check whether a page exists at a given URL?
I have this code:
private void check(string path)
{
try
{
Uri uri = new Uri(path);
WebRequest request = WebRequest.Create(uri);
request.Timeout = 3000;
WebResponse response;
response = request.GetResponse();
}
catch(Exception loi) { MessageBox.Show(loi.Message); }
}
But that gives an error message about the proxy. :(
First, you need to understand that your question is at least twofold,
you must first check if the server is responsive, using ping for example - that's the first check, while doing this, consider timeout, for which timeout you will consider a page as not existing?
second, try retrieving the page using many methods which are available on google, again, you need to consider the timeout, if the server taking long to replay, the page might still "be there" but the server is just under tons of pressure.
If the proxy needs to authenticate you with your Windows credentials (e.g. you are in a corporate network) use:
WebRequest request=WebRequest.Create(url);
request.UseDefaultCredentials=true;
request.Proxy.Credentials=request.Credentials;
try
{
Uri uri = new Uri(path);
HttpWebRequest request = HttpWebRequest.Create(uri);
request.Timeout = 3000;
HttpWebResponse response;
response = request.GetResponse();
if (response.StatusCode.Equals(200))
{
// great - something is there
}
}
catch (Exception loi)
{
MessageBox.Show(loi.Message);
}
You can check the content-type and length, see MSDN HTTPWebResponse.
At a guess, without knowing the specific error message or path, you could try casting the WebRequest to a HttpWebRequest and then setting the WebProxy.
See MSDN: HttpWebRequest - Proxy Property