Is it possible to determine whether website is online or offline other than using HttpWebRequest command?
Please do not suggest me using Ping method because I want to check website availability.
You should send an ajax call to your header (domain). Add random number after the ? (this is to test even if you're page is cached). This is how it is in javascript:
function hasInternets() {
console.log("hasInternets: " + window.location.href.split("?")[0] + "?" + Math.random());
var s = $.ajax({
type: "HEAD",
url: window.location.href.split("?")[0] + "?" + Math.random(),
async: false
}).status;
console.log("s: " +s);
//thx http://www.louisremi.com/2011/04/22/navigator-online-alternative-serverreachable/
return s >= 200 && s < 300 || s === 304; }
If you're looking for C# codes on this, at least this javascript method will give you an idea.
Is it possible to determine whether website is online or offline other
than using HttpWebRequest command?
No, there isn't other way. You need to send an HTTP request to the site and check the returned status code. In .NET this is usually done with the WebClient class but you could use WebRequest as well if you prefer. And to avoid wasting bandwidth you could use the HEAD verb. This way you are instructing the web server to not send a response body, just a status code that you could check against for being 200.
An option is to load an image, and javascript img with events that get fired
onabort Loading of an image is interrupted
onerror An error occurs when loading an image
onload An image is finished loading
for more info
http://www.w3schools.com/jsref/dom_obj_image.asp
Related
I'm pretty sure this isn't possible but I thought I'd ask...
I have a FileResult which returns a file, when it works there's no problem. When there's an error in the FileResult for some reason, I'd like to display an exception error message on the users screen, preferably in a popup.
Can I do an Ajax post which returns a file when successful and displays a message if not?
I think it is not possible cause in order to handle ajax post, you will have to write a javascript handler on the client side and javascript cannot do file IO on client side.
However, what you can do is, make an ajax request to check if file exists and can be downloaded. If, not, respond to that request negatively which will popup a dialog on client side. If successful, make a file download request.
Not specifically related to MVC but...
it can be done using XMLHttpRequest in conjunction with the new HTML5 File System API that allows you to deal with binary data (fetched from http response in your case) and save it to the local file system.
See example here: http://www.html5rocks.com/en/tutorials/file/xhr2/#toc-example-savingimages
Controller (MyApiController) Code:
public ActionResult DownloadFile(String FileUniqueName)
{
var rootPath = Server.MapPath("~/UploadedFiles");
var fileFullPath = System.IO.Path.Combine(rootPath,FileUniqueName);
byte[] fileBytes = System.IO.File.ReadAllBytes(fileFullPath);
return File(fileBytes, System.Net.Mime.MediaTypeNames.Application.Octet, "MyDownloadFile");
}
Jquery Code:
$(document).on("click"," a ", function(){ //on click of anchor tag
var funame=$(this).attr('uname'); /*anchor tag attribute "uname" contain file unique name*/
var url = "http://localhost:14211/MyApi/DownloadFile?FileUniqueName= " + funame;
window.open(url);
});
I have an ASP.NET MVC web site. One of my routes is a URL that takes 5 parameters. For the sake of illustration, these parameters are named parameter1, parameter2, parameter3, parameter4, and parameter5. Currently, I'm constructing a URL in some C# code that will POST to the mvc action via a WebClient. that code looks like this:
WebClient myWebClient = new WebClient();
myWebClient.UploadStringCompleted += myWebClient_UploadStringCompleted;
string url = "http://www.example.com/customer/" + parameter1 + "/orders/" + parameter2 + "/" + parameter3 + "/" + parameter4 + "/" + parameter5;
myWebClient.UploadStringAsync(new Uri(url, UriKind.Absolute));
I'm confident that the UploadString method does a POST. I need to do a POST, because my parameter values can be very long. In fact, I estimate that occasionally, the total url length may be 20000 characters long. Regardless, I get a 400 error when I attempt to post my data. In an effort to debug this, I'm trying to figure out how to simulate a POST in Fiddler.
Assuming that I am passing values via a query string as shown above, what values do I enter into Fiddler? From the Composer tab, I'm not sure what to enter for the Request Headers area. I'm also not entirely sure what to enter for the url. I'm not sure if I put the entire URL in there, including the parameter values, or if those belong in the Request Headers.
What I need to enter into Fiddler, so that I can debug my issue?
Basically all your parameters are a part of the URL, and this is the root of your problem. Here is what is going on: you are hitting the URL length limitation, and receiving a "400 Bad request" error. In real world most web browsers do not work with URLs more than 2000 characters long.
To resolve this problem I would suggest doing a bit of refactoring, so that request is posted to the URL http://www.example.com/customer/parameter1/orders or even http://www.example.com/customer/orders with parameters send in request body. Here is how test such request in Fiddler:
On Composer tab choose POST request verb
Specify the URL as
http://www.example.com/customer/parameter1/orders
or
http://www.example.com/customer/orders
In Request Headers section you can set content type header like
Content-Type: application/x-www-form-urlencoded
or any other header you might require. Or just leave it blank which will work in your case.
Finally in Request Body field list your parameters in query string form
parameter1name=parameter1value¶meter2name=parameter2value
In this new case here is how you can send such a request using WebClient:
WebClient myWebClient = new WebClient();
myWebClient.UploadStringCompleted += myWebClient_UploadStringCompleted;
string url = "http://www.example.com/customer/orders";
string data = "parameter1name=parameter1value¶meter2name=parameter2value";
myWebClient.UploadStringAsync(new Uri(url, UriKind.Absolute), data);
I simply mimic the exact request that was sent.
This is how I do it:
Open Fiddler
Go to the page that I want to re-issue the command i.e. repeat the bug step but watch for the request in the list
Select it from the list and right-click, go to replay > reissue and edit
This build a replicated request but hits a break point before it is sent (You will see the red bar on the right)
Above this you can edit the values that were sent by double-clicking on any of them in Headers, QueryString etc
Then hit Run to Complete
I have an application that contains a button, on click of this button, it will open a browser window using a URL with querystring parameters (the url of a page that i am coding).
Is there a way to ensure that the URL is coming from my application and only from my application - and not just anyone typing the URL manually in a webbrowser?
If not, what is the best way to ensure that a specific URL is coming from a specific application - and not just manually entered in the address bar or a web browser-
Im using asp.net.
You can check if the request was made from one of the pages of your application using:
Request.UrlReferrer.Contains("mywebsite.com")
That's the simple way.
The secure way is to put a cookie on the client containing a value encrypted using a secure key or hashed using a secure salt. If the cookie is set to expire when the page is closed it should be impossible for someone to forge.
Here's an example:
On the pages that would redirect to the page you are trying to protect:
HttpCookie cookie = new HttpCookie("SecureCheck");
//don't set the cookie's expiration so it's deleted when the browser is closed
cookie.Value = System.Web.Security.FormsAuthentication.HashPasswordForStoringInConfigFile(Session.SessionID, "SHA1");
Response.Cookies.Add(cookie);
On the page you are trying to protect:
//check to see if the cookie is there and it has the correct value
if (string.IsNullOrEmpty(Request.Cookies["SecureCheck"]) || System.Web.Security.FormsAuthentication.HashPasswordForStoringInConfigFile(Session.SessionID, "SHA1") != Request.Cookies["SecureCheck"])
throw Exception("Invalid request. Please access this page only from the application.");
//if we got this far the exception was not thrown and we are safe to continue
//insert whatever code here
There's no reliable way to do this for a GET request, nor is their any reason to try for a legitimate user. What you should do instead is ensure that regardless of where the request comes from the user has the proper permissions and access rights and that the session is protected appropriately (HTTP only cookies, SSL, etc.) If the request is changing data, then it should be a POST, not a GET, and it should be accompanied by some suitable cross-site request forgery prevention techniques (such as a cookie containing a nonce that is verified against a matching nonce on the form itself).
There is no way, other than rejecting the request if it doesn't contain a previously generated random one-time token in the parameters (that would be stored in the session, for example).
While there is no 100% secure way to do this, what I am suggesting might at least take care of your basic needs.
This is what you can do .
Client: Add a HTTP header with an encoded string that is like hash (sha256) of some word.
Then make your client always do a POST request instead of GET.
Server: Check the HTTP Header for encoded string. Also make sure it is a POST request.
This is not 100% as ofcourse someone smart enough could figure out and still generate a request, but depending on your need you might find this enough or not
You can check the referer, the user agent, add an additional header to the request, always do post requests to that url. However, considering HTTP is transmitted in plain text, somebody is always able to let wireshark or fiddler run, capture the HTTP packets and recreate the requests with your measures in place.
Pass parameters from your application so that you can verify on the server side.
I suggest you use an encryption algorithm and generate random text using a password(key). Then, decrypt the param on the server side and check if it matches your expectation.
I am not very clear though. sorry about that, If had to do something like this, then, I would do something similar to mentioned above.
You can use to check the header on MVC controller like Request.Headers["Accept"]; if it is coming from your code in angularjs or jquery:
sample angularjs like this:
var url = ServiceServerPath + urlSearchService + '/SearchCustomer?input=' + $scope.strInput;
$http({
method: 'GET',
url: url,
headers: {
'Content-Type': 'application/json'
},.....
And on the MVC [HttpGet] Action method
[HttpGet]
[PreventDirectAccess]//It is my custom filters
// ---> /Index/SearchCustomer?input={input}/
public string SearchCustomer(string input)
{
try
{
var isJsonRequestOnMVC = Request.Headers["Accept"];//TODO: This will check if the request comes from MVC else comes from Browser
if (!isJsonRequestOnMVC.Contains("application/json")) return "Error Request on server!";
var serialize = new JavaScriptSerializer();
ISearch customer = new SearchCustomer();
IEnumerable<ContactInfoResult> returnSearch = customer.GetCustomerDynamic(input);
return serialize.Serialize(returnSearch);
}
catch (Exception err)
{
throw;
}
}
I need to check that our visitors are using HTTPS. In BasePage I check if the request is coming via HTTPS. If it's not, I redirect back with HTTPS. However, when someone comes to the site and this function is used, I get the error:
System.Web.HttpException: Server
cannot append header after HTTP
headers have been sent. at
System.Web.HttpResponse.AppendHeader(String
name, String value) at
System.Web.HttpResponse.AddHeader(String
name, String value) at
Premier.Payment.Website.Generic.BasePage..ctor()
Here is the code I started with:
// If page not currently SSL
if (HttpContext.Current.Request.ServerVariables["HTTPS"].Equals("off"))
{
// If SSL is required
if (GetConfigSetting("SSLRequired").ToUpper().Equals("TRUE"))
{
string redi = "https://" +
HttpContext.Current.Request.ServerVariables["SERVER_NAME"].ToString() +
HttpContext.Current.Request.ServerVariables["SCRIPT_NAME"].ToString() +
"?" + HttpContext.Current.Request.ServerVariables["QUERY_STRING"].ToString();
HttpContext.Current.Response.Redirect(redi.ToString());
}
}
I also tried adding this above it (a bit I used in another site for a similar problem):
// Wait until page is copletely loaded before sending anything since we re-build
HttpContext.Current.Response.BufferOutput = true;
I am using c# in .NET 3.5 on IIS 6.
Chad,
Did you try ending the output when you redirect? There is a second parameter that you'd set to true to tell the output to stop when the redirect header is issued. Or, if you are buffering the output then maybe you need to clear the buffer before doing the redirect so the headers are not sent out along with the redirect header.
Brian
This error usually means that something has bee written to the response stream before a redirection is initiated. So you should make sure that the test for https is done fairly high up in the page load function.
As some of you may know, you are able to POST with C#. This means you can "push" buttons on a website with webrequest/response. Now there are also buttons on sites which work with javascript, they start like:
(function($j){
$j.data(document, 'maxPictureSize', 764327);
share_init();
})(jQuery.noConflict());
Is there any solution you can make those function calls in C# with like httprequests or any other kind of library?
I'm assuming you have a program that wants to manipulate the server "back end" for a web page by making the server think that someone pushed a button that POSTs, and sending the data that the web page would include with its POST.
The first tool you need is Microsoft Network Monitor 3.3, or another network packet tracing tool. Use this to look at the POST from the real web page. NetMon (at least) decomposes the packet into the HTTP pieces and headers, so you can easily see what's going on.
Now you will know what data the real POST is sending, and the URL to which it is sending the data (with any possible "query string" - which is unusual for a POST).
Next you need to write C# to create the same sort of POST to the same URL. It seems that you already know about HttpWebRequest/HttpWebResponse so I won't explain them in detail. You may have noticed in your NetMon trace that the Content-Type header was application/x-www-form-urlencoded. This is most often data from an HTML form which is URL-Encoded (like the name), so you need to URL-Encode your data before POSTing it, and you need to know the size of the encoded data for the Content-Length. HttpUtility.UrlEncode() is one method to use for this encoding.
Once you think you have it, try it and use NetMon to inspect your POST request and the response from the server. Keep going until you have duplicated what the mystery web page is doing.
Ok use webBrowser form to load the page:
webBrowser.Navigate( url );
then save the contents of the web broweser form to a file or a string:
File.WriteAllText(#"c:\test\ajax_test.txt", webBrowser1.Document.Body.Parent.OuterHtml, Encoding.GetEncoding(webBrowser1.Document.Encoding));
now if you look to the txt file it should have the html tags you look for.
Even when using JavaScript to do a POST there is a POST somewhere in the JS which works the same way as button submit. You just have to dig to the place where the JS code posts and see how it does it. Then craft the same post in C#.
Take for example ASP.NET's own __doPostBack function
var theForm = document.forms['aspnetForm'];
if (!theForm) {
theForm = document.aspnetForm;
}
function __doPostBack(eventTarget, eventArgument) {
if (!theForm.onsubmit || (theForm.onsubmit() != false)) {
theForm.__EVENTTARGET.value = eventTarget;
theForm.__EVENTARGUMENT.value = eventArgument;
theForm.submit();
}
}
You can see that it digs for the form sets several values for input fields and does a submit. Basically you need to fill the same values for the inputs and submit the same form and you've got the JS submit done yourself.
You need to capture the requests and headers those buttons are sending and simulate them with HttpWebRequest. You could also take a look at WatiN if you want to automate user actions on web sites.