I'm using an .aspx page to serve an image file from the file system according to the given parameters.
Server.Transfer(imageFilePath);
When this code runs, the image is served, but no Last-Modified HTTP Header is created.
as opposed to that same file, being called directly from the URL on the same Server.
Therefor the browser doesn't issue an If-Modified-Since and doesn't cache the response.
Is there a way to make the server create the HTTP Headers like normally does with a direct request of a file (image in that case) or do I have to manually create the headers?
When you make a transfer to the file, the server will return the same headers as it does for an .aspx file, because it's basically executed by the .NET engine.
You basically have two options:
Make a redirect to the file instead, so that the browser makes the request for it.
Set the headers you want, and use Request.BinaryWrite (or smiiliar) to send the file data back in the response.
I'll expand on #Guffa's answer and share my chosen solution.
When calling the Server.Transfer method, the .NET engine treats it like an .aspx page, so It doesn't add the appropriate HTTP Headers needed (e.g. for caching) when serving a static file.
There are three options
Using Response.Redirect, so the browser makes the appropriate request
Setting the headers needed and using Request.BinaryWrite to serve the content
Setting the headers needed and calling Server.Transfer
I choose the third option, here is my code:
try
{
DateTime fileLastModified = File.GetLastWriteTimeUtc(MapPath(fileVirtualPath));
fileLastModified = new DateTime(fileLastModified.Year, fileLastModified.Month, fileLastModified.Day, fileLastModified.Hour, fileLastModified.Minute, fileLastModified.Second);
if (Request.Headers["If-Modified-Since"] != null)
{
DateTime modifiedSince = DateTime.Parse(Request.Headers["If-Modified-Since"]);
if (modifiedSince.ToUniversalTime() >= fileLastModified)
{
Response.StatusCode = 304;
Response.StatusDescription = "Not Modified";
return;
}
}
Response.AddHeader("Last-Modified", fileLastModified.ToString("R"));
}
catch
{
Response.StatusCode = 404;
Response.StatusDescription = "Not found";
return;
}
Server.Transfer(fileVirtualPath);
Related
When I call Response.Redirect(someUrl) I get the following HttpException:
Cannot redirect after HTTP headers have been sent.
Why do I get this? And how can I fix this issue?
According to the MSDN documentation for Response.Redirect(string url), it will throw an HttpException when "a redirection is attempted after the HTTP headers have been sent". Since Response.Redirect(string url) uses the Http "Location" response header (http://en.wikipedia.org/wiki/HTTP_headers#Responses), calling it will cause the headers to be sent to the client. This means that if you call it a second time, or if you call it after you've caused the headers to be sent in some other way, you'll get the HttpException.
One way to guard against calling Response.Redirect() multiple times is to check the Response.IsRequestBeingRedirected property (bool) before calling it.
// Causes headers to be sent to the client (Http "Location" response header)
Response.Redirect("http://www.stackoverflow.com");
if (!Response.IsRequestBeingRedirected)
// Will not be called
Response.Redirect("http://www.google.com");
Once you send any content at all to the client, the HTTP headers have already been sent. A Response.Redirect() call works by sending special information in the headers that make the browser ask for a different URL.
Since the headers were already sent, asp.net can't do what you want (modify the headers)
You can get around this by a) either doing the Redirect before you do anything else, or b) try using Response.Buffer = true before you do anything else, to make sure that no output is sent to the client until the whole page is done executing.
A Redirect can only happen if the first line in an HTTP message is "HTTP/1.x 3xx Redirect Reason".
If you already called Response.Write() or set some headers, it'll be too late for a redirect. You can try calling Response.Headers.Clear() before the Redirect to see if that helps.
Just check if you have set the buffering option to false (by default its true). For response.redirect to work,
Buffering should be true,
you should not have sent more data using response.write which exceeds the default buffer size (in which case it will flush itself causing the headers to be sent) therefore disallowing you to redirect.
Using
return RedirectPermanent(myUrl) worked for me
You can also use below mentioned code
Response.Write("<script type='text/javascript'>"); Response.Write("window.location = '" + redirect url + "'</script>");Response.Flush();
There is one simple answer for this:
You have been output something else, like text, or anything related to output from your page before you send your header. This affect why you get that error.
Just check your code for posible output or you can put the header on top of your method so it will be send first.
If you are trying to redirect after the headers have been sent (if, for instance, you are doing an error redirect from a partially-generated page), you can send some client Javascript (location.replace or location.href, etc.) to redirect to whatever URL you want. Of course, that depends on what HTML has already been sent down.
My Issue got resolved by adding the Exception Handler to handle
"Cannot redirect after HTTP headers have been sent". this Error as shown below code
catch (System.Threading.ThreadAbortException)
{
// To Handle HTTP Exception "Cannot redirect after HTTP headers have been sent".
}
catch (Exception e)
{//Here you can put your context.response.redirect("page.aspx");}
I solved the problem using:
Response.RedirectToRoute("CultureEnabled", RouteData.Values);
instead of Response.Redirect.
Be sure that you don't use Responses' methods like Response.Flush(); before your redirecting part.
Error
Cannot redirect after HTTP headers have been sent.
System.Web.HttpException (0x80004005): Cannot redirect after HTTP headers have been sent.
Suggestion
If we use asp.net mvc and working on same controller and redirect to different Action then you do not need to write.. Response.Redirect("ActionName","ControllerName"); its better to use only return RedirectToAction("ActionName"); or return View("ViewName");
The redirect function probably works by using the 'refresh' http header (and maybe using a 30X code as well). Once the headers have been sent to the client, there is not way for the server to append that redirect command, its too late.
If you get Cannot redirect after HTTP headers have been sent then try this below code.
HttpContext.Current.Server.ClearError();
// Response.Headers.Clear();
HttpContext.Current.Response.Redirect("/Home/Login",false);
There are 2 ways to fix this:
Just add a return statement after your Response.Redirect(someUrl);
( if the method signature is not "void", you will have to return that "type", of course )
as so:
Response.Redirect("Login.aspx");
return;
Note the return allows the server to perform the redirect...without it, the server wants to continue executing the rest of your code...
Make your Response.Redirect(someUrl) the LAST executed statement in the method that is throwing the exception. Replace your Response.Redirect(someUrl) with a string VARIABLE named "someUrl", and set it to the redirect location... as follows:
//......some code
string someUrl = String.Empty
.....some logic
if (x=y)
{
// comment (original location of Response.Redirect("Login.aspx");)
someUrl = "Login.aspx";
}
......more code
// MOVE your Response.Redirect to HERE (the end of the method):
Response.Redirect(someUrl);
return;
In ASP.NET MVC, now we can response 304 code to browser, which means that the content in the server has not been changed, the browser can use its local cache for this url.
public ActionResult Image(int id){
var image = _imageRepository.Get(id);
if (image == null)
throw new HttpException(404, "Image not found");
if (!String.IsNullOrEmpty(Request.Headers["If-Modified-Since"]))
{
CultureInfo provider = CultureInfo.InvariantCulture;
var lastMod = DateTime.ParseExact(Request.Headers["If-Modified-Since"], "r", provider).ToLocalTime();
if (lastMod == image.TimeStamp.AddMilliseconds(-image.TimeStamp.Millisecond))
{
Response.StatusCode = 304;
Response.StatusDescription = "Not Modified";
return Content(String.Empty);
}
}
var stream = new MemoryStream(image.GetImage());
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetLastModified(image.TimeStamp);
return File(stream, image.MimeType);
}
But I am a little confused about the logic in the browser. For example, when we first ask for a page http://www.test.com/index.html,it will load a javascript file aaa.js. But when the browser ask another page http://www.test.com/index2.html, this page also contains aaa.js.
Here comes the question. We know that the browser has a logic for http cache. I assume that when the browser asks for index2.html, it will check that it has aaa.js locally, which is available, so it will not communicate with server about this file. So here, no 304 is returned, because the browser has not request anything about this file.Is this the right logic?
Or every time it will communicate with the server to check the version of the file? In this situation, if we don't write any C# code to return 304 status, every time it will return the whole file.So I guess this is not the logic.
What is the relationship between the browser cache and 304 status?
Depending on the server response to the first request to aaa.js the browser may or may not request the file again the second time.
If no specific caching headers are sent by the server with the file, on the second page load the browser will send a request for aaa.js again. If the browser doesn't have the JS file in its cache, it will send the request the same as it did the first time. If aaa.js is in the browser cache it will send a request to the server containing an If-Modified-Since header with the date the file was previously downloaded. The server then checks if the file has been modified: if so it sends the new file; otherwise it sends the 304 header.
Now let's spool back to the beginning. In the initial request to aaa.js the server could include a Cache-control header telling the browser how long to cache the file for. Let's say Cache-control: max-age=3600 which instructs to cache the file for one hour (3600 seconds).
If the user visits the second page within one hour, the browser won't even send a request to the server for aaa.js, it will just use the cached file without question.
Once the hour is up and a new page is loaded, the browser request aaa.js again.
I am having some trouble adding a value to the Page.Request & Page.Response headers and have the key & value stay/persist through a redirect.
I have an enum tracking code that I want to place in the headers to trace how a user goes through my site prior to their checkout.
I am using this code to add the headers to response and request context.
var RequestSessionVariable = context.Request.Headers["SessionTrackingCode"];
if (RequestSessionVariable == null)
{
context.Response.AddHeader("SessionTrackingCode", ((int)tracker).ToString());
context.Request.Headers.Add("SessionTrackingCode", ((int)tracker).ToString());
}
else
{
if(!RequestSessionVariable.Contains(((int)tracker).ToString()))
{
RequestSessionVariable += ("," + ((int)tracker).ToString());
context.Request.Headers["SessionTrackingCode"] = RequestSessionVariable;
context.Response.Headers["SessionTrackingCode"] = RequestSessionVariable;
}
}
The method call that occurs in Page_Load of the necessary controls within the website:
trackingcodes.AddPageTrackingCode(TrackingCode.TrackingCodes.ShoppingCart, this.Context);
The header SessionTrackingCode is their but after a Response.Redirect("~/value.aspx") the RequestSessionVariable is always null. Is there something that happens on the redirect that will wipe out the headers that I add? Or what am I doing wrong on the addition of the header key and value?
this equals:
public partial class Cart : System.Web.UI.UserControl
Headers send by client on every request, so any redirect will require client to send headers again.
Unless you are using some special client (not a browser) any special headers will be essentially ignored/lost during requests. Browser only will send known headers (cookies, authentication, referrer) in requests and act on other set of known headers in response (setCookies). You are using custom header that not known to browser so browser will not read in from response nor send it in request.
Your options:
switch to use cookies for your tracking (same as everyone else)
use AJAX requests to send/receive custom headers (probably not what you are looking for as urls look like regular GET/POST ones)
build custom client that will pay attention to your headers (purely theoretical, unless you building some sort of sales terminal no one will install your client to visit your site)
Note: adding headers to request in page code does no make much sense as request will not be send anywhere (it is what come from browser).
This looks like a job for cookies, rather than http headers. The browser will not return your custom headers to you, but it will return your cookies.
I'm pretty sure this isn't possible but I thought I'd ask...
I have a FileResult which returns a file, when it works there's no problem. When there's an error in the FileResult for some reason, I'd like to display an exception error message on the users screen, preferably in a popup.
Can I do an Ajax post which returns a file when successful and displays a message if not?
I think it is not possible cause in order to handle ajax post, you will have to write a javascript handler on the client side and javascript cannot do file IO on client side.
However, what you can do is, make an ajax request to check if file exists and can be downloaded. If, not, respond to that request negatively which will popup a dialog on client side. If successful, make a file download request.
Not specifically related to MVC but...
it can be done using XMLHttpRequest in conjunction with the new HTML5 File System API that allows you to deal with binary data (fetched from http response in your case) and save it to the local file system.
See example here: http://www.html5rocks.com/en/tutorials/file/xhr2/#toc-example-savingimages
Controller (MyApiController) Code:
public ActionResult DownloadFile(String FileUniqueName)
{
var rootPath = Server.MapPath("~/UploadedFiles");
var fileFullPath = System.IO.Path.Combine(rootPath,FileUniqueName);
byte[] fileBytes = System.IO.File.ReadAllBytes(fileFullPath);
return File(fileBytes, System.Net.Mime.MediaTypeNames.Application.Octet, "MyDownloadFile");
}
Jquery Code:
$(document).on("click"," a ", function(){ //on click of anchor tag
var funame=$(this).attr('uname'); /*anchor tag attribute "uname" contain file unique name*/
var url = "http://localhost:14211/MyApi/DownloadFile?FileUniqueName= " + funame;
window.open(url);
});
We are using ExpertPDF to take URLs and turn them into PDFs. Everything we do is through memory, so we build up the request and then read the stream into ExpertPDF and then write the bits to file. All the files we have been requesting so far are just plain HTML documents. Our designers update CSS files or change the HTML and rerequest the documents as PDFs, but often times, things are getting cached. Take, for example, if I rename the only CSS file and view the HTML page through a web browser, the page looks broke because the CSS doesn't exist. But if I request that page through the PDF Generator, it still looks ok, which means somewhere the CSS is cached. Here's the relevant PDF creation code:
// Create a request
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.UserAgent = "IE 8.0";
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "GET";
// Send the request
HttpWebResponse resp = (HttpWebResponse)request.GetResponse();
if (resp.IsFromCache) {
System.Web.HttpContext.Current.Trace.Write("FROM THE CACHE!!!");
} else {
System.Web.HttpContext.Current.Trace.Write("not from cache");
}
// Read the response
pdf.SavePdfFromHtmlStream(resp.GetResponseStream(), System.Text.Encoding.UTF8, "Output.pdf");
When I check the trace file, nothing is being loaded from cache. I checked the IIS log file and found a 200 response coming from the request, even after a file had been updated (I would expect a 302). We've tried putting the No-Cache attribute on all HTML pages, but still no luck. I even turned off all caching at the IIS level. Is there anything in ExpertPDF that might be caching somewhere or something I can do to the request object to do a hard refresh of all resources?
UPDATE
I put ?foo at the end of my style href links and this updates the CSS everytime. Is there a setting someplace that can prevent stylesheets from being cached so I don't have to do this inelegant solution?
Actually this is a perfectly normal solution, though I would recommend using something like the current date and time attached to the PDF link/file name (like you did for the css sheet
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url + DateTime.Now.ToString().Replace(":", "").Replace("-", "").Replace(" ", ""));)
rather than foo on your stlye sheet. As the date and time will ALWAYS change, you will force the download each time.
I would venture to guess that the caching is not the CSS style sheet, but rather the PDF is being cached by the client. Adding the URL variable to your stylesheet is preventing it from being cached. (I think you fixed the problem, but probably not, in my opinion, the best way) Try the above tip, and you should not have any file caching problems.
PS. I know you can use DateTime.Now.ToString(formathere) but I am too lazy to look it up right now ;)