i am trying to compress the webpages of my website to increase its speed.
i am done with JS and CSS compression.
now i want to compress my aspx pages before the response is out.
i am using this code in global.asax file of my website
void Application_Start(object sender, EventArgs e)
{
HttpContext incoming = HttpContext.Current;
string oldpath = incoming.Request.Path.ToLower();
incoming.Response.Filter = new System.IO.Compression.GZipStream(incoming.Response.Filter, System.IO.Compression.CompressionMode.Compress);
HttpContext.Current.Response.AppendHeader("Content-encoding", "gzip");
HttpContext.Current.Response.Cache.VaryByHeaders["Accept-encoding"] = true;
}
it donot gives error in visual studio. but when i put this code on IIS, it gives error/exception
Exception Details: System.Web.HttpException: Request is not available in this context
can anyone suggest or explain what should i do.
Application_Start is executed when your web application starts and this start is not associated with any page request, as page request is not happened yet at this point. Request is not available on Application_Start.
You can use Server.MapPath() instead.
The issue here is an elusive one, the built in Visual Studio cassini web server will be running requests using the older pattern of firing up the application upon the first request, which is the same as Managed Pipeline mode = classic in IIS. This means that there is a request object for you to access straight away, as the request is what triggered the app_start.
However, when you put this onto an IIS 7 box with a Managed pipeline mode = Integrated, it will fail. This is because an integrated pipe means that the site is started as soon as the app pool fires up, meaning there is no request object for it to hook into.
To solve this problem I'd recommend letting IIS compress the content rather than doing it by hand, this link has the details to get you started and here is a good outline about the difference it can make.
If you're really determined to do the compression within the application, I'd suggest implementing it as a HttpModule, similar to this example.
EDIT: Another implementation of a gzipping HttpModule here.
Related
Update 2019-04-24
Problem TL;DR one controller call was causing the next few calls to have a ~15s delay before they even reached the controllers.
I've narrowed the cause to a large file write in the request that causes further delay File.WriteAllText(htmlFilePath, reportHTML);
Originial post
The Problem
I have a long running controller request for generating a report. After the report in generated, additional http requests are made to fetch resultant images.
However, the http calls for the images take about 15s between the ajax call in the browser and when the controller action is invoked. After that, the method runs speedily.
Evidence so far
Previously, we used WCF to run the report generation on a separate machine and there was no such delay.
I've tried running both the report generation and image retrieval methods as async calls on their own threads (but on the same machine). However, that still has the delay.
The delay also only happens on the first image request after generating the report. Afterwards, there is no delay.
There is also no session state and disabling session state has no effect
The Ask
Does anyone know what might cause this delay? How can I get better insights into blocks in ASP.NET code or IIS processes?
Other details:
Using CoreHtmlToImage for report generation and azure storage emulator for image storage.
ASP.NET MVC is version 5.2.3 (not core)
Turns out that writing to the website's directory causes the webserver to restart the site
Source: Creating temporary files in wwroot folder ASP.Net MVC3
So if you write files using the assembly working directory like
var uriAssemblyPath = System.Reflection.Assembly.GetExecutingAssembly().CodeBase;
var assemblyPath = new Uri(uriAssemblyPath).LocalPath;
var baseDirectory = System.IO.Path.GetDirectoryName(assemblyPath);
You'll run into this issue.
If you need a consistent directory outside the webroot, you can use the designated temp directory
Path.GetTempPath()
Source: Where can I write a temp file from ASP.NET?
I have a minimum ASP.NET Handler (.ashx) that returns a PDF file:
public void ProcessRequest(HttpContext context)
{
context.Response.ContentType = "application/pdf";
context.Response.BinaryWrite(File.ReadAllBytes(context.Server.MapPath("~/files/GettingStarted.pdf")));
}
public bool IsReusable
{
get
{
return false;
}
}
}
When I run my web application on IIS Express, the app is hosted at localhost:45050. If I browse to localhost:45050/handler1.ashx on my main development machine, the PDF is downloaded as expected. If I use the DHC Chrome extension (an HTTP client) to perform an HTTP GET on localhost:45050/handler1.ashx, an HTTP 200 OK response code is returned along with the binary data:
HOWEVER, if I run the exact same ASP.NET project on a different machine, I run into bizarre issues. With the project running locally on localhost:45050, I'm still able to browse to localhost:45050/handler1.ashx in Chrome/Firefox/IE to download the file. But, when I use the DHC extension to perform an HTTP GET on localhost:45050/handler1.ashx, there is no response!
I'm able the resolve localhost:45050 (the home page of the site) via DHC on this alternate machine. The server responds with 200 OK and yields the landing page.
But when dealing with the handler that returns binary content, I cannot get any response back from the server with any HTTP client aside from the browser's URL bar. How are browsers able to resolve the HTTP response when standalone HTTP clients cannot? Does anyone have any idea what may be happening here? What would cause behavior to change across machines? I'm trying to handle the response in a JavaScript client, but I'm not getting any data back.
Any help would be greatly appreciated. Thanks!
The top answer here...
Best way to stream files in ASP.NET
...resolved the problem. It seems that writing large files in a single call is a no-no on certain servers. You have to chunk the response manually.
I'm having some difficulties with my custom URL-rewriting setup. I'm not using any 3rd party tool to manage this, just using the global.asax by looking at every request and processing it.
The way I am using it is like:
www.mydomain.com/site/google.com
This page contains information about the site "google.com". I'm using the actual domain within the URL. This is all works fine.
I have simplified the code that I'm using just to show you an example of how it works:
Dim myContext As HttpContext = HttpContext.Current
Dim URL As String = myContext.Request.ServerVariables("URL")
If URL.ToLower.Contains("/site/") Then
URL = URL.Trim("/")
Dim strURL As String = URL.ToLower.Split("/")(1)
Redirect301("/site.aspx?url=" & strURL)
Exit Sub
End If
The issue I'm having is for certain domain extensions, the page will just load up custom 404 Not Found and I have no idea what the cause is.
Example of pages that don't load:
/site/google.ad
/site/google.cd
I'm guessing that the system thinks that .cd and .ad files are actual physical files and when it doesn't find them, it shows the custom 404 error. It doesn't actually looks like the request is getting through to the global.asax. When working in local environment, they actually load fine but only on the live server it has this issue and this is why it has been a nightmare trying to figure it out.
Another issue I found was loading the following URL:
/site/prn.com
This shows a 404 error again but this time not the custom one I created but the actual hard looking .net 404 error page. This also works fine in local environment.
There must be some IIS setting or code change I could do to try to get this resolved.
Thank you for your time :)
Aki
Check this link . Error message when you try to browse a Web page that is hosted on IIS 7.0: "HTTP Error 404.7 – FILE_EXTENSION_DENIED
This is the reason .ad .cd extensions are blocked.
I want to run my personal web sites via an httphandler (I have a web server and static ip at home.)
Eventually, I will incorporate a data access layer and domain router into the handler, but for now, I am just trying to use it to return static web content.
I have the handler mapped to all verbs and paths with no access restrictions in IIS 7 on Windows 7.
I have added a little file logging at the beginning of process request. As it is the first thing in the handler, I use the logging to tell me when the handler is hit.
At the moment, the handler just returns a single web page that I have already written.
The handler itself is mostly just this:
using (FileStream fs = new FileStream(Request.PhysicalApplicationPath + "index.htm",
FileMode.Open))
{
fs.CopyTo(Response.OutputStream);
}
I understand that this won't work for anything but the one file.
So my issue is this: the HTML file has links to some images in it. I would expect that the browser would come back to the server to get those images as new requests. I would expect those requests to fail (because they'd be mapped to index.htm). But I would expect to see the logging hit at least twice (and potentially hit recursively). However, I only see a single request. The web page comes up and the images are 'X's.
When I refresh the browser, I see another request come through, but only for the root page again. The page is basic HTML, I do not have an asp.net application (nor do I want one, I like HTML/CSS/JS).
What do I have to do to get more than just the first request sent from the browser? I assume I'm just totally off the mark because I wrote an HTTP Module first, but strangely got the same exact behavior. I'm thinking I need to specify some response headers, but don't see that in any example.
OK, this might sound a bit confusing and complicated, so bear with me.
We've written a framework that allows us to define friendly URLs. If you surf to any arbitrary URL, IIS tries to display a 404 error (or, in some cases, 403;14 or 405). However, IIS is set up so that anything directed to those specific errors is sent to an .aspx file. This allows us to implement an HttpHandler to handle the request and do stuff, which involves finding the an associated template and then executing whatever's associated with it.
Now, this all works in IIS 5 and 6 and, to an extent, on IIS7 - but for one catch, which happens when you post a form.
See, when you post a form to a non-existent URL, IIS says "ah, but that url doesn't exist" and throws a 405 "method not allowed" error. Since we're telling IIS to redirect those errors to our .aspx page and therefore handling it with our HttpHandler, this normally isn't a problem. But as of IIS7, all POST information has gone missing after being redirected to the 405. And so you can no longer do the most trivial of things involving forms.
To solve this we've tried using a HttpModule, which preserves POST data but appears to not have an initialized Session at the right time (when it's needed). We also tried using a HttpModule for all requests, not just the missing requests that hit 404/403;14/405, but that means stuff like images, css, js etc are being handled by .NET code, which is terribly inefficient.
Which brings me to the actual question: has anyone ever encountered this, and does anyone have any advice or know what to do to get things working again? So far someone has suggested using Microsoft's own URL Rewriting module. Would this help solve our problem?
Thanks.
Microsoft released a hotfix for this :
http://support.microsoft.com/default.aspx/kb/956578
Since IIS7 uses .net from the top down there would not be any performance overhead of using an HttpModule, In fact there are several Managed HttpModules that are always used on every request. When the BeginRequest event is fired, the SessionStateModule may not have been added to the Modules collection, so if you try to handle the request during this event no session state info will be available. Setting the HttpContext.Handler property will initialize the session state if the requested handler needs it, so you can just set the handler to your fancy 404 page that implements IRequiresSessionState. The code below should do the trick, though you may need to write a different implementation for the IsMissing() method:
using System.Web;
using System.Web.UI;
class Smart404Module : IHttpModule
{
public void Dispose() {}
public void Init(HttpApplication context)
{
context.BeginRequest += new System.EventHandler(DoMapping);
}
void DoMapping(object sender, System.EventArgs e)
{
HttpApplication app = (HttpApplication)sender;
if (IsMissing(app.Context))
app.Context.Handler = PageParser.GetCompiledPageInstance(
"~/404.aspx", app.Request.MapPath("~/404.aspx"), app.Context);
}
bool IsMissing(HttpContext context)
{
string path = context.Request.MapPath(context.Request.Url.AbsolutePath);
if (System.IO.File.Exists(path) || (System.IO.Directory.Exists(path)
&& System.IO.File.Exists(System.IO.Path.Combine(path, "default.aspx"))))
return true;
return false;
}
}
Edit: I added an implementation of IsMissing()
Note: On IIS7, The session state module does not run globally by default. There are two options: Enable the session state module for all requests (see my comment above regarding running managed modules for all request types), or you could use reflection to access internal members inside System.Web.dll.
The problem in IIS 7 of post variables not being passed through to custom error handlers is fixed in service pack 2 for Vista. Haven't tried it on Windows Server but I'm sure it will be fixed there too.
Just a guess: the handler specified in IIS7's %windir%\system32\inetsrv\config\applicationhost.config which is handling your request is not allowing the POST verb to get through at all, and it is evaluating that rule before determining whether the URL doesn't exist.
Yes, I would definitely recommend URL rewriting (using Microsoft's IIS7 one or one of the many alternatives). This is specifically designed for providing friendly URLs, whereas error documents are a last-ditch backstop for failures, which tends to munge the incoming data so it may not be what you expect.