I am trying to create an HttpModule in C# which will redirect arbitrary URLs and missing files, and which will perform canonicalization on all URLs that come in. Part of my canonicalization process is to redirect from default documents (such as http://www.contoso.com/default.aspx) to a bare directory. (like http://www.contoso.com/)
I have discovered that when an IIS server receives a request for a bare directory, it processes this request normally, and then it creates a child request for the selected default document. This is producing a redirect loop in my module - the first request goes through just fine, but when it sees the child request it removes the default document from the url and redirects back to the bare directory, starting the process over again.
Obviously, all I need to solve this problem is for my module to know when it's seeing a child request, so that it can ignore it. But I cannot find anything online describing how to tell the two requests apart. I found that request headers persist between the two requests, so I tried adding a value to the request headers and then looking for that value. This worked in IIS 7, but apparently IIS 6 won't let you alter request headers, and my code needs to run in both.
These child requests can also be triggered by any Server.Transfer or Server.Executes in the code. One trick that works to detect a child request would be to add a custom request header during the first request and checking for it later (when in the child request). Example:
private bool IsChildRequest(HttpRequest request)
{
var childRequestHeader = request.Headers["x-parent-breadcrumb"];
if (childRequestHeader != null)
{
return true;
}
request.Headers["x-parent-breadcrumb"] = "1"; // arbitrary value
return false;
}
This works because the request headers are passed to the child request. I initially tried this with HttpContext.Current.Items, but that seemed to get reset for the child request.
What's happening with your module is perfectly the way it should. If your default page is Default.aspx, then IIS is bound to redirect to Default.aspx, which causes your module to redo the work. However one thing I don't understand is that why would you want to have http://www.contoso.com/default.aspx to be redirected to http://www.contoso.com? probably you need to redefine your requirement. Or else, if possible you could have another default page (like http://www.contoso.com/Home.aspx) and then your IIS should forward the bare requests to that URL.
Related
I have a put and post method in controller that when they work, they would normally redirect to the get method. But I noticed that using the redirect process in MVC is slower than just returning the call of the get method.
Sample from MS code:
return RedirectToRoute("someRoute", routeVarWithId);
What I found that takes less time:
return Get(Id);
Since my put, post, and get, all return IHttpActonResult, I don't see why I should use the redirect if the call is within my one controller, and the security rights are the same.
Am I missing anything obvious here?
But I noticed that using the redirect process in MVC is slower than just returning the call of the get method.
Of course it is; there are now two requests.
Am I missing anything obvious here?
Imagine I submit a form as a POST request to order a new computer from your website. Instead of returning a redirect to my order page, it just renders it out. Then my cat jumps on the keyboard and hits CTRL+R (refresh). What happens? My browser resubmits the last request, which was the POST. Now I've ordered two computers!
Instead, after successfully processing the POST request, you should return a redirect to the order page, which my browser will fetch with a GET. Now I can refresh to my heart's content and nothing bad will ever happen.
This also gives the user the ability to bookmark the page or email it to my wife. You can't email links that are POST requests.
For some light reading on the topic, refer to the HTTP/1.1 standard, specifically section 9.5 and following:
If a resource has been created on the origin server, the response
SHOULD be 201 (Created) and contain an entity which describes the
status of the request and refers to the new resource, and a Location
header (see section 14.30).
So when a POST creates a new resource, like an Order, it should return a 201 Created redirect to the URL where the new resource (i.e. the order) can be retrieved.
I want to run my personal web sites via an httphandler (I have a web server and static ip at home.)
Eventually, I will incorporate a data access layer and domain router into the handler, but for now, I am just trying to use it to return static web content.
I have the handler mapped to all verbs and paths with no access restrictions in IIS 7 on Windows 7.
I have added a little file logging at the beginning of process request. As it is the first thing in the handler, I use the logging to tell me when the handler is hit.
At the moment, the handler just returns a single web page that I have already written.
The handler itself is mostly just this:
using (FileStream fs = new FileStream(Request.PhysicalApplicationPath + "index.htm",
FileMode.Open))
{
fs.CopyTo(Response.OutputStream);
}
I understand that this won't work for anything but the one file.
So my issue is this: the HTML file has links to some images in it. I would expect that the browser would come back to the server to get those images as new requests. I would expect those requests to fail (because they'd be mapped to index.htm). But I would expect to see the logging hit at least twice (and potentially hit recursively). However, I only see a single request. The web page comes up and the images are 'X's.
When I refresh the browser, I see another request come through, but only for the root page again. The page is basic HTML, I do not have an asp.net application (nor do I want one, I like HTML/CSS/JS).
What do I have to do to get more than just the first request sent from the browser? I assume I'm just totally off the mark because I wrote an HTTP Module first, but strangely got the same exact behavior. I'm thinking I need to specify some response headers, but don't see that in any example.
I'm seeing some very strange behaviour when debugging my web application in VS2010 locally. The same user journey/sequence of pages happens in Production.
Debugging, I'm seeing this:
1. request for MyPage.aspx - handled by thread_1
2. (there is something on that page that IIS/ASP.Net doesn't like it seems) I'm slowly removing sections to pin-point exactly but there's
no JS, or anything fancy there just html content, user controls etc.
3. Either way a separate thread_2 to begin processing the Page_Load of my defaultdocument i.e. home.aspx is executed. There is logic in
home.aspx.cs to clear some data.
4. So when thread_1 continues processing, checks against the data above fail, resulting with the user being redirected to an error page.
Can anyone shed any light on why the second thread is created and why it starts to process my default document?
Please note:
I've checked the global methods for errors e.g. session_end,
app_error etc but nothing.
I do intermittently see a 401 error with Failed Request Tracing Logging enabled but I don't understand how that would start the
processing of my default home page?
just to sanity check, I placed a new doc test.aspx at the beginning of my defaultdocument list in the web.config and it did get called.
It seems as though, something within IIS/ASP.Net is configured to begin processing the default page on an error but this is new behaviour to me?
I've tried researching this but the only thing that seems that it could be related is thread-agility but I'm not too sure..?
It seems like there are two HTTP requests running concurrently. As each request (generally) executes on its on thread this condition would make sense.
HTTP requests by default do not share state. They operate on different data. For that reason this is not a thread-safety issue.
An exception to this rule is if you explicitly share state e.g. using static variables. You shouldn't do this for various reasons.
To debug the problem launch Fiddler and examine the HTTP request being executed. Also example HttpContext.Current.Request.RawUrl on each of the two concurrent threads.
After removing a lot of content within the faulty MyPage.aspx, I came across the guilty line of code: btnShowPost.ImageUrl = SitePath + "post.png"; (it was never accessed behind an if statement) and therefore the image <asp:Image ID="btnShowPost" runat="server" /> never set the necessary ImageUrl.
Without it, apparently this is standard browser behavior: any img, script, css, etc, with a
src= missing, will use the default path as the url. iis will usually redirect to default.aspx (or whatever is the default).
See full explanation on this link
I have a custom Sharepoint 2010 web part that runs the user through a series of steps in a registration process. At each step, when whatever required input is completed, the user clicks the Continue button which is a standard server side button control. The code behind does some validation and DB updates before calling Response.Redirect which refreshes the same page with updated session data.
(Note: the session data is kept in the URL as an encrypted query string parameter, not by the conventional Session object)
This solution works fine in my single server test environment, but as soon as I deploy it to a load balanced stage or production environment some requests simply time out without receiving a response after clicking Continue (ERR_TIMED_OUT).
The Webpart log shows that the webpart is in fact calling Response.Redirect with a valid URL
This is no server resource issue. The timeout can be set to a minute or more, no response is received.
Only happens when deployed to load balanced servers
Everything works fine when I complete a registration on one of the load balanced servers - which leads me to believe there is a problem with load balancing and server sessions. I know that when interacting with a load balanced web application from one of the server nodes in the NLB, all requests will go to that particular server.
I know I have faced a similar issue before, but it is several years ago and I cannot remember what the solution was.
try
{
// get clean URL without query string parameters
string url;
if (string.IsNullOrEmpty(Request.Url.Query))
url = Request.Url.AbsoluteUri;
else
url = Request.Url.AbsoluteUri.Replace(Request.Url.Query, "");
// add encrypted serialized session object
url += "?" + Constants.QueryStringParameterData + "=" + SessionData.Serialize(true);
_log.Info("Redirecting to url '" + url + "'..");
Response.Redirect(url);
}
catch (Exception) { }
OK, the problem has been resolved.
It turned out to be UAG that was doing something in the background, and the way I discovered it was that the links that triggered the postbacks got changed from
http://some_url.com/sites/work/al2343/page.aspx
to
http://some_other_url.domain.com/uniquesigfed6a45cdc95e5fa9aa451d1a37451068d36e625ec2be5d4bc00f965ebc6a721/uniquesig1/sites/work/al2343/page.aspx
(Take note of the "uniquesig" in there)
This was the URL the browser actually tried to redirect to, but because of whatever the issue was with UAG the navigation froze.
I don't know how they fixed it, but at least the problem was not in my component.
One possibility that Request.Url is how particular server sees the url (something like http://internalServer44/myUrl) instead of externally visible load-balanced Ulr (like http://NlbFarmUrl/myUrl).
In case of SharePoint it will be better to use SPContext.Current.Site/Web properties to get base portion of Url since this Urls should already be in externally visible form.
I'm on IIS 6 and I have an ASP.Net 4.0 site that's a single page to serve as a SOAP reverse proxy. I have to modify the return content in order to delete a trouble node from the response and add a tracking node.
In order to facilitate its function as a reverse proxy for all addresses, I have the 404 on the server set to a custom "URL" of "/default.aspx" (the page for my app)
For requests without a payload, it works perfectly - such as for ?WSDL Urls. It requests the proper URL from the target system, gets the response and sends it back - it's pretty utterly transparent in this regard.
However, when a SOAP request is being made with an input payload, the Request.InputStream in the code is always empty. Empty - with one exception - using SOAPUI, I can override the end point and send the request directly to /default.aspx and it will receive the input payload. Thus, I have determined that the custom 404 handler is - when server-side transferring the request - stripping the payload. I know the payload is being sent - I have even wiresharked it on the server to be sure. But then when I add code to log the contents of Request.InputStream it's blank - even though Request.ContentLength shows the right content length for the original request.
I've also been looking for a good way to use ASP.Net to intercept the requests directly rather than allowing the normal IIS 404 handler to take care of it but even with a wildcard mapping, I can't seem to get the settings right nor am I fully confident that it would help. (But I'm hoping it would?)
Finally, I don't have corporate permission to install MVC framework.
Thus, I need either some configuration for IIS I am missing to make this work properly or some other method of ensuring that I get the request payload to my web page.
Thanks!
What about using an HTTP Handler mapped to all requests?
You'll need to add a wildcard application mapping as detailed here and correctly configure your HTTP Handler.