Cache-Control Headers in ASP.NET - c#

I am trying to set the cache-control headers for a web application (and it appears that I'm able to do it), but I am getting what I think are odd entries in the header responses. My implementation is as follows:
protected override void OnLoad(EventArgs e)
{
// Set Cacheability...
DateTime dt = DateTime.Now.AddMinutes(30);
Response.Cache.SetExpires(dt);
Response.Cache.SetMaxAge(new TimeSpan(dt.ToFileTime()));
// Complete OnLoad...
base.OnLoad(e);
}
And this is what the header responses show:
-----
GET /Pages/Login.aspx HTTP/1.1
Host: localhost:1974
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.10) Gecko/2009042316 Firefox/3.0.10 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
X-lori-time-1: 1244048076221
Cache-Control: max-age=0
HTTP/1.x 200 OK
Server: ASP.NET Development Server/8.0.0.0
Date: Wed, 03 Jun 2009 16:54:36 GMT
X-AspNet-Version: 2.0.50727
Content-Encoding: gzip
Cache-Control: private, max-age=31536000
Expires: Wed, 03 Jun 2009 17:24:36 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 6385
Connection: Close
-----
Why does the "Cache-Control" property show up twice?
Do I need both "Cache-Control" and the "Expires" properties?
Is "Page_Load" the best place to put this code?
Thanks!

You might also want to add this line if you are setting the max age that far out :
// Summary:
// Sets Cache-Control: public to specify that the response is cacheable
// by clients and shared (proxy) caches.
Response.Cache.SetCacheability(HttpCacheability.Public);
I do a lot of response header manip with documents and images from a file handler that processes requests for files the are saved in the DB.
Depending on your goal you can really force the browsers the cache almost all of you page for days locally ( if thats what u want/need ).
edit:
I also think you might be setting the max age wrong...
Response.Cache.SetMaxAge(new TimeSpan(dt.Ticks - DateTime.Now.Ticks ));
this line set is to 30 min cache time on the local browser [max-age=1800]
As for the 2x Cache Control lines... you might want to check to see if IIS has been set to add the header automatically.

I don't see Cache-control appearing twice. One is in the request, one is in the response. The one in the request is probably because you hit Shift+F5 in the browser or something similar.
To your second question: that depends on what you want to achieve with the cache headers.
I don't know what you wanted to achieve with the max-age. The value is way too high since you converted the DateTime incorrectly to a TimeSpan. Why don't you just use TimeSpan.FromMinutes instead?
Page load is okay. I usually mess around with HTTP headers there myself.

Related

Is it possible to change the order of the Headers in a HttpWebResponseMessage using the ApiController?

I have a legacy project which uses .NET Framework 4.5.2 and NancyModule.
When I get the result of a GET-request, then the Headers have the following order:
Key
Value
Content-Length
206
Content-Type
application/json; charset=utf-8
Vary
Accept
Server
Microsoft-HTTPAPI/2.0
Link
</Servicename.xml>; rel="application/xml"
x-powered-by
...
Date
Tue, 31 Jan 2023 13:25:07 GMT
I transfer this project to .net 6 and Microsoft.AspNetCore.Mvc.
When I get the result of a GET-request, then the keys of the Headers are arranged alphabetically.
This leads me to the following question:
Is it possible to change the order of the headers?
I tried to remove and add several values of the dictionary in HttpContext.Response.Headers but it has no entries. When I added a custom header then it was also in alphabetical order.
If you are using Kestrel (the default web server in ASP.NET Core) you might want to remove the Server header in order to try to have the Date header last.
But that would be very fragile, you can't really control the order of the headers, see the source code of how it's done!
For simple HTTP responses that don't set any special headers, this might work and you might end up with something like that. Note the many conditionals used in the previous sentences. 😉
HTTP/1.1 200 OK
Content-Length: 4536
Content-Type: text/plain
Date: Tue, 31 Jan 2023 14:58:31 GMT
And here's how to disable the Server header for Kestrel:
var builder = WebApplication.CreateBuilder(args);
builder.WebHost.ConfigureKestrel(serverOptions => serverOptions.AddServerHeader = false);
If you need to use HTTP.sys instead of Kestrel then you'll be out of luck since the Content-Length header is added after the Date header and there's nothing you can do about it.
HTTP/1.1 200 OK
Content-Type: text/plain
Server: Microsoft-HTTPAPI/2.0
Date: Tue, 31 Jan 2023 15:11:23 GMT
Content-Length: 4536

(Instagram API) Trouble Reaching Tag Endpoints

I'm trying to gather a list of recent posts that contain a certain hashtag. The API Documentation states that I should be using the following GET call:
https://api.instagram.com/v1/tags/{tag-name}/media/recent?access_token=ACCESS-TOKEN
When I load the page where I want this information displayed, I perform the following:
using(HttpClient Client = new HttpClient())
{
var uri = "https://api.instagram.com/v1/tags/" + tagToLookFor + "/media/recent?access_token=" + Session["instagramaccesstoken"].ToString();
var results = Client.GetAsync(uri).Result;
// Result handling below here.
}
For reference, tagToLookFor is a constant string defined at the top of the class (eg. foo), and I store the Access Token returned from the OAuth process in the Session object with a key of 'instagramaccesstoken'.
While debugging this, I checked to make sure the URI was being formed correctly, and it does contain both the tag name and the just-created access_token. Using Apigee with the same URI (Save for a different access_token) returns the valid results I would expect. However, attempting to GET using the URI on my webstie returns:
{
StatusCode: 400,
ReasonPhrase: 'BAD REQUEST',
Version: 1.1,
Content: System.Net.Http.StreamContent,
Headers:{
X-Ratelimit-Remaining: 499
Vary: Cookie
Vary: Accept-Language
X-Ratelimit-Limit: 500
Pragma: no-cache
Connection: keep-alive
Cache-Control: no-store, must-revalidate, no-cache, private
Date: Fri, 27 Nov 2015 21:39:56 GMT
Set-Cookie: csrftoken=97cc443e4aaf11dbc44b6c1fb9113378; expires=Fri, 25-Nov-2016 21:39:56 GMT; Max-Age=31449600; Path=/
Content-Length: 283
Content-Language: en
Content-Type: application/json; charset=utf-8
Expires: Sat, 01 Jan 2000 00:00:00 GMT
}
}
I'm trying to determine what the difference between the two could be; the only thing that I can think of is that access_token is somehow being invalidated when I switch between pages. The last thing I do on the Login/Auth page is store the access_token using Session.Add, then call Server.Transfer to move to the page that I'm calling this on.
Any Ideas on what the issue could be? Thanks.
Attach the token to the header when making the request.
Client.DefaultRequestHeaders.Add("access_token", "Bearer " + token);
The problem ended up being one regarding Sandbox Mode. I had registered an app after the switch, and I was the only user in my sandbox. As a result, it had no problem finding my posts/info, but Sandbox Mode acts as if the Sandbox users are the only users on Instagram, so naturally it would not find anything else.
It turns out there was an existing registered application in my organization (made before the switch date) that does not have any such limitations, so I have been testing using that AppID/secret.
tl;dr: If you're the only user in your app's sandbox, work on getting users into your sandbox. See their article about it for more info.

Send multiple of items to WCF Dataservice at once?

I'm using WCF Data Service as a way to allow other webservice in the project to connect to the database. My problem is that I our project has a crawler that add tens of items to the database every hour.
Using AddToItems method (which is auto generated by ADO.NET) leads to timeout exception or at least it makes the crawler in need to wait for a lot of time taking into consideration that Addto method handles each independently.
*Notes :
1- I've added an interceptor on adding to items to perform some actions when a new item is added.
2- WCF Data services Service Operations doesn't allow taking parameters of a user defined data type , that prevented me from creating a service operation that takes a list of items as a parameter to be able to handle multiple items at each time and at the same way to allow the client to handle it asynchronously.
When I tried to serialize this list so it can be treated as a string , an exception has occurred because of the length limit of the url even when POST is used instead of Get.
Update : Saveing Changes via BeginSaveChanged and EndSaveChanged solved the problem to some extent but I'm still looking for a better solution
Maybe the OData feature Batch can address your requirement:
var client = new Container(serviceUrl);
client.Format.UseJson();
DefaultBatchCustomer customer0ToAdd = new DefaultBatchCustomer { Id = 10, Name = "Customer 10" };
DefaultBatchCustomer customer0ToAdd = new DefaultBatchCustomer { Id = 11, Name = "Customer 11" };
client.AddToDefaultBatchCustomer(customer0ToAdd);
client.AddToDefaultBatchCustomer(customer1ToAdd);
var response = await client.SaveChangesAsync(SaveChangesOptions.BatchWithSingleChangeset);
and the request looks like:
POST http://jinfutanwebapi1:9123/DefaultBatch/$batch HTTP/1.1
OData-Version: 4.0;NetFx
OData-MaxVersion: 4.0;NetFx
Content-Type: multipart/mixed; boundary=batch_8ce61768-e8bb-4117-954b-9bc43e05baef
Accept: multipart/mixed
Accept-Charset: UTF-8
User-Agent: Microsoft ADO.NET Data Services
Host: jinfutanwebapi1:9123
Content-Length: 1772
Expect: 100-continue
--batch_8ce61768-e8bb-4117-954b-9bc43e05baef
Content-Type: multipart/mixed; boundary=changeset_b36bec94-fc3b-4d89-99cc-0610fcec8148
--changeset_b36bec94-fc3b-4d89-99cc-0610fcec8148
Content-Type: application/http
Content-Transfer-Encoding: binary
POST http://jinfutanwebapi1:9123/DefaultBatch/DefaultBatchCustomer HTTP/1.1
Content-ID: 13
OData-Version: 4.0;NetFx
OData-MaxVersion: 4.0;NetFx
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
User-Agent: Microsoft ADO.NET Data Services
{"#odata.type":"#WebStack.QA.Test.OData.Batch.Tests.DataServicesClient.DefaultBatchCustomer","Id":10,"Name":"Customer 10"}
--changeset_b36bec94-fc3b-4d89-99cc-0610fcec8148
Content-Type: application/http
Content-Transfer-Encoding: binary
POST http://jinfutanwebapi1:9123/DefaultBatch/DefaultBatchCustomer HTTP/1.1
Content-ID: 14
OData-Version: 4.0;NetFx
OData-MaxVersion: 4.0;NetFx
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
User-Agent: Microsoft ADO.NET Data Services
{"#odata.type":"#WebStack.QA.Test.OData.Batch.Tests.DataServicesClient.DefaultBatchCustomer","Id":11,"Name":"Customer 11"}
--changeset_b36bec94-fc3b-4d89-99cc-0610fcec8148--
--batch_8ce61768-e8bb-4117-954b-9bc43e05baef--

Would like to get http response results like Fiddler

I'm trying to get the same type of results that Fiddler gets when I launch a webpage from my app.
Below is the code I'm using and the results I'm getting. I've used google.com only as an example.
What do I need to modify in my code to get the results I want or do I need an entirely different approach?
Thanks for your help.
My code:
// create the HttpWebRequest object
HttpWebRequest objRequest = (HttpWebRequest)WebRequest.Create("http://www.google.com");
// get the response object which has the header info, using the GetResponse method
var objResults = objRequest.GetResponse();
// get the header count
int intCount = objResults.Headers.Count;
// loop through the results object
for (int i = 0; i < intCount; i++)
{
string strKey = objResults.Headers.GetKey(i);
string strValue = objResults.Headers.Get(i);
lblResults.Text += strKey + "<br />" + strValue + "</br /><br />";
}
My results:
Cache-Control
private, max-age=0
Content-Type
text/html; charset=ISO-8859-1
Date
Tue, 05 Jun 2012 17:40:38 GMT
Expires
-1
Set-Cookie
PREF=ID=526197b0260fd361:FF=0:TM=1338918038:LM=1338918038:S=gefqgwkuzuPJlO3G; expires=Thu, 05-Jun-2014 17:40:38 GMT; path=/; domain=.google.com,NID=60=CJbpzMe6uTKf58ty7rysqUFTW6GnsQHZ-Uat_cFf1AuayffFtJoFQSIwT5oSQKqQp5PSIYoYtBf_8oSGh_Xsk1YtE7Z834Qwn0A4Sw3ruVCA9v3f_UDYH4b4fAloFJbW; expires=Wed, 05-Dec-2012 17:40:38 GMT; path=/; domain=.google.com; HttpOnly
P3P
CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657 for more info."
Server
gws
X-XSS-Protection
1; mode=block
X-Frame-Options
SAMEORIGIN
Transfer-Encoding
chunked
=========================
Fiddler results:
Result Protocol Host URL Body Caching Content-Type Process Comments Custom
1 304 HTTP www.rolandgarros.com /images/misc/weather/P8.gif 0 max-age=700 Expires: Tue, 05 Jun 2012 17:53:40 GMT image/gif firefox:5456
2 200 HTTP www.google.com / 23,697 private, max-age=0 Expires: -1 text/html; charset=UTF-8 chrome:2324
3 304 HTTP www.rolandgarros.com /images/misc/weather/P9.gif 0 max-age=700 Expires: Tue, 05 Jun 2012 17:53:57 GMT image/gif firefox:5456
4 200 HTTP Tunnel to translate.googleapis.com:443 0 chrome:2324
5 200 HTTP www.google.com
The difference is Fiddler is actually recording an entire session, not just a single HTTP request.
If a user loads Google.com, the response is typically an HTML document which contains images, script files, CSS files, etc. Your browser will then initiate a new HTTP request for each one of those resources. With Fiddler running, it tracks each of those HTTP requests and spits out the result code and other information about the session.
With your C# code above, you're only initiating a single HTTP request, thus you only have information about a single result.
You'd probably be better off writing a browser plugin. Otherwise, you'd have to parse the HTML response and load other resources from that document as well.
If you do need to do this with C# code, you could probably parse the document with the HTML Agility Pack and then look for other resources within the HTML to simulate a browser. There's also embedded browsers, such as Awesomium, that might be helpful.
You are not asking for the same information that Fiddler is displaying. Fiddler shows the HTTP Status code, the host and URI and (it appears, from your example) the Content Length, Content Type and Cache status.
For many of these you will have to peek in to the response headers.

C# - Connection: keep-alive Header is Not Being Sent During HttpWebRequest

I'm trying to send to send the following header with my HttpWebRequest:
Connection: keep-alive
However, the header is never sent. Fiddler2 is showing that whenever I request the page in Google Chrome, the header is sent. However, my application refuses to send this header for some reason.
I have set the KeepAlive property to true (it's true by default anyway), yet the header still does not get sent.
I am trying to send this header with multiple HttpWebRequests, but they all basically look like this:
HttpWebRequest logIn6 = (HttpWebRequest)WebRequest.Create(new Uri(responseFromLogIn5));
logIn6.CookieContainer = cookies;
logIn6.KeepAlive = true;
logIn6.Referer = "https://login.yahoo.com/config/login?.src=spt&.intl=us&.lang=en-US&.done=http://football.fantasysports.yahoo.com/";
logIn6.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1";
logIn6.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
logIn6.Headers.Add("Accept-Encoding:gzip,deflate,sdch");
logIn6.Headers.Add("Accept-Language:en-US,en;q=0.8");
logIn6.Headers.Add("Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3");
logIn6.AllowAutoRedirect = false;
HttpWebResponse logIn6Response = (HttpWebResponse)logIn6.GetResponse();
string responseFromLogIn6 = logIn6Response.GetResponseHeader("Location");
cookies.Add(logIn6Response.Cookies);
logIn6Response.Close();
Does anyone know what I have to do to make sure this header is sent?
Fiddler2 Raw From Chrome:
GET xxx HTTP/1.1
Host: accounts.google.com
Connection: keep-alive
Referer: https://login.yahoo.com/config/login?.src=spt&.intl=us&.lang=en-US&.done=http://football.fantasysports.yahoo.com/
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie: xxx
HTTP/1.1 302 Moved Temporarily
Set-Cookie: xxx
Set-Cookie: xxx
Location: xxx
Content-Type: text/html; charset=UTF-8
P3P: CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657 for more info."
Date: Sat, 17 Sep 2011 22:27:09 GMT
Expires: Sat, 17 Sep 2011 22:27:09 GMT
Cache-Control: private, max-age=0
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Content-Length: 2176
Server: GSE
Fiddler2 Raw From My Application:
GET xxx HTTP/1.1
Referer: https://login.yahoo.com/config/login?.src=spt&.intl=us&.lang=en-US&.done=http://football.fantasysports.yahoo.com/
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.220 Safari/535.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Host: accounts.google.com
HTTP/1.1 302 Moved Temporarily
Location: xxx
Content-Type: text/html; charset=UTF-8
Date: Sun, 18 Sep 2011 00:05:40 GMT
Expires: Sun, 18 Sep 2011 00:05:40 GMT
Cache-Control: private, max-age=0
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Content-Length: 573
Server: GSE
I'm trying to get the second Fiddler2 raw information to look like the first Fiddler2 raw information.
I've had the same issue: The Connection: Keep-Alive header is not sent except the first request, and the server I accessed won't give me the correct response if it is missing. So, here are my workarounds to this issue:
First is set the ProtocolVersion property of HttpWebRequest instance to HttpVersion.Version10. Except the http command will become GET xxx HTTP/1.0, it works and uses only the public API.
The second way uses the reflection to modify the internal property ServicePoint.HttpBehaviour of HttpWebRequest instance, like this:
var req = (HttpWebRequest)WebRequest.Create(someUrl);
var sp = req.ServicePoint;
var prop = sp.GetType().GetProperty("HttpBehaviour",
BindingFlags.Instance | BindingFlags.NonPublic);
prop.SetValue(sp, (byte)0, null);
req.GetResponse().Close();
Hope this helps.
I struggled with this problem for half a day! And dear old Fiddler (my guardian angel) was inadvertantly part of the problem:
Whenever I tested my HTTP POSTs using Fiddler monitoring ON - the problem DIDN'T appear
Whenever I tested my HTTP POSTs with Fiddler monitoring OFF - the problem DID appear
My POSTS were sent with protocol 1.1 and the Keep-Alive was ignored/redundant/why after the initial connection. i.e. I could see it in the header of the first POST (via Fiddler!), but not in subsequent POSTs despite using the same code. Hey ho ...
But the remote server would only respond if Keep-Alive was sent. Now I can't prove this, but I suspect that Fiddler monitoring the connection caused the remote server to think or believe that the connection was still active (despite no Keep-Alives sent after my first POST) and responded correctly. As I said, the second I turned Fiddler off, the absence of Keep-Alives caused the remote server to timeout on me..
I implemented the 1.0 solution described above and my POSTS now work, with or without Fiddler on or off. Hope this helps somebody else stuck somewhere ...
You doing it right. The code should result in following header added:
Connection: Keep-Alive
Post the code that you use for sending request and Raw output from Fiddler if you don't see this header. You may also ignore this because HTTP 1.1 connection is keep-alive by default.
Update: it looks like .NET only sets Keep-Alive explicitly for the first (!) request. Further requests to the same host/url will not have this header presumably because underlying tcp connection is already being reused.
After downloading HttpWebRequest source code, noticed that every property checks some known headers for HeaderCollection. To get rid of that doing some reflection stuff on that collection make it work
var webRequest = (HttpWebRequest) WebRequest.Create(url);
webRequest.Headers.GetType().InvokeMember("ChangeInternal",
BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.InvokeMethod,
Type.DefaultBinder, webRequest.Headers, new object[] {name, value}
);
I know the answer for this as I had the same problem and managed to solve it by inheriting the webclient and overriding it's Get Web Request method.
See the code below:
public class CookieAwareWebClient : WebClient
{
public CookieContainer CookieContainer { get; set; }
public CookieAwareWebClient()
: this(new CookieContainer())
{ }
public CookieAwareWebClient(CookieContainer c)
{
this.CookieContainer = c;
}
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
var castRequest = request as HttpWebRequest;
if (castRequest != null)
{
castRequest.KeepAlive = true; //<-- this what you want! The rest you don't need.
castRequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
castRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.104 Safari/537.36";
castRequest.Referer = "https://www.jobserve.com/gb/en/Candidate/Login.aspx?url=48BB4C724EA6A1F2CADF4243A0D73C13225717A29AE8DAD6913D";
castRequest.Headers.Add("Accept-Encoding", "gzip,deflate,sdch");
castRequest.Headers.Add("Accept-Language", "en-GB,en-US;q=0.8,en;q=0.6");
castRequest.CookieContainer = this.CookieContainer;
}
return request;
}
}
As you can see I am not only enabling keep-alive but I am utilizing cookies and other headers also!
I hope that helps!
Kiran

Categories

Resources