Is it possible to set Max Age for Cookies in Web Forms Application? I know, that it's okey to set Expire, but is there a way to set Max Age?
Asp.Net doesn't specifically provide this property on HttpCookie, probably because they are very Microsoft-centric, and IE doesn't support max-age (as least, as of IE11)
However, you can still do it. Here's some code demonstrating the proper and invalid ways to set this cookie with max-age:
// doesn't work:
var mytestcookie = new HttpCookie("regular_httpcookie", "01");
mytestcookie.Values.Add("max-age", "300");
Response.Cookies.Add(mytestcookie);
// *does* work:
Response.Headers.Add("set-cookie", "testingmaxage=01;max-age=300; path=/");
And it renders like this in the HTTP response:
Set-Cookie testingmaxage=01;max-age=300; path=/
X-AspNet-Version 4.0.30319
Set-Cookie regular_httpcookie=01&max-age=300; expires=Fri, 10-Jun-2016 15:02:15 GMT; path=/
As you can see above, if you are also setting cookies using HttpCookie, this will create a second "set-cookie" header on the response , but the browser won't mind, it will just add it to the list of cookies.
I tested on IE11 and Chrome and this is a non-issue - the cookies will all go in, as long as they have differing names. If the cookie name conflicts with one already set in HttpCookies, the last one in wins. Check out the text of your HTTP response to see which one goes in last. (Best to simply make sure they don't conflict though)
As I mentioned at the beginning, when testing on IE11, I noted that it's ignoring the max-age property of the cookie. Here's a link to a way to settle that issue:
Set-Cookie: Expire property, clock skew and Internet Explorer issue
Related
I'm using HttpClient 0.6.0 from NuGet.
I have the following C# code:
var client = new HttpClient(new WebRequestHandler() {
CachePolicy =
new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable)
});
client.GetAsync("http://myservice/asdf");
The service (this time CouchDB) returns an ETag value and status code 200 OK. There is returned a Cache-Control header with value must-revalidate
Update, here are the response headers from couchdb (taken from the visual studio debugger):
Server: CouchDB/1.1.1 (Erlang OTP/R14B04)
Etag: "1-27964df653cea4316d0acbab10fd9c04"
Date: Fri, 09 Dec 2011 11:56:07 GMT
Cache-Control: must-revalidate
Next time I do the exact same request, HttpClient does a conditional request and gets back 304 Not Modified. Which is right.
However, if I am using low-level HttpWebRequest class with the same CachePolicy, the request isn't even made the second time. This is the way I would want HttpClient also behave.
Is it the must-revalidate header value or why is HttpClient behaving differently? I would like to do only one request and then have the rest from cache without the conditional request..
(Also, as a side-note, when debugging, the Response status code is shown as 200 OK, even though the service returns 304 Not Modified)
Both clients behave correctly.
must-revalidate only applies to stale responses.
When the must-revalidate directive is present in a response received by a cache, that cache MUST NOT use the entry after it becomes stale to respond to a
subsequent request without first revalidating it with the origin server. (I.e., the cache MUST do an end-to-end revalidation every time, if, based solely on the origin server's Expires or max-age value, the cached response is stale.)
Since you do not provide explicit expiration, caches are allowed to use heuristics to determine freshness.
Since you do not provide Last-Modified caches do not need to warn the client that heuristics was used.
If none of Expires, Cache-Control: max-age, or Cache-Control: s- maxage (see section 14.9.3) appears in the response, and the response does not include other restrictions on caching, the cache MAY compute a freshness lifetime using a heuristic. The cache MUST attach Warning 113 to any response whose age is more than 24 hours if such warning has not already been added.
The response age is calculated based on Date header since Age is not present.
If the response is still fresh according to heuristic expiration, caches may use the stored response.
One explanation is that HttpWebRequest uses heuristics and that there was a stored response with status code 200 that was still fresh.
Answering my own question..
According to http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.4 I would say that
a "Cache-Control: must-revalidate" without expiration states that the resource should be validated on every request.
In this case it means a conditional GET should be done every time the resource is made. So in this case System.Net.Http.HttpClient is behaving correctly and the legacy (Http)WebRequest is doing invalid behavior.
I'm doing some close to the metal HTTP tangling with Owin.
I have a owin middleware that outputs javascripts. It looks like this (relevant parts)
public override Task Invoke(IOwinContext context)
{
var response = context.Response;
response.ContentType = "application/javascript";
response.StatusCode = 200;
if (ClientCached(context.Request, scriptBuildDate))
{
response.StatusCode = 304;
response.Headers["Content-Length"] = "0";
response.Body.Close();
response.Body = Stream.Null;
return Task.FromResult<Object>(null);
}
response.Headers["Last-Modified"] = scriptBuildDate.ToUniversalTime().ToString("r");
return response.WriteAsync(js);
}
private bool ClientCached(IOwinRequest request, DateTime contentModified)
{
string header = request.Headers["If-Modified-Since"];
if (header != null)
{
DateTime isModifiedSince;
if (DateTime.TryParse(header, out isModifiedSince))
{
return isModifiedSince >= contentModified;
}
}
return false;
}
It will output 200 if its not client cached and add a Last-Modified date to the header, if its client cached it will output 304 "Not modified".
The problem is that the client will not call the url again unless they are doing a hard F5 in the browser. My understanding of Last modified caching is that it should call each time to check if the content has been modified?
Update:
Control: must-revalidate
Chrome
F5 and ctrl+F5 will call server, opening site in new tab or restarting browser will call server, typing the address in same tab will not call server. If-Modified-Since only cleared when doing Ctrl+F5 which means it can be used to return 304 correctly when content not modified
IE10
F5 and ctrl+F5 will call server, opening site in new tab will not call server, typing the address in same tab will not call server. If-Modified-Since cleared when doing Ctrl+F5 OR when restarting browser
Cache-Control: no-cache and Pragma: no-cach
Chrome
Will call server for every action If-Modified-Since only cleared when doing Ctrl+F5
Will call server for every action If-Modified-Since cleared for both restarting browser and Ctrl+F5
Conclusion
Looks like no-cache is might better if you want to be sure it calls to check for 304 each time
From the HTTP/1.1 spec (RFC2616, my emphasis):
13.2.2 Heuristic Expiration
Since origin servers do not always provide explicit expiration times,
HTTP caches typically assign heuristic expiration times, employing
algorithms that use other header values (such as the Last-Modified
time) to estimate a plausible expiration time. The HTTP/1.1
specification does not provide specific algorithms, but does impose
worst-case constraints on their results. Since heuristic expiration
times might compromise semantic transparency, they ought to used
cautiously, and we encourage origin servers to provide explicit
expiration times as much as possible.
Providing a Last-Modified header is not equivalent to asking user agents to check for updates every time they need a resource from your server.
Ideally, you should add an Expires header whenever possible. However, adding the header Cache-Control: must-revalidate should help.
I've recently run into some problems with the CookieContainer. Either I'm doing something seriously wrong or there is some kind of bug w/ the CookieContainer object. It doesn't seem to update the cookie collection with certain Set-Cookie headers.
This might be a lengthy post and I appologize, but I want to be as thurough as possible so I'm going to list my HTTP sniffing logs as well as my actual implementation code.
public bool SendRequest(HttpWebRequest request, IDictionary<string, string> data, int retries)
{
// copy request in case request instance already failed
HttpWebRequest newRequest = (HttpWebRequest)HttpWebRequest.Create(request.RequestUri);
newRequest.Method = request.Method;
// if POST data was provided, write it to the stream
if (data != null && data.Count != 0)
{
StreamWriter writer = new StreamWriter(newRequest.GetRequestStream());
writer.Write(createPostString(data));
writer.Close();
}
// set request with global cookie container
newRequest.CookieContainer = this.cookieJar;
try
{
using (HttpWebResponse resp = (HttpWebResponse)newRequest.GetResponse())
{
//CookieCollection newCooks = getCookies(resp.Headers);
//updateCookies(newCooks);
this.cookieJar = newRequest.CookieContainer;
this.Html = getResponseString(resp);
/* remainder snipped */
So there is the code, here are two request to responses I sniffed in Fiddler:
Request 1
POST /login/ HTTP/1.1
Host: www.site.com
Content-Length: 47
Expect: 100-continue
Connection: Keep-Alive
Response 1
HTTP/1.1 200 OK
Date: Wed, 02 Dec 2009 17:03:35 GMT
Server: Apache
Set-Cookie: tcc=one; path=/
Set-Cookie: cust_id=2702585226; domain=.site.com; path=/; expires=Mon, 01-Jan-2011 00:00:00 GMT
Set-Cookie: cust_session=12%2F2%2F2009%20%2012%3A3%3A35; domain=.site.com; path=/; expires=Wed 2-Dec-2009 17:33:35
Set-Cookie: refer_id_persistent=0000; domain=.site.com; path=/; expires=Fri 2-Dec-2011 17:3:35
Set-Cookie: refer_id=0000; domain=.site.com; path=/
Set-Cookie: private_browsing_mode=off; domain=.site.com; path=/; expires=Fri, 01-Jan-2010 17:03:35 GMT
Set-Cookie: member_session=UmFuZG9tSVYL%5BS%5D%5BP%5DfhH77bYaVoS9j9Yd8ySRkyHHz%5BS%5Dk0S8MVsQ6AyraNlcdcCRC0RkB%5BP%5DfBYVM4vn6JQ3HlJxT3GlJi1RZiMGQaITg7HN9dpu9oRbZgMjhJlXXa%5BP%5D7pFSjqDIZWRr3LAfnhh3btv4E3rvVH42CeOP%5BS%5Dx6kDyvrokQEHyIHPGi7zswZbuHrUdx2XKEKKJzw1unDWfw0LZWjoehAs0QgSOz6Nzp8P4Hp8hqrULdIMch6acPT%5BS%5DbKV8zwugBIcjr5dI3rVR%5BP%5Dv42rsTtQB7dyb%5BP%5DRKb8Y83cGqhHM33hP%5BP%5DUtmbDC1PPfr%5BS%5DPC23lAO%5BS%5DmQ3mOy9x4pgQSOfp40XSfzgVg3EavITaxHBeI5nO3%5BP%5D%5BS%5D2rSDthDfuEm4sT9i6UF3sYd1vlOL0IC9ZsVatV1yhhpQ%5BE%5D%5BE%5D; domain=.site.com; path=/; expires=Fri, 01-Jan-2010 17:03:35 GMT
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=UTF-8
Request 2
GET /test?search=jjkjf HTTP/1.1
Host: www.site.com
Cookie: tcc=one; cust_id=2702585226; private_browsing_mode=off; member_session=UmFuZG9tSVYL%5BS%5D%5BP%5DfhH77bYaVoS9j9Yd8ySRkyHHz%5BS%5Dk0S8MVsQ6AyraNlcdcCRC0RkB%5BP%5DfBYVM4vn6JQ3HlJxT3GlJi1RZiMGQaITg7HN9dpu9oRbZgMjhJlXXa%5BP%5D7pFSjqDIZWRr3LAfnhh3btv4E3rvVH42CeOP%5BS%5Dx6kDyvrokQEHyIHPGi7zswZbuHrUdx2XKEKKJzw1unDWfw0LZWjoehAs0QgSOz6Nzp8P4Hp8hqrULdIMch6acPT%5BS%5DbKV8zwugBIcjr5dI3rVR%5BP%5Dv42rsTtQB7dyb%5BP%5DRKb8Y83cGqhHM33hP%5BP%5DUtmbDC1PPfr%5BS%5DPC23lAO%5BS%5DmQ3mOy9x4pgQSOfp40XSfzgVg3EavITaxHBeI5nO3%5BP%5D%5BS%5D2rSDthDfuEm4sT9i6UF3sYd1vlOL0IC9ZsVatV1yhhpQ%5BE%5D%5BE%5D
So as you can see, the CookieContainer (this.cookieJar) which is used for every request is not picking up the Set-Cookie header for refer_id, cust_session, refer_id_persistent. However it does pick up cust_id, private_browsing_mode, tcc, and member_session... Any ideas why this might be?
Just wanted to update this post in case someone else came across this. Issue is that .NET complies with the RFC specification for cookie tags, but not all sites do. So, ultimately, the issue is not Microsoft, or .NET for the matter. (Although, IE, manages the cookies fine so it would be better to rewrite their .NET cookie parsing methods using the same parsing methods) The issue is the sites that do not follow RFC specifications.
Nonetheless, an issue I've often encountered is that sites will use commas in the expiration dates in their cookies. .NET interprets these as separators between different cookie fields and strips the ending and everything there after off of the cookie.
RFC spec: "Cookie:, followed by a comma-separated list of one or more cookies." An easy solution to this problem would be for the web server to enclose values with commas in quotation marks, per the RFC document. However, there is no RFC police, so we can only hope that people follow the rules.
MSDN SetCookies:
SetCookies pulls all the HTTP cookies out of the HTTP cookie header, builds a Cookie for each one, and then adds each Cookie to the internal CookieCollection that is associated with the URI. The HTTP cookies in the cookieHeader string must be delimited by commas.
MSDN GetCookieHeader
GetCookieHeader returns a string that holds the HTTP cookie header for the Cookie instances specified by uri. The HTTP header is built by adding a string representation of each Cookie associated with uri. Note that the exact format of the string depends on the RFC that the Cookie conforms to. The strings for all the Cookie instances that are associated with uri are combined and delimited by semicolons.
This string is not in the correct format for use as the second parameter of the SetCookies method.
This is only a quick scan through you code but it seems that you are sending post data before you send the cookies in the request.
if (data != null && data.Count != 0)
{
StreamWriter writer = new StreamWriter(newRequest.GetRequestStream());
writer.Write(createPostString(data));
writer.Close();
}
// set request with global cookie container
newRequest.CookieContainer = this.cookieJar;
What might be happening is when you write your post data to the stream this is sent to the remote server. However for a cookies to be set they must be sent to the server before any postdata. The simple solution is to swap this around like so:
// set request with global cookie container
newRequest.CookieContainer = this.cookieJar;
if (data != null && data.Count != 0)
{
StreamWriter writer = new StreamWriter(newRequest.GetRequestStream());
writer.Write(createPostString(data));
writer.Close();
}
CookieContainer has 2 major issues that I have come across, whether it is by design or bug I don't know.
1) Cookies set on a 302 post are not picked up.
Example
Post to site
302 redirect response
Load New page which sets cookie
Solution
Set autoredirect to false and manually follow the redirects and set the cookies yourself
2) .Net is VERY fussy about incorectly form cookie strings that have a comma in the string. This is actually correct, but occasioally cookies have the date set that include a comma, which stops all cookies being set.
Solution
Manually parse cookie strings and add yourself. A horrible task. I have a sprawing mess of a hack function, that loops and ifs but the end result is it works for all cases I have thrown at it so far. IT isn't pretty but it gets job done
Not sure the above is your issue, but maybe. If not some food for thought anyway
My solution: replace " UTC" with " GMT".
Try using CookieContainer.GetCookieHeader and CookieContainer.SetCookies
YourCookieContainer.GetCookieHeader(new Uri("your url"));
YourCookieContainer.SetCookies(new Uri("your url"), "string from GetCookieHeader");
I'm creating a client to visit a website and log in + do some tasks automatically, however they recently updated their cookies to (for whatever reason...) contain a comma inside their identification cookie.
So for example, the Cookie will have a value similar to this:
a,bcdefghijklmnop
The problem is, according to msdn you can't use a comma nor a period inside a cookie's value. What I'm looking for is a way around this limitation, some way to make .net's Cookie's work nice with the commas. I've found that the server does send a 'SET-COOKIE' header to the client and I'm guessing that's what is being parsed, but that also seems to obviously give special meaning to commans and semicolons as well (thus the limitation of the class inside .NET itself).
But then how does a browser such as IE, Firefox, etc... handle the cookie properly (as they clearly do, since the website works fine in any browsers I've tested it with.) Is there maybe a way to force this behaviour in .NET?
Any help would be appreciated, thanks.
--- EDIT ---
some additional information:
My code looks something like this:
request = (HttpWebRequest)WebRequest.Create(URI);
request.CookieContainer = Program.client.cookieJar;
Where cookieJar is defined in Program.client as:
CookieContainer cookieJar = new CookieContainer();
When i loop through and print out all the cookies in the CookieContainer, I get something like this: (cookies, in the format: "name" -> "value")
"normal_cookie" -> "i am the value"
"messedup_cookie" -> "a"
"bcdefghijklmnop" -> ""
// What I should get is this:
"normal_cookie" -> "i am the value"
"messedup_cookie" -> "a,bcdefghijklmnop"
The core of the problem seems to be that commas and semi colons are reserved characters in the SET-COOKIE header string...but then how do browsers handle this? I can probably parse the string myself but I don't know how to get around situations such as this one: (HTTP header, in the format: "name" -> "value")
"Set-Cookie" -> "messedup_cookie=a,bcdefghijklmnop; path=/; domain=.domain.com; expires=Sat, 15-Aug-2009 09:14:24 GMT,anothervariable=i am the value;"
As you can see, the expires section looks for a comma instead of a semicolon to differentiate itself from the next variable. As far as I can tell, it's in the format:
cookie1_var1=value; cookie1_var2=value,cookie2_var1=value; cookir2_var2=value
But if that's true, is there an elegant way to deal with commas that may occur inside one of the values?
According to the following article, you should consider UrlEncode and UrlDecode for storing values in cookies.
private void SetCookie()
{
HttpCookie cookie = new HttpCookie("cookiename");
cookie.Expires = DateTime.Now.AddMonths(24);
cookie.Values.Add("name", Server.UrlEncode(txtName.Text));
Response.Cookies.Add(cookie);
}
private void GetCookie()
{
HttpCookie cookie = Request.Cookies["cookiename"];
if (cookie != null)
{
txtName.Text = Server.UrlDecode(cookie.Values["name"]);
}
}
Don’t ever HTML encode a cookie in ASP.NET! It results in a yellow
screen of death and an exception
stating that the cookie contains
dangerous characters.
MSDN also has an article on the subject.
ASP.NET does not encode or unencode
cookies in UrlEncode format by
default. As a result, you may
encounter unexpected behavior in
ASP.NET applications.
To handle the cookie, since a cookie is essentially a string, you may like to try URL Encoding the cookie value before setting it, and then Decoding it when you pull out the value.
i.e.:
Response.Cookies["Value1"] = Server.UrlEncode(sCookieValue);
and similarly:
string sCookieValue = Server.UrlDecode(Request.Cookies["Value1"]);
While writing a custom IHttpHandler I came across a behavior that I didn't expect concerning the HttpCachePolicy object.
My handler calculates and sets an entity-tag (using the SetETag method on the HttpCachePolicy associated with the current response object). If I set the cache-control to public using the SetCacheability method everything works like a charm and the server sends along the e-tag header. If I set it to private the e-tag header will be suppressed.
Maybe I just haven't looked hard enough but I haven't seen anything in the HTTP/1.1 spec that would justify this behavior. Why wouldn't you want to send E-Tag to browsers while still prohibiting proxies from storing the data?
using System;
using System.Web;
public class Handler : IHttpHandler {
public void ProcessRequest (HttpContext ctx) {
ctx.Response.Cache.SetCacheability(HttpCacheability.Private);
ctx.Response.Cache.SetETag("\"static\"");
ctx.Response.ContentType = "text/plain";
ctx.Response.Write("Hello World");
}
public bool IsReusable { get { return true; } }
}
Will return
Cache-Control: private
Content-Type: text/plain; charset=utf-8
Content-Length: 11
But if we change it to public it'll return
Cache-Control: public
Content-Type: text/plain; charset=utf-8
Content-Length: 11
Etag: "static"
I've run this on the ASP.NET development server and IIS6 so far with the same results. Also I'm unable to explicitly set the ETag using
Response.AppendHeader("ETag", "static")
Update: It's possible to append the ETag header manually when running in IIS7, I suspect this is caused by the tight integration between ASP.NET and the IIS7 pipeline.
Clarification: It's a long question but the core question is this: why does ASP.NET do this, how can I get around it and should I?
Update: I'm going to accept Tony's answer since it's essentially correct (go Tony!). I found that if you want to emulate the HttpCacheability.Private fully you can set the cacheability to ServerAndPrivate but you also have call cache.SetOmitVaryStar(true) otherwise the cache will add the Vary: * header to the output and you don't want that. I'll edit that into the answer when I get edit permissions (or if you see this Tony perhaps you could edit your answer to include that call?)
I think you need to use HttpCacheability.ServerAndPrivate
That should give you cache-control: private in the headers and let you set an ETag.
The documentation on that needs to be a bit better.
Edit: Markus found that you also have call cache.SetOmitVaryStar(true) otherwise the cache will add the Vary: * header to the output and you don't want that.
Unfortunately if you look at System.Web.HttpCachePolicy.UpdateCachedHeaders() in .NET Reflector you see that there's an if statement specifically checking that the Cacheability is not Private before doing any ETag stuff. In any case, I've always found that Last-Modified/If-Modified-Since works well for our data and is a bit easier to monitor in Fiddler anyway.
If like me you're unhappy with the workaround mentioned here of using Cacheability.ServerAndPrivate, and you really want to use Private instead - perhaps because you are customising pages individually for users and it makes no sense to cache on the server - then at least in .NET 3.5 you can set ETag through Response.Headers.Add and this works fine.
N.B. if you do this you have to implement the comparison of the client headers yourself and the HTTP 304 response handling - not sure if .NET takes care of this for you under normal circumstances.