I'm using Windows Auth in an Intranet setting to Cache information pulled from Active Directory. The purpose of the Cache is to speed up the page, as reading from AD is not particularly fast, and doesn't need to be done every single time. (The data doesn't change all that often)
To do this, i'm setting a custom key in HttpContext.Application.
This is the code located in Global.asax to handle VaryByCustom:
public override string GetVaryByCustomString(HttpContext context, string arg)
{
System.Diagnostics.Debug.Print("GetVaryByCustomString : " + context.Application["BrowsingSession_Key"].ToString());
if (arg == "BrowsingSession_Key")
{
object o = context.Application["BrowsingSession_Key"];
if (o == null)
{
o = Guid.NewGuid();
context.Application["BrowsingSession_Key"] = o;
}
return o.ToString();
}
return base.GetVaryByCustomString(context, arg);
}
In my BaseController (Inherited by all my controllers):
protected override void Initialize(RequestContext requestContext)
{
base.Initialize(requestContext);
//Custom Cache Initiation Variable
if (HttpContext.Application["BrowsingSession_Key"] == null)
{
HttpContext.Application["BrowsingSession_Key"] = Guid.NewGuid();
System.Diagnostics.Debug.Print("BaseController.Initialize : " + HttpContext.Application["BrowsingSession_Key"].ToString());
}
}
And finally, in my method inside a controller:
[OutputCache(Duration = 300, VaryByCustom = "BrowsingSession_Key", Location = OutputCacheLocation.Server)]
public ActionResult Index(HomeViewModel model)
//...
return View("index", model);
}
The issue is simple - the first person to view the page has their info cached, and the Guid for BrowsingSession_Key is set.
However, the next user visits the page within the 5 minute window, and reaches the last users cached content.
As you can see - i'm attempting to give each user a unique BrowsingSession_Key, so that they get their own cached content.
I'm using VaryByCustom so that i can quickly invalidate the cache by assigning a new BrowsingSession_Key (Guid) to that user - and pull a non-cached copy of a page for them.
Can you see what's going wrong here?
From my testing - it seems Initialize is often called, as is GetVaryByCustomString, in the places you'd expect them to be called. However, i can't run debug as multiple users, so i can't see why they're getting the same Guid and the same outputcache.
As it turns out, Application-level variables are not a good place to be storing per-user information, even temporarily.
In the end, i swapped over to using a cookie with a stored GUID, and invalidate the cookie (reset the GUID, or delete & Re-create). This meant multiple users were then able to use the site, with GetVaryByCustomString instead handling information stored in the Cookie.
This works for me - but i need to take into account the possibility of users swapping cookies, so will look further into encryption options.
For now - the answer is - Don't use Application-level variables this way, they're not suited to such tasks.
Related
I have a class that keeps track of Property Changes
public class Property
{
object _OriginalValue;
object _ProposedValue;
DateTime _ProposedDateTime;
List<Property> _History = new List<Property>();
public object OriginalValue
{
get
{
return _OriginalValue;
}
set
{
_OriginalValue = value;
}
}
public object ProposedValue
{
get
{
return _ProposedValue;
}
set
{
_ProposedDateTime = DateTime.Now;
_ProposedValue = value;
}
}
public bool IsDirty
{
get
{
if (OriginalValue != ProposedValue)
{
return true;
}
else
{
return false;
}
}
}
}
This property can be used by classes like
public class Customer
{
protected Property _FirstName = new Property();
public string FirstName
{
get
{
return (string)_FirstName.ProposedValue;
}
set
{
_FirstName.ProposedValue = value;
}
}
public object GetOriginalValue(Property Property)
{
return Property.OriginalValue;
}
}
The question is, is there a way to secure the original value when passing this to a client in an N-Tier architecture?
When a client passes a Customer back into the Service Boundary - by default you can't trust the client. You need to either reload the original values from the database or validate that the original values are untampered. Of course I'm assuming we're going to use business logic based on the current values in the customer to reject or allow an update operation.
Example:
User inserts record with Name Bob.
User fetches record with Name Bob and changes name to Ted. Original Value is Bob, proposed Value is Ted.
User sends Customer to Service to Update Customer.
Everything is good.
*A business rule is now coded into the service that says if the customer's name is Ted - allow the update else throw "unable to update" exception. *
User fetches record with name Ted.
User changes name to Darren.
User changes name back to Ted - system throws exception.
User fetches Ted. User cheats and uses a tool to change the OriginalPropertyValue on the client.
The server doesn't refetch the OriginalValue from the database and simply reads the OriginalValue coming from the client.
User bypasses business rule.
Actually there're more issues with your approach than just checking if original value hasn't been tampered. For example, I suspect that's a multi-user environment where more than an user would be able to edit the same object. That is, the original value mightn't be tampered, but changed before other has already saved a new original value in the database.
I guess you're already applying some kind of optimistic or pessimistic locking on your data...
About your actual concern, probably you need to sign your original value, and whenever you're going to store those objects back in the database, your application layer should check that original value hasn't been tampered (from Wikipedia):
Digital signatures are a standard element of most cryptographic
protocol suites, and are commonly used for software distribution,
financial transactions, contract management software, and in other
cases where it is important to detect forgery or tampering.
Following is my ASP.Net Web API controller code. Here as you can see there's a private class object, BL, that is used and both the Get methods implemented. For the first method FetchAllDashboardByUserId(int userId), I pass the user id so that the BL object can be initiated. Within the same browser session, if the second get method is called, then I do not want to pass the userid, since BL should be by default initiated, but currently that's not the case. BL is null for the second method, so I have to add userid to the call to the method - GetCardDataUI(int userId, int dashBoardID, int cardID). My question is how to avoid it. Is my thinking incorrect that:
A single open browser where I make the Consecutive call to the following URLs are a single session:
webapi/ViewR?userId=1
webapi/ViewR?userId=1&dashBoardID=1&cardID=3
I don't want to pass the userId in the second URL. Please note that if I declare class object as static then it works as expected, but that's not what I want, it has to be tied to a user:
public class ViewRController : ApiController
{
// BL object for a user
private static BL accessBL = null;
// HTTP GET for Webapi/ViewR (Webapi - name of API, ViewR - Controller with implementation)
[AcceptVerbs("Get")]
public List<DashboardUI> FetchAllDashboardByUserId(int userId)
{
if (accessBL == null)
accessBL = new BL(userId);
// Use BL object for entity processing
}
[AcceptVerbs("Get")]
public CardDataGetUI GetCardDataUI(int userId, int dashBoardID, int cardID)
{
if (accessBL == null)
accessBL = new BL(userId);
// Use BL object for entity processing
}
}
How I want the second method implementation to be:
[AcceptVerbs("Get")]
public CardDataGetUI GetCardDataUI(int dashBoardID, int cardID)
{
// Use BL class object created in last call for entity processing
// Should not pass userid again
}
You can easily store data in Session:
... first request:
Session["userID"] = userID;
... next request:
int userID = (int)Session["userID"]; // should check for null first, but you get the idea...
But keep the following points in mind:
Session variables are stored as objects, so you'll need to cast and/or type-check
Session variables can be null
Session expires after a configurable (in web.config) about of time
Default session state is in-memory, meaning if the app pool is restarted session state is gone - you can store session in files or databases to keep longer
Session doesn't scale out unless you use persistent storage (file, database)
Objects stored in persistent storage must be serializable
I have an api method that gets a lot of data in the application start up.
90% of the data is relevant for all application users and the other 10% need to be changed by the user id and his environment and app version.
To avoid calling the first method every time user connect and make the start up slower,
i added : AspNetCacheProfile attribute.
But in this situation i can`t use the user id, environment and version at this method
because it is the data of the first user called the method.
So,
I added new method (the second one) and set the AspNetCacheProfile attribute above it
and called this method from the first one.
This way i can cache the general data and update the other 10% each call.
I just wanted to know .. will it work ?
I was not sure because the second method is not called directly, it is called from the first one.
[WebGet(UriTemplate = "/GetData")]
public APIResponse GetData()
{
try
{
MyCachedResponse = GetMyCachedResponse();
foreach (var info in data.info)
{
if (Something)
{
// Update some specific values inside
}
}
AppState.CurrentResponse.Data = data;
}
return AppState.CurrentResponse;
}
[AspNetCacheProfile("OneMinuteCaching")]
private MyCachedResponse GetMyCachedResponse()
{
return new MyCachedResponse(categories);
}
I'm trying to implement functionality to cache certain pages depending on the host. This is because I can have multiple versions of a page which have the same parameters, and where the only difference in terms of a request is the host that is being requested.
So, for example these two URLs will request the same page, but they are styled differently:
http://www.a.com/something/specific
and
http://www.b.com/something/specific
I'm going through the example outlined here:
http://msdn.microsoft.com/en-us/library/5ecf4420%28v=VS.90%29.aspx
but it's not making sense to me.
I've added this to my global.asax:
public override string GetVaryByCustomString(HttpContext context, string arg)
{
if (arg == "host")
{
return "host=" + context.Request.Url.Host;
}
return base.GetVaryByCustomString(context, arg);
}
and the example states "To set the custom string programmatically, call the SetVaryByCustom method and pass it the custom string to use", with code similar to the following:
Response.Cache.SetVaryByCustom("host");
The problem is I'm not sure what to do with this. I've added the previous line to MvcApplication_EndRequest because it seems like it makes sense, but I don't think this is right because when I set breakpoints in GetVaryByCustomString they never get hit.
Can somebody please tell me what I'm missing here? Or if I need to do this differently?
Edit: RE Darin's answer below, I'm already decorating my actions with:
[CustomOutputCache(CacheProfile = "FundScreener")] // or similar depending on the action
where CustomOutputCacheAttribute is defined as:
[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)]
public class CustomOutputCacheAttribute: OutputCacheAttribute
{
public override void OnResultExecuted(ResultExecutedContext filterContext)
{
AddLabelFilesDependency(filterContext);
base.OnResultExecuted(filterContext);
}
private static void AddLabelFilesDependency(ControllerContext filterContext)
{
IConfigurationManager configurationManager = ObjectFactory.TryGetInstance<IConfigurationManager>();
if (configurationManager == null
|| filterContext == null
|| filterContext.RequestContext == null
|| filterContext.RequestContext.HttpContext == null
|| filterContext.RequestContext.HttpContext.Response == null
)
{
return;
}
string[] files = Directory.GetFiles(configurationManager.LabelsDirectoryPath, "*.xml");
foreach(var file in files)
{
filterContext.RequestContext.HttpContext.Response.AddFileDependency(file);
}
}
}
where the profile is defined as:
<add name="FundScreener"
location="Server"
enabled="true"
varyByParam="*"
duration="1200"
sqlDependency="mmftms:offline.ScreenerData"/>
Do I need to change this?
You don't need to call SetVaryByCustom in MVC. You could use the OutputCache attribute. Checkout the following blog post.
If you want to have different cache for different hosts, you can use:
VaryByHeader="host"
Because, that would make it use the value of header "host" in the request to vary the cache. You can add this in the OutputCache directive on your controllers/actions, or you can specify it globally in your web.config probably.
a host header will always be present if you use host-bindings, which seems to be the case for you.
GetVaryByCustomString(...) is called by the caching layer per request and you have an opportunity to inspect the request and the passed-in argument to decide how to "categorize" this request. So if you set the VaryByCustom property/attribute to "host", you would then write code inside GetVaryByCustomString function which returns the host (as in your example, above). If the caching layer finds that it has already cached the argument "host" with the value you've returned then it will return the cached response, otherwise it executes the request and adds it to the cache.
Based on your edit, add VaryByCustom="host" to your FundScreener output cache profile.
I'm building a Javascript application and eash user has an individual UserSession. The application makes a bunch of Ajax calls. Each Ajax call needs access to a single UserSession object for the user.
Each Ajax call needs a UserSession object.
Data in the UserSession object is unique to each user.
Originally, during each Ajax call I would create a new UserSession object and it's data members were stored in the ASP.NET Session. However, I found that the UserSession object was being instantiated a lot. To minimize the construction of the UserSession object, I wrapped it in a Singleton pattern and sychronized access to it.
I believe that the synchronization is happening application wide, however I only need it to happen per user. I saw a post here that says the ASP.NET cache is synchronized, however the time between creating the object and inserting it into the cache another Thread could start construction it's another object and insert it into the cache.
Here is the way I'm currently synchronizing access to the object. Is there a better way than using "lock"... should be be locking on the HttpContext.Session object?
private static object SessionLock = new object();
public static WebSession GetSession
{
get
{
lock (SessionLock)
{
try
{
var context = HttpContext.Current;
WebSession result = null;
if (context.Session["MySession"] == null)
{
result = new WebSession(context);
context.Session["MySession"] = result;
}
else
{
result = (WebSession)context.Session["MySession"];
}
return result;
}
catch (Exception ex)
{
ex.Handle();
return null;
}
}
}
}
You don't need to lock Session state access.
The physical values of a session state are locked for the time needed to complete a request. The lock is managed internally by the HTTP module and used to synchronize access to the session state.
http://msdn.microsoft.com/en-us/library/aa479041.aspx
In general, you don't need this kind of code for asp.net session access, since access to each session is limited to a single user. The only reason I can think of for locking access to your session object is if you expect to have multiple simultaneous ajax requests, and even so, I think asp.net would synchronize the access for you.
If you do decide to lock, you only really need to do it if your session object is null:
if (context.Session["MySession"] == null) {
lock(SessionLock) {
if (context.Session["MySession"] == null) {
context.Session["MySession"] = new WebSession(context); // try-catch block removed for clarity (and my laziness)
}
}
}
return (WebSession)context.Session["MySession"];