I am looking at options for caching data at service layer off my web application (server layer gets data from other systems and at Web Front End I dont want to go on round trip for that data each time - I would like to cache it for say 20 mins and if it is not null load it from cache if not go and retrieve it
I have looked at Dynacache which basically looks as if it should do exactly what I want but I have been having problems getting it working with SimpleInjector my DI Framework. Has anyone used a similar NuGet package or got an example of doing similar?
I typically set up my web service layer with as little caching as possible, and leave the caching up to the client. If a website needs to only cache a set of data, then that's its own responsibility. If another web application needs real-time access, then I don't want to hinder that.
If I DO need to cache, say a static list that hardly changes, then I typically use something like MemoryCache and set a rolling timeout. For this, I usually write a wrapper for this that utilizes a lambda Func in the .Get() property of my caching service as the source of the cache for that key if the value happens to be null.
Related
Newbie to the whole ASP.Net conundrum !!
So I have a set of Web API's (Actions) in a controller which will be called in succession to each other from the client. All the API's depend on a data model object fetched from the database. Right now I have a DAO layer which fetches this data and transparently caches it. So there is no immediate issue as there is no round trip to database for each API call. But the DAO layer is maintained by a different team and makes no guarantee that the cache will continue to exist or it's behaviour won't change.
Now the properties or attributes of the model object would change but not too often. So If I refer the API calls made in succession from a client as a bundle, then I can safely assume that the bundle can actually query this data once and use it without having to worry about change in value. How can I achieve this ? Is there a design pattern somewhere in ASP.Net world which I can use ? What I would like is to fetch this value at a periodic interval and refresh it in case one of the API calls failed indicating the underlying values have changed.
There are a few techniques that might be used. First of all, is there a reason for the need of a second cache, because your Data Access Layer already has it, right?
You can place a cache on the Web API response level by using a third party library called Strathweb.CacheOutput, and:
CacheOutput will take care of server side caching and set the appropriate client side (response) headers for you.
You can also cache the data from your data access layer by using a more manual approach by using the MemoryCache from System.Runtime.Caching.
Depending on what's the infrastructure available, distributed caches like Cassandra or Redis may be the best choice.
What I want to do is this: create an ASP.NET MVC web app that periodically looks for new or updated data from an external API (eg, a new order created or modified in an inventory management system), and then sends this data to another system. Similar sort of thing to this - I have tried using it, but I've found it's a bit too limited for some of the things I wanted to do.
I am currently trying to figure out the best way to go about this. I imagine that it will work like this:
Periodically (using a timer?) pull the data from one system's API, and cache the id and DateUpdated fields (in a local database I assume) for each item.
If there are any changes against the cached data, then convert the item to appropriate format if needed and post to the second API.
I would be connecting to a few different third party systems, which for the most part do not support web hooks etc, so I assume polling and caching is the only option.
Am I on the right track? Haven't been able to find a lot of resources on this sort of thing, so would appreciate any advice.
What I want is pretty simple conceptually but I can't figure out how it would be best to implement such a thing.
In my web application I have services which access repositories which access EF which interacts with the SQL Server database. All of these are instanced once per web request.
I want to have an extra layer between the repositories and EF (or the services and the repositories?) which statically keeps track of objects being pulled from and pushed to the database.
The goal, assuming DB-access is only accomplished through the application, would be that we know for a fact that unless some repository access EF and commits a change, the object set didn't really change.
An example would be:
Repository invokes method GetAllCarrots();
GetAllCarrots() performs a query on SQL Server retrieving a List<Carrot>, if nothing else happens in between, I would like to prevent this query from being actually made on the SQL Server each time (regardless of it being on a different web request, I want to be able to handle that scenario)
Now, if a call to BuyCarrot() adds a Carrot to the table, then I want that to invalidate the static cache for Carrots, which would make GetAllCarrots(); require a query to the database once again.
What are some good resources on database caching?
You can use LinqToCache for this.
It allows you to use the following code inside your repository:
var queryTags = from t in ctx.Tags select t;
var tags = queryTags.AsCached("Tags");
foreach (Tag t in tags)
{
...
}
The idea is that you use SqlDependency to be notified when the result of a query changes. As long as the result doesn't change you can cache it.
LinqToCache keeps track of your queries and returns the cached data when queried. When a notification is received from SqlServer the cache is reset.
I recommend you reading the http://rusanu.com/2010/08/04/sqldependency-based-caching-of-linq-queries/ .
I had a similar challenge, and due to EF's use and restrictions, i've decided to implement the cache as an additional service between the client and server's service, using an IoC. Monitoring all service methods that could affect the cached data.
Off course is not a perfect solution when you have a farm of servers running the services, if the goal is to support multiple servers i would implement using the SqlDependency.
Ok, this is very "generic" question. We currently have a SQL Server database for which we need to develop an application in ASP.NET with will contain all the business logic in C# Web Services.
The thing is that, architecturally speaking, I'm not sure how to design the web service and the data management. There are many things to consider :
We need have very rapid access to data. Right now, we have over a million "sales" and "purchases" record from which we need to often calculate and load the current stock for a given day according to a serie of parameter. I'm not sure how we should preload the data and keep the data in the Web Service. Doing a stock calculation within a SQL query will be very lengthy. They currently have a stock calculation application that preloads all sales and purchases for the day and afterwards calculate the stock on the code-side.
We want to develop powerful reporting tools. We want to implement a "pivot table" but not sure how to implement it and have good performances.
For the reasons above, I'm not sure how to design the data model.
How would you manage the display of the current stock, considering that "stock" is actually purchases - sales and that you have to consider all rows to calculate it ? Would you cache "stock" data in the database to optimize performances, even though its redundant data ?
Anybody can give me any guidelines on how to start, or from their personnal experiences (what have you done in the past ?)
I'm not sure if it's possible to make a bounty even though the question is new (I'd put 300 rep on it, since I really need something). If you know how, let me know.
Thanks
The first suggestion is to not use legacy ASMX web services. Use WCF, which Microsoft says should be used for all new web service development.
Second, are you sure you can't optimize the database, or else place it on faster hardware, or nearer to the web server?
I don't know that you're going to get that much data in memory at once. If you could, then you could use a DataSet and use LINQ to DataSets for queries against it.
I hope I hope I'm misunderstanding what you wrote, but if by
contain all the business logic in C# Web Services
you mean something like this then you're already headed in the anti-pattern direction. Accessing your data from an ASP.NET application over web-services would just cause you to incur the serialization/deserialization penalty for pretty much no gain.
A better approach would be to organize services you want to make available into a common layer that your applications are built on and access them directly from your ASP.NET application and maybe also expose them as Web Services to allow external sources to consume this data.
You could also look into exposing data that is expensive to compute using a data warehouse that is updated at regular intervals (once or a couple of times/day). This would help with getting better read performance out of data (as long as you're willing to accept data being a bit stale).
Is that the kind of information you're looking for?
I need to access a Web-svc, run a bunch of queries and save the data to a store as part of an analysis. On top of this, there is a web-site that will query the datastore and show this data.
We have features like this getiing added every month. How can I reduce the amount of boilerplate code that get's written.
Add web svc ref
Wrap methods in provider layer to handle exceptions
Prepare request
Send request
Store data locally.
Retrive and show data through aspx.
This is such a pain.
I have found two things useful for lowering the tedious coding of the scenario
- WCF Line of Business Adapter SDK: This provides a very powerful base for building an WCF adapter (basically like a BizTalk adapter). It is a bit tough the first time but adding to it later is much nicer.
- p&p Web Service Factory: This is nice for the database stuff especially since it provides some great wizards to do automatic generation. Not to say you can't use it for other things.