I'm going to be doing a progress bar for a donations page on a website that I run. I only want the progress bar to update once a day, rather than on every page load as I would normally do.
What options do I have here for grabbing the current SUM of donations and putting it maybe in a flat text file for the aspx page to read rather than it querying the database every time.?
Hope this makes sense.
Another option is to use caching and set the cache to only expire once every 24 hours. Then the data is pulled and placed in cache and the cached version is served all day.
Why don't you implement Cache and make it expire in 24 hrs? This is a better solution and aids in performance too :-)
HTH
I would just run a SUM query on the database each time. Unless you're expecting millions of row inserts per day, this will be a negligable performance hit. (Providing your database table is indexed)
First: this type of query is very fast; unless you have some reason that you haven't mentioned, just have the page query the database each time it loads. A lot of people these days seem to advocate avoiding round-trips to the database. However, SQL Server is very fast, and this sort of thing will have virtually no impact on your application performance. Also, I'm a fan of having accurate data displayed whenever possible - not something that's potentially 24 hours out-of-date.
If you insist on doing this, you have a couple of options.
Create a VBScript or (preferred) PowerShell script that queries the database and dumps the result to, say, an aspx, ascx or HTML file in a virtual folder attached to your website. Set the script to run as a scheduled task on your web server (or a scripts server, if you have one).
Do the same with a Windows service. That may be overkill for this. Services are great when you need something to listen for remote connections, etc, but for running a task periodically, the built-in scheduler works just fine.
The file you generate should probably already contain the formatted HTML. Your ASP.NET page would include this as an user control or partial view. If you're using MVC, you could use Razor from your script to format the output.
Again, I think this is a bad idea, and creating unnecessary work. Unless you have a very good reason for avoiding hitting the database, don't go down this path.
If you want something to run exactly once a day, the Task Scheduler is your best bet. Have it run a query and stick the result somewhere your website can read from.
The next option is to use a cache. Have your code check if the value is in the cache. If it is, use that. If it isn't, run the query and populate the cache. Set the cache to expire in 24 hours. The problem with this is that the query will probably run more then once on a busy website (multiple requests come in after the cache is expired but before the first one populates it again), but that likely won't matter for a query that doesn't take long.
Try searching for ASP.NET global cache examples. Set two variables, one for you query result, another for the DateTime of the query execution. Check in code if DateTime.Now - the date in cache is > 24h -> update again (or something like that)
If you have static variable in some class, it is visible across the entire web app
I would say to use the cache. As an enhancement on the other caching answers you could impliment some code to make it so that the cache would expire at the same time every day rather than just 24 hours after the first access of the day. This protects against having erratic update times in a scenario where traffic is sporatic. The first request after the cut off will look up the value and persist it until the next cutoff time. But the cutoff stays constant regardless of when the first request occurs.
Something along the lines of this:
int ProgressValue
{
get
{
int? Value = Cache["ProgressValue"] as int?;
if (Value == null)
{
//Set expiration time to 6 AM tomorrow.
DateTime ExpTime = DateTime.Now.Date.AddDays(1).AddHours(6);
Value = GetValueFromDB();
Cache.Insert("ProgressValue", Value, null, ExpTime, System.Web.Caching.Cache.NoSlidingExpiration);
}
return (int)Value;
}
}
Related
I have a large enterprise web application that is starting to be heavily used. Recently I've noticed that we are making many database calls for things like user permissions, access, general bits of profile information.
From what I can see on Azure we are looking at an average of 50,000 db queries per hour.
We are using Linq to query via the DevExpress XPO ORM. Now some of these are joins, but the majority are simple 1 table queries.
Is constantly hitting the database the best way to be accessing this kind of information? Are there ways for us to offload the database work as some of this information will never change?
Thanks in advance.
Let's start putting this into perspective. With 3600 seconds in an hour you have less than 20 operations per second. Pathetically low in any measurement.
That said, there is nothing wrong with for example caching user permissions for let's say 30 seconds or a minute.
Generally try to cache not in your code, but IN FRONT - the ASP.NET output cache and donut caching are concepts mostly ignored but still most efficient.
http://www.dotnettricks.com/learn/mvc/donut-caching-and-donut-hole-caching-with-aspnet-mvc-4
has more information. Then ignore all the large numbers and run a profiler - see what your real heavy hitters are (likely around permissions as those are used on every page). Put that into a subsystem and cache this. Given that you can preload that into user identity object in the asp.net subsystem - your code should not hit the database in the pages anyway, so the cache is isolated in some filter in asp.net.
Measure. Make sure your SQL is smart - EF and LINQ lead to extremely idiotic SQL because people are too lazy. Avoid instantiating complete objects just to throw them away, ask only for the fields you need. Make sure your indices are efficient. Come back when you start having a real problem (measured).
But the old rule is: cache early. And LINQ optimization is quite far in the back.
For getting user specific information like profile, access etc. from database, instead of fetching it for every request it is better to get information once at the time of login and keep it session. This should reduce your transactions with database
I have three tables in my sql Database say Specials, Businesses, Comments. And in my master page i have a prompt area where i need to display alternative data from these 3 tables based on certain conditions during each page refresh (These tables have more than 1000 records). So in that case what will be the best option to retrieve data from these tables?
Accessing data each time from database is not a good idea i know, is there any other good way to do this, like Caching or any other new techniques to effectively manage this. Now it takes too much time to load the page after each page refresh.
Please give your suggestions.
At present what i was planning is to create a SP for data retrieval and to keep the value returned in a Session.
So that we can access the data from this session rather going to DB each time on page refresh. But do not know is there any other effective way to accomplish the same.
Accessing data each time from database is not a good idea
It not always true, it depends on how frequently the data is getting changed. If you choose to cache the data, you will have to revalidate it every time the data is changed. I am assuming you do not want to display a static count or something that once displayed will not change. If that's not the case, you can simply store in cookies and display from there.
Now it takes too much time to load the page after each page refresh.
Do you know what takes too much time? Is it client side code or server side code (use Glimpse to know that)? If server side, is it the code that hits the DB and the query execution time or its server side in memory manipulation.
Generally first step to improve performance is to measure it precisely and in order for you to solve such issues you ought to know where the problem is.
Based on your first statement, If i were you, I would display each count in a separate div which will be refreshed asynchronously. You could choose to update the data periodically using a timer or even better push it from server (use SignalR). The update will happen transparently so no page reload required.
Hope this helps.
I agree that 1000 records doesn't seem like a lot, but if you really aren't concerned about there being a slight delay you may try using HttpContext.Cache object. It's very much like a dictionary with string keys and object values, with the addition that you can set expirations etc...
Excuse typos, on mobile so no compile check:
var tableA = HttpContext.Cache.Get("TableA")
if tableA == null {
//if its null, there was no copy in the cache so create your
//object using your database call
tableA = Array, List, however you store your data
//add the item to the cache, with an expiration of 1 minute
HTTPContext.Cache.Insert("TableA", tableA, null, NoAbsoluteExpiration, TimeSpan(0,1,0))
}
Now, no matter how many requests go through, you only hit the database once a minute, or once for however long you think is reasonable considering your needs. You can also trigger a removal of the item from cache, if some particular condition happens.
One suggestion is to think of your database as a mere repository to persist state. Your application tier could cache collections of your business objects, persist them when they change, and immediately return state to your presentation tier (the web page).
This assumes all updates to the data are coming from your page. If the database is being populated from different places, you'll need to either tie everything into a common application tier, or poll the database to update your cache.
Basically I'm writing an ASP.NET MVC application that though a javascript sends a GET request every 30 seconds, checking if a certain row in a table in the database has changed.
I've been looking at the OutputCache attribute but it doesn't seem like it would work since it would just cache the content and not really check if an update was made.
What would be the "cheapest" way to do this? I mean the way that burdens the server the least?
A HEAD request may be faster, but not guaranteed to be, but it is worth investigating.
If you can't use something to stream the change to you, the cheapest way is to use an API that takes a date, and returns a boolean flag or an integer stating whether a change occurred. Essentially it's polling, and this would be minimal because it's the smallest response back and forth, if a SignalR or some other message receive process isn't possible.
Depends what you want it to do, have you considered long-polling? Eg, make the GET/POST request using javascript and allow the server to withhold the reply until your 'event' happens.
OutputCache works perfectly. But it's expiration time should be a divider of your polling time, like 10 sec for - in this case - 30 sec on client size.
I'm not an expert on EF, but if your database supports triggers; that would be an option and you can cache the result for a longer period (like say 1 hour) unless a trigger is set.
But if your record is being updated very fast, trigger would be costly.
In that case I would go with Caching + a time-stamp mechanism (like versions in a NoSQL db, or time-stamp in Oracle).
And remember that you are fetching the record every 30 seconds, not on every change on the record. That's a good thing, because it makes your solution much simpler.
Probably SignalR with push notification when there's a change in the database (and that could be either tracked manually or by SqlDependency, depending on the database)...
I used Wyatt Barnett's suggestion and it works great.
Thanks for the answers - appreciate it.
Btw I'm answering it since I can't mark his comment as answer.
I would like some advice on how to best go about what I'm trying to achieve.
I'd like to provide a user with a screen that will display one or more "icon" (per say) and display a total next to it (bit like the iPhone does). Don't worry about the UI, the question is not about that, it is more about how to handle the back-end.
Let's say for argument sake, I want to provide the following:
Total number of unread records
Total number of waiting for approval
Total number of pre-approved
Total number of approved
etc...
I suppose, the easiest way to descrive the above would be "MS Outlook". Whenever emails arrive to your inbox, you can see the number of unread email being updated immediately. I know it's local, so it's a bit different, but now imagine having the same principle but for the queries above.
This could vary from user to user and while dynamic stored procedures are not ideal, I don't think I could write one sp for each scenario, but again, that's not the issue heree.
Now the recommendation part:
Should I be creating a timer that polls the database every minute (for example?) and run-all my relevant sql queries which will then provide me with the relevant information.
Is there a way to do this in real time without having a "polling" mechanism i.e. Whenever a query changes, it updates the total/count and then pushes out the count of the query to the relevant client(s)?
Should I have some sort of table storing these "totals" for each query and handle the updating of these immediately based on triggers in SQL and then when queried by a user, it would only read the "total" rather than trying to calculate them?
The problem with triggers is that these would have to be defined individually and I'm really tring to keep this as generic as possible... Again, I'm not 100% clear on how to handle this to be honest, so let me know what you think is best or how you would go about it.
Ideally when a specific query is created, I'd like to provide to choices. 1) General (where anyone can use this) and b) Specific where the "username" would be used as part of the query and the count returned would only be applied for that user but that's another issue.
The important part is really the notification part. While the polling is easy, I'm not sure I like it.
Imagine if I had 50 queries to be execute and I've got 500 users (unlikely, but still!) looking at the screen with these icons. 500 users would poll the database every minute and 50 queries would also be executed, this could potentially be 25000 queries per miuntes... Just doesn't sound right.
As mentioned, ideally, a) I'd love to have the data changes in real-time rather than having to wait a minute to be notified of a new "count" and b) I want to reduce the amount of queries to a minimum. Maybe I won't have a choice.
The idea behind this, is that they will have a small icon for each of these queries, and a little number will be displayed indicating how many records apply to the relevant query. When they click on this, it will bring them the relevant result data rather than the actual count and then can deal with it accordingly.
I don't know if I've explained this correctly, but if unclear, please ask, but hopefully I have and I'll be able to get some feedback on this.
Looking forward to your feeback.
Thanks.
I am not sure if this is the ideal solution but maybe a decent 1.
The following are the assumptions I have taken
Considering that your front end is a web application i.e. asp.net
The data which needs to be fetched on a regular basis is not hugh
The data which needs to be fetched does not change very frequently
If I were in this situation then I would have gone with the following approach
Implemented SQL Caching using SQLCacheDependency class. This class will fetch the data from the database and store in the cache of the application. The cache will get invalidated whenever the data in the table on which the dependency is created changes thus fetching the new data and again creating the cache. And you just need to get the data from the cache rest everything (polling the database, etc) is done by asp.net itself. Here is a link which describes the steps to implement SQL Caching and believe me it is not that difficult to implement.
Use AJAX to update the counts on the UI so that the User does not feel the pinch of PostBack.
What about "Improving Performance with SQL Server 2008 Indexed Views"?
"This is often particularly effective for aggregate views in decision
support or data warehouse environments"
I have an application that uses data from a table which is almost always not changing in many parts of it.
This seems a right place to make a cache of it. So: i need to make cached list of that data to work with it, but have some expiration timeout after which my cached list should update itself from database(thats why global static list is not for this situation).
PS im sure thats not that difficult, but im new to caching and help will save my time, thank you. At least i can create static list that will be updated after some timeout with timer in another thread, but i think such solution is too ugly.
Lots of hints here on how to achieve that:
http://msdn.microsoft.com/en-us/library/dd997357.aspx
Cache can be fitted with an expiry date so it will go and re-fetch the data after a set amount of time without dealing with timers etc.