Updating GridView after insert - c#

We are using Visual Studio 2010 and .NET 4.0 for an ASP.NET website and MVC for the general architecture.
There are 2 parts to this question. Nothing is on how to do this but what is the right way as per good design and industry standards.
If I have a GridView (AJAX enabled) with 1000 + records (lot of data), and show 100 records at a time, do I go back to database for next 100 records, or store the data in session and just rebind the gridview by taking new data from session?
In the case of an insert, I have 2 choices. One, insert a record in the database and reload the gridview and rebind. Two, insert record in session and database and update the GridView based on session data. I need not download new data from database.
Can you please point me in right direction?

Forget the 1000 records at a time... If you're only showing 100 at a time, that's all you need to worry about. In my opinion, requery the database. 100 records at a time isn't much. Don't rely on session for this sort of thing, embrace the 'stateless' nature of the web.
I don't think using session gives you much benefit. Either way, the info needs to be sent to the user's browser. Querying the database for 100 records is probably a trivial operation (in terms of latency). From a development standpoint, the added complexity of introducing session state here isn't worth it. (Ask yourself, exactly what benefit does session give you here?)
That's just my opinion, others may differ. But I can't imagine running into too many problems querying for 100 records at a time in this scenario

You can store the data in the cache object at the server to avoid having to read the database each time. http://msdn.microsoft.com/en-us/library/aa478965.aspx

Read this article from ScottGu. Page your recordsets at the database. In most cases this is the correct approach. The article shows some benchmarks. If your site becomes high traffic, storing in session can be resource intensive. If you were to scale to a webfarm, you'd likely end up storing session state in sql server anyway.
Session data is also volatile. What if the user opens the same page in 2 browser windows? You'll end up stepping on the toes of each page. What if the user walks away from the page, letting session or cache expire? You'll have the same issues with Cache.
Paging at the server scales the best.
https://web.archive.org/web/20211020140032/https://www.4guysfromrolla.com/articles/031506-1.aspx#postadlink

Related

Large Application - How to handle data access

I have a large enterprise web application that is starting to be heavily used. Recently I've noticed that we are making many database calls for things like user permissions, access, general bits of profile information.
From what I can see on Azure we are looking at an average of 50,000 db queries per hour.
We are using Linq to query via the DevExpress XPO ORM. Now some of these are joins, but the majority are simple 1 table queries.
Is constantly hitting the database the best way to be accessing this kind of information? Are there ways for us to offload the database work as some of this information will never change?
Thanks in advance.
Let's start putting this into perspective. With 3600 seconds in an hour you have less than 20 operations per second. Pathetically low in any measurement.
That said, there is nothing wrong with for example caching user permissions for let's say 30 seconds or a minute.
Generally try to cache not in your code, but IN FRONT - the ASP.NET output cache and donut caching are concepts mostly ignored but still most efficient.
http://www.dotnettricks.com/learn/mvc/donut-caching-and-donut-hole-caching-with-aspnet-mvc-4
has more information. Then ignore all the large numbers and run a profiler - see what your real heavy hitters are (likely around permissions as those are used on every page). Put that into a subsystem and cache this. Given that you can preload that into user identity object in the asp.net subsystem - your code should not hit the database in the pages anyway, so the cache is isolated in some filter in asp.net.
Measure. Make sure your SQL is smart - EF and LINQ lead to extremely idiotic SQL because people are too lazy. Avoid instantiating complete objects just to throw them away, ask only for the fields you need. Make sure your indices are efficient. Come back when you start having a real problem (measured).
But the old rule is: cache early. And LINQ optimization is quite far in the back.
For getting user specific information like profile, access etc. from database, instead of fetching it for every request it is better to get information once at the time of login and keep it session. This should reduce your transactions with database

How to manage frequent data access in .net application?

I have three tables in my sql Database say Specials, Businesses, Comments. And in my master page i have a prompt area where i need to display alternative data from these 3 tables based on certain conditions during each page refresh (These tables have more than 1000 records). So in that case what will be the best option to retrieve data from these tables?
Accessing data each time from database is not a good idea i know, is there any other good way to do this, like Caching or any other new techniques to effectively manage this. Now it takes too much time to load the page after each page refresh.
Please give your suggestions.
At present what i was planning is to create a SP for data retrieval and to keep the value returned in a Session.
So that we can access the data from this session rather going to DB each time on page refresh. But do not know is there any other effective way to accomplish the same.
Accessing data each time from database is not a good idea
It not always true, it depends on how frequently the data is getting changed. If you choose to cache the data, you will have to revalidate it every time the data is changed. I am assuming you do not want to display a static count or something that once displayed will not change. If that's not the case, you can simply store in cookies and display from there.
Now it takes too much time to load the page after each page refresh.
Do you know what takes too much time? Is it client side code or server side code (use Glimpse to know that)? If server side, is it the code that hits the DB and the query execution time or its server side in memory manipulation.
Generally first step to improve performance is to measure it precisely and in order for you to solve such issues you ought to know where the problem is.
Based on your first statement, If i were you, I would display each count in a separate div which will be refreshed asynchronously. You could choose to update the data periodically using a timer or even better push it from server (use SignalR). The update will happen transparently so no page reload required.
Hope this helps.
I agree that 1000 records doesn't seem like a lot, but if you really aren't concerned about there being a slight delay you may try using HttpContext.Cache object. It's very much like a dictionary with string keys and object values, with the addition that you can set expirations etc...
Excuse typos, on mobile so no compile check:
var tableA = HttpContext.Cache.Get("TableA")
if tableA == null {
//if its null, there was no copy in the cache so create your
//object using your database call
tableA = Array, List, however you store your data
//add the item to the cache, with an expiration of 1 minute
HTTPContext.Cache.Insert("TableA", tableA, null, NoAbsoluteExpiration, TimeSpan(0,1,0))
}
Now, no matter how many requests go through, you only hit the database once a minute, or once for however long you think is reasonable considering your needs. You can also trigger a removal of the item from cache, if some particular condition happens.
One suggestion is to think of your database as a mere repository to persist state. Your application tier could cache collections of your business objects, persist them when they change, and immediately return state to your presentation tier (the web page).
This assumes all updates to the data are coming from your page. If the database is being populated from different places, you'll need to either tie everything into a common application tier, or poll the database to update your cache.

Store a database table in memory in a C# website application?

I have noticed that our web application queries a particular table an enormous amount of times. The table is relatively small, with only about a hundred rows that are used.
I'm wondering if there is a way to store this table once every 15 minutes or so in memory in the website application, so the system doesn't have to make so many queries to get the same information over and over again. This would be available across many different users.
The table is the Client table, so users login from many different clients. The data is pretty static, probably getting updated perhaps once a day.
Updates: SQL profiler is showing the query is run quite a bit, so that's what concerns me. The website is not notably slow. I just thought this could help make it even faster.
If the table is small and frequently queried, there is an outstanding chance that the data and any indices is entirely in SQL Server's memory, the query plan is cached, and that the query will be extremely fast.
Measure the actual performance impact before making any changes.
If you see there is a performance impact, there are many caching strategies that you can use to reduce trips to the database. More information about access patterns to the table and the need for information consistency would be needed to recommend a specific caching strategy.
You state
to get the same information over and over again
but also state
once every 15 minutes
If the information really is the same over and over, you can load it once into the ASP.Net cache at application start. If it might change every so often, but it is OK for the data to be a little out-of-date for a given user, you can use a time-based cache expiration policy. If the data changes only every so often but must be up-to-date immediately after it changes, you can consider a SQL Dependency for cache expiration.
For more information on ASP.Net caching see
http://msdn.microsoft.com/en-us/library/xsbfdd8c(v=vs.100).aspx
and specifically
http://msdn.microsoft.com/en-us/library/6hbbsfk6(v=vs.100).aspx
My suggestion would be to create a WCF windows service - using REST you could easily cache the SQLDataReader (or other DataReader) and implement a TTL metric to re-query at an interval.
Well,there is few solutions.
If you want to load data in memory every 15 minutes you should use some of the .net caching library's,for example system .NET Caching where you could set expiration polices,and other.
You could try optimize you query with nonclustered indexes
You could use App Fabric caching,or something similar
And last,try to add more memory on sql server server

Best Practice for Web App Wizard -- Session Variables or In-Line SQL?

I am working on a web application page which consists of a form that is three pages long (I'm constrained to keep it that way for the users, even though it is not the most efficient way of doing it).
My question is: What is the best practice for keeping track of the information from one page of the form to the next? Generally, we store everything in session variables until the last form where we make a stored procedure call or in-line SQL to update the database with the results of the form. The other option would be to use in-line SQL page-by-page to store the data before going from one page to the next.
TL;DR - session variable storage of data and SQL after 3 pages, or in-line SQL at each page?
Thanks!
I would suggest to save the entered data on each page in a database. The data could be saved into one (temp) table by session ID. If user clicks the "Finish" or "Submit" button then the data is "activated" by coping the data from temp. table into normalised tables.
However, this solution requires you to deal with dead session which never end up copying into it's final place. Therefore, a clean up task is needed to set up. This could be a MS SQL Job or any SQL query to the database checks the last clean-up time and performs it if the before set time interval is reached.
Storing everything in a session is not a good way. Especially if they keep a larger data set or if there are many concurrent users. The reason is that the HTTP sessions are stored in text files in the server and consume I/O. This makes it slow compared to a RDB.

DAL, Session, Cache architecture

I'm attempting to create Data Access Layer for my web application. Currently, all datatables are stored in the session. When I am finished the DAL will populate and return datatables. Is it a good idea to store the returned datatables in the session? A distributed/shared cache? Or just ping the database each time? Note: generally the number of rows in the datatable will be small < 2000.
Additional info:
Almost none of the data is shared. The parameters that are sent to the SQL queries are chosen by the user. The parameter values available to the user are based on who the user is. In most cases it is impossible for two users to run the same sql queries. However, the same user can run the same query more than once.
More info:
Number of concurrent users ~50,000
Important info:
In 99% of the cases no two users will have the same data/queries, however, the same user may run the same query/get the same data multiple times.
Thanks
Storing the data in session is not a good idea because:
Every user gets a separate copy of the same data - enormous waste of server memory.
IIS will recycle a session if you fill it with too much data.
I recommend storing the data tables in Cache, and also populating each table only when first requested rather than all at once. That way, if IIS starts reclaiming space in the cache, your code won't be affected.
Very simple example of fetching on demand:
T GetCached<T>(string cacheKey, Func<T> getDirect) {
object value = HttpContext.Current.Cache.Item(cacheKey);
if(value == null) {
value = getDirect();
HttpContext.Current.Cache.Insert(cacheKey, value);
}
return (T) value;
}
EDIT: - Question Update
Cache vs local Session - Local session state is all-or-nothing. If it gets too full, IIS will recycle everything in it. By contrast, cache items are dropped individually when memory gets too low, so it's much less of a problem.
Cache vs Session state server - I don't have any data to back this up, so please say so if I've got this wrong, but I would have thought that caching the data independently in memory in each physical server AppDomain would scale better than storing it in a shared session state service.
The first thing I would say is: cache is not mandatory everywhere. You should use it wisely and very specially on bottlenecks related to data access.
I don't think it's a good idea to store 1000 different datatables with 2000 records anywhere. If queries are so dynamic that having the same query in a short period of time is the exception then cache doesn't seem a good option.
And in relation to a distributed cache option, I suggest you to check http://memcached.org . A distributed cache used by many big projects around the world.
I know Velocity is near, but so far I know it needs Windows Server 2008 and it's something very very new yet. Normally Microsoft products are good from version 2.0 :-)
Store lookups/dictionaries - and items that your app would require very frequently in Application or Cache object; query database for data that depends upon the user role.
--EDIT--
This is in response to your comment.
Usually in any data oriented system, the queries run around the facts table(or tables that are inevitable to query); assuming you do have a set of inevitable tables, so you can use Cache.Insert():
Load the inevitable tables on app startup;
Load most queried tables in Cache upon table request-basis;
Query database for least queried tables.
If you do not have any performance issues then let SQL handle everything.
Storing that amount of data in the Session is a very bad idea. Each user will get their own version!
If this is shared data (same for all users), consider moving it to the Application object.

Categories

Resources