Good morning,
I am working on new MVC4 apps which consist of a huge search form (more than 50 parameters).
After the form is submitted, a complex query is built and i obtained a IList of a viewModel:
IList<ResultViewModel> results = session.CreateSQLQuery(ComplexQuery).SetResultTransformer(Transformers.AliasToBean<ResultViewModel>()).List<ResultViewModel>();
InterfaceVM.QueryResults = results;
InterfaceVM is then the model used by my output views to display grid results and Leaflet maps.
Everything works fine except that i can get results up to 1 million records. I would need to implement paging using PagedList.MVC for example but without having to pass the search parameters in the URL. I would like to avoid rebuilding the query and the IList object again and again.
InterfaceVM.QueryResults =results.ToPagedList(pageNumber, 20);
Moreover, this IList results will also be used several times in my output views to generate dynamically complex GIS outputs.
I spent several hours reading over the web to find the best strategy to keep my result object persisting after the form has been submitted to be able to manipulate it easily and avoiding to rebuilt it for each paging/mapping request.
I read about temp table in SQL server, Session object, MemoryCache etc... but i don't know what would better fit my situation.
Your input on this would be really helpful. I am using Fluent NHibernate, MVC4, SQL server 2008 R2.
Thanks in advance for your help
Sylvain
Related
I have some SQL Server Store Procs that generates statistical data for charting in a C# web application.
Right now the user in the web app has to wait about 5 minutes to see these charts with updated data and this is a pain in the neck for the user and for me.
Some of the Store procs takes more than 5 minutes to generate the data but the web user don't need to see the info on the fly. Maybe update the chart every 2-3 hours.
So, I dont know what is the best practice to solve this.
I was thinking on creating a windows service that every 2-3 hours will call the SP's and then store the data in different tables.
Any clue on how to deal with this?
Appreciate the help
As I said in the comments, indexed views (kind of like materialized views) can increase performance of certain common queries without having to make temporary tables and things like that.
The benefits of indexed views are performance and that it doesn't require much extra coding and effort. When you create an indexed view as opposed to a temp table, the query navigator will (should) know when to take advantage of this view, without the end user needing to specify a temp or aggregate table.
Examples of the benefits of indexed views and how to implement them can be found here http://msdn.microsoft.com/en-us/library/dd171921(v=sql.100).aspx
here are some links to indexed views. Like the comments said, views allow you to quickly get information rather then always doing a select every time using a stored proc. Read the second link for a very good explanation about views.
MSDN
http://msdn.microsoft.com/en-ca/library/ms187864%28v=sql.105%29.aspx
Very well explained here
http://www.codeproject.com/Articles/199058/SQL-Server-Indexed-Views-Speed-Up-Your-Select-Quer
I am having an issue with an internal site during testing ( 50 + users).
the pages work fine with 1 or 2 users but when a bunch of peoples hit the site, I get errors for a lot of my data bindings "System.Web.HttpException: DataBinding: 'System.Data.DataRowView' does not contain a property with the name ".
all these property names exist in the results I return from the database. but for some reason it happens with a lot of users at the same time.
I am using asp.net 4.0 and WCF.
The pages use data repeaters to bind data. i aslo checked the database and the responses from the database server are good, no issues there, so its purely application problem.
any help is much appreciated.
It seem there is a performance issue.
You can:
1.Use the simpler data source
2.Use output caching or partial caching
3.use data caching in your business logic layer ( like asp.net internal cache or Application Block Cache Helper , ... )
4.Review generated SQL by ORM(Entity Framework) and optimize it.
Use a really basic databinder, like SQLDataReader. It seems to me that you have some resource issues. And ensure you close all your connection strings.
I have a scenario and i want to know bestpractices to reduce database hits. Scenario is I have a dictionary table in my application where i put all the words/keywords for translation purpose because my system is multilingual.
Keywords are placed all over the page they can be 10 to 20 in one page and on each word it fetches the translation from database if user in not viewing english version of website.
My application in on Asp.Net MVC 2 with C# and LINQ2SQL.
Caching is a good way to reduce database queries. There are 2 levels of cache you could use:
Cache objects (for example results of database queries)
Cache HTML output of entire controller actions or partials
The translation typically don't change very often and the amount of data is limited. Read up all translated strings when the web app is started and put them in a globally accessible Dictionary. Whenever you need the translated strings, look them up in the dictionary.
linq will lazy load, which means the queries won't hit the database unless you access a property returned by the query, so make sure you avoid accessing property before they are really needed.
you could also try to combine linq queries into one and have a look at your loops to make sure there isn't a better way to cycle through your queries.
you should also be able to remove database access altogether and use translation files in xml rather than on a database.
Before you can do things like caching and lazy loading, etc... it's best to figure out WHAT is going wrong.
Enter LinqToSql Profiler. Yes, it's a commercial product .. but it's worth it. Also, it has a DEMO period ..
This can show you the crap performing queries .. and which queries are doing N+1, etc....
I'm facing a potential problem using the XtraReports tool and a web service about performance. in a Windows Form app.
I know XtraReport loads large data set (I understand a large data set as +10,000 rows) by loading the first pages and then continue loading the rest of the pages in the background, but all this is done with a data source in hand. So what happens if this data source has to pass through a web service, which will need to serialize the data in order to send it to the client?
The scenario is the following:
I have a thin client in windows form that makes calls to a web service, which takes that call and by reflection instantiates the corresponding class and calls the required method (Please notice that this architecture is inherited, I have almost no choice on this, I have to use it). So I'll have a class that gets the data from the database and send it to the client through the web service interface. This data can be a DataSet, SqlDataReader (also notice that we're using SQL Server 2000, but could be 2008 by the end of the year), DataTable, XML, etc.
If the result data set is large, the serialization + transference time can be considerable, and then render the report can add some more time, degradating the overall performance.
I know that there is a possibility for using something like streaming video, but for streaming data through a web service but I have no lead info for trying something around it.
What do you think about this? Please let me know any questions you may have or if I need to write more info for better statement of the problem.
Thanks!
I'll give you an answer you probably don't want to hear.
Set expectations.
Reports are typically slow because they have to churn through a lot of data. There just isn't a good way to get around it. But barring that, I'd do the following:
Serialize the data load to a binary state, convert to something transferable via soap (base64 for instance) and transfer that. This way you'll avoid a lot of useless angle brackets.
Precache as much data on the client as possible.
Focus on the perceived performance of the application. For instance, throw the report data gathering onto a background thread, so the user can go and do other work, then show the user a notification when the report is available.
Sometimes, it is possible to generate the report for the most often used criteria ahead of time and provide that when asked.
Is there a way you can partition the data, ie return a small subset of it?
If there's no absolute requirement to return all the data at the same time, then dividing up the data into smaller pieces will make a huge difference when reading from the database, serializing, and transporting through the webservice.
Edit: I had deleted this answer since you already mentioned partitioning. I'm undeleting it now, maybe it will serve some use as the starting point of a discussion...
As for your point about how to work with paging: I think you're on the right track with the "Show next 100 results" approach. I don't know how XtraReport works, and what its requirements for a datasource are. There are 3 issues I see:
Server support for partitioned data (eg your webservice should support returning just "page 3"'s data
Summary Data - does your report have a row of totals or averages? Are those calculated by the XtraReport control? Does it require a complete dataset to display those results? Can you provide the summaries to the control on your own (and find a more efficient way to calculate them without returning the entire data set?)
XtraReport support for a datasource that uses paging.
transfering DataSets is a bad idea. DataSet have a lot of unusefull date. Use Simple Objects. Use ORM on server side.
Also you can precalculte some data. Referencies can be cached on client and then joined with server data.
I am using Asp.NET MVC as a basic webservice for several searchengine type asp/ jquery sites. Database searching is straightforward:
Model - Sql Server FullText Sproc returning XML
View - N/a
Controller - Authorise user/ Parse input return Content (Model.XML)
The returned XML contains four resultsets - item list, category breakdown, related/ads items & paging numbers. The item, category & related lists all comprise several elements and attribs.
I am now looking for the best method to display the same info in an MVC view - both full and partials for jquery use - but am struggling to find the best solution. The only two I have come up with so far are to parse the XML using Linq ( should this be done in View or Controller?) or have a SProc returning resultsets and use NextResult method to fill multiple lists ( not that I 've worked out how to do that yet....)
All suggestions appreciated, thanks!
Decided to run with XDocument(Linq-to-XML); with the PasteXmlAsXLinq Visual Studio addin taking the pain out of it!
I'll leave the ORM until Asp.net mvc rtm. By then should have a better idea (Entity/ Linq & Viewengine) on most suitable route for full MVC 'conversion'.
For parsing the xml - how about using XmlSerializer to deserialize an object graph?
Does it need to return xml? How about returning regular grids, that you can read over directly? For example, one option is to use ORM such as a LINQ-to-SQL DataContext - then you can just execute the generated method which gives you the data as objects.
Alternatively - perhaps render the xml via an xsl transform?