How to manage data access / preloading efficiently using web services in C#? - c#

Ok, this is very "generic" question. We currently have a SQL Server database for which we need to develop an application in ASP.NET with will contain all the business logic in C# Web Services.
The thing is that, architecturally speaking, I'm not sure how to design the web service and the data management. There are many things to consider :
We need have very rapid access to data. Right now, we have over a million "sales" and "purchases" record from which we need to often calculate and load the current stock for a given day according to a serie of parameter. I'm not sure how we should preload the data and keep the data in the Web Service. Doing a stock calculation within a SQL query will be very lengthy. They currently have a stock calculation application that preloads all sales and purchases for the day and afterwards calculate the stock on the code-side.
We want to develop powerful reporting tools. We want to implement a "pivot table" but not sure how to implement it and have good performances.
For the reasons above, I'm not sure how to design the data model.
How would you manage the display of the current stock, considering that "stock" is actually purchases - sales and that you have to consider all rows to calculate it ? Would you cache "stock" data in the database to optimize performances, even though its redundant data ?
Anybody can give me any guidelines on how to start, or from their personnal experiences (what have you done in the past ?)
I'm not sure if it's possible to make a bounty even though the question is new (I'd put 300 rep on it, since I really need something). If you know how, let me know.
Thanks

The first suggestion is to not use legacy ASMX web services. Use WCF, which Microsoft says should be used for all new web service development.
Second, are you sure you can't optimize the database, or else place it on faster hardware, or nearer to the web server?
I don't know that you're going to get that much data in memory at once. If you could, then you could use a DataSet and use LINQ to DataSets for queries against it.

I hope I hope I'm misunderstanding what you wrote, but if by
contain all the business logic in C# Web Services
you mean something like this then you're already headed in the anti-pattern direction. Accessing your data from an ASP.NET application over web-services would just cause you to incur the serialization/deserialization penalty for pretty much no gain.
A better approach would be to organize services you want to make available into a common layer that your applications are built on and access them directly from your ASP.NET application and maybe also expose them as Web Services to allow external sources to consume this data.
You could also look into exposing data that is expensive to compute using a data warehouse that is updated at regular intervals (once or a couple of times/day). This would help with getting better read performance out of data (as long as you're willing to accept data being a bit stale).
Is that the kind of information you're looking for?

Related

Retrieve huge data from rest api

I am calling third party rest api.this rest api returns more then 2 thousand records that takes much time to
retrieve this information.I want to show these records in asp.net page.
Records display in page successfully but problem is its takes much time to display/retrieve this information.
I call this api against dropdown change so api call might be again and again.means it is not one time call and
also i cann't cache this information because in server end may be information will change.
So How I reduce download time or any trick(paging is not supported),Any suggestion?
Thanks in advance
Well there's always a workaround for any given problem even when it doesn't come to your mind at first sight.
I had one situation like this, and this is what we did:
First, keep in mind that real-time consistency is just something that cannot be achieved without writing extensive code. For example, if a user requests data from your service and then that data is rendered on the page, potentially the user could be seeing out-of-data data if at that precise time the source is updated.
So we need to accept the data will be consistent eventually
Then consider the following, replicate the data in your database using an external service periodically
Expose this local data allowing paging and consume it from your client code
As any design decision, there are trade-offs that you should evaluate to take your final choice, these were problems we had with a design like this:
Potentially phe data will not be consistent when the user requests it. You can poll the data to the server every xxx minutes depending on how often it changes (Note: usually we as developers, we are not happy with this kind of data inconsistency, we would like to see everything on real-time, but the truth, is that the end users, most of the time can live with this kind of latency because that's the way they use to work in real-life, consult with your end-users to gather their point of views about this. You can read more about this from Eric Evans in the Blue Book)
You have to write extra code to maintain the data locally
Sadly when you are dealing with a third party service with these conditions, there are few options to choose from, I recommend you to evaluate them all with your end-users explaining their trade-offs to choose the solution that best fits your needs
Alternatives:
Since I already said the data will not be consistency all the time, then perhaps you could consider caching the call to the service =)
Usually the easiest way is always the best way to face something like this, since you didn't mention it in your post I will comment here. Have you tried to simply request to the third party organization to allow paging?. Get in touch with your third party partner and talk about this.
If you can, evaluate another source (other third party companies providing a similar service)
So How I reduce download time or any trick(paging is not supported),Any suggestion?
If the API doesn't support paging, there's not much you could do from a consumer perspective than probably caching the data to avoid the expensive HTTP call.
There's no miracle. There's no client.MaxDownloadTimeMs property that you could set to 5 for example and get your huge data in less than 5 milliseconds.
Your only option would be to cache the data for some amount of time. If the third party api has a call you can make which is more generic than your dropdown change you could call that before the page loads to get all possible drop down values and store the results for a certain period of time. Then just return the appropriate data each time.
Also, look to see if the third party api has any options for gzipping (compressing) the data before sending it down to you. I have created several API's and have created this type of option for large datasets.

what is the best way in asp.net-mvc / SQL server, to store expensive calculated data and serve it up fast like stackoverflow

i have a similar requirement to stackoverflow to show a number of metrics on a page in my asp.net-mvc site that are very expensive to calculate. Stackoverflow has a lot of metrics on the page (like user accept rate, etc) which clearly is not being calculated on the fly on page request, given that it would be too slow.
What is a recommended practice for serving up calculated data really fast without the performance penalty (assuming we can accept that this data maybe a little out of date.
is this stored in some caching layer or stored in some other "results" database table so every day there is a job to calculate this data and store the results so they can be queries directly?
assuming that i am happy to deal with the delayed of having this data as a snapshot,what is the best solution for this type of problem.
Probably they may be relying on the Redis data store for such calculations and caching. This post from marcgravell may help.
yes, the answer is caching, how you do it is (can be) the complicated part, if you are using NHibernate adding caching is really easy, is part of your configuration and on the queries you just add .Cacheable and it manages it for you. Caching also depends on the type of environment, if you're using a single worker, web farm or web garden, you would have to build a caching layer to accomodate for your scenario
Although this is a somewhat-recent technique, one really great way to structure your system to make stuff like this possible is by using Command and Query Responsibility Segregation, more often referred to by CQRS.

Querying Web Services with SQL

I had a bit of a shock recently when thinking about combining a service oriented architecture with a brilliant UI which leverages SQL to optimize performance when querying data.
The DevExpress grid view for ASP.NET, for example, is so cool that it delegates all filtering, sorting and paging logic to the database server. But this presumes that the data is retrieved from a SQL-able database server.
What if I want to introduce a web service layer between the database and UI layers, and to have the UI use the web services to query the data?
How can I design the web services and the UI such that I can pass filtering requests from the UI via the web services to the database?
Do I need to provide a List QueryData(string sqlQuery) style web service and have to parse the SQL string on my own to guarantee security/access restriction?
Or is there any good framework or design guideline that takes this burden from me?
This must be a very common problem, and I am sure that it has been solved relatively adequately already, has it?
I am mainly interested in a .NET/C#-based or -compatible solution.
Edit: I've found OData and Microsoft WCF Data Services. If I got it right, an OData-based application could look as follows:
User ---/Give me Page 1 (records 1..10)/---> ASP.NET Server Control (of course, via HTTP)
ASP.NET Server Control ---/LINQ Query/---> Data service client
Data service client ---/OData Query/---> WCF Data Service
WCF Data Service ---/LINQ Query/---> Entity Framework
Entity Framework ---/SQL Query/---> Database
If I got this right, my DevExpress server control should be able to delegate a filtering request (e.g. give me the top 10 only) through all these layers down to the database which then applies its indexes etc. in order to perform that query.
Is that right?
Edit: It is a joy to see this thread coming to life :-) It is hard to decide on what answer to accept because all seem equally good to me...
Really interesting question! I don't think there's a right or wrong answer, but I think you can establish some architectural principles.
Firstly, "Service Oriented Architecture" is an architectural style that requires you to expose business services for consumption by other applications. Running a database query is not a service - in my opinion at least. In fact, providing a web service to execute arbitrary SQL is probably an anti-pattern - you would bypass the security model most database servers provide, you'd have no control over the queries - it's relatively easy to write a syntactically correct "select" query which cripples your database (Cartesian joins are my favourite), and the overhead of the web service protocol would make this approach several times slower than just querying the database through normal access routes - LINQ or whatever.
So, let's assume you accept that point of view - what is the solution to the problem?
Firstly, if you want the productivity of using the DevExpress grid, you probably should work in the way DevExpress want you to work - if that means querying the database directly, that's by far the best way to go. If you want to move to a SOA, and the DevExpress grid doesn't support that, it's time to find a new grid control, rather than tailor your entire enterprise architecture to a relatively minor component.
Secondly - structurally, where should you do your sorting, filtering etc? This is an easy concept in SQL, but rather unpleasant when trying to translate it to a web service specification - you quickly end up with an incomprehensible method signature ("getAccountDataForUser(userID, bool sortByDate, bool sortByValue, bool filterZeros, bool filterTransfers)").
On the other hand, performing the filtering and sorting on the client is messy and slow.
My recommendation would be to look at the Specification Pattern - this allows you to have clean method signatures, but specify the desired sorting and ordering in a consistent way.
Implementing the List QueryData(string sqlQuery) will open you up to a near infinite number of security problems.
If you need to filter based on security access, then the OData implementation will not be trivial either, you need to setup proper authorization/authentication on the WCF service so that you could further filter the OData query based on the authenticated user data.
The easiest way to implement server side data operations when the data is retrieved from a WCF service would be to intercept the Grid's sort/filter operations in the code behind, and then call a specialized method on the WCF service based on what the user is doing.
"This must be a very common problem, and I am sure that it has been
solved relatively adequately already, has it?"
Given the number of skinned cats laying around the developer world, I'd have to say no.
WCF Data Services offers the best solution I've found so far, but there authentication and authorization can be tricky. There is a decent post covering the server-side issues around this at http://blogs.msdn.com/b/astoriateam/archive/2010/07/19/odata-and-authentication-part-4-server-side-hooks.aspx. Setting this up isn't easy, but it does work well.

C# Sync two identical DataSets over a Web Service

What is the most effective way to sync two identically structured DataSets using a Web Service?
The design I use at the moment is simple without a Web Service. All data iscached on the Client, and it will only update data from the MySQL database if an update has been done, this is determined by a timestamp.
If possible I want to keep the same simple design, but add a Web Service in the middle, to allow easier access to the database over our limited VPN.
Best Regards, John
That's one heck of a question, but something I'm doing myself too. Easiest way I guess would be to add a "saved version" property. If it really is a simple design then you could just re-write only the DAL code to get things working for a web service. In fact, if the WSDL is done right, you may only need to make very minor adjustments (especially if the DB was previously designed using EF).
You say you want to sync two datasets simultaneously, but the question I guess is why? And which two? Do you have two web services, or are you wanting to sync data to both the local cache and the online web service (MSSQL db?) simultaneously?

ADO.NET data services their place in overall design

ADO.NET Data service is the next generation of data access layer within applications. I have seen a lot of examples using it directly from a UI layer such as Silverlight or Ajax to get data. This is almost as having a two tiered system, with business layer completely removed. Should DAL be accessed by the Business layer, and not directly from UI?
ADO.NET Data Services is one more tool to be evaluated in order to move data.
.NET RIA Services is another one. Much better I would say.
I see ADO.NET Data Services as a low level services to be used by some
high level framework. I would not let my UI talk directly to it.
The main problem I see with ADO.NET Data Services has more to do with
security than with anything else.
For simple/quick tasks, in a Intranet, and if you are not too pick with your
design, it can be useful. (IMO)
It can be quite handy when you need to quickly expose data from an existing database.
I say handy, but it would not be my first choice as I avoid as much as I can
the "quick and dirty" solutions.
Those solutions are like ghosts, always come back to haunt you.
ADO.NET Data service is the next generation of data access layer within applications
I have no idea where you got that from! Perhaps you're confusing ADO.NET Data Services with ADO.NET Entity Framework?
One shouldn't assume that everything Microsoft produces is of value to every developer. In my opinion, ADO.NET Data Services is a quick way to create CRUD services, which maybe have a few other operations defined on the entity, but the operations are all stored procedures. If all you need is a database-oriented service, then this may be what you want. Certainly, there's relatively little reason to do any coding for a service like this, except in the database.
But that doesn't mean that ADO.NET Data Services "has a place in the overall design" of every project. It's something that fills a need of enough customers that Microsoft thought it worthwhile to spend money developing and maintaining it.
For that matter, they also thought ASP.NET MVC was a good idea...
:-)
In my opinion other answers underestimate importance of ADO.Net Data Services. Though using it directly in your application brings some similarity to two tiered system , other Microsoft products such as .Net RIA Services , Windows Asure Storage Services based on it. On the contrary to the phrase in one of the answers "For simple/quick tasks, in a Intranet, and if you are not too pick with your design, it can be useful" it may be useful for public websites including websites in ASP.Net MVC.
Dino Esposito describes the driving force for Ado.Net Data Services in his blog
http://weblogs.asp.net/despos/archive/2008/04/21/the-quot-driving-force-quot-pattern-part-1-of-n.aspx
"ADO.NET Data Services (aka, Astoria)
Driving force: the need of building richly interactive Web systems.
What's that in abstract: New set of tools for building a middle-tier or, better yet, the service layer on top of a middle-tier in any sort of application, including enterprise class applications.
What's that in concrete: provides you with URLs to invoke from hyperlinks to bring data down to the client. Better for scenarios where a client needs a direct|partially filtered access to data. Not ideal for querying data from IE, but ideal for building a new generation of Web controls that breath AJAX. And just that."

Categories

Resources