I have a project which we are thinking of optimizing some parts of the program.
It is basically a 3 tier hierarchy structure and below will be roughly how we structured.
UI - aspx and aspx.cs
Business Logic- Web Services
Data Access- a separate class residing the web service project.
Example:
The practice now is that when we retrieve data, we retrieve in terms of dataset from the web service and return it to the UI to process the displaying of data. In order to make changes to the database, the changed dataset is passed back to the web service. For small number of results, it would not pose much issue, but when the result is huge, web service will also be passing a larger xml to and forth, which greatly decrease performance.
So does anyone happen be using this type of structure and have a better way of processing database results? Right now, this practice is used in all CRUD, so the idea that came to me is for create and delete, or maybe even update, we can skip the passing of dataset, and use a linear command, but I am not sure is this a good method
If by dataset you mean DataSet then this is something that should never be passed to and from a web service. I would recommend you using POCO classes when communicating with the service. A DataSet already does XML serialization/deserialization internally and when you send it to a web service this serialization/deserialization will occur twice. Another tip is not to resend the entire dataset back to the web service but only what has changed in between the calls to reduce the network traffic.
Rule of thumb: In web applications you should exchange only that data between client and server which is needed. I mean all filtering, etc should be done server side.
Some ideas are: Employ paging, avoid datasets, ask for only that information to your WS which you intend to show!
Related
I had a bit of a shock recently when thinking about combining a service oriented architecture with a brilliant UI which leverages SQL to optimize performance when querying data.
The DevExpress grid view for ASP.NET, for example, is so cool that it delegates all filtering, sorting and paging logic to the database server. But this presumes that the data is retrieved from a SQL-able database server.
What if I want to introduce a web service layer between the database and UI layers, and to have the UI use the web services to query the data?
How can I design the web services and the UI such that I can pass filtering requests from the UI via the web services to the database?
Do I need to provide a List QueryData(string sqlQuery) style web service and have to parse the SQL string on my own to guarantee security/access restriction?
Or is there any good framework or design guideline that takes this burden from me?
This must be a very common problem, and I am sure that it has been solved relatively adequately already, has it?
I am mainly interested in a .NET/C#-based or -compatible solution.
Edit: I've found OData and Microsoft WCF Data Services. If I got it right, an OData-based application could look as follows:
User ---/Give me Page 1 (records 1..10)/---> ASP.NET Server Control (of course, via HTTP)
ASP.NET Server Control ---/LINQ Query/---> Data service client
Data service client ---/OData Query/---> WCF Data Service
WCF Data Service ---/LINQ Query/---> Entity Framework
Entity Framework ---/SQL Query/---> Database
If I got this right, my DevExpress server control should be able to delegate a filtering request (e.g. give me the top 10 only) through all these layers down to the database which then applies its indexes etc. in order to perform that query.
Is that right?
Edit: It is a joy to see this thread coming to life :-) It is hard to decide on what answer to accept because all seem equally good to me...
Really interesting question! I don't think there's a right or wrong answer, but I think you can establish some architectural principles.
Firstly, "Service Oriented Architecture" is an architectural style that requires you to expose business services for consumption by other applications. Running a database query is not a service - in my opinion at least. In fact, providing a web service to execute arbitrary SQL is probably an anti-pattern - you would bypass the security model most database servers provide, you'd have no control over the queries - it's relatively easy to write a syntactically correct "select" query which cripples your database (Cartesian joins are my favourite), and the overhead of the web service protocol would make this approach several times slower than just querying the database through normal access routes - LINQ or whatever.
So, let's assume you accept that point of view - what is the solution to the problem?
Firstly, if you want the productivity of using the DevExpress grid, you probably should work in the way DevExpress want you to work - if that means querying the database directly, that's by far the best way to go. If you want to move to a SOA, and the DevExpress grid doesn't support that, it's time to find a new grid control, rather than tailor your entire enterprise architecture to a relatively minor component.
Secondly - structurally, where should you do your sorting, filtering etc? This is an easy concept in SQL, but rather unpleasant when trying to translate it to a web service specification - you quickly end up with an incomprehensible method signature ("getAccountDataForUser(userID, bool sortByDate, bool sortByValue, bool filterZeros, bool filterTransfers)").
On the other hand, performing the filtering and sorting on the client is messy and slow.
My recommendation would be to look at the Specification Pattern - this allows you to have clean method signatures, but specify the desired sorting and ordering in a consistent way.
Implementing the List QueryData(string sqlQuery) will open you up to a near infinite number of security problems.
If you need to filter based on security access, then the OData implementation will not be trivial either, you need to setup proper authorization/authentication on the WCF service so that you could further filter the OData query based on the authenticated user data.
The easiest way to implement server side data operations when the data is retrieved from a WCF service would be to intercept the Grid's sort/filter operations in the code behind, and then call a specialized method on the WCF service based on what the user is doing.
"This must be a very common problem, and I am sure that it has been
solved relatively adequately already, has it?"
Given the number of skinned cats laying around the developer world, I'd have to say no.
WCF Data Services offers the best solution I've found so far, but there authentication and authorization can be tricky. There is a decent post covering the server-side issues around this at http://blogs.msdn.com/b/astoriateam/archive/2010/07/19/odata-and-authentication-part-4-server-side-hooks.aspx. Setting this up isn't easy, but it does work well.
What is the most effective way to sync two identically structured DataSets using a Web Service?
The design I use at the moment is simple without a Web Service. All data iscached on the Client, and it will only update data from the MySQL database if an update has been done, this is determined by a timestamp.
If possible I want to keep the same simple design, but add a Web Service in the middle, to allow easier access to the database over our limited VPN.
Best Regards, John
That's one heck of a question, but something I'm doing myself too. Easiest way I guess would be to add a "saved version" property. If it really is a simple design then you could just re-write only the DAL code to get things working for a web service. In fact, if the WSDL is done right, you may only need to make very minor adjustments (especially if the DB was previously designed using EF).
You say you want to sync two datasets simultaneously, but the question I guess is why? And which two? Do you have two web services, or are you wanting to sync data to both the local cache and the online web service (MSSQL db?) simultaneously?
What are the pros and cons of the following 2 cases:
Case I:
Traditional way:
Add service reference in project. Create object and Get data from service on server side and bind to asp.net grid.
Case II:
Update Service for JSON behavior. Add service reference in project. Call service from javascript to get data. Bind data to jquery grid.
Which one is the best approach and why?(Not developer point of view)
If there is another approach which is more optimized, please explain it and consider for large data.
It depends on whether end-clients (browsers) are allowed to have access to the WCF data service, or just the app service. For simple security modes, having json allows a lot of very simply jQuery etc scenarios.
Of course, jQuery etc demands a compatible browser; these days that means "most", but by no means "all". So if you want to provide the same data to dumb browsers you'll need a way to get the data at the server.
If the intend is to provide server-to-server (B2B etc) access, json is generally a second choice; xml (SOAP etc) would be the de-facto standard, but it isn't the only option. For example, if you have high bandwidth needs you might choose a more compact binary transmission format (there are many).
second approach. any client can now consume that data, whether it'd be jquery grid or even an iphone client.
I'm new to Flex, and I'm wondering about the best practices when it comes to getting data from a database and displaying it in a Flex (Flash) swf. Currently I have some C# code that gets the data from the DB and saves it to an XML file on my site. Then the .swf reads that xml file.
Is that the best way to do it, or is there a better or more standardized way? Thanks
protected void Page_Load(object sender, EventArgs e)
{
DataTable _dt = new DataTable();
_dt = ProductList.GetProductssForAdmin(10);
_dt.TableName = "Products";
_dt.WriteXml(Server.MapPath("xml/Flex.xml"), false);
}
You really should use WebORB or FluorineFX to achieve this. It sends binary data(amf) and it's way more performant. You can map flex classes to .net classes which makes it all very easy!
If you install WebORB or FluorineFX, you get a load of examples + very clear documentation.
I really wouldn't recommend using xml.
I think the answer really depends on a lot of factors. Although a using a three tiered approach (Database -> Web Service -> Frontend) is the SOP, consider the following:
1) The quantity of data: If your application is using a large amount of data, or a varying subset of a larger database, then using a three tiered approach is going to be best. However if it's a small amount of data then a flat file is a simple a straightforward solution.
2) How often does the data change: If it changes a lot, then the three tiered approach is best. You might even consider a solution outside of ASP.NET using Livecycle or Blaze if the data is changing very often and you want to push changes to the Flex frontend. However if the data you are using is updated infrequently, then your approach is again, simple and straightforward.
3) Security: Using an XML file is a pretty secure solution. It is disconnected from your database, and short of someone gaining write access to your web server the flow of data is going to be one way. However, if you create a web service and connect it to your database, you have to take steps to ensure the security of your data from SQL injection and other attacks. If this is sensitive data (as if there is any other kind) and you aren't experienced in creating a secure database service, you may want to stick with your current system.
That said my work almost always requires a three tiered approach, and it is the most powerful way to go. Just be sure that if you go that route you've considered all the factors, number 3 in particular.
EDIT:
There should really be another item in that list, which is closely related to number 1:
4) What are you doing with the data: If you are just rendering the same set of data as a table in the application or something along those lines then an XML file is a decent solution. If you are doing a lot of XPath type queries on the returned XML before it is being used then the three tiered approach has the advantage.
Another perfectly viable option that preserves the separation of concerns is to use JSON rather than XML. It's pretty well supported on both sides of the connection (web service & client), and it simplifies the marshaling and un-marshaling process in Flex quite a bit.
The best way to do it is to create a web service. In Visual Studio, create a new Web project, and select ASP.NET Web Service Application.
The following tutorial should help you get started in writing a web service using ASP.NET.
http://www.15seconds.com/Issue/010430.htm
Also, here's how to consume a web service with Flex.
http://www.adobe.com/devnet/flex/articles/flexbuilder_ws.html
I'm facing a potential problem using the XtraReports tool and a web service about performance. in a Windows Form app.
I know XtraReport loads large data set (I understand a large data set as +10,000 rows) by loading the first pages and then continue loading the rest of the pages in the background, but all this is done with a data source in hand. So what happens if this data source has to pass through a web service, which will need to serialize the data in order to send it to the client?
The scenario is the following:
I have a thin client in windows form that makes calls to a web service, which takes that call and by reflection instantiates the corresponding class and calls the required method (Please notice that this architecture is inherited, I have almost no choice on this, I have to use it). So I'll have a class that gets the data from the database and send it to the client through the web service interface. This data can be a DataSet, SqlDataReader (also notice that we're using SQL Server 2000, but could be 2008 by the end of the year), DataTable, XML, etc.
If the result data set is large, the serialization + transference time can be considerable, and then render the report can add some more time, degradating the overall performance.
I know that there is a possibility for using something like streaming video, but for streaming data through a web service but I have no lead info for trying something around it.
What do you think about this? Please let me know any questions you may have or if I need to write more info for better statement of the problem.
Thanks!
I'll give you an answer you probably don't want to hear.
Set expectations.
Reports are typically slow because they have to churn through a lot of data. There just isn't a good way to get around it. But barring that, I'd do the following:
Serialize the data load to a binary state, convert to something transferable via soap (base64 for instance) and transfer that. This way you'll avoid a lot of useless angle brackets.
Precache as much data on the client as possible.
Focus on the perceived performance of the application. For instance, throw the report data gathering onto a background thread, so the user can go and do other work, then show the user a notification when the report is available.
Sometimes, it is possible to generate the report for the most often used criteria ahead of time and provide that when asked.
Is there a way you can partition the data, ie return a small subset of it?
If there's no absolute requirement to return all the data at the same time, then dividing up the data into smaller pieces will make a huge difference when reading from the database, serializing, and transporting through the webservice.
Edit: I had deleted this answer since you already mentioned partitioning. I'm undeleting it now, maybe it will serve some use as the starting point of a discussion...
As for your point about how to work with paging: I think you're on the right track with the "Show next 100 results" approach. I don't know how XtraReport works, and what its requirements for a datasource are. There are 3 issues I see:
Server support for partitioned data (eg your webservice should support returning just "page 3"'s data
Summary Data - does your report have a row of totals or averages? Are those calculated by the XtraReport control? Does it require a complete dataset to display those results? Can you provide the summaries to the control on your own (and find a more efficient way to calculate them without returning the entire data set?)
XtraReport support for a datasource that uses paging.
transfering DataSets is a bad idea. DataSet have a lot of unusefull date. Use Simple Objects. Use ORM on server side.
Also you can precalculte some data. Referencies can be cached on client and then joined with server data.