ADO.NET data services their place in overall design - c#

ADO.NET Data service is the next generation of data access layer within applications. I have seen a lot of examples using it directly from a UI layer such as Silverlight or Ajax to get data. This is almost as having a two tiered system, with business layer completely removed. Should DAL be accessed by the Business layer, and not directly from UI?

ADO.NET Data Services is one more tool to be evaluated in order to move data.
.NET RIA Services is another one. Much better I would say.
I see ADO.NET Data Services as a low level services to be used by some
high level framework. I would not let my UI talk directly to it.
The main problem I see with ADO.NET Data Services has more to do with
security than with anything else.
For simple/quick tasks, in a Intranet, and if you are not too pick with your
design, it can be useful. (IMO)
It can be quite handy when you need to quickly expose data from an existing database.
I say handy, but it would not be my first choice as I avoid as much as I can
the "quick and dirty" solutions.
Those solutions are like ghosts, always come back to haunt you.

ADO.NET Data service is the next generation of data access layer within applications
I have no idea where you got that from! Perhaps you're confusing ADO.NET Data Services with ADO.NET Entity Framework?
One shouldn't assume that everything Microsoft produces is of value to every developer. In my opinion, ADO.NET Data Services is a quick way to create CRUD services, which maybe have a few other operations defined on the entity, but the operations are all stored procedures. If all you need is a database-oriented service, then this may be what you want. Certainly, there's relatively little reason to do any coding for a service like this, except in the database.
But that doesn't mean that ADO.NET Data Services "has a place in the overall design" of every project. It's something that fills a need of enough customers that Microsoft thought it worthwhile to spend money developing and maintaining it.
For that matter, they also thought ASP.NET MVC was a good idea...
:-)

In my opinion other answers underestimate importance of ADO.Net Data Services. Though using it directly in your application brings some similarity to two tiered system , other Microsoft products such as .Net RIA Services , Windows Asure Storage Services based on it. On the contrary to the phrase in one of the answers "For simple/quick tasks, in a Intranet, and if you are not too pick with your design, it can be useful" it may be useful for public websites including websites in ASP.Net MVC.
Dino Esposito describes the driving force for Ado.Net Data Services in his blog
http://weblogs.asp.net/despos/archive/2008/04/21/the-quot-driving-force-quot-pattern-part-1-of-n.aspx
"ADO.NET Data Services (aka, Astoria)
Driving force: the need of building richly interactive Web systems.
What's that in abstract: New set of tools for building a middle-tier or, better yet, the service layer on top of a middle-tier in any sort of application, including enterprise class applications.
What's that in concrete: provides you with URLs to invoke from hyperlinks to bring data down to the client. Better for scenarios where a client needs a direct|partially filtered access to data. Not ideal for querying data from IE, but ideal for building a new generation of Web controls that breath AJAX. And just that."

Related

Best approach for migrating database schema of shared database

I am currently struggling to find a way to migrate to a new database schema for a database shared by multiple applications, while keeping applications working with the old schema intact.
There are multiple applications performing CRUD operations on the shared database, using a self-written ORM-like library. The main problems I see with this architecture are:
Each application implements its own business logic with a lot of code being redundant or code which should do the same in every application but is implemented differently and therfore hard to maintain
Since each application works directly with the ORM-library the other applications cannot know when some data was changed by another application without monitoring/polling the database for changes
The ORM-library does implement only limited concurrency, no transactions and is relatively slow
To solve the redundancy/inconsistency problems I am thinking about implementing a layered architecture.
Service Layer
Business Layer
Data Access Layer
Database
The applications then communicate with a SOAP web service on the service layer.
The service layer uses the business layer to perform validation and apply business logic. The business layer uses the data access layers repositories.
I am hoping to be able to also use the business layer in the client applications, with another repository implementation, which does not access the database directly but via the SOAP web service.
To solve the other problems I was hoping to use Entity Framework instead of the selfmade ORM-library. But the schema of the database is made in a kind of generic way. Meaning for each machine added to the database (database stores facility data) several machine specific tables are added. This results in redundant tables, named [machinename]_[tablename]. As far as I know, Entity Framework or any other ORM cannot deal with that (its poor design anyway, probably meant to speed queries up).
The plan would be to migrate to another database schema, but the problem with that is that all the applications using the database need to be changed to use the new schema/SOAP web service. This cannot happen from one day to another therefore it would be best if I can keep some of the applications unchanged, but still work on the only one database. And then later deal with reimplementing the other applications to use the web service.
I already thought about using views to simulate the old schema, so that the old applications can still work with the changed schema, but unfortunately the selfmade ORM does not support working with views.
I don't expect anyone to present me a solution but rather some basic approaches and/or ideas to improve the overall architecture of the system.

Good architecture to shift a WPF desktop application to Client Server technology

I have a working WPF application that works on a single PC. I have used SQL server database, Entity Framework to communicate with database
and RDLC reporting in the application. Now the requirement has arrived to make this application work on the local company network where multiple users (normally around 25 at max) will access application depending upon there roles and permissions set. I did some R&D on this and used primarily the architecture mentioned here http://www.codeproject.com/Articles/434282/A-N-Tier-Architecture-Sample-with-ASP-NET-MVC-WCF, and after doing so, I have made a paper design/architecture of the application that will look like this
A WCF service running on a high end server within the company network
GPC.Service itself - defines the protocol to connect to the service
and all other necessary information
GPC.Algorithm - will be the main business logic layer that will
contain the logic and will be interface to the clients for calling
the database layer methods
GPC.Persistance - will have actual database interaction methods like
fetching/storing/updating/deleting records in the database
GPC.Data - This will contain the edmx schema for the Entity
Framwework
GPC.Entites - This will contain the entities of the database schema
and addional partial classes
**
Clients:
The client will a WPF Application based on MVVM pattern for now (may be in future we will need to move to the Web application but not required for now). Main components of the application are:
Import from excel: Currently all data is in Excel files. All that
data needs to be imported into the system.
Edit/Update/Delete: Once data is imported, allow interface to user
to edit/update/delete records
Generate reprots (using RDLC for this)
Users/Roles management etc.
Shared:
This is a library that contains differnet miscelenious classes like code to read excel file, Handle errors, Collections that will be bind to the UI etc.
DB context: Will be created in a using statement inside the Persistance layer for each method to ensure no stale information is left.
Does this architecure follow the n-tier architecture and is it flexible? What improvements are required in this and please guide me how to improve whatever issues are there. I want to make sure this is a good architecture before I go ahead and change my existing application.
It seems like your are on the correct path however you may be over engineering in some areas.
I think to a large degree the EntityFramework deals with the Entities, Data and Persistence layers for you. Implementing them yourself may be overkill unless you are looking to ultimately replace EntityFramework with some other ORM system.
You are eluding to SOA (Service Orientated Architecture) here with your GPC.Services library. Here you need to look at how you can break down your service layer into one or more atmoic services which will serve the client application. There are a number of ways of going about this and would depend largely on how you plan to use the service layer going forward. Take a look at RESTful services which breaks down the services layer nicely and will guide you into building neat atmoic services. Check out the Asp.net Web API for this.
I think what you are looking for in your GPC.Alogrithms library is really a domain model. A domain model encapsulates all your business logic and allows you to perform state changes on your objects via public functions which you expose. With this in mind the layers of the system would appear as follows:
Persistence (EF) -> Domain Model -> Service Layer -> DTO (Data Transfer Objects) -> Client
The DTO objects mentioned above would be a set of POCO (Plain Old C# Objects) which are responsible for delivering data to and from your client. You need this since serializing and desalinizing your domain objects will become problematic due to back references and other encapsulation issues. Putting DTO's in place will enforce a context boundary which is once of the tenets of SOA - "Boundarys are explicit", see this for more info on soa
With respect to the client side it seems like you are on track. What you may want to do is refactor you current client application so that all data queries are consolidated into a single layer. So when the time comes you will just replace that layer with the service implementation.
this makes perfect sense. (try to build it TDD style)
in order to make your life a bit easier with the client versions management consider to use ClickOnce installer to enforce the latest version installations on your users computers (this headache will be gone once you will move it to be a web app).

Architecture of an ASP.NET MVC application

I'm in the process of doing the analysis of a potentially big web site, and I have a number of questions.
The web site is going to be written in ASP.NET MVC 3 with razor view engine. In most examples I find that controllers directly use the underlying database (using domain/repository pattern), so there's no WCF service in between. My first question is: is this architecture suitable for a big site with a lot of traffic? It's always possible to load balance the site, but is this a good approach? Or should I make the site use WCF services that interact with the data?
Question 2: I would like to adopt CQS principles, which means that I want to separate the querying from the commanding part. So this means that the querying part will have a different model (optimized for the views) than the commanding part (optimized to business intend and only containing properties that are needed for completing the command) - but both act on the same database. Do you think this is a good idea?
Thanks for the advice!
For scalability, it helps to separate back-end code from front-end code. So if you put UI code in the MVC project and as much processing code as possible in one or more separate WCF and business logic projects, not only will your code be clearer but you will also be able to scale the layers/tiers independently of each other.
CQRS is great for high-traffic websites. I think CQRS, properly combined with a good base library for DDD, is good even for low-traffic sites because it makes business logic easier to implement. The separation of data into a read-optimized model and a write-optimized model makes sense from an architectural point of view also because it makes changes easier to do (maybe some more work, but it's definitely easier to make changes without breaking something).
However, if both act on the same database, I would make sure that the read model consists entirely of Views so that you can modify entities as needed without breaking the Read code. This has the advantage that you'll need to write less code, but your write model will still consist of a full-fledged entity model rather than just an event store.
EDIT to answer your extra questions:
What I like to do is use a WCF Data Service for the Read model. This technology (specific to .NET 4.0) builds an OData (= REST + Atom with LINQ support) web service on top of a data model, such as an Entity Framework EDMX.
So, I build a Read model in SQL Server (Views), then build an Entity Framework model from that, then build a WCF Data Service on top of that, in read-only mode. That sounds a lot more complicated than it is, it only takes a few minutes. You don't need to create yet another model, just expose the EDMX as read-only. See also http://msdn.microsoft.com/en-us/library/cc668794.aspx.
The Command service is then just a one-way regular WCF service, the Read service is the WCF Data Service, and your MVC application consumes them both.

Querying Web Services with SQL

I had a bit of a shock recently when thinking about combining a service oriented architecture with a brilliant UI which leverages SQL to optimize performance when querying data.
The DevExpress grid view for ASP.NET, for example, is so cool that it delegates all filtering, sorting and paging logic to the database server. But this presumes that the data is retrieved from a SQL-able database server.
What if I want to introduce a web service layer between the database and UI layers, and to have the UI use the web services to query the data?
How can I design the web services and the UI such that I can pass filtering requests from the UI via the web services to the database?
Do I need to provide a List QueryData(string sqlQuery) style web service and have to parse the SQL string on my own to guarantee security/access restriction?
Or is there any good framework or design guideline that takes this burden from me?
This must be a very common problem, and I am sure that it has been solved relatively adequately already, has it?
I am mainly interested in a .NET/C#-based or -compatible solution.
Edit: I've found OData and Microsoft WCF Data Services. If I got it right, an OData-based application could look as follows:
User ---/Give me Page 1 (records 1..10)/---> ASP.NET Server Control (of course, via HTTP)
ASP.NET Server Control ---/LINQ Query/---> Data service client
Data service client ---/OData Query/---> WCF Data Service
WCF Data Service ---/LINQ Query/---> Entity Framework
Entity Framework ---/SQL Query/---> Database
If I got this right, my DevExpress server control should be able to delegate a filtering request (e.g. give me the top 10 only) through all these layers down to the database which then applies its indexes etc. in order to perform that query.
Is that right?
Edit: It is a joy to see this thread coming to life :-) It is hard to decide on what answer to accept because all seem equally good to me...
Really interesting question! I don't think there's a right or wrong answer, but I think you can establish some architectural principles.
Firstly, "Service Oriented Architecture" is an architectural style that requires you to expose business services for consumption by other applications. Running a database query is not a service - in my opinion at least. In fact, providing a web service to execute arbitrary SQL is probably an anti-pattern - you would bypass the security model most database servers provide, you'd have no control over the queries - it's relatively easy to write a syntactically correct "select" query which cripples your database (Cartesian joins are my favourite), and the overhead of the web service protocol would make this approach several times slower than just querying the database through normal access routes - LINQ or whatever.
So, let's assume you accept that point of view - what is the solution to the problem?
Firstly, if you want the productivity of using the DevExpress grid, you probably should work in the way DevExpress want you to work - if that means querying the database directly, that's by far the best way to go. If you want to move to a SOA, and the DevExpress grid doesn't support that, it's time to find a new grid control, rather than tailor your entire enterprise architecture to a relatively minor component.
Secondly - structurally, where should you do your sorting, filtering etc? This is an easy concept in SQL, but rather unpleasant when trying to translate it to a web service specification - you quickly end up with an incomprehensible method signature ("getAccountDataForUser(userID, bool sortByDate, bool sortByValue, bool filterZeros, bool filterTransfers)").
On the other hand, performing the filtering and sorting on the client is messy and slow.
My recommendation would be to look at the Specification Pattern - this allows you to have clean method signatures, but specify the desired sorting and ordering in a consistent way.
Implementing the List QueryData(string sqlQuery) will open you up to a near infinite number of security problems.
If you need to filter based on security access, then the OData implementation will not be trivial either, you need to setup proper authorization/authentication on the WCF service so that you could further filter the OData query based on the authenticated user data.
The easiest way to implement server side data operations when the data is retrieved from a WCF service would be to intercept the Grid's sort/filter operations in the code behind, and then call a specialized method on the WCF service based on what the user is doing.
"This must be a very common problem, and I am sure that it has been
solved relatively adequately already, has it?"
Given the number of skinned cats laying around the developer world, I'd have to say no.
WCF Data Services offers the best solution I've found so far, but there authentication and authorization can be tricky. There is a decent post covering the server-side issues around this at http://blogs.msdn.com/b/astoriateam/archive/2010/07/19/odata-and-authentication-part-4-server-side-hooks.aspx. Setting this up isn't easy, but it does work well.

Using data services with silverlight 3?

Any thoughts on ADO.NET data services and Entity Framework with silverlight 3?
It is a good practice?
Yes, it's a good practice.
You get a lot of functionality needed for building Silverlight application (e.g. bindings, client side entity classes, context changes tracking, etc.) for free. One note: you are not limited to entity framework as your data provider for ADO.NET Data Service - you can easily plug in other data providers as well (LLBLGen, for example, has a template that allows to use LLBLGen framework with ADO.Net Data Services)
Also it seems that the plans are to make more and more data services accessible via ADO.NET Data Services (just recently Astoria team announced that they'd be adding Sharepoint) Thus choosing ADO.NET Data Services as a layer Silverlight client talks to makes a very good strategic sense, as you'll be able to easily reuse whatever libraries, patterns, approaches you discover for all different types of backends (being that a database, Azure cloud, Reporting Services, Sharepoint, etc.) It's clearly invaluable for portal types of applications whose whole purpose is to aggregate data from multiple data source.
That is one approach you can take to get data into Silverlight. Another approach is RIA Services which I think is a better approach and seems to have a higher adoption rate. Determining if it is good practice will depend on your implementation of it, not on the technology itself.

Categories

Resources