Any thoughts on ADO.NET data services and Entity Framework with silverlight 3?
It is a good practice?
Yes, it's a good practice.
You get a lot of functionality needed for building Silverlight application (e.g. bindings, client side entity classes, context changes tracking, etc.) for free. One note: you are not limited to entity framework as your data provider for ADO.NET Data Service - you can easily plug in other data providers as well (LLBLGen, for example, has a template that allows to use LLBLGen framework with ADO.Net Data Services)
Also it seems that the plans are to make more and more data services accessible via ADO.NET Data Services (just recently Astoria team announced that they'd be adding Sharepoint) Thus choosing ADO.NET Data Services as a layer Silverlight client talks to makes a very good strategic sense, as you'll be able to easily reuse whatever libraries, patterns, approaches you discover for all different types of backends (being that a database, Azure cloud, Reporting Services, Sharepoint, etc.) It's clearly invaluable for portal types of applications whose whole purpose is to aggregate data from multiple data source.
That is one approach you can take to get data into Silverlight. Another approach is RIA Services which I think is a better approach and seems to have a higher adoption rate. Determining if it is good practice will depend on your implementation of it, not on the technology itself.
Related
I've recently heard about a company using a surprising architecture pattern of a stand alone /independent API (access layer) between a very poorly structured SQL Server and their front end, such that NO entity framework is used anywhere as well as having no direct interaction with the DB from the front end.
I've never seen this before in a .Net Core/Framework environment and comes across as a plaster type situation where they are trying to abstract away the poor DB structure and hide it from the consumer via the API, instead of fixing the core issue, which is the poor DB.
Is this considered an actual architecture pattern or best practice (in this situation even perhaps?) or is this just a mess? The development team seems adamant on this new API pattern...
It is standard practice to abstract the front end from the database. Layers of abstractions make render data transport objects that differ wildly from the entity models in the database. This insulating layer provides a set standard for actors attempting to access the data and encapsulates business logic in a central location. This is a smart decision. As the database standards are improved, the API calls to the database can be updated without affecting the front end. Thus, front end developers need not be bothered with schematic database changes. Entity Framework is not a pre or co requisite for c# projects communicating with databases. There are many ORM libraries out there and some stacks don't even leverage one. While EF is powerful, if the database is a mess, it may be prudent to delay implementation of any ORM until the data and schema is sufficiently curated.
I am currently struggling to find a way to migrate to a new database schema for a database shared by multiple applications, while keeping applications working with the old schema intact.
There are multiple applications performing CRUD operations on the shared database, using a self-written ORM-like library. The main problems I see with this architecture are:
Each application implements its own business logic with a lot of code being redundant or code which should do the same in every application but is implemented differently and therfore hard to maintain
Since each application works directly with the ORM-library the other applications cannot know when some data was changed by another application without monitoring/polling the database for changes
The ORM-library does implement only limited concurrency, no transactions and is relatively slow
To solve the redundancy/inconsistency problems I am thinking about implementing a layered architecture.
Service Layer
Business Layer
Data Access Layer
Database
The applications then communicate with a SOAP web service on the service layer.
The service layer uses the business layer to perform validation and apply business logic. The business layer uses the data access layers repositories.
I am hoping to be able to also use the business layer in the client applications, with another repository implementation, which does not access the database directly but via the SOAP web service.
To solve the other problems I was hoping to use Entity Framework instead of the selfmade ORM-library. But the schema of the database is made in a kind of generic way. Meaning for each machine added to the database (database stores facility data) several machine specific tables are added. This results in redundant tables, named [machinename]_[tablename]. As far as I know, Entity Framework or any other ORM cannot deal with that (its poor design anyway, probably meant to speed queries up).
The plan would be to migrate to another database schema, but the problem with that is that all the applications using the database need to be changed to use the new schema/SOAP web service. This cannot happen from one day to another therefore it would be best if I can keep some of the applications unchanged, but still work on the only one database. And then later deal with reimplementing the other applications to use the web service.
I already thought about using views to simulate the old schema, so that the old applications can still work with the changed schema, but unfortunately the selfmade ORM does not support working with views.
I don't expect anyone to present me a solution but rather some basic approaches and/or ideas to improve the overall architecture of the system.
I have a working WPF application that works on a single PC. I have used SQL server database, Entity Framework to communicate with database
and RDLC reporting in the application. Now the requirement has arrived to make this application work on the local company network where multiple users (normally around 25 at max) will access application depending upon there roles and permissions set. I did some R&D on this and used primarily the architecture mentioned here http://www.codeproject.com/Articles/434282/A-N-Tier-Architecture-Sample-with-ASP-NET-MVC-WCF, and after doing so, I have made a paper design/architecture of the application that will look like this
A WCF service running on a high end server within the company network
GPC.Service itself - defines the protocol to connect to the service
and all other necessary information
GPC.Algorithm - will be the main business logic layer that will
contain the logic and will be interface to the clients for calling
the database layer methods
GPC.Persistance - will have actual database interaction methods like
fetching/storing/updating/deleting records in the database
GPC.Data - This will contain the edmx schema for the Entity
Framwework
GPC.Entites - This will contain the entities of the database schema
and addional partial classes
**
Clients:
The client will a WPF Application based on MVVM pattern for now (may be in future we will need to move to the Web application but not required for now). Main components of the application are:
Import from excel: Currently all data is in Excel files. All that
data needs to be imported into the system.
Edit/Update/Delete: Once data is imported, allow interface to user
to edit/update/delete records
Generate reprots (using RDLC for this)
Users/Roles management etc.
Shared:
This is a library that contains differnet miscelenious classes like code to read excel file, Handle errors, Collections that will be bind to the UI etc.
DB context: Will be created in a using statement inside the Persistance layer for each method to ensure no stale information is left.
Does this architecure follow the n-tier architecture and is it flexible? What improvements are required in this and please guide me how to improve whatever issues are there. I want to make sure this is a good architecture before I go ahead and change my existing application.
It seems like your are on the correct path however you may be over engineering in some areas.
I think to a large degree the EntityFramework deals with the Entities, Data and Persistence layers for you. Implementing them yourself may be overkill unless you are looking to ultimately replace EntityFramework with some other ORM system.
You are eluding to SOA (Service Orientated Architecture) here with your GPC.Services library. Here you need to look at how you can break down your service layer into one or more atmoic services which will serve the client application. There are a number of ways of going about this and would depend largely on how you plan to use the service layer going forward. Take a look at RESTful services which breaks down the services layer nicely and will guide you into building neat atmoic services. Check out the Asp.net Web API for this.
I think what you are looking for in your GPC.Alogrithms library is really a domain model. A domain model encapsulates all your business logic and allows you to perform state changes on your objects via public functions which you expose. With this in mind the layers of the system would appear as follows:
Persistence (EF) -> Domain Model -> Service Layer -> DTO (Data Transfer Objects) -> Client
The DTO objects mentioned above would be a set of POCO (Plain Old C# Objects) which are responsible for delivering data to and from your client. You need this since serializing and desalinizing your domain objects will become problematic due to back references and other encapsulation issues. Putting DTO's in place will enforce a context boundary which is once of the tenets of SOA - "Boundarys are explicit", see this for more info on soa
With respect to the client side it seems like you are on track. What you may want to do is refactor you current client application so that all data queries are consolidated into a single layer. So when the time comes you will just replace that layer with the service implementation.
this makes perfect sense. (try to build it TDD style)
in order to make your life a bit easier with the client versions management consider to use ClickOnce installer to enforce the latest version installations on your users computers (this headache will be gone once you will move it to be a web app).
I've been wading through all the new EF and WCF stuff in .NET 4 for a major project in its early stages, and I think my brain's now officially turned to sludge. It's the first large-scale development work I've done in .NET since the 1.1 days. As usual everything needs to be done yesterday, so I'm playing catch-up.
This is what I need to hammer together - any sanity checks or guidance would be greatly appreciated. The project itself can be thought of as essentially a posh e-commerce system, with multiple clients, both web and Windows-based, connecting to central servers with live data.
On the server side:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all the other accoutrements of a complex DB system)
Underlying classes used for EF and
WCF must be extensible both at a
property and class (ie field and
record) level, for validation,
security, high-level auditing and
other custom logic
On the client side:
WCF client
Underlying classes the same as the
server side, but with some of the
customisations not present
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
The client-side WCF API details will
probably end up being published
publicly, so sensitive server-side
implementation hints should not be
leaked through the API unless
absolutely unavoidable - this
includes EF attributes in properties
and classes
General requirements:
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can
foresee data traffic and server
workload increasing exponentially
within a few years
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it. Somehow they
need to be made suitable for
both EF and WCF on both client and
server side, and have various layers
of customisation, but appear as if
custom-written for each scenario
Sorry this is so open-ended, but as I said, my brain's completely turned to sludge and I've confused myself to the point where I'm frozen.
Could anyone point me in the general direction of how to build the classes to do all this? Honestly, thanks very, very much.
A few hints in no particular order:
POCO would be the way to go to avoid dependencies on EF classes in your data objects.
Consider adding a intermediate layer based on data transfer objects to cope with your "only modified properties" are passed (this requirement will be the tricky part). These DTO will be passed between service and clients to exchange modifications
Use a stateless communication model (no WCF session) to be able to implement load-balancing and fail-over very easily.
Share the POCO between client and services, use subclassing on the server to add the internal customized information.
You would end up in the server side with at least:
A project for the service contracts and the DTO (shared)
A project for the POCO (shared)
A project for the WCF service layer
A project for business logic (call by the WCF layer)
I have few notes to your requirements:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all
the other accoutrements of a complex
DB system)
Are you going to build only data access layer exposed as set of WCF services or heavy business logic exposed as WCF services? This strongly affect rest of your requirements. If you want to do the former case check WCF Data Services. In the later case check my other notes.
Underlying classes used for EF and WCF
must be extensible both at a property
and class (ie field and record) level,
for validation, security, high-level
auditing and other custom logic
Divide your data classes into two sets. Internally your services will use POCO classes implemented as domain objects. Domain objects will be materialized / persisted by EF (you need .NET 4.0) and they will also contain custom logic. If you want to build heavy business layer you should also think about Domain driven design = repositories, aggregate roots, etc.
Underlying classes the same as the
server side, but with some of the
customisations not present
Second set of data classes will be Data transfers objects which will be exposed by WCF services and shared among server and clients. Your domain objects will be converted to DTOs when sending data to client and DTOs will be converted to domain objects when returning from client.
Your WCF services should be build on the top of business logic - domain objects / domain services. WCF servies should expose chunky interfaces (instead of chatty CRUD interfaces) where DTO transfers data from several domain operations. This will also help you improve performance by reducing number of round trips between client and service.
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
I think this is achievable only by correct definition of DTOs or perhaps by some custom serialization.
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can foresee data traffic and server
workload increasing exponentially
within a few years
As already mentioned you have to design your service to be ready for load balancing and you should also think about caching (distributed) - check AppFabric. Good idea is to use stateless services.
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it.
This seems like simple requirement but you can easily model database which will be hard to use with Entity Framework.
The main advice:
Your project looks big and complex so the first thing you should do is to hire some developers with experience with WCF, EF, etc. Each of these technologies have some pitfalls so it is really big risk to use them in such scale without having experienced people.
ADO.NET Data service is the next generation of data access layer within applications. I have seen a lot of examples using it directly from a UI layer such as Silverlight or Ajax to get data. This is almost as having a two tiered system, with business layer completely removed. Should DAL be accessed by the Business layer, and not directly from UI?
ADO.NET Data Services is one more tool to be evaluated in order to move data.
.NET RIA Services is another one. Much better I would say.
I see ADO.NET Data Services as a low level services to be used by some
high level framework. I would not let my UI talk directly to it.
The main problem I see with ADO.NET Data Services has more to do with
security than with anything else.
For simple/quick tasks, in a Intranet, and if you are not too pick with your
design, it can be useful. (IMO)
It can be quite handy when you need to quickly expose data from an existing database.
I say handy, but it would not be my first choice as I avoid as much as I can
the "quick and dirty" solutions.
Those solutions are like ghosts, always come back to haunt you.
ADO.NET Data service is the next generation of data access layer within applications
I have no idea where you got that from! Perhaps you're confusing ADO.NET Data Services with ADO.NET Entity Framework?
One shouldn't assume that everything Microsoft produces is of value to every developer. In my opinion, ADO.NET Data Services is a quick way to create CRUD services, which maybe have a few other operations defined on the entity, but the operations are all stored procedures. If all you need is a database-oriented service, then this may be what you want. Certainly, there's relatively little reason to do any coding for a service like this, except in the database.
But that doesn't mean that ADO.NET Data Services "has a place in the overall design" of every project. It's something that fills a need of enough customers that Microsoft thought it worthwhile to spend money developing and maintaining it.
For that matter, they also thought ASP.NET MVC was a good idea...
:-)
In my opinion other answers underestimate importance of ADO.Net Data Services. Though using it directly in your application brings some similarity to two tiered system , other Microsoft products such as .Net RIA Services , Windows Asure Storage Services based on it. On the contrary to the phrase in one of the answers "For simple/quick tasks, in a Intranet, and if you are not too pick with your design, it can be useful" it may be useful for public websites including websites in ASP.Net MVC.
Dino Esposito describes the driving force for Ado.Net Data Services in his blog
http://weblogs.asp.net/despos/archive/2008/04/21/the-quot-driving-force-quot-pattern-part-1-of-n.aspx
"ADO.NET Data Services (aka, Astoria)
Driving force: the need of building richly interactive Web systems.
What's that in abstract: New set of tools for building a middle-tier or, better yet, the service layer on top of a middle-tier in any sort of application, including enterprise class applications.
What's that in concrete: provides you with URLs to invoke from hyperlinks to bring data down to the client. Better for scenarios where a client needs a direct|partially filtered access to data. Not ideal for querying data from IE, but ideal for building a new generation of Web controls that breath AJAX. And just that."