Architecture of independent API access layer between DB and front end - c#

I've recently heard about a company using a surprising architecture pattern of a stand alone /independent API (access layer) between a very poorly structured SQL Server and their front end, such that NO entity framework is used anywhere as well as having no direct interaction with the DB from the front end.
I've never seen this before in a .Net Core/Framework environment and comes across as a plaster type situation where they are trying to abstract away the poor DB structure and hide it from the consumer via the API, instead of fixing the core issue, which is the poor DB.
Is this considered an actual architecture pattern or best practice (in this situation even perhaps?) or is this just a mess? The development team seems adamant on this new API pattern...

It is standard practice to abstract the front end from the database. Layers of abstractions make render data transport objects that differ wildly from the entity models in the database. This insulating layer provides a set standard for actors attempting to access the data and encapsulates business logic in a central location. This is a smart decision. As the database standards are improved, the API calls to the database can be updated without affecting the front end. Thus, front end developers need not be bothered with schematic database changes. Entity Framework is not a pre or co requisite for c# projects communicating with databases. There are many ORM libraries out there and some stacks don't even leverage one. While EF is powerful, if the database is a mess, it may be prudent to delay implementation of any ORM until the data and schema is sufficiently curated.

Related

Best approach for migrating database schema of shared database

I am currently struggling to find a way to migrate to a new database schema for a database shared by multiple applications, while keeping applications working with the old schema intact.
There are multiple applications performing CRUD operations on the shared database, using a self-written ORM-like library. The main problems I see with this architecture are:
Each application implements its own business logic with a lot of code being redundant or code which should do the same in every application but is implemented differently and therfore hard to maintain
Since each application works directly with the ORM-library the other applications cannot know when some data was changed by another application without monitoring/polling the database for changes
The ORM-library does implement only limited concurrency, no transactions and is relatively slow
To solve the redundancy/inconsistency problems I am thinking about implementing a layered architecture.
Service Layer
Business Layer
Data Access Layer
Database
The applications then communicate with a SOAP web service on the service layer.
The service layer uses the business layer to perform validation and apply business logic. The business layer uses the data access layers repositories.
I am hoping to be able to also use the business layer in the client applications, with another repository implementation, which does not access the database directly but via the SOAP web service.
To solve the other problems I was hoping to use Entity Framework instead of the selfmade ORM-library. But the schema of the database is made in a kind of generic way. Meaning for each machine added to the database (database stores facility data) several machine specific tables are added. This results in redundant tables, named [machinename]_[tablename]. As far as I know, Entity Framework or any other ORM cannot deal with that (its poor design anyway, probably meant to speed queries up).
The plan would be to migrate to another database schema, but the problem with that is that all the applications using the database need to be changed to use the new schema/SOAP web service. This cannot happen from one day to another therefore it would be best if I can keep some of the applications unchanged, but still work on the only one database. And then later deal with reimplementing the other applications to use the web service.
I already thought about using views to simulate the old schema, so that the old applications can still work with the changed schema, but unfortunately the selfmade ORM does not support working with views.
I don't expect anyone to present me a solution but rather some basic approaches and/or ideas to improve the overall architecture of the system.

Design of Data Access Layer with multiple databases

I have read many posts concerning the issue of having several databases and how to design a DAL efficiently in this case. In many cases, the forum suggests to apply the repository pattern, which works fine in most cases.
However, I find myself in a different situation. I have 3 different databases: Oracle, OLE DB and SQLServer. Currently, there exists a unique DAL with many different classes sending SQL queries down to a layer below to be executed in the corresponding database. They way the system works is that two of the databases are only used to read information from them, and the other one is used to either store this same information or read it later on. I have to propose a better design to the current implementation, but it seems as if a common interface for all three databases is not plausible from an architectural point of view.
Is there any design pattern that solves this situation? Should I have three different DALs? Or perhaps it is possible (and advisable) to apply the repository pattern to this problem?
Answers to your question will probably be very subjective. These are some thoughts.
You could apply command-query separation. The query side integrates directly with your data layer, bypassing any business or domain layer, and the returning entities are optimized for read and project from your databases. This layer could also be responsible to merge results from different database calls.
The command side consists of command handlers, using domain or business entities, which are mapped from your R/W database.
By doing this, the interface that you expose will be more clear and business oriented.
I'm not sure that completely abstracting out the data access layer with custom units of work and repositories is really needed: do the advantages outweigh the disadvantages? They rarely do, because you will you ever change a database technology? And if you do, this probably means a rewrite anyway. Also, if you use entity framework code first, you already have unit of work and an abstraction on top of your database; and the flexibility of using LINQ.
Bottom line - try not to over-engineer/over-abstract things; or make things super-generic.
Your core/business code should never be dependent on any contract/interface/class that is placed in the DAL layer of the application.
Accessing data is something the business/core layer of your application needs to be able to do, and this it should be able to do without any dependency of SQL statements and without having any knowledge of the underlying data access technology.
I think you need to remove any "sql" statements from the core part of the application. SQL is vendor dependent and any dependency to a specific database engine needs to be clean out of you core, and moved to the DAL where it belongs. Then you need to create interfaces that resides outside of the DAL(s) which you then create implementation classes for in one or many DAL modules/classes. Your DAL can be dependent of your core, but not the other way around.
I don't see why the repository layer can't be used in this case. When I have a database which I can only read from, I usually let the name of the repository interface indicate this, like ICompanyRepositoryRead.

Good architecture to shift a WPF desktop application to Client Server technology

I have a working WPF application that works on a single PC. I have used SQL server database, Entity Framework to communicate with database
and RDLC reporting in the application. Now the requirement has arrived to make this application work on the local company network where multiple users (normally around 25 at max) will access application depending upon there roles and permissions set. I did some R&D on this and used primarily the architecture mentioned here http://www.codeproject.com/Articles/434282/A-N-Tier-Architecture-Sample-with-ASP-NET-MVC-WCF, and after doing so, I have made a paper design/architecture of the application that will look like this
A WCF service running on a high end server within the company network
GPC.Service itself - defines the protocol to connect to the service
and all other necessary information
GPC.Algorithm - will be the main business logic layer that will
contain the logic and will be interface to the clients for calling
the database layer methods
GPC.Persistance - will have actual database interaction methods like
fetching/storing/updating/deleting records in the database
GPC.Data - This will contain the edmx schema for the Entity
Framwework
GPC.Entites - This will contain the entities of the database schema
and addional partial classes
**
Clients:
The client will a WPF Application based on MVVM pattern for now (may be in future we will need to move to the Web application but not required for now). Main components of the application are:
Import from excel: Currently all data is in Excel files. All that
data needs to be imported into the system.
Edit/Update/Delete: Once data is imported, allow interface to user
to edit/update/delete records
Generate reprots (using RDLC for this)
Users/Roles management etc.
Shared:
This is a library that contains differnet miscelenious classes like code to read excel file, Handle errors, Collections that will be bind to the UI etc.
DB context: Will be created in a using statement inside the Persistance layer for each method to ensure no stale information is left.
Does this architecure follow the n-tier architecture and is it flexible? What improvements are required in this and please guide me how to improve whatever issues are there. I want to make sure this is a good architecture before I go ahead and change my existing application.
It seems like your are on the correct path however you may be over engineering in some areas.
I think to a large degree the EntityFramework deals with the Entities, Data and Persistence layers for you. Implementing them yourself may be overkill unless you are looking to ultimately replace EntityFramework with some other ORM system.
You are eluding to SOA (Service Orientated Architecture) here with your GPC.Services library. Here you need to look at how you can break down your service layer into one or more atmoic services which will serve the client application. There are a number of ways of going about this and would depend largely on how you plan to use the service layer going forward. Take a look at RESTful services which breaks down the services layer nicely and will guide you into building neat atmoic services. Check out the Asp.net Web API for this.
I think what you are looking for in your GPC.Alogrithms library is really a domain model. A domain model encapsulates all your business logic and allows you to perform state changes on your objects via public functions which you expose. With this in mind the layers of the system would appear as follows:
Persistence (EF) -> Domain Model -> Service Layer -> DTO (Data Transfer Objects) -> Client
The DTO objects mentioned above would be a set of POCO (Plain Old C# Objects) which are responsible for delivering data to and from your client. You need this since serializing and desalinizing your domain objects will become problematic due to back references and other encapsulation issues. Putting DTO's in place will enforce a context boundary which is once of the tenets of SOA - "Boundarys are explicit", see this for more info on soa
With respect to the client side it seems like you are on track. What you may want to do is refactor you current client application so that all data queries are consolidated into a single layer. So when the time comes you will just replace that layer with the service implementation.
this makes perfect sense. (try to build it TDD style)
in order to make your life a bit easier with the client versions management consider to use ClickOnce installer to enforce the latest version installations on your users computers (this headache will be gone once you will move it to be a web app).

Entity Framework - Interact with Orace and SQL Server

I am working on a .NET web api service(with Odata support) to support Mobile client.The service should support both Oracle and SQL server databases, but only one database type will be used at a time, according to which ever database technology client is using.
How to create database agnostic data access layer? Dont want to write code twice - once for SQL server and once for Oracle.
Also it seems like in order to support oracle in EF, 3rd party oracle drivers are required - either from devart or oracle's ODP.NET.
I am debating should I use old style ADO.NET or use EF for building data access layer.
I will appreciate any help on this.
Thanks!
Your question seems to revolve around multiple concerns, i'll give answers based on my views on them:
1.- ¿How can you create a Database (DB Engine) agnostic DAL?
A: One approach for this is to follow the Repository pattern and/or use interfaces to decouple the code that manipulates the data from the code that retrieves/inserts it. The actual implementation of the interfaces used by your code to get the data can also be taylored to be DB Engine agnostic, if you're going to use ADO.NET, you can check out the Enterprise Library for some very useful code which is DB Engine agnostic. Entity Framework is also compatible with different DB engines but, as you mentioned, can only interact with one DB at a time, so whenever you generate the model, you tie it to the specifics of the DB Engine that your DB is hosted in. This is related to another concern in your question:
2.- ¿Should you use plain old ADO.NET or EF?
A: This is a very good question, which i'm sure has been asked before many times and given that both approaches give you the same practical results of being able to retrieve and manipulate data, the resulting question is: ¿what is your personal preference for coding and the time/resources constraints of the project?
IMO, Entity Framework is best suited for Code-First projects and when your business logic doesn't require complex logging, transactions and other security or performance constraints on the DB side, not because EF is not capable of including these requirements, but because it becomes rather convoluted and unpractical to do it and i personally believe that defeats the purpose of EF, to provide you with a tool that allows for rapid development.
So, if the people involved in the project is not very comfortable writing stored procedures in SQL and the data manipulation will revolve mostly around your service without the need for very complex operations on the DB side, then EF is a suitable approach, and you can leverage the Repository pattern as well as interfaces to implement "DBContext" objects that will allow you to create a DB Agnostic DAL.
However, if you are required to implement transactions, security, extensive logging, and are more comfortable writing SQL stored procedures, Entity Framework will often prove to be a burden for you simply because it is not yet suited for advanced tasks, for example:
Imagine you have a User table, with multiple fields (Address, phone, etc) that are not always necessary for all user-related operations (such as authentication); Trying to map an entity to the results of a stored procedure that does not return any of the fields that the entity contains will result in an error, and you will either need to create different models with more or less members or return additional columns in the SP that you might not need for a particular operation, increasing the bandwith consumption unnecessarily.
Another situation is taking advantage of features such as Table Valued Parameters in SQL Server to optimize sending multiple records at once to the DB, in this case Entity Framework does not include anything that will automatically optimize operations with multiple records, so in order to use TVPs you will need to manually define that operation, much like you would if you had gone the ADO.NET route.
Eventually, you will have to weigh the considerations of your project against what your alternatives provide you; ADO.NET gives you the best performance and customization for your DB operations, it is highly scalable and allows optimizations but it takes more time to code, while EF is very straightforward and practical for objects manipulation, and though it is constantly evolving and improving, its performance and capabilities are not quite on pair with ADO.NET yet.
And regarding the drivers issue, it shouldn't weigh too much in the matter since even Oracle encourages you to use their driver instead of the default one provided by Microsoft.

ADO.NET data services their place in overall design

ADO.NET Data service is the next generation of data access layer within applications. I have seen a lot of examples using it directly from a UI layer such as Silverlight or Ajax to get data. This is almost as having a two tiered system, with business layer completely removed. Should DAL be accessed by the Business layer, and not directly from UI?
ADO.NET Data Services is one more tool to be evaluated in order to move data.
.NET RIA Services is another one. Much better I would say.
I see ADO.NET Data Services as a low level services to be used by some
high level framework. I would not let my UI talk directly to it.
The main problem I see with ADO.NET Data Services has more to do with
security than with anything else.
For simple/quick tasks, in a Intranet, and if you are not too pick with your
design, it can be useful. (IMO)
It can be quite handy when you need to quickly expose data from an existing database.
I say handy, but it would not be my first choice as I avoid as much as I can
the "quick and dirty" solutions.
Those solutions are like ghosts, always come back to haunt you.
ADO.NET Data service is the next generation of data access layer within applications
I have no idea where you got that from! Perhaps you're confusing ADO.NET Data Services with ADO.NET Entity Framework?
One shouldn't assume that everything Microsoft produces is of value to every developer. In my opinion, ADO.NET Data Services is a quick way to create CRUD services, which maybe have a few other operations defined on the entity, but the operations are all stored procedures. If all you need is a database-oriented service, then this may be what you want. Certainly, there's relatively little reason to do any coding for a service like this, except in the database.
But that doesn't mean that ADO.NET Data Services "has a place in the overall design" of every project. It's something that fills a need of enough customers that Microsoft thought it worthwhile to spend money developing and maintaining it.
For that matter, they also thought ASP.NET MVC was a good idea...
:-)
In my opinion other answers underestimate importance of ADO.Net Data Services. Though using it directly in your application brings some similarity to two tiered system , other Microsoft products such as .Net RIA Services , Windows Asure Storage Services based on it. On the contrary to the phrase in one of the answers "For simple/quick tasks, in a Intranet, and if you are not too pick with your design, it can be useful" it may be useful for public websites including websites in ASP.Net MVC.
Dino Esposito describes the driving force for Ado.Net Data Services in his blog
http://weblogs.asp.net/despos/archive/2008/04/21/the-quot-driving-force-quot-pattern-part-1-of-n.aspx
"ADO.NET Data Services (aka, Astoria)
Driving force: the need of building richly interactive Web systems.
What's that in abstract: New set of tools for building a middle-tier or, better yet, the service layer on top of a middle-tier in any sort of application, including enterprise class applications.
What's that in concrete: provides you with URLs to invoke from hyperlinks to bring data down to the client. Better for scenarios where a client needs a direct|partially filtered access to data. Not ideal for querying data from IE, but ideal for building a new generation of Web controls that breath AJAX. And just that."

Categories

Resources