I have been playing around with the EF to see what it can handle. Also many articles and posts explain the various scenarios in which the EF can be used, however if miss the "con" side somehow. Now my question is, in what kind of scenarios should I stay away from the Entity Framework ?
If you have some experience in this field, tell me what scenarios don't play well with the EF. Tell me about some downsides you experienced where you whished you would have chosen a different technology.
The Vote of No Confidence lists several missteps and/or missing bits of functionality in the eyes of those who believe they know what features, and their implementations, are proper for ORM/Datamapper frameworks.
If none of those issues are a big deal to you, then I don't see why you shouldn't use it. I have yet to hear that it is a buggy mess that blows up left and right. All cautions against it are philosophical. I happen to agree with the vote of no confidence, but that doesn't mean you should. If you happen to like the way EF works, then go for it. At the same time I'd advise you to at least read the vote of no confidence and try to get a rudimentary understanding of each of the issues in order to make an informed decision.
Outside of that issue and to the heart of your question - You need to keep an eye on the Sql that is being generated so you can make tweaks before a performance problem gets into production. Even if you are using procs on the backend, I'd still look for scenarios where you may be hitting the database too many times and then rework your mappings or fetching scenarios accordingly.
One potentially big issue: Entity Framework 1.0 does not support persistence ignorance. This means that your business layer has a dependency on your data access layer.
If your entire application will be hosted in the same process (like a website on IIS) then this is no problem.
If, however, you have a need to remote your entities (to a Silverlight or Windows Mobile client for example), then your entities will not easily serialize across the wire. You will have to create separate data transfer classes to send your entities across the wire, and additional logic to marshal data between your entity classes and the DTOs.
Edit: spelling.
I'm also just at the 'playing around' stage, and although I was worried about the lack of built-in persistence agnosticism, I was sure there would be a "work-around".
In fact, not even a work-around in an n-tier architecture.
WCF + EF
If I've read the article correctly, then I don't see any problem serializing entities across the wire (using WCF) and also the persistence ignorance isn't a problem.
This is because I'd use PI mainly for unit-testing.
Unit Testing is possible! (i think)
In this system, we could simply use a mock service (by wrapping up the call to the service in ANOTHER interface based class which could be produced from a factory, for example). This would test OUR presenter code (there's no need to unit-test the EF/DAL - that's Microsoft's job!) Of course, integration tests would still be required to achieve full confidence.
If you wanted to write to a separate database, this would be done in the DAL layer, easily achieved via the config file.
My Tuppence Worth
My opinion - make up your own mind about the EF and don't be put off by all the doom and gloom regarding it that's doing the rounds. I'd guess that it's going to be around for a while and MS will iron out the faults in the next year or so. PI is definitely coming in according to Dan Simmons.
EDIT: I've just realised I jumped the gun and like a good politician didn't actually answer the question that was asked. Oops. But I'll leave this in in case anyone else finds it useful.
Not all data models map nicely to application Entities. If the mapping isn't relatively straightforward, I'd skip the Entity Framework. You'll find yourself doing handstands to make it work without any clear benefit.
Anders Hejlsberg had some interesting comments about object/relational mapping here.
Since EF does not support POCO, it can be difficult to write good unit tests against. That was one of the knocks against it in the Vote Of No Confidence.
If you're wanting to write good tests, EF will raise obstacles. You can work around them, but it is non-trivial.
Though both SQL CE 3.5 SP1 and Entity Framework 4.0 Beta 1 both support Identity Columns, using these two products together (at least up to the versions listed), Identity Columns are not supported. You will be required to set primary keys on your own.
Other than that, I've been enjoying EF with SQL CE.
Related
I am working on a web application which implements DDD principles and using Asp.net web api and Entity Framework.
I know that my question is weird, but anyway I am looking for guidance from people who experienced that.
In the past I was doing mapping between dtos, domain models and data models, but recently I found it hard every time to do that.
Is it a reliable way to depend on built-in automappers to handle mapping for our applications, and what are the available built-in automappers to use them with web application as c# and Entity Framework?
I would say that using Automapper is quite useful, saves a lot of time and makes me focus more on the domain logic rather than mapping issues.
I have used http://automapper.org/ in many projects I have done and it is very powerful specially when understand how to configure the mappings the right way.
For sure there is a performance cost on using Automappers rather than manual mapping (in most of cases) but when considering the development time it saves, it worth a try.
There are many articles and questions talking in more details about this subject and comparisons with different mapping methods and tools:
http://geekswithblogs.net/mrsteve/archive/2016/11/29/object-mapper-performance-comparison-revisited.aspx
Which is faster: Automapper, Valuinjector, or manual mapping? To what degree is each one faster?
https://softwareengineering.stackexchange.com/questions/167250/am-i-wrong-in-thinking-that-needing-something-like-automapper-is-an-indication-o
and much more, just google it...
I am designing this HR System (desktop-based) for a mid-size organization. The thing is I have all the tables designed and was planning on using the O/RM in VS2008 to generate the entity classes (this is the first time I work with OR/M; in fact, this is my first "big" project.) I wanted to make the app with 3 layers (one of the programmers of the company suggested not 3 but 4 or 5 layers) but after reading quite a lot of blog entries and a lot of questions here I've realized that is not quite easy to do that with LINQ to SQL because of how the datacontext works and how difficult it is to pass objects between layers using LINQ to SQL.
Probably I'll just use the entity classes generated by the VS2008 ORM and add any validation and bussines logic in partial classes. But that would be 2 layers, or not? The app will be used by like 10 users, so I don't think the 2 layer approach is a big issue for now.
In the future, a web-based front-end will be developed so candidates can apply to jobs online. I want to develop it as scalable as possible. But the truth is I don't have a lot of time to waste to make a decision, times running up hehe.
Having said all that, should I just use the entities generated by the VS2008 ORM?
So any suggestion or idea would be greatly appreciated. Thanks.
You're chewing over quite a lot with your line of questioning here. (Is there a concrete question hidden in there somewhere?)
With layers, I assume you mean physical boundaries, i.e. application, app/SOA/WCF server, data layer that lives on the SOA server, and a database somewhere.
Designing for the future might seem like a good idea, but DO make sure that there WILL be a need for all those layers somewhere down the line. Essentially, you do not need a WCF/SOA based approach if you're not exposing your application over the internet at some point. A web frontend can solve the same problem in many cases.
I'm not saying you will not need those layers at all, but you might not. If you really do, seams are your friend. You need to make "cut points" where you can define your boundaries. I commonly use the repository pattern to diversify data access methodologies, and use plain objects (POCO) and interfaces that are persisted via technologies such as NHibernate. Using POCOs also makes it MUCH easier to transfer those objects over the wire at a later point, either standalone or part of messages.
Creating service interfaces that are called can solidify your boundaries. When you are ready to move cross-machine/physical boundaries, you simply create your boundaries in the service implementations.
It sure sounds like a dangerous way to go - creating the tables first, then domain and finally GUI.
I must admit I am no expert on ORM expert but the generated classes I´ve seen looks more like dataobjects than classes. I would say you need another layer to stop all logic to end up in the GUI ).
We are looking into using an ORM and I wanted some opinions/comparisons
The basic criteria we have for an ORM is: Easy to use/configure(short learning curve), flexible, the ability to abstract it away, easy to maintain
Here is a list of what ORM we are looking at and what our initial impressions are
Open Access - seems really easy for simple stuff, but doesn't seem to have a lot of flexibility, cost isn't an issue we already own it
Ling to SQL - looks very simple to use and configure but is missing some functionality
Active Record - NHibernate made simple
SubSonic - looks very feature rich, but haven't really played with it much
here are the ORMs we have looked at and ruled out
Entity is still in beta
NHibernate has far to much of a learning curve (we don't have 3 weeks to delicate to learning it)
I'd say you should take a look at DataObjects.NET (http://www.x-tensive.com). It's feature rich and pretty easy to use. It does, though, absolutely tie you to your object model, as it decides what the database structure should be based on what your object model looks like. That being said, if you want to be able to disregard the existence of the database, it's quite nice. We've used it for years and have had great success.
We currently use SubSonic (2.0.3) and it has been an absolute lifesaver. I cannot stress enough how awesome it is. HOWEVER, we are now looking at switching away from it for various reasons (probably to NHibernate or Entity). Here are my Pros and Cons of it:
Pros:
Very simple to setup and use.
Lots of great & useful, tools and features
Uses the "convention over configuration" philosophy, so very little configuration. It "just works". (As long as you do things the way it wants... :) )
Cons:
Your database design is very tightly coupled to your domain design. Make a change in your DB, and you need to change your code/domain design.
By default, SubSonic uses the ActiveRecord pattern for all data access instead of the Repository pattern, which makes it more difficult to "abstract it away". (Although I believe with v3.0 that you can swap out the default ActiveRecord templates to use the Repository pattern).
Lots of pessimistic rumours flying around about the future of SubSonic. But rumours are just that: Rumours.
For all the paired pennies it may be worth:
If you don't have 3 weeks now to learn your ORM of choice (whichever you choose), you may have to find 3 weeks to learn it later when it doesn't map something exactly how you thought it would.
If you have a model that's moderately complicated, ORM is non-trivial. You'll wind up needing to know how your ORM works so you can tell it to map things the way you want.
Which is all another way of saying "Know thy tools", of course. :)
Most folks will have a smattering of experience with one or two of those, but few will have exposure to all. I recommend a proof-of-concept effort with each of your favorites. Get each one set up, spending no more than n hours per ORM tool (n = however much time you decide is reasonable.) You don't have to implement your entire object model, a functional subset will do.
By the time you're done, you will have worked through the setup and some usage of all of them. You can then write up a post-mortem and the team can decide which has the best pain-to-feature ratio.
Use T4 templates to create your own. There are several established patterns available on the internet especially for T4 templates.
Knowledge of T4 also will allow you to script out items that might have a large scope than macros, but a smaller scope than writing a custom app to generate the script you need.
Hope this helps!
I very much agree with BryCoBat (upvoted). I wanted to also add that if you already own Open Access, then the company very likely has people somewhere internally who are already very comfortable with it, including code examples you look at in your own domain for both trivial and non-trivial tasks. In other words: use what you know.
If you're not using Telerik controls, LINQ to SQL should be the one to select for fast learning - there is huge amount of different tutorials, videos, books in the web.
Assuming that writing nhibernate mapping files is not a big issue....or polluting your domain objects with attributes is not a big issue either....
what are the pros and cons?
is there any fundamental technical issues? What tends to influence peoples choice?
not quite sure what all the tradeoffs are.
The biggest pro of AR is that it gives you a ready-made repository and takes care of session management for you. Either of ActiveRecordBase<T> and ActiveRecordMediator<T> are a gift that you would have ended up assembling yourself under NHibernate. Avoiding the XML mapping is another plus. The AR mapping attributes are simple to use, yet flexible enough to map even fairly 'legacy' databases.
The biggest con of AR is that it actively encourages you to think incorrectly about NHibernate. That is, because the default session management is session-per-call, you get used to the idea that persisted objects are disconnected and have to be Save()d when changes happen. This is not how NHibernate is supposed to work - normally you'd have session-per-unit-of-work or request or thread, and objects would remain connected for the lifecycle of the session, so changes get persisted automatically. If you start off using AR and then figure out you need to switch to session-per-request to make lazy loading work - which is not well explained in the docs - you'll get a nasty surprise when an object you weren't expecting to get saved does when the session flushes.
Bear in mind that the Castle team wrote AR as a complementary product for Castle Monorail, which is a Rails-like framework for .NET. It was designed with this sort of use in mind. It doesn't adapt well to a more layered, decoupled design.
Use it for what it is, but don't think about it as a shortcut to NHibernate. If you want to use NH but avoid mapping files, use NHibernate Attributes or better, Fluent NHibernate.
I found ActiveRecord to be a good piece of kit, and very suitable for the small/medium projects I've used it for. Like Rails, it makes many important decisions for you, which has the effect of keeping you focused you on the meat of the problem.
In my opinion pro's and cons are:
Pros
Lets you focus on problem in hand, because many decisions are made for you.
Includes mature, very usable infrastructure classes (Repository, Validations etc)
Writing AR attributes are faster than writing XML or NHibernate.Mapping.Attributes IMHO.
Good documentation and community support
It's fairly easy to use other NHibernate features with it.
A safe start. You have a get-out clause. You can slowly back into a bespoke NHibernate solution if you hit walls with AR.
Great for domain-first development (generating the db).
You might also want to look up the benefits and drawbacks of the ActiveRecord pattern
Cons
You can't pretend NHibernate isn't there - you still need to learn it.
Might not be so productive if you already have a legacy database to work with.
Not transparent persistence.
In-built mappings are comprehensive, but for some projects you might need to revert to NHibernate mappings in places. I haven't had this problem, but just a thought.
In general, I really like ActiveRecord and it's always been a time saver, mainly because I seem to happily accept the decisions and tools baked into the library, and subsequently spend more time focusing on the problem in hand.
I'd give it a try on a few projects and see what you think.
When I started using NHibernate, I didn't learn about Castle ActiveRecord until I had written my Mapping files and made my classes. At that point, I couldn't visibly discern what Castle Activerecord would give me, so I didn't use it.
The second time I used NHibernate, I simply used myGeneration to make the mapping files and the classes just by having it look at my database. That saved a lot of time by itself, and allowed me to (once again) not worry about Castle Active Record.
In reality, most of your time is going to be spent making the custom queries, and Castle Active Record won't necessarily help with that -- if you were to use myGeneration with NHibernate, you'd bypass most of the work you'd need to do anyway.
Edit: I don't want to seem like a cheerleader for either myGeneration or NHibernate. I just use the tool that allows me to get my work done quickly and easily. The less time I have to spend writing Data Access code, the better. It doesn't mean I can't do it -- but there's little sense in re-inventing the wheel each time you write a new application. Write SQL queries and Stored Procedures where needed, and no where else. If you're doing CRUD operations, an ORM is the way to go.
Edit #2: Castle Active Record may bring more to the table than I realize -- I don't know much other than what's on their website, but if it does bring more to the table, then it would help potential adopters to be able to readily see that on their site.
I'm currently working on putting together a fairly simple ORM tool to serve as a framework for various web projects for a client. Most of the projects are internal and will not require massive amounts of concurrency and all will go against SQL Server. I've suggested that they go with ORM tools like SubSonic, NHibernate, and a number of other open source projects out there, but for maintainability and flexibility reasons they want to create something custom. So my question is this: What are some features that I should make sure to include in this ORM tool? BTW, I'll be using MyGeneration to do the code generation templates.
For the love of all that's holy (and the women and the children), do everything possible to convince them not to go with a custom O/RM solution. Why are people wanting to re-invent the wheel when there are perfectly-good, open-source wheels already in existence?!?!
If your client isn't interested in OSS because of (real or imagined) perceptions about support, have you considered any of the top-quality commercial third-party ORMs such as LightSpeed that comes with a nice GUI designer tool
(source: mindscape.co.nz)
Mindscape (the company that sells LightSpeed) is a New Zealand company based near where I live, I have met some of the devs there, and I know they are famous for having incredible customer support. And they give you the source code when you buy the software, so you can tweak it any way you like.
You probably don't want to have to roll your own ORM unless you have to and your client is willing to hand over a stupid amount of cash for you to do so.
IMO writing your own OR/M is one the worst design decisions you could ever make. "maintainability and flexibility" are reasons exactly NOT to write your own OR/M.
Please read See 25 Reasons Not To Write Your Own Object Relational Mapper, and see if your client really wants to pay what it costs to build something like NHibernate ($7.6M) or SubSonic ($1.5M). Because, like ChanChan said above, you will end up with something similar to that.
There's a bunch of posts by Davy Brion (an NHibernate committer) who is for some reason also forced to write a custom ORM for a client.
Some of the things he covers are:
Mapping Classes To Tables
Out Of The Box CRUD Functionality
Hydrating Entities
Session Level Cache
Executing Custom Queries
Definately worth checking out, if you MUST go down this path: Build Your Own Data Access Layer Series
You need to go the nHibernate style, in my experience, and have it so that you have some kind of map, between your objects and the database. This allows your objects to have some things that are hard to represent in a database but are easier represented in POCOs.
Generation gets you started, by giving you classes that meet your schema, but if you plan on maintaining anything or testing anything, mapping is pain now for pleasure later.
Subsonic is a great model, and its open source, if you must go generation, use their templates in myGeneration to get a leg up.
BTW: I've done what you are doing, and I ended up with something very similar to subsonic, and now advise my clients to take the subsonic source, and fork it for themselves.
Maybe just maybe, you need badly some "features" that do not exist yet in the existing solutions. Maybe you need something simpler also. 1.5$ for Subsonic is simply outrageous.
Maybe you want to use POCO. Maybe you want to use the stuff easily in a 3 tier scenario.
Maybe you don't want to support ALL RDBMS on the planet, so you can hardcode and optimize the code just for your target. Maybe you want to implement smarter object tracking. Maybe some design decisions made by the existing orms drive you crazy....
I myself am using a custom orm developed by me myself and i, and i am satisfied that i did it. There is no hidden dragon under the carpet, no surprise scenario. My orm does exacty what i want it to do, nothing less, nothing more.
Second level cache
Allows you maintain entities instances in-memory
Automatic dirty-checking
Allows you updates changes in an object without loading it.
Powerful query language
Powerful cascade operation
Powerful primary key generator strategy
ORM framework will pickup best primary key generator strategy according to target database
Support to composite elements
Support to events
onSave, onUpdate and so on
Good documentation and reference books
Support to conversational state
regards,
No-one has mentioned it yet; but go with LLBLGen. You may customise the template as you like, and you may also, obviously, write your own custom code in the generated classes. Buy it. You will never look back, and you will be saying "Thank you silky!" when it consistently works beautifully. (I didn't write it, but I love it). If it doesn't work out for you, you may also say "Damn you silky!". But that's unlikely, however I do offer it as an option.
The only bad thing I noticed about LLBLGen has been the support for switching between Databases/servers on the fly. It doesn't support a feature that I'd like; namely the ability to detect that a given entity you retrieved doesn't "exist" in a new database that you've switched to. But this is a rare case.
I suggest LLBLGen, because I was in the process of writing my own OR/M when I came across it. Never looked back.
Your job as a consultant (sounds like that's what you are) is to leverage your expertise in implementing for your clients a solution that fits their desires with a minimum cost and time investment.
If they want to build and sell an OR/M. The go to town making one. If they want anything else, use one that already exists to get the job done.
If they insist on spending money, buy an existing one (I won't name any, but there exist some good ones that are not free).
Try to use Devart LinqConnect - all of the LINQ to SQL features and wide support of the most popular database servers - Oracle, MySQL, Postgre, SQL Server, and SQLite. Incredible Visual Modeling tool, advanced monitoring tool, high quality support - as a result i've learned it only in three weeks during my project execution.