I'm currently involved in a project where we will present data from an external data source to visitors, but we will also provide meta data for the entities/rewrite some of the original data.
The external data source is a SQL Server database which I've created an .edmx file from and I've created an additional, controllable, SQL Server database with it's own .edmx file. But I'm not comfortable with using two entities for what, in my eyes, is one type of data.
Somehow I would like to merge the two data sources into one, and use only one entity class which I could query. Inheritance in LINQ to Entities would be perfect, but I would prefer no to change the .edmx files manually.
As it is now I have to create wrapper classes and populate them manually with the entity classes, or use multiple database queries to fetch the required data which is a big turn off performance wise.
It feels like it have to exist some sort of work around for these problems I'm facing?
You have two options here.
First you can extend the entity framework class by using partial
classes. It will help you avoiding changes to the generated classes.
Second you can use Entity Framework code first, Which i will
recommend as you will have more control on your entities.
Related
I'm in the process of creating a piece of software that processes data on an existing SQL Server database (which I can't change).
I'm using visual studio to develop my application in C# using WPF and the MVVM framework.
Essentially, the main purpose of my program is to gather data from the several tables of the database and present it to the user in a meaningful way. I don't want to simply display the data in the tables, but instead to gather information spread over several tables and aggregate in a certain way.
For that purpose, I've already created several Views in the database (using SMSS), some of which are rather complicated.
My question is: should I map those Views in Entity Framework and use the created POCO class as the source of a Datagrid, or should I recreate those Views through a LINQ query? Secondly, can I use a list of anonymous type as the source of a datagrid, taking into account that it would, naturally, be read-only?
Thanks
First, if your views have no performance issues, you should map views to POCO. You do not need to recreate the wheels.
Secondly, you can bind a collection of anonymous type to source of a DataGrid, but anonymous type binding has a limitation of automatic column generation.
Let's say I have a project where I use Entity Framework, but I want to use my own classes instead of the EF classes.
Reasons for using my own classes:
Easy to add properties in code
Easy to derive and inherit
Less binding to the database
Now, my database has table names like User and Conference.
However, In my domain project, I also call my files User.cs and Conference.cs.
That means I suddenly have two objects with the same naming, which is usually very annoying to work with, because you have to use namespaces all the time to know the difference.
My question is how to solve this problem?
My ideas:
Prefix all database tables with 'db'. I usually do this, but in this case, I cannot change the database
Prefix or postfix all C# classes with "Poco" or something similar
I just don't like any of my ideas.
How do you usually do this?
It's difficult to tell without more background but it sounds like you are using the Entity Framework designer to generate EF classes. This is known as the "Model First" workflow. Have you considered using the Code First / Code Only workflow? When doing code first you can have POCO classes that have no knowledge of the database, EF, or data annotations. The mapping between the database and your POCOs can be done externally in the the DBContext or in EntityTypeConfiguration classes.
You should be able to achieve your goal of decoupling from EF with just one set of objects via code first.
To extend the above answer, the database table name User (or Users as many DB designers prefer) is the identifier for the persistence store for the object User that's defined in your code file User.cs. None of these identifiers share the same space, so there should be no confusion. Indeed, they are named similarly to create a loose coupling across spaces (data store, code, development environment) so you can maintain sanity and others can read your code.
I am very new to entity framework and I am trying to do something like following scenario.
I am creating ASP.net web from application. That website needs to connect to two databases which the schemas are completely different.
I have no idea how the basic structure should be.
I though of have EF on class library. please guide me with instructions since I have less knowledge.
Since you are using two different databases, the only viable option is to create two separate conceptual models. Even if you would be able to merge two different databases into a single conceptual model, it would be a pain to maintain is the databases are of mentionable sizes.
The two models could reside within the same project, in seprate folders to get different namespaces.
E.g.:
Company.MyApp.DataModel
Company.MyApp.DataModel.Model1
Company.MyApp.DataModel.Model2
Then you could put a new layer on top of these two models which do all the heavy lifting, and could even make them look like one database if you want that, or merge data from entities in both models into a DTO or something similar.
Check out the Repository pattern.
If you think about it, when you create a EDM model with Visual Studio it ask you to give an existing database, and when finished creating the model, it generates an EF connection string, that internally address to the given underlying database connection string.
E.g: metadata=res:///EFTestModel.csdl|res:///EFTestModel.ssdl|res:///EFTestModel.msl;provider=System.Data.SqlClient;provider connection string="Data Source=.\;Initial Catalog=EFTest;Integrated Security=True;MultipleActiveResultSets=True"*
So each model matches only a database, only a connection string.
EF4 still does not support creating one conceptual model which works with N storage models. At least this is not supported with any built-in provider. Perhaps in the future this could be done through a new provider that combines the support of many storages (from the same providers or different).
I havent done enough research on it, but perhaps Windows Server AppFabric (Codename Velocity) could be the bridge to go through this gap.
Note: I have tried even editing manually the xml for the EDM (edmx) to insert a second element inside the tag but it does not match the EDM XML Schema so VS warns about it: Error 10021: Duplicated Schema element encountered.
You are going to use model first approach as the database already exists.
You will need to create two edmx for the two database.
You can create one model project which you can have connection strings for both the edmx in the app.config file and then create your edmx.
I am working on a project where we need to fetch data from a WCF service. The service code looks up the database thru the Entity Framework. However inorder to prevent sending down EF generated classes across the wire into the proxy generated by the client we have decided to map the values from the EF classes to custom built DTO classes, where the mapper class is responsible for picking out values from the EF generated classes and putting them into the DTO class. We then use those DTO classes for the service method's request and response.
The EF builds classes from tables that are related to each other. I get various classes with properties that look something like these below:
public global::System.Data.Objects.DataClasses.EntityCollection<SubAttachment> Attachments
{}
public global::System.Data.Objects.DataClasses.EntityReference<Gl> GlCodeReference
{}
A few of the properties have the keyword Reference appended to them which I am guessing provides some way for the EF to look up the related table on that field.
Is there a better/different approach than this mapping approach if I dont want to send heavy EF classes across? If not, is there some reference material that will help me understand how the the classes are built by the Entity framework.
Thanks for your time
Since you need to fetch data from WCF service which is backed by EF framework, have you considered using OData to expose EF objects? Check out some links below:
http://www.odata.org/
http://www.hanselman.com/blog/ODataBasicsAtTheAZGroupsDayOfNETWithScottGu.aspx
Link
When you create classes in EF, they have the [DataMember] attributes on their fields, and that's the only data that get's sent accross the wire. So, it's not as heavy as it seems...
But, since you're passing through WCF, the entities should be generated to be self-tracking, so when they get back to the service, you know what's changed and don't have to refetch every entity from db to do comparing.
If you still want the DTO's, you can generate them as well. If you're using EF4.0 you have an option of extracting a T4 file (.tt) that practically does the code generation - use that and alter to suit your needs and generate DTO's as well as mapper classes...
To get a .tt file from edmx (only for EF4): right click your model, choose Add code generation items, and choose EntityObject generator, or the other one if you want to have objects transfered through wcf. This will create a tt file that you can run by issuing a save command (you'll get a prompt if you want to allow it to run). When saved, it will generate a file that's exactly the same as the file generated by edmx model in the case of EntityObject generator, or you'll have two .tt files if you're using the other generator...
I've used something very similar to the approach in the link below along with some custom partial classes and it worked quite nicely.
Link
I want a strongly-typed DataSet along with designer TableAdapters, but Visual Studio's DataSet designer generates provider-specific (e.g. SQL Server vs. MySql) code and I don't want to commit to just one provider. An ORM would help, but:
Entity Framework is 3.5 only and doesn't play nice with DataSets, and
NHibernate doesn't support SQLite.
Here's what I've come up with:
"DataSets.Masters" contains a completely designed DataSet bound to some particular provider (e.g. SqlClient), including:
a CustomTableAdapter component, subclassed by each designer TableAdapter,
an ITableAdapterManager interface, implemented by designer's TableAdapterManager for hierarchical updates.
Everything except the DataSets.MyDataSetTableAdapters namespace is copied into the "DataSets" project, where all the TableAdapter code (along with xs:annotation) is removed.
The DataSets.MyDataSetTableAdapters namespace, along with MyDataSet.xsd etc., is copied and customized into each of "DataSets.SqlClient", "DataSets.SQLite", etc. each of which references the "DataSets" assembly.
Now I just have to choose the right assembly to load my ITableAdapterManager implementation from, based on any given connection string. When the table schema changes, I modify the Masters assembly, copy code to the production assemblies, and run a few tests.
So my question: am I making this too difficult? DataSets are so standard, and the need to support multiple database engines via the data access layer is so common, is there a method that doesn't involve copy, paste, and search & replace? What do you do?
It might be easier to simply ignore the autogenerated TableAdapter commands and use the ADO.Net data access factory objects when it's time for your CRUD operations. That way you can use DbProviderFactory.CreateCommandBuilder to correctly format the parameters in the CRUD operations. Note that this assumes that you aren't doing any tricky property mapping and your schema will remain consistent across data providers.
An aditional option if you use this technique is to create a class that you can enter as the BaseClass property on your TableAdapters. Add an "init"-type method that overrides the connection and the insert, delete, select, and update commands with ones from the factory (based on the auto-generated select command—which should be compatible across most providers).
NHibernate does support SQLite http://www.hibernate.org/361.html.
I recommend using NHibernate combined with Fluent NHibernate. Fluent NHibernate is a library that allows you to use NHibernate without needing to deal with any xml yourself which in my opinion is NHibernate's greatest drawback.
Also Fluent NHibernate supports an auto persistence model that if your domain objects are close to your database schema you can automap your entire business domain without writing mapping code for every single object. The further your business objects differ from your database the more complex it becomes to use the automapping features of Fluent NHibernate and it's worth using static mapping.