Is the Entity designer in VS 2008 just plain broken? - c#

I was working in L2E with 3 simple tables, whose purposes a fairly straightforward: Users, Users_Roles, Roles.
Users had a 1:many relationship with Users_Roles, referenced on the column UserID (a uniqueidentifier). At the time, Roles had no relation to any tables. I brought all 3 tables into the designer, which reflected the mapping of Users to Users_Roles, while Roles sat all by itself.
Realizing Roles should be mapped to Users_Roles in a many:many relationship, I jumped into management studio. I slapped a relationship from Roles to Users_Roles and saved it. I then jumped back into VS and did the logical the next logical step in my mind - tried to update the Entity model by right-clicking, "Update Model from database". It showed all three tables in its update list. After updating, the visual relationship didn't change. I tried to recompile to see if any changes were made, but received two errors differing only in line numbers:
Error 3 Error 3034: Problem in Mapping Fragments starting at lines 177, 192:
Two entities with possibly different keys are mapped to the
same row. Ensure these two mapping fragments map both ends
of the AssociationSet to the corresponding columns.
C:*\Model1.edmx 178 15 MVCTestApp
Tried to debug the error to no avail. Eventually got frustrated, deleted the model and rebuilt it. The relationships were recognized, the designer was update properly, life was good.
This isn't the first time I've wanted to take the designer outside and bury it, either: a week or so earlier I had the nerve to delete a table, and learned very quickly that deleting a table in the designer only deletes it from the designer; its mappings stay.
So, a few questions:
1) is this behavior limited to VS 2008? I'll have 2010 shortly, and am hoping that the designer functions as expected,
2) are there other tools that can replaced the built-in designer that actually work,
3) is there a way to make update from model actually update - perhaps some trick I'm not aware of besides deleting the entire model, thus losing all my other relationships I've set up manually?

Broken? It's more correct to say that it only recognizes certain DB schema patterns.
The EF supports other patterns, but you need to write the EDMX yourself. If you want the GUI Designer to get everything right for you, you need to follow the patterns it expedcts.
If Users_Roles has only two columns, the UserId and the RoleId and if those are both FKs to their respective tables, and if the two columns together are the PK of the table, then mapping the table will come out right with no work on your part.
Otherwise, you have to study the EDMX format and get the mapping right yourself. As you've discovered, that's trickier.

Related

ADO.NET Entity Data Model refresh after incremental Db changes

I am using VS 2013 Express for Web with ADO.NET Entity Data Model.
When updating the entity data model from database using 'refresh' tab option (seems you can only select one item though the heading says select objects plural) the usage seems unclear and I have noticed some issues.
Just two examples:
I changed a stored procedure so it returned the same number of fields but one field was of a slightly different type but the complex type never changed. I realise there can be an impact on client code but this simply did not change the complex type, everything stayed the same. However, removing the relevant elements from the model browser then readding the elments from the database back into the model did exactly what I expected.
I made some significant changes to two or three tables, attributes and one relationship but did bot change the table names. Here again refresh had some very odd results, so I simply created a fresh model.
I am planning some more changes first change specifically I am adding a FK relationship that I forgot.
Is there any way to be sure of what is supported and what is not in terms of refresh.
Also I am concerned that if refresh fails and I so delete the two tables with the relationship, what impact will that have on temporarily orphaned tables and their relationships, and if when I regenerate the two tables their connections with the other tables will still work. I guess it depends how the generated code works underneath.
I want to make these kinds of changes but avoid have to recreate the entire model.
Any advice appreciated.
The most guaranteed way of ensuring you always have the latest version is to select all (Ctrl A) delete, and then re-add everything from the model page.
I know it sounds like a pain but it's guaranteed to work as long as you haven't made any changes to the model from within Visual Studio.
The refresh doesn't always work.

View using same type as Table

I have a table that used throughout an app by Entity. I have a view that returns an identical column set, but is actually a union on itself to try to work around some bad normalization (The app is large and partially out of my hands, this part is unavoidable).
Is it possible to have Entity 4 treat a view that is exactly like a table as the same type, so that I can use this view to populate a collection of the same type? This question seems to indicate it is possible in nhibernatem but I can't find anything like it for entity. It would be an extra bonus of the navigation properties could still be used to Include(), but this is not necessary (I can always manually join).
Since EF works on mappings from objects to database entities this is not directly possible. What you need is something like changing the queried database entity dynamically, and AFAIK this is not possible without manually changing the object context.
For sure the EF runtime won't care as long as it can treat the view as if it was completely separate table. The two possible challenges that I forsee are:
Tooling: Our wizard does allow you to select views when doing reverse engineering (i.e. database-first). Definitively if you can use 'code first against an existing database' you can just pretend that the view is just a table, but you won't get any help scripting the database creation or migrations.
Updates: in general you can perform updates for a view setting up store procedure mapping (which is available in the EF Designer from v1 or in Code First starting in EF6). You might also be able to make your view updatable directly or using instead off triggers (see "Updatable Views" here for more details). If I remember correctly the SQL generated by EF to retrieve database generated values (e.g. for identity columns) is not compatible in some cases with instead-off triggers. Yet another alternative is to have your application treat the view as read-only and perform all updates through the actual table, which you would map as a separate entity. Keep in in mind that in-memory entities for the view and the original table will not be kept in sync.
Hope this helps!

Creating snapshot of application data - best practice

We have a text processing application developed in C# using .NET FW 4.0 where the Administrator can define various settings. All this 'settings' data reside in about 50 tables with foreign key relations and Identity primary keys (this one will make it tricky, I think). The entire database is no more than 100K records, with the average table having about 6 short columns. The system is based on MS SQL 2008 R2 Express database.
We face a requirement to create a snapshot of all this data so that the administrator of the system could roll back to one of the snapshots anytime he screws up something. We need to keep the last 5 snapshots only. Creation of the snapshot must be commenced from the application GUI and so must be the rollback to any of the snapshots if needed (use SSMS will not be allowed as direct access to the DB is denied). The system is still in development (are we ever really finished?) which means that new tables and columns are added many times. Thus we need a robust method that can take care of changes automatically (digging code after inserting/changing columns is something we want to avoid unless there's no other way). The best way would be to tell that "I want to create a snapshot of all tables where the name begins with 'Admin'". Obviously, this is quite a DB-intensive task, but due to the fact that it will be used in emergency situations only, this is something that I do not mind. I also do not mind if table locks happen as nothing will try to use these tables while the creation or rollback of the snapshot is in progress.
The problem can be divided into 2 parts:
creating the snapshot
rolling back to the snapshot
Regarding problem #1. we may have two options:
export the data into XML (file or database column)
duplicate the data inside SQL into the same or different tables (like creating the same table structure again with the same names as the original tables prefixed with "Backup").
Regarding problem #2. the biggest issue I see is how to re-import all data into foreign key related tables which use IDENTITY columns for PK generation. I need to delete all data from all affected tables then re-import everything while temporarily relaxing FK constraints and switching off Identity generation. Once data is loaded I should check if FK constraints are still OK.
Or perhaps I should find a logical way to load tables so that constraint checking can remain in place while loading (as we do not have an unmanageable number of tables this could be a viable solution). Of course I need to do all deletion and re-loading in a single transaction, for obvious reasons.
I suspect there may be no pure SQL-based solution for this, although SQL CLR might be of help to avoid moving data out of SQL Server.
Is there anyone out there with the same problem we face? Maybe someone who successfully solved such problem?
I do not expect a step by step instruction. Any help on where to start, which routes to take (export to RAW XML or keep snapshot inside the DB or both), pros/cons would be really helpful.
Thank you for your help and your time.
Daniel
We don't have this exact problem, but we have a very similar problem in which we provide our customers with a baseline set of configuration data (fairly complex, mostly identity PKs) that needs to be updated when we provide a new release.
Our mechanism is probably overkill for your situation, but I am sure there is a subset of it that is applicable.
The basic approach is this:
First, we execute a script that drops all of the FK constraints and changes the nullability of those FK columns that are currently NOT NULL to NULL. This script also drops all triggers to ensure that any logical constraints implemented in them will not be executed.
Next, we perform the data import, setting identity_insert off before updating a table, then setting it back on after the data in the table is updated.
Next, we execute a script that checks the data integrity of the newly added items with respect to the foreign keys. In our case, we know that items that do not have a corresponding parent record can safely be deleted, but you may choose to take a different approach (report the error and let someone manually handle the issue).
Finally, once we have verified the data, we execute another script that restores the nullability, adds the FKs back, and reinstalls the triggers.
If you have the budget for it, I would strongly recommend that you take a look at the tools that Red Gate provides, specifically SQL Packager and SQL Data Compare (I suspect there may be other tools out there as well, we just don't have any experience with them). These tools have been critical in the successful implementation of our strategy.
Update
We provide the baseline configuration through an SQL Script that is generated by RedGate's SQL Packager.
Because our end-users can modify the database between updates which will cause the identity values in their database to be different in ours, we actually store the baseline primary and foreign keys in separate fields within each record.
When we update the customer database and we need to link new records to known configuration information, we can use the baseline fields to find out what the database-specific FKs should be.
In otherwords, there is always a known set of field ids for well-known configuration records regardless what other data is modified in the database and we can use this to link records together.
For example, if I have Table1 linked to Table2, Table1 will have a baseline PK and Table2 will have a baseline PK and a baseline FKey containing Table1's baseline PK. When we update records, if we add a new Table2 record, all we have to do is find the Table1 record with the specified baseline PK, then update the actual FKey in Table2 with the actual PK in Table1.
A kind of versioning by date ranges is a common method for records in Enterprise applications. As an example we have a table for business entities (us) or companies (uk) and we keep the current official name in another table as follows:
CompanyID Name ValidFrom ValidTo
12 Business Lld 2000-01-01 2008-09-23
12 Business Inc 2008-09-23 NULL
The null in the last record means that this is current one. You may use the above logic and possibly add more columns to gain more control. This way there are no duplicates, you can keep the history up to any level and synchronize the current values across tables easily. Finally the performance will be great.

Large entity framework split into many EDMX. How to operate between those?

I'm working on rather large database. There will be about 200-300 tables total. First of all, did anybody have such database in single EDMX? In my experience designer gets unusable when you get more then 50 or so entities.
In my case database splits itself rather nicely into subject areas like Membership, System, Assets, etc..
This is what with EDMX and everything works more or less OK.
Here is issue I bet everyone has.. I have all main entities reference User table with fields like CreatedBy, UpdatedBy. And User table lives in separate EDMX.
For CUD operations I can deal with this fine because I have current user hanging around. But for retreival I often need to display ID of who created record and for that I need a join. But I can't join to table in different EDMX.
I see 2 solutions to this:
I cache list of users (it can't be huge) and lookup id's on a client
I will add User table to all of my models and join every time I request data
?
I realize that for what I'm doing - #1 is probably OK. But I can see how I may need to join entities from different models in future and wonder how you guys handle this?
edmx is only defines a conceptual model of your database. So it is OK to add user tables to multiple edmx files.
Option #2 seems alright.
You would need to have a different names for the tables that repeat in multiple edmx files to avoid conflict. Something like SystemUser, AssetUser all actually pointing to the same User table.
Thanks.

Mapping composite foreign keys in a many-many relationship in Entity Framework

I have a Page table and a View table. There is a many-many relationship between these two via a PageView table. Unfortunately all of these tables need to have composite keys (for business reasons).
Page has a primary key of (PageCode, Version),
View has a primary key of (ViewCode, Version).
PageView obviously enough has PageCode, ViewCode, and Version.
The FK to Page is (PageCode, Version) and the FK to View is (ViewCode, Version)
Makes sense and works, but when I try to map this in Entity framework I get
Error 3021: Problem in mapping
fragments...: Each of the following
columns in table PageView is mapped to
multiple conceptual side properties:
PageView.Version is mapped to
(PageView_Association.View.Version,
PageView_Association.Page.Version)
So clearly enough, EF is having a complain about the Version column being a common component of the two foreign keys.
Obviously I could create a PageVersion and ViewVersion column in the join table, but that kind of defeats the point of the constraint, i.e. the Page and View must have the same Version value.
Has anyone encountered this, and is there anything I can do get around it? Thanks!
I'm not aware of a solution in Entity Framework for this problem, but a workaround could be to add primary key columns to your tables and add a unique constraints on the fields you wanted to act like a composite key. This way you ensure uniqueness of your data, but still have one primary key column. Pro-con arguments can be found under this topic: stackoverflow question
Cheers
After much reading and messing about, this is just a limitation of the EF designer and validator when working with many-many relationships.
I was going to write that you should use a surrogate key, but I don't think this will actually help you. The join table is enforcing a business rule basedon the logical attributes of the entities - these same attributes would be stored in the join table even if Page and View were augmented with surrogate keys.
If you are executing on a server that supports constraints, you could separate the Version into PageVersion and ViewVersion and add a constraint that the two are equal, or use an INSERT/UPDATE trigger to enforce this.
I may have simply misunderstood the intent, but I feel there is something that doesn't seem right with this design. I can't imagine how the versioning will work as pages and views are changed and new versions created. If changing a page means it gets a new version, then it will also have cause new versions of all it's views to be made, even for views that haven't changed in that version. Equivalently, if one view in a page changes, the view's version changes, which means the page's version must also change, and so all other views in that page, since page and view versions must match. Does this seem right?
Consider using nHibernate? :) - or at least for anything more than simple joins in your DB. Im working with EF4 and it doesnt seem mature enough for complex data graphs IMO, at the moment. Hopefully it will get there though!

Categories

Resources