LLBLGen: How can I softdelete a entry - c#

I have inherited a project that uses LLBLGen Pro for the DB layer. The DB model requires that when a entry is deleted a flag (DeletedDate is set to the current time). The last programmer ignored this requirement and has used regular deletes throughout the entire application.
Is there a way to set the code generator to do this automatically or do I have to overload each delete operator for the Entities that requires it?

I implemented this in SQL Server 2005 using INSTEAD OF triggers on delete for any soft delete table. The triggers set the delete flag and perform clean-up. The beauty of this solution is that it correctly handles deletes issued by any system that accesses the database. INSTEAD OF is relatively new in SQL Server, I know there's an Oracle equivalent.
This solution also plays nicely with our O/R mapper -- I created views that filter out soft deleted records and mapped those. The views are also used for all reporting.

You could create custom task in LLBLGen that would override those for you when you are generating entities. Check out their template studio and template examples on the website.

It depends if you are using self-servicing or adapter. If SS you will need to modify the template so that it sets the flag for you rather than deleting the entity.
If adapter, you can inherit from DataAccessAdapter and override the delete methods to set the flag for you rather than deleting the entities.
It's generally a crappy solution for performace though as every query then needs to filter out "deleted" entities - and because the selectvity on the "deleted" column won't be very high (all of your "undelted" records are null - i'm guessing this will be the majority of them) indexing it doesn't gain you a great deal - you will end up with a lot of table scans.

Related

Adding Entity Objects to LINQ-to-SQL Data Context at Runtime - SQL, C#(WPF)

I've hit a wall when it comes to adding a new entity object (a regular SQL table) to the Data Context using LINQ-to-SQL. This isn't regarding the drag-and-drop method that is cited regularly across many other threads. This method has worked repeatedly without issue.
The end goal is relatively simple. I need to find a way to add a table that gets created during runtime via stored procedure to the current Data Context of the LINQ-to-SQL dbml file. I'll then need to be able to use the regular LINQ query methods/extension methods (InsertOnSubmit(), DeleteOnSubmit(), Where(), Contains(), FirstOrDefault(), etc...) on this new table object through the existing Data Context. Essentially, I need to find a way to procedurally create the code that would otherwise be automatically generated when you do use the drag-and-drop method during development (when the application isn't running), but have it generate this same code while the application is running via command and/or event trigger.
More Detail
There's one table that gets used a lot and, over the course of an entire year, collects many thousands of rows. Each row contains a timestamp and this table needs to be divided into multiple tables based on the year that the row was added.
Current Solution (using one table)
Single table with tens of thousands of rows which are constantly queried against.
Table is added to Data Context during development using drag-and-drop, so there are no additional coding issues
Significant performance decrease over time
Goals (using multiple tables)
(Complete) While the application is running, use C# code to check if a table for the current year already exists. If it does, no action is taken. If not, a new table gets created using a stored procedure with the current year as a prefix on the table name (2017_TableName, 2018_TableName, 2019_TableName, and so on...).
(Incomplete) While the application is still running, add the newly created table to the active LINQ-to-SQL Data Context (the same code that would otherwise be added using drag-and-drop during development).
(Incomplete) Run regular LINQ queries against the newly added table.
Final Thoughts
Other than the above, my only other concern is how to write the C# code that references a table that may or may not already exist. Is it possible to use a variable in place of the standard 'DB_DataContext.2019_TableName' methodology in order to actually get the table's data into a UI control? Is there a way to simply create an Enumerable of all the tables where the name is prefixed with a year and then select the most current table?
From what I've read so far, the most likely solution seems to involve the use of a SQL add-on like SQLMetal or Huagati which (based solely from what I've read) will generate the code I need during runtime and update the corresponding dbml file. I have no experience using these types of add-ons, so any additional insight into these would be appreciated.
Lastly, I've seen some references to LINQ-to-Entities and/or LINQ-to-Objects. Would these be the components I'm looking for?
Thanks for reading through a rather lengthy first post. Any comments/criticisms are welcome.
The simplest way to achieve what you want is to redirect in SQL Server, and leave your client code alone. At design-time create your L2S Data Context, or EF DbContex referencing a database with only a single table. Then at run-time substitue a view or synonym for that table that points to the "current year" table.
HOWEVER this should not be necessary in the first place. SQL Server supports partitioning, so you can store all the data in a physically separate data structures, but have a single logical table. And SQL Server supports columnstore tables, which can compress and store many millions of rows with excellent performance.

View using same type as Table

I have a table that used throughout an app by Entity. I have a view that returns an identical column set, but is actually a union on itself to try to work around some bad normalization (The app is large and partially out of my hands, this part is unavoidable).
Is it possible to have Entity 4 treat a view that is exactly like a table as the same type, so that I can use this view to populate a collection of the same type? This question seems to indicate it is possible in nhibernatem but I can't find anything like it for entity. It would be an extra bonus of the navigation properties could still be used to Include(), but this is not necessary (I can always manually join).
Since EF works on mappings from objects to database entities this is not directly possible. What you need is something like changing the queried database entity dynamically, and AFAIK this is not possible without manually changing the object context.
For sure the EF runtime won't care as long as it can treat the view as if it was completely separate table. The two possible challenges that I forsee are:
Tooling: Our wizard does allow you to select views when doing reverse engineering (i.e. database-first). Definitively if you can use 'code first against an existing database' you can just pretend that the view is just a table, but you won't get any help scripting the database creation or migrations.
Updates: in general you can perform updates for a view setting up store procedure mapping (which is available in the EF Designer from v1 or in Code First starting in EF6). You might also be able to make your view updatable directly or using instead off triggers (see "Updatable Views" here for more details). If I remember correctly the SQL generated by EF to retrieve database generated values (e.g. for identity columns) is not compatible in some cases with instead-off triggers. Yet another alternative is to have your application treat the view as read-only and perform all updates through the actual table, which you would map as a separate entity. Keep in in mind that in-memory entities for the view and the original table will not be kept in sync.
Hope this helps!

Creating snapshot of application data - best practice

We have a text processing application developed in C# using .NET FW 4.0 where the Administrator can define various settings. All this 'settings' data reside in about 50 tables with foreign key relations and Identity primary keys (this one will make it tricky, I think). The entire database is no more than 100K records, with the average table having about 6 short columns. The system is based on MS SQL 2008 R2 Express database.
We face a requirement to create a snapshot of all this data so that the administrator of the system could roll back to one of the snapshots anytime he screws up something. We need to keep the last 5 snapshots only. Creation of the snapshot must be commenced from the application GUI and so must be the rollback to any of the snapshots if needed (use SSMS will not be allowed as direct access to the DB is denied). The system is still in development (are we ever really finished?) which means that new tables and columns are added many times. Thus we need a robust method that can take care of changes automatically (digging code after inserting/changing columns is something we want to avoid unless there's no other way). The best way would be to tell that "I want to create a snapshot of all tables where the name begins with 'Admin'". Obviously, this is quite a DB-intensive task, but due to the fact that it will be used in emergency situations only, this is something that I do not mind. I also do not mind if table locks happen as nothing will try to use these tables while the creation or rollback of the snapshot is in progress.
The problem can be divided into 2 parts:
creating the snapshot
rolling back to the snapshot
Regarding problem #1. we may have two options:
export the data into XML (file or database column)
duplicate the data inside SQL into the same or different tables (like creating the same table structure again with the same names as the original tables prefixed with "Backup").
Regarding problem #2. the biggest issue I see is how to re-import all data into foreign key related tables which use IDENTITY columns for PK generation. I need to delete all data from all affected tables then re-import everything while temporarily relaxing FK constraints and switching off Identity generation. Once data is loaded I should check if FK constraints are still OK.
Or perhaps I should find a logical way to load tables so that constraint checking can remain in place while loading (as we do not have an unmanageable number of tables this could be a viable solution). Of course I need to do all deletion and re-loading in a single transaction, for obvious reasons.
I suspect there may be no pure SQL-based solution for this, although SQL CLR might be of help to avoid moving data out of SQL Server.
Is there anyone out there with the same problem we face? Maybe someone who successfully solved such problem?
I do not expect a step by step instruction. Any help on where to start, which routes to take (export to RAW XML or keep snapshot inside the DB or both), pros/cons would be really helpful.
Thank you for your help and your time.
Daniel
We don't have this exact problem, but we have a very similar problem in which we provide our customers with a baseline set of configuration data (fairly complex, mostly identity PKs) that needs to be updated when we provide a new release.
Our mechanism is probably overkill for your situation, but I am sure there is a subset of it that is applicable.
The basic approach is this:
First, we execute a script that drops all of the FK constraints and changes the nullability of those FK columns that are currently NOT NULL to NULL. This script also drops all triggers to ensure that any logical constraints implemented in them will not be executed.
Next, we perform the data import, setting identity_insert off before updating a table, then setting it back on after the data in the table is updated.
Next, we execute a script that checks the data integrity of the newly added items with respect to the foreign keys. In our case, we know that items that do not have a corresponding parent record can safely be deleted, but you may choose to take a different approach (report the error and let someone manually handle the issue).
Finally, once we have verified the data, we execute another script that restores the nullability, adds the FKs back, and reinstalls the triggers.
If you have the budget for it, I would strongly recommend that you take a look at the tools that Red Gate provides, specifically SQL Packager and SQL Data Compare (I suspect there may be other tools out there as well, we just don't have any experience with them). These tools have been critical in the successful implementation of our strategy.
Update
We provide the baseline configuration through an SQL Script that is generated by RedGate's SQL Packager.
Because our end-users can modify the database between updates which will cause the identity values in their database to be different in ours, we actually store the baseline primary and foreign keys in separate fields within each record.
When we update the customer database and we need to link new records to known configuration information, we can use the baseline fields to find out what the database-specific FKs should be.
In otherwords, there is always a known set of field ids for well-known configuration records regardless what other data is modified in the database and we can use this to link records together.
For example, if I have Table1 linked to Table2, Table1 will have a baseline PK and Table2 will have a baseline PK and a baseline FKey containing Table1's baseline PK. When we update records, if we add a new Table2 record, all we have to do is find the Table1 record with the specified baseline PK, then update the actual FKey in Table2 with the actual PK in Table1.
A kind of versioning by date ranges is a common method for records in Enterprise applications. As an example we have a table for business entities (us) or companies (uk) and we keep the current official name in another table as follows:
CompanyID Name ValidFrom ValidTo
12 Business Lld 2000-01-01 2008-09-23
12 Business Inc 2008-09-23 NULL
The null in the last record means that this is current one. You may use the above logic and possibly add more columns to gain more control. This way there are no duplicates, you can keep the history up to any level and synchronize the current values across tables easily. Finally the performance will be great.

EF Code First - SqlServerCE Trigger

i use the entity framework code first for my application and i need a trigger.
My Application should support different database engines like
Sql Server
SqlServerCE
SqlExpress
Trigger, Stored procs are not supported in SqlServerCE, what would you do to get this
functionality?
I could do something after "SaveChanges" or something, what is a good way ?
Yes you can do something inside SaveChanges (by overriding) or after SaveChanges and call SaveChanges again to persist new data but it is not exactly the same as trigger. Simply if your requirement is to use trigger for some functionality SqlServerCE is not a choice for you. If you rewrite the trigger logic to your application you should have two versions anyway - one for big SQL using triggers and one for SQL CE not using triggers.
Code first, although it allows you to send in some raw queries and hence perform your own native database/server manipulations, basically is designed only for building a model and querying upon that model.
Now as for your question: You can directly build 'stored procedure alternatives' by adding methods to your DbContext (or extension methods for seperation of concerns).
Triggers are a little more complex. You can override the SaveChanges method in which you can basically track each change made either before it got persisted back to the database- or after.
Both problems however can also be solved by introducing a repository. See: http://huyrua.wordpress.com/2011/04/13/entity-framework-4-poco-repository-and-specification-pattern-upgraded-to-ef-4-1 This allows you to launch a script (trigger) when adding, updating or removing a certain entity

Is EF or SQL the better choice to audit data changes?

The requirement seems simple: when data changes, audit the changes.
Here's some important pieces to the equation:
The Data in my application spans multiple tables (some cross ref. tables).
My DTO is deep, with Navigation Properties conditionally populated.
When loaded, I copy the original DTO with its "original values".
When saved is requested, the original DTO contains the changes.
Ideally, foreign keys will read like useful text not Id numbers.
Unlike TFS' cool history feature, mine seems more complicated because of the many related tables and conditional child entities.
I see three possibilities (so far):
I could use C# to reflect the objects and create a before/after record.
I could use triggers in SQL 2008R2 to catch changes and coalesce a before/after record.
I could store the raw before/after objects and let SQL 2008R2 parse them.
Please note: Right now, I seems to me that SQL 2008R2's CDC is far too heavy of an option. I am really looking for something I can build, but I admit my mind is open to anything right now.
My question
Before I get started building this:
How does everybody else handle auditing a complex EF DTO?
Is there a low(ish)-tech solution available?
Thank you in advance.
Related, but not-completely-related questions already on StackOverflow: Implementing Audit Log / Change History with MVC & Entity Framework and Create Data Audit in SQL Server and https://stackoverflow.com/questions/5773419/how-to-audit-many-to-many-relationship-in-entity-framework and Maintaining audit log for entities split across multiple tables and Linq to SQL Audit Trail / Audit Log: should I use triggers or doddleaudit? do not provide an answer.
IF audit is a real requirement I would opt for the trigger solution... since the other methods have several "shortcomings":
"blind" to any changes happening through other means than your application
if you make some code changes and forget about adding the audit code the audit trail gets "blind spots"
The trigger-based solution can be secured so that only special users can even see the audited data...
I usually work with Oracle but from my experience in such situations: allow the app only SELECT rights via Views , any insert/delete/update should be done via Stored procedures and audit trail should be done via triggers...
I've recently implemented an audit log manager on top of Entity Framework. When I instantiate my audit manager, I reflect all of the entity classes, and store the property information. Then within the object context SavingChanges event, I audit all of the changes. It works great. In the case of foreign keys, I just store their Id's before and after during changes.
The nice thing about this solution is that it doesn't require any extra coding. Once you create a log manager of sort, you don't have to worry about adding new triggers, or modifying triggers when new columns are added. Any changes to your entity classes will automatically be picked up when reflecting the classes.
Well, let's see. SQL Server auditing already exists, comes with tools, is probably already known by your DBAs, doesn't slow down your app, and can trace events that the application itself will never even see.
On the other hand, rolling your own in EF will allow you to audit non-SQL Server data sources. It also doesn't require EE.
Trigger Solution, Pros:
Cannot bypass the audit
Trigger Solution, Cons:
Cannot audit non SQL data
Cannot audit complex objects on insert
Entity Framework, Pros:
Can audit everything
Can audit complex objects in any state
Entity Framework, Cons:
Can be bypassed (like direct-to-SQL)
Requires a copy of original values
My choice is Entity Framework. Using STE makes it easier.
Either way you have to roll your own.

Categories

Resources