EF Code First - SqlServerCE Trigger - c#

i use the entity framework code first for my application and i need a trigger.
My Application should support different database engines like
Sql Server
SqlServerCE
SqlExpress
Trigger, Stored procs are not supported in SqlServerCE, what would you do to get this
functionality?
I could do something after "SaveChanges" or something, what is a good way ?

Yes you can do something inside SaveChanges (by overriding) or after SaveChanges and call SaveChanges again to persist new data but it is not exactly the same as trigger. Simply if your requirement is to use trigger for some functionality SqlServerCE is not a choice for you. If you rewrite the trigger logic to your application you should have two versions anyway - one for big SQL using triggers and one for SQL CE not using triggers.

Code first, although it allows you to send in some raw queries and hence perform your own native database/server manipulations, basically is designed only for building a model and querying upon that model.
Now as for your question: You can directly build 'stored procedure alternatives' by adding methods to your DbContext (or extension methods for seperation of concerns).
Triggers are a little more complex. You can override the SaveChanges method in which you can basically track each change made either before it got persisted back to the database- or after.
Both problems however can also be solved by introducing a repository. See: http://huyrua.wordpress.com/2011/04/13/entity-framework-4-poco-repository-and-specification-pattern-upgraded-to-ef-4-1 This allows you to launch a script (trigger) when adding, updating or removing a certain entity

Related

How to abtract DBMS requests in a .NET program, while being able to alter database during runtime

I am posting this thread because I didn't find any easy way to abstract my db requests like LinQ does, allowing my program to alter dynamically the database, creating tables or fields.
I am using .NET framework 4.0 and SQL Server 2012, on windows.
I have seen a lot of topics on ORMs such as Entity Framework that allows to run migrations on the db at runtime, only they can't be generated at runtime.
For now, my project creates table at runtime by executing hard-coded SQL Server scripts.
Only, I dont want to use specifically SQLServer, I want to use some generic language that generates a script for the right DBMS according to my c# code.
Example :
I want to alter my data design at runtime because my program is actually running on several machines that have their own database.
When I update the program I would like it to create new tables that are used by the new functionnalities.
Let's say I am adding... a QCM for the user.
I have a winform that allows the user to see the questions and answer it.
Now to keep track of the answers, I want my program to create a new table using Linq and then fill it with the answers.
If I understand well, when using Entity Framework with code-first approach :
I would have to use the packet manager console to Add migrations on every machine, before running :
var migrator = new DbMigrator(configuration);
migrator.Update();
Or is there a way to send to the machine a migration file that will updated with migrator.Update(); ?
It sounds like you want to do something that is not a good fit for for EF (or virtually any ORM for that matter). ORM's are generally designed to map a static data model to a static object model.
While it's not impossible to do what you want, it would require a very deep understanding of EF and it's implementation details, to use the lower-level functions as well as Dynamic Linq. Then you'd have to map this to dynamic C# objects that can change at runtime.
Frankly, you're probably better off writing a custom data layer yourself using standard SQLCommand statements.
As far as migrations go, this is generally a development/deployment tool, not a runtime tool.
The SqlCommand class in .net allows direct access to the database, including the execution of arbitrary scripts. This will allow you to send any valid database instruction script to the database, including schema definition scripts. This will allow you to create, define, backup, restore and destroy databases.

c# update single db field or whole object?

This might seem like an odd question, but it's been bugging me for a while now. Given that i'm not a hugely experienced programmer, and i'm the sole application/c# developer in the company, I felt the need to sanity check this with you guys.
We have created an application that handles shipping information internally within our company, this application works with a central DB at our IT office.
We've recently switch DB from mysql to mssql and during the transition we decided to forgo the webservices previously used and connect directly to the DB using Application Role, for added security we only allow access to Store Procedures and all CRUD operations are handle via these.
However we currently have stored procedures for updating every field in one of our objects, which is quite a few stored procedures, and as such quite a bit of work on the client for the DataRepository (needing separate code to call the procedure and pass the right params for each procedure).
So i'm thinking, would it be better to simply update the entire object (in this case, an object represents a table, for example shipments) given that a lot of that data would be change one field at a time after initial insert, and that we are trying to keep the network usage down, as some of the clients will run with limited internet.
Whats the standard practice for this kind of thing? or is there a method that I've overlooked?
I would say that updating all the columns for the entire row is a much more common practice.
If you have a proc for each field, and you change multiple fields in one update, you will have to wrap all the stored procedure calls into a single transaction to avoid the database getting into an inconsistent state. You also have to detect which field changed (which means you need to compare the old row to the new row).
Look into using an Object Relational Mapper (ORM) like Entity Framework for these kinds of operations. You will find that there is not general consensus on whether ORMs are a great solution for all data access needs, but it's hard to argue that they solve the problem of CRUD pretty comprehensively.
Connecting directly to the DB over the internet isn't something I'd switch to in a hurry.
"we decided to forgo the webservices previously used and connect directly to the DB"
What made you decide this?
If you are intent on this model, then a single SPROC to update an entire row would be advantageous over one per column. I have a similar application which uses SPROCs in this way, however the data from the client comes in via XML, then a middleware application on our server end deals with updating the DB.
The standard practice is not to connect to DB over the internet.
Even for small app, this should be the overall model:
Client app -> over internet -> server-side app (WCF WebService) -> LAN/localhost -> SQL
DB
Benefits:
your client app would not even know that you have switched DB implementations.
It would not know anything about DB security, etc.
you, as a programmer, would not be thinking in terms of "rows" and "columns" on client side. Those would be objects and fields.
you would be able to use different protocols: send only single field updates between client app and server app, but update entire rows between server app and DB.
Now, given your situation, updating entire row (the entire object) is definitely more of a standard practice than updating a single column.
It's better to only update what you change if you know what you change (if using an ORM like entity Framework for example), but if you're going down the stored proc route then yes definately update everything in a row at once that's way granular enough.
You should take the switch as an oportunity to change over to LINQ to entities however if you're already in a big change and ditch stored procedures in the process whenever possible

Entity Framework and ADO.NET with Unit of Work pattern

We have a system built using Entity Framework 5 for creating, editing and deleting data but the problem we have is that sometimes EF is too slow or it simply isn't possible to use entity framework (Views which build data for tables based on users participating in certain groups in database, etc) and we are having to use a stored procedure to update the data.
However we have gotten ourselves into a problem where we are having to save the changes to EF in order to have the data in the database and then call the stored procedures, we can't use ITransactionScope as it always elevates to a distributed transaction and/or locks the table(s) for selects during the transaction.
We are also trying to introduce a DomainEvents pattern which will queue events and raise them after the save changes so we have the data we need in the DB but then we may end up with the first part succeeding and the second part failing.
Are there any good ways to handle this or do we need to move away from EF entirely for this scenario?
I had similar scenario . Later I break the process into small ones and use EF only, and make each small process short. Even overall time is longer, but system is easier to maintain and scale. Also I minimized joins, only update entity itself, disable EF'S AutoDetectChangesEnabled and ValidateOnSaveEnabled.
Sometimes if you look your problem in different ways, you may have better solution.
Good luck!

How can I run SQL upon Database creation with EF 4.1 CodeFirst?

I want to utilize Elmah in my MVC application to store error messages, and I want to store the exceptions in my application's database. To do that I need to run the included DDL to create the Elmah tables and stored procs.
However, since my development database is recreated whenever my model changes (Via EF CodeFirst) I need the DDL to be run any time the database is recreated.
How would I go about doing this? The only place I could think to put this would be to add calls to run the Sql in the Seed() overridden method in my DbInitializer, but it doesn't seem completely appropriate since I am not seeding elmah, I am creating the structure for the schema to be created.
What is the most appropriate way to apply the DDL upon database recreation?
Using Seed method is usual approach to place custom SQL to execute after database is created. Its main purpose is to fill some initial data but developers use it for creating indexes, constraints, etc. so you can place there anything you need.

LLBLGen: How can I softdelete a entry

I have inherited a project that uses LLBLGen Pro for the DB layer. The DB model requires that when a entry is deleted a flag (DeletedDate is set to the current time). The last programmer ignored this requirement and has used regular deletes throughout the entire application.
Is there a way to set the code generator to do this automatically or do I have to overload each delete operator for the Entities that requires it?
I implemented this in SQL Server 2005 using INSTEAD OF triggers on delete for any soft delete table. The triggers set the delete flag and perform clean-up. The beauty of this solution is that it correctly handles deletes issued by any system that accesses the database. INSTEAD OF is relatively new in SQL Server, I know there's an Oracle equivalent.
This solution also plays nicely with our O/R mapper -- I created views that filter out soft deleted records and mapped those. The views are also used for all reporting.
You could create custom task in LLBLGen that would override those for you when you are generating entities. Check out their template studio and template examples on the website.
It depends if you are using self-servicing or adapter. If SS you will need to modify the template so that it sets the flag for you rather than deleting the entity.
If adapter, you can inherit from DataAccessAdapter and override the delete methods to set the flag for you rather than deleting the entities.
It's generally a crappy solution for performace though as every query then needs to filter out "deleted" entities - and because the selectvity on the "deleted" column won't be very high (all of your "undelted" records are null - i'm guessing this will be the majority of them) indexing it doesn't gain you a great deal - you will end up with a lot of table scans.

Categories

Resources