I'm making an application that uses legacy database, using EF6 database first, .Net C#.
The database has two versions: the old and the new one. In the new one some tables were modified and renamed. E.g. old one has tables like: work, order, item etc. and new one work_t, order_t and item_t.
The content of corresponding tables is very similar, in the new ones some new columns were added and some were removed. So my application is supposed to work with both kind of databases as I use only the columns that are presented in both versions.
I was wondering if there is any decent way to hide those table pairs behind some interface or something to avoid doing 2 implementations of LINQ coding.
This is not exactly creating one entity out of 2 tables, because only one table is presented in the database at a time. I want to have single piece of code to address either one of the similar tables.
Here's some pseudo code for what I'm after:
public workDTO GetWork(int workId)
{
MyEntities db = new MyEntities();
// for old version it will go like
var work = db.work.Where(a => a.id == workId);
// for new version it will go like
var work = db.work_t.Where(a => a.id == workId);
return Mapper.Map(work, workDTO);
}
So the idea is to have just one method and one LINQ implementation for both tables.
Yes , You can do it by giving a column attribute in entity framework:
Read here
Update :
You can use the .ToTable() method:
modelBuilder.Entity().ToTable("t_Department");
Source: MSDN: http://msdn.microsoft.com/en-us/data/jj591617.aspx
Related
Here is my problem...
I've got one entity class, and I want to keep historical values from this table. Every time each record is changed, I want to insert a snapshot from this record into another table (with the same fields). I could do this field by field, but I'm sure there should be a simple way of doing this.
I've tried something like this, but it doesn't work:
var t1 = context.TABLE1.Find(id);
var t2 = new TABLE2();
context.Entry(t2).CurrentValues.SetValues(t1);
context.SaveChanges();
I've found this How to "transfer" the data from one table to another with EF? but it doesn't work for me, because my tables can't do what is said on this post
t2.CurrentValues.SetValues(t1);
Any ideas?
You could use a package like AutoMapper to create a mapping between the Table1 and Table2 classes. They have good documentation on how to get that set up. You could also override the SaveChange() method to handle the creation/saving of the historical record.
Just a bit of an outline of what i am trying to accomplish.
We keep a local copy of a remote database (3rd party) within our application. To download the information we use an api.
We currently download the information on a schedule which then either inserts new records into the local database or updates the existing records.
here is how it currently works
public void ProcessApiData(List<Account> apiData)
{
// get the existing accounts from the local database
List<Account> existingAccounts = _accountRepository.GetAllList();
foreach(account in apiData)
{
// check if it already exists in the local database
var existingAccount = existingAccounts.SingleOrDefault(a => a.AccountId == account.AccountId);
// if its null then its a new record
if(existingAccount == null)
{
_accountRepository.Insert(account);
continue;
}
// else its a new record so it needs updating
existingAccount.AccountName = account.AccountName;
// ... continue updating the rest of the properties
}
CurrentUnitOfWork.SaveChanges();
}
This works fine, however it just feels like this could be improved.
There is one of these methods per Entity, and they all do the same thing (just updating different properties) or inserting a different Entity. Would there be anyway to make this more generic?
It just seems like a lot of database calls, would there be anyway to "Bulk" do this. I've had a look at this package which i have seen mentioned on a few other posts https://github.com/loresoft/EntityFramework.Extended
But it seems to focus on bulk updating a single property with the same value, or so i can tell.
Any suggestions on how i can improve this would be brilliant. I'm still fairly new to c# so i'm still searching for the best way to do things.
I'm using .net 4.5.2 and Entity Framework 6.1.3 with MSSQL 2014 as the backend database
For EFCore you can use this library:
https://github.com/borisdj/EFCore.BulkExtensions
Note: I'm the author of this one.
And for EF 6 this one:
https://github.com/TomaszMierzejowski/EntityFramework.BulkExtensions
Both are extending DbContext with Bulk operations and have the same syntax call:
context.BulkInsert(entitiesList);
context.BulkUpdate(entitiesList);
context.BulkDelete(entitiesList);
EFCore version have additionally BulkInsertOrUpdate method.
Assuming that the classes in apiData are the same as your entities, you should be able to use Attach(newAccount, originalAccount) to update an existing entity.
For bulk inserts I use AddRange(listOfNewEntitities). If you have a lot of entities to insert it is advisable to batch them. Also you may want to dispose and recreate the DbContext on each batch so that it's not using too much memory.
var accounts = new List<Account>();
var context = new YourDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
foreach (var account in apiData)
{
accounts.Add(account);
if (accounts.Count % 1000 == 0)
// Play with this number to see what works best
{
context.Set<Account>().AddRange(accounts);
accounts = new List<Account>();
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
context = new YourDbContext();
}
}
context.Set<Account>().AddRange(accounts);
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
For bulk updates, there's not anything built in in LINQ to SQL. There are however libraries and solutions to address this. See e.g. Here for a solution using expression trees.
List vs. Dictionary
You check in a list every time if the entity exists which is bad. You should create a dictionary instead to improve performance.
var existingAccounts = _accountRepository.GetAllList().ToDictionary(x => x.AccountID);
Account existingAccount;
if(existingAccounts.TryGetValue(account.AccountId, out existingAccount))
{
// ...code....
}
Add vs. AddRange
You should be aware of Add vs. AddRange performance when you add multiple records.
Add: Call DetectChanges after every record is added
AddRange: Call DetectChanges after all records is added
So at 10,000 entities, Add method have taken 875x more time to add entities in the context simply.
To fix it:
CREATE a list
ADD entity to the list
USE AddRange with the list
SaveChanges
Done!
In your case, you will need to create an InsertRange method to your repository.
EF Extended
You are right. This library updates all data with the same value. That is not what you are looking for.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library may perfectly fit for your enterprise if you want to improve your performance dramatically.
You can easily perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
Example:
public void ProcessApiData(List<Account> apiData)
{
// Insert or Update using the primary key (AccountID)
CurrentUnitOfWork.BulkMerge(apiData);
}
I have a class, suppose it's called EntityModel, and I want to make three different tables with the same columns, as defined in EntityModel. Let's call the tables tbPast, tbPresent and tbFuture. I want also to access them separetely in the Entity DbContext:
using (var db = new MyContext())
{
var element = db.Past.Find(id);
db.Past.Remove(element);
db.Present.Add(element);
db.SaveChanges();
}
The main purpose of having three tables is performance: the table will have millions of rows, and the most important is the Present, with dozens of rows. Most queries will be made in the Present table.
What is the best way to do this? Implementing three models with the same properties doesn't seem right for me.
I'm using Entity Framework, with the Code First approach, along with ASP.NET MVC 3.
You can't use the same model to generate separate tables w/ EF code-first. If you need to have some sort of grouping, use a Discriminator field and assing it any of the values: Past Present Future.
Edit:
Similar effect can be achieved through table-per-concrete type inheritance. Thus each type will have it's own table and can share most (if not all) of the fields.
How to use two different database with relation in one asp.net mvc c# application
One of the benefits of using Entity Framework 4.0 is that it can handle data from multiple tables, or as in your case, multiple databases. Here is one how-to article. There is somewhat of a learning curve, but many people like this approach, and Microsoft seems to be dedicated to it for the future.
Basically, using EF allows you to do the data mapping in its model, abstracting all the database and table joins from you. You get business objects with class and property names that you can understand, and that are easier to code against.
static New table1DataContext Context1 =
new table1DataContext ("ConnectionString1");
static table2DataContext Context2 = new table2DataContext ("ConnectionString2");
//Linq statement in c#
var query = from a in table1DataContext.table1 from b in table2DataContext.table2
where a.ID == b.ID
select new { a, b };
I have the following scenario: there are a database that generates a new logTable every year. It started on 2001 and now has 11 tables. They all have the same structure, thus the same fields, indexes,pk's, etc.
I have some classes called managers that - as the name says - manages every operation on this DB. For each different table i have a manager, except for this logTable which i have only one manager.
I've read a lot and tried different things like using ITable to get tables dynamically or an interface that all my tables implements. Unfortunately, i lose strong-typed properties and with that i can't do any searches or updates or anything, since i can't use logTable.Where(q=> q.ID == paramId).
Considering that those tables have the same structure, a query that searches logs from 2010 can be the exact one that searches logs from 2011 and on.
I'm only asking this because i wouldn't like to rewrite the same code for each table, since they are equal on it's structure.
EDIT
I'm using Linq to SQL as my ORM. And these tables uses all DB operations, not just select.
Consider putting all your logs in one table and using partitioning to maintain performance. If that is not feasible you could create a view that unions all the log tables together and use that when selecting log data. That way when you added a new log table you just update the view to include the new table.
EDIT Further to the most recent comment:
Sounds like you need a new DBA if he won't let you create new SPs. Yes I think could define an ILogTable interface and then make your log table classes implement it, but that would not allow you do GetTable<ILogTable>(). You would have to have some kind of DAL class with a method that created a union query, e.g.
public IEnumerable<ILogTable> GetLogs()
{
var Log2010 = from log in DBContext.2010Logs
select (ILogTable)log;
var Log2011 = from log in DBContext.2011Logs
select (ILogTable)log;
return Log2010.Concat(Log2011);
}
Above code is completely untested and may fail horribly ;-)
Edited to keep #AS-CII happy ;-)
You might want to look into the Codeplex Fluent Linq to SQL project. I've never used it, but I'm familiar with the ideas from using similar mapping techniques in EF4. YOu could create a single object and map it dynamically to different tables using syntax such as:
public class LogMapping : Mapping<Log> {
public LogMapping(int year) {
Named("Logs" + year);
//Column mappings...
}
}
As long as each of your queries return the same shape, you can use ExecuteQuery<Log>("Select cols From LogTable" + instance). Just be aware that ExecuteQuery is one case where LINQ to SQL allows for SQL Injection. I discuss how to parameterize ExecuteQuery at http://www.thinqlinq.com/Post.aspx/Title/Does-LINQ-to-SQL-eliminate-the-possibility-of-SQL-Injection.