I have a legacy system with three databases
Vendor
CustomCode
LogData
Vendor contains control and log data from our Vendors app.
CustomCode contains lots of views and stored procedures that joins to Vendor and LogData
LogData contains results from our CustomCode processes. eg: Daily/Weekly/Monthly summaries and results.
I'm writing a website that will plot data on a map. The list of units is from a view in CustomCode. The Summary record is from LogData, and the individual log points are retrieved from Vendor by a stored proc in CustomCode.
I started with a DbContext for CustomCode, but can't seem to Navigate to properties in a 2nd DbContext to LogData
Can I link navigation properties between objects in different contexts?
Can I have once context with multiple databases connected ?
Please note, this is nothing to do with multi-tenant or multi-schema
Can I link navigation properties between objects in different contexts?
No.
Can I have one context with multiple databases connected?
No.
Suggestion:
If the databases can communicate to each other (ie on same server), which appears to be already done since
CustomCode contains lots of views and stored procedures that joins to Vendor and LogData
then create a stored procedure to perform the desired queries (which can join tables from separate databases).
From there you should be able to expose and execute the procedure from Entity Framework to perform the desired functionality.
This would avoid have multiple contexts and trying to join the data in memory, which can have adverse effects if the data set is large.
Also answered elsewhere (https://stackoverflow.com/a/54347237/861352), but here's the gist:
This actually appears to be a known issue, with a solution in the pipeline (although it hasn't been prioritised yet):
https://github.com/aspnet/EntityFrameworkCore/issues/4019
I did however find an interim solution to this problem, and it's based on two sources:
https://stackoverflow.com/a/26922902/861352 (EF6 solution)
https://weblogs.asp.net/ricardoperes/interception-in-entity-framework-core
And here it is:
How To Do (Same Server) Cross DB Joins With One EF Core DbContext
You'll need to install the Microsoft.Extensions.DiagnosticAdapter Nuget Package
using System;
using System.Data.Common;
using Microsoft.EntityFrameworkCore.Diagnostics;
using Microsoft.Extensions.DiagnosticAdapter;
namespace Example
{
public class CommandInterceptor
{
[DiagnosticName("Microsoft.EntityFrameworkCore.Database.Command.CommandExecuting")]
public void OnCommandExecuting(DbCommand command, DbCommandMethod executeMethod, Guid commandId, Guid connectionId, bool async, DateTimeOffset startTime)
{
var secondaryDatabaseName = "MyOtherDatabase";
var schemaName = "dbo";
var tableName = "Users";
command.CommandText = command.CommandText.Replace($" [{tableName}]", $" [{schemaName}].[{tableName}]")
.Replace($" [{schemaName}].[{tableName}]", $" [{secondaryDatabaseName}].[{schemaName}].[{tableName}]");
}
}
}
Replace 'MyOtherDatabase', 'dbo' and 'Users' with your Database name, table schema and table name, maybe from a config etc.
Then attach that interceptor to your context.
using System.Diagnostics;
using Microsoft.EntityFrameworkCore.Infrastructure;
var context = new MultipleDatabasesExampleDbContext(optionsBuilder.Options);
// Add interceptor to switch between databases
var listener = context.GetService<DiagnosticSource>();
(listener as DiagnosticListener).SubscribeWithAdapter(new CommandInterceptor());
In my case I put the above in MultipleDatabasesExampleDbContextFactory method.
Now you can just use the context as if you were referencing one database.
context.Customers // Default database defined in connection string
context.Users // MyOtherDatabase (a different database on the same server)
In EF Core 5.0 new features, It is now easier to create a DbContext instance without any connection or connection string. Also, the connection or connection string can now be mutated on the context instance. This feature allows the same context instance to dynamically connect to different databases.
Reference: https://learn.microsoft.com/en-us/ef/core/what-is-new/ef-core-5.0/whatsnew#use-a-c-attribute-to-indicate-that-an-entity-has-no-key
No, You cannot link navigation properties between objects in different contexts. A context represents a particular connection or DB. You can try getting data from multiple contexts (DBs) and join them and use in-memory.
Related
I apologise if this has been asked already, I am struggling greatly with the terminology of what I am trying to find out about as it conflicts with functionality in Entity Framework.
What I am trying to do:
I would like to create an application that on setup gives the user to use 1 database as a "trial"/"startup" database, i.e. non-production database. This would allow a user to trial the application but would not have backups etc. in no way would this be a "production" database. This could be SQLite for example.
When the user is then ready, they could then click "convert to production" (or similar), and give it the target of the new database machine/database. This would be considered the "production" environment. This could be something like MySQL, SQLServer or.. whatever else EF connects to these days..
The question:
Does EF support this type of migration/data transfer live? Would it need another app where you could configure the EF source and EF destination for it to then run through the process of conversion/seeding/population of the data source to another data source?
Why I have asked here:
I have tried to search for things around this topic, but transferring/migration brings up subjects totally non-related, so any help would be much appreciated.
From what you describe I don't think there is anything out of the box to support that. You can map a DbContext to either database, then it would be a matter of fetching and detaching entities from the evaluation DbContext and attaching them to the production one.
For a relatively simple schema / object graph this would be fairly straight-forward to implement.
ICollection<Customer> customers = new List<Customer>();
using(var context = new AppDbContext(evalConnectionString))
{
customers = context.Customers.AsNoTracking().ToList();
}
using(var context = new AppDbContext(productionConnectionString))
{ // Assuming an empty database...
context.Customers.AddRange(customers);
}
Though for more complex models this could take some work, especially when dealing with things like existing lookups/references. Where you want to move objects that might share the same reference to another object you would need to query the destination DbContext for existing relatives and substitute them before saving the "parent" entity.
ICollection<Order> orders = new List<Order>();
using(var context = new AppDbContext(evalConnectionString))
{
orders = context.Orders
.Include(x => x.Customer)
.AsNoTracking()
.ToList();
}
using(var context = new AppDbContext(productionConnectionString))
{
var customerIds = orders.Select(x => x.Customer.CustomerId)
.Distinct().ToList();
var existingCustomers = context.Customers
.Where(x => customerIds.Contains(x.CustomerId))
.ToList();
foreach(var order in orders)
{ // Assuming all customers were loaded
var existingCustomer = existingCustomers.SingleOrDefault(x => x.CustomerId == order.Customer.CustomerId);
if(existingCustomer != null)
order.Customer = existingCustomer;
else
existingCustomers.Add(order.Customer);
context.Orders.Add(order);
}
}
This is a very simple example to outline how to handle scenarios where you may be inserting data with references that may, or may not exist in the target DbContext. If we are copying across Orders and want to deal with their respective Customers we first need to check if any tracked customer reference exists and use that reference to avoid a duplicate row being inserted or throwing an exception.
Normally loading the orders and related references from one DbContext should ensure that multiple orders referencing the same Customer entity will all share the same entity reference. However, to use detached entities that we can associate with the new DbContext via AsNoTracking(), detached references to the same record will not be the same reference so we need to treat these with care.
For example where there are 2 orders for the same customer:
var ordersA = context.Orders.Include(x => x.Customer).ToList();
Assert.AreSame(orders[0].Customer, orders[1].Customer); // Passes
var ordersB = context.Orders.Include(x => x.Customer).AsNoTracking().ToList();
Assert.AreSame(orders[0].Customer, orders[1].Customer); // Fails
Even though in the 2nd example both are for the same customer. Each will have a Customer reference with the same ID, but 2 different references because the DbContext is not tracking the references used. One of the several "gotchas" with detached entities and efforts to boost performance etc. Using tracked references isn't ideal since those entities will still think they are associated with another DbContext. We can detach them, but that means diving through the object graph and detaching all references. (Do-able, but messy compared to just loading them detached)
Where it can also get complicated is when possibly migrating data in batches (disposing of a DbContext regularly to avoid performance pitfalls for larger data volumes) or synchronizing data over time. It is generally advisable to first check the destination DbContext for matching records and use those to avoid duplicate data being inserted. (or throwing exceptions)
So simple data models this is fairly straight forward. For more complex ones where there is more data to bring across and more relationships between that data, it's more complicated. For those systems I'd probably look at generating a database-to-database migration such as creating INSERT statements for the desired target DB from the data in the source database. There it is just a matter of inserting the data in relational order to comply with the data constraints. (Either using a tool or rolling your own script generation)
As I've mentioned in a couple other questions, I'm currently trying to replace a home-grown ORM with the Entity Framework, now that our database can support it.
Currently, we have certain objects set up such that they are mapped to a table in our internal database and a table in the database that runs our website (which is not even in the same state, let alone on the same server). So, for example:
Part p = new Part(12345);
p.Name = "Renamed part";
p.Update();
will update both the internal and the web databases simultaneously to reflect that the part with ID 12345 is now named "Renamed part". This logic only needs to go one direction (internal -> web) for the time being. We access the web database through a LINQ-to-SQL DBML and its objects.
I think my question has two parts, although it's possible I'm not asking the right question in the first place.
Is there any kind of "OnUpdate()" event/method that I can use to trigger validation of "Should this be pushed to the web?" and then do the pushing? If there isn't anything by default, is there any other way I can insert logic between .SaveChanges() and when it hits the database?
Is there any way that I can specify for each object which DBML object it maps to, and for each EF auto-generated property which property on the L2S object to map to? The names often match up, but not always so I can't rely on that. Alternatively, can I modify the L2S objects in a generic way so that they can populate themselves from the EF object?
Sounds like a job for Sql Server replication.
You don't need to inter-connect the two together as it seems you're saying with question 2.
Just have the two separate databases with their own EF or L2S models and abstract them away using repositories with domain objects.
This is the solution I ended up going with. Note that the implementation of IAdvantageWebTable is inherited from the existing base class, so nothing special needed to be done for EF-based classes, once the T4 template was modified to inherit correctly.
public partial class EntityContext
{
public override int SaveChanges(System.Data.Objects.SaveOptions options)
{
var modified = this.ObjectStateManager.GetObjectStateEntries(EntityState.Modified | EntityState.Added); // Get the list of things to update
var result = base.SaveChanges(options); // Call the base SaveChanges, which clears that list.
using (var context = new WebDataContext()) // This is the second database context.
{
foreach (var obj in modified)
{
var table = obj.Entity as IAdvantageWebTable;
if (table != null)
{
table.UpdateWeb(context); // This is IAdvantageWebTable.UpdateWeb(), which calls all the existing logic I've had in place for years.
}
}
context.SubmitChanges();
}
return result;
}
}
I have the following scenario: there are a database that generates a new logTable every year. It started on 2001 and now has 11 tables. They all have the same structure, thus the same fields, indexes,pk's, etc.
I have some classes called managers that - as the name says - manages every operation on this DB. For each different table i have a manager, except for this logTable which i have only one manager.
I've read a lot and tried different things like using ITable to get tables dynamically or an interface that all my tables implements. Unfortunately, i lose strong-typed properties and with that i can't do any searches or updates or anything, since i can't use logTable.Where(q=> q.ID == paramId).
Considering that those tables have the same structure, a query that searches logs from 2010 can be the exact one that searches logs from 2011 and on.
I'm only asking this because i wouldn't like to rewrite the same code for each table, since they are equal on it's structure.
EDIT
I'm using Linq to SQL as my ORM. And these tables uses all DB operations, not just select.
Consider putting all your logs in one table and using partitioning to maintain performance. If that is not feasible you could create a view that unions all the log tables together and use that when selecting log data. That way when you added a new log table you just update the view to include the new table.
EDIT Further to the most recent comment:
Sounds like you need a new DBA if he won't let you create new SPs. Yes I think could define an ILogTable interface and then make your log table classes implement it, but that would not allow you do GetTable<ILogTable>(). You would have to have some kind of DAL class with a method that created a union query, e.g.
public IEnumerable<ILogTable> GetLogs()
{
var Log2010 = from log in DBContext.2010Logs
select (ILogTable)log;
var Log2011 = from log in DBContext.2011Logs
select (ILogTable)log;
return Log2010.Concat(Log2011);
}
Above code is completely untested and may fail horribly ;-)
Edited to keep #AS-CII happy ;-)
You might want to look into the Codeplex Fluent Linq to SQL project. I've never used it, but I'm familiar with the ideas from using similar mapping techniques in EF4. YOu could create a single object and map it dynamically to different tables using syntax such as:
public class LogMapping : Mapping<Log> {
public LogMapping(int year) {
Named("Logs" + year);
//Column mappings...
}
}
As long as each of your queries return the same shape, you can use ExecuteQuery<Log>("Select cols From LogTable" + instance). Just be aware that ExecuteQuery is one case where LINQ to SQL allows for SQL Injection. I discuss how to parameterize ExecuteQuery at http://www.thinqlinq.com/Post.aspx/Title/Does-LINQ-to-SQL-eliminate-the-possibility-of-SQL-Injection.
I am using Entity Framework with WCF Data Services and I have the following table in my database :
Table Contract
Id (int)
Name (varchar)
byUser (varchar)
isDeleted (bit)
Entity Framework class
Id (int)
Name(string)
byUser(string)
isDeleted(boolean)
whenever the user is inserting/updating/deleting a contract(through a client app), I need to log who did the action.
so, I created Stored procedures for insert/update/delete that will receive the username from the client when an insertion/deletion/update is performed.
the issue is that the delete operation does not send over who is performing the operation :
var ctx = Context;
var contractToDelete = ctx.Contracts.Where(c => c.ContractId == 1).First();
contractToDelete.ByUser = username;
ctx.DeleteObject(contractToDelete);
ctx.SaveChanges();
at the server side, the byUser is always null.
Questions :
1) How do I make it so that the byUser parameter is sent to the server ?
2) Is there a better way to handle this kind of scenario ? (logging/authentication/authorization) with Entity Framework
It doesn't send null "always". It sends the old value always. That is some internal logic in entity framework. For each tracked object EF keeps both original and current values. When you are deleting object EF doesn't use current values - it uses original values (don't ask me why, simply this is how it works).
So you need to cheat EF:
var ctx = Context;
var contractToDelete = ctx.Contracts.Where(c => c.ContractId == 1).First();
contractToDelete.ByUser = username;
ctx.Contracts.ApplyOriginalValues(contractToDelete);
ctx.DeleteObject(contractToDelete);
ctx.SaveChanges();
Calling ApplyOriginalValues will force EF to override original values with values passed in parameter = you will override original values with current values.
In my opinion the better way is storing deleted records in separate table because it will avoid a lot of problems with passing isDeleted=false to every query where both eager and lazy loading will load deleted records as well. The only way to avoid problems with isDeleted is using conditional mapping but in such case you will not be able to load deleted records even if you want to unless you use stored procedures or direct SQL queries.
The way I managed this is, when my user logs in, I store basic information about them in the session. I then have a class that sits on top of my actions to context.
Whenever I commit back changes, I go through the same routine which checks what changed. I developed the ability to trigger actions based upon the entity being worked with (so I can keep an eye on something such as contracts). Then I have the user able to be logged.
[Edit]
This is tougher to clarify than I realised, but I'll try.
I'm creating a web application. Heavily using Ninject.
When the user logs in, I store their information in an IUserSession object (this is really held in Session, but a custom Ninject scope makes this neat for me and prevents me from having to expose my data layer to Web Session). This user session object contains username, user id etc.
I created a class that contains the context,and wraps all the SELECT,CREATE,DELETE and COMMIT calls. i.e. SELECT;
public IQueryable<TEntity> All<TEntity>( ) {
return Context.Set<TEntity>();
}
This class also has a Commit method, this is the call to SaveChanges.
Before calling SaveChanges, you have access to the changes in the form of Context.ChangeTracker.Entities
For each entity that has changed, you can test to see if it was added, deleted or modified.To get the type of the element being modified;
Type baseEntityType = ObjectContext.GetObjectType( entity.Entity.GetType( ) );
I do plan on writing up a tutorial soon, based upon my personal experience with doing this (not that that helps you right now).
In order to acces my db and use my stored procedures I made this very simple data access layer (if someone can call this "layer"). I have 8 files where each file looks like:
using System;
using System.Data;
using System.Data.Common;
using Microsoft.Practices.EnterpriseLibrary.Data;
public class TasksDBHandler
{
private static Database db = DatabaseFactory.CreateDatabase("DBNAME");
public static void SetTaskDepreciationData(long taskId, long fieldId, string value)
{
DbCommand command = db.GetStoredProcCommand("dbo.P_CUS_TSK_SetTaskDepreciationData");
db.AddInParameter(command, "#task_id", DbType.Int64, taskId);
db.AddInParameter(command, "#field_id", DbType.Int64, field);
db.AddInParameter(command, "#value", DbType.String, value);
db.ExecuteNonQuery(command);
}
//Many more stored procedures calls
}
I want to build a new and better data access layer but I don't know how should it look like. I want the ability to use stored procedures without the need to write static method for each stored procedure, I want better connection menagement and so on/
Is anyone have any clue how to do so?
I am using .Net and SQL SERVER.
Have you looked at any of the ORM products out there? There's Linq2Sql, Entity Framework, NHibernate, and others. Unless what you need to do is very basic, you'll probably have better results learning to use an existing framework than trying to write your own.
In an ORM like Entity Framework, you typically don't manage your connection manually, it defines an object (or entity) model from your database and a "context" which is responsible for retrieving the data from your database and mapping it to the correct properties on the classes in your entity model. So you request something from the context, it loads the data necessary to fulfill your request into memory, you work with it like other classes, and then tell the context to save your changes back to the database. There are several ways to interact with your entity model in entity framework, but the example I'll use is Linq2Entities. You write a Linq query, and the context is responsible for turning that into a query against the database *disclaimer: I haven't tried to run this code, it's just meant to serve as an example
using(MyEntitiesContext context = new MyEntitiesContext())
{
var idleUsers = from u in context.User
where u.LoggedIn && u.LastActivity > DateTime.Now.AddMinutes(-30)
select u;
foreach(User u in idleUsers)
{
u.Status = UserStatus.Idle;
}
context.SaveChanges();
}
Obviously there's a lot going on behind the scenes:
There's the whole object model that gets generated from your database (you select the tables you want to be included in your model, and you can create multiple models in the same project).
There's the context, which manages the database connection and turns your Linq expression into a database query
There's the connection string that has to be defined your .config file so Entity Framework knows how to connect to your database
You should be able to find plenty of information on Entity Framework, but the easiest way I've found to learn is to jump in and start trying to do something, and then find answers to questions as they come up. I wouldn't try to use it right away on something highly critical or time sensitive, as there's definitely a learning curve, and you'll learn better ways of working with it once you've experienced some of the pitfalls.
Here's a link to Microsoft's Entity Framework 4 Quickstart which should give you something fairly straightforward to try out. Have fun!