How to fix slow EF Core 7 performance - c#

I have code more or less like this:
// The Parent entity
public Class Parent
{
public int Id { get; set; }
public string SomeProperty { get; set; }
public List<Child> Children { get; set; } // Navigation to Children
}
// The Child entity
public Class Parent
{
public int Id { get; set; }
public string SomeProperty { get; set; }
public int ParentId { get; set; } // FK to the Parent
public Parent Parent { get; set; }
}
// The code that does saves data to the database
public Class MyClass
{
public void DoSomething(List<Parent> newItems)
{
var iterationNumber = 0;
// newItems contains 100,000 Parent/Child objects!
// Each newItem consists of a Parent and one Child.
foreach (var newItem in newItems) {
// Commit the changes every 500 iterations
if (iterationNumber++ % 500 == 0) {
await _db.SaveChangesAsync();
}
_db.Set<Parent>().Add(newItem);
existingItems.Add(newItem);
}
await _db.SaveChangesAsync();
}
}
So, I'm adding 100,000 new entities to my database. I'm committing the changes every 500 iterations.
I've noticed that the performance of my loop degrades significantly as it proceeds. I'm looking for suggestions for how to improve the performance of this code.
EDIT:
I had assumed the performance degrades because EF is tracking more and more objects in the newItems list. I tried adding _db.ChangeTracker.Clear() after both of the _db.SaveChangesAsync() calls, but that had no obvious effect on the poor performance.

I think it's a good practice not to use database calls in the loop if you can avoid it.
You can use AddRange but you'll have to write custom code for batch-wise processing.
context.Parent.AddRange(newItems);
//batch-wise processing
const int batchSize = 5000;
var totalCount = newItems.Count();
var batches = Math.Ceiling(totalCount / (double)batchSize);
//disable tracking
context.ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking;
for (var i = 0; i < batches; i++)
{
var batch = newItems.Skip(i * batchSize).Take(batchSize);
context.Parents.AddRange(batch);
context.SaveChanges();
}
Or Microsoft.EntityFrameworkCore.BulkExtensions. In BulkExtensions you can perform the batch-wise insertion. No need to write custom code.
context.BulkInsert(newItems, options =>
{
options.BatchSize = 5000;
// disable tracking
options.TrackingBehavior = TrackingBehavior.NoTracking;
});
BulkInsert has a default value of 30-sec timeout. you can increase this in options.Timeout

I had similar issue in past and what I did was on every few iteration I was creating new instance for db context. I utilized using () {} scope so that it will automatically dispose old db context instances.
I assume that you are using dependency injection to instantiate db context & your db context class is DatabaseContext. You need to get IServiceScopeFactory instance from from constructor. And use it to instantiate db context. Explanation is given in comment. You can update your code like below.
Note : If you are not using dependency injection to retrieve db context then you can simply use using (var _db = new DatabaseContext()) { } and remove IServiceScopeFactory & respective using blocks from below code.
public Class MyClass
{
// serviceScopeFactory will be needed to instantiate db context from dependency injection.
private readonly IServiceScopeFactory serviceScopeFactory;
// retrieve serviceScopeFactory from constructor
public CustomerCriteriaRepository(IServiceScopeFactory serviceScopeFactory)
{
this.serviceScopeFactory = serviceScopeFactory;
}
public void DoSomething(List<Parent> newItems)
{
// set batch count to save at single go.
var batchCount = 500;
// loop over list and save. Do not use increment condition (i++) in for.
for (var i = 0; i < newItems.Count;)
{
// create scope which can instantiate db context
using (var scope = this.serviceScopeFactory.CreateScope())
{
// instantiate db context using scope object
using (var _db = scope.ServiceProvider.GetService<DatabaseContext>())
{
// increase i with batchCount value
i += batchCount;
// skip already processed values and take next batch and add them all into _db using AddRange.
_db.Set<Parent>().AddRange(newItems.Skip(i).Take(batchCount));
// save all newly added objects
await _db.SaveChangesAsync();
}
}
}
}
}

Related

Calculate NotMapped property when loading from EF Core

We do have an entity class defined as below:
[Table("Users", Schema = "Mstr")]
[Audited]
public class User
{
public virtual string FamilyName { get; set; }
public virtual string SurName { get; set; }
[NotMapped]
public virtual string DisplayName
{
get => SurName + " " + FamilyName;
private set { }
}
}
This is working just fine. Now we would like to extract the logic part SurName + " " + FamilyName to a helper class which is usually injected with dependency injection. Unfortunately DI is not working for an entity class.
Therefor my question: is there any way to intercept the creation of new User objects? Is there a method from EF which I could override to execute some additional logic after a User object was created by EF?
Actually (at least in EF Core 6) you can use DI when constructing entities. Solution is a little bit hacky and based on the EF Core capability to inject "native" services like the context itself into entities constructors:
Currently, only services known by EF Core can be injected. Support for injecting application services is being considered for a future release.
And AccessorExtensions.GetService<TService> extension method which seems to support resolving services from DI.
So basically just introduce ctor accepting your DbContext as a parameter to the entity and call GetService on it and use service:
public class MyEntity
{
public MyEntity()
{
}
public MyEntity(SomeContext context)
{
var valueProvider = context.GetService<IValueProvider>();
NotMapped = valueProvider.GetValue();
}
public int Id { get; set; }
[NotMapped]
public string NotMapped { get; set; }
}
// Example value provider:
public interface IValueProvider
{
string GetValue();
}
class ValueProvider : IValueProvider
{
public string GetValue() => "From DI";
}
Example context:
public class SomeContext : DbContext
{
public SomeContext(DbContextOptions<SomeContext> options) : base(options)
{
}
public DbSet<MyEntity> Entities { get; set; }
}
And example:
var serviceCollection = new ServiceCollection();
serviceCollection.AddTransient<IValueProvider, ValueProvider>();
serviceCollection.AddDbContext<SomeContext>(builder =>
builder.UseSqlite($"Filename={nameof(SomeContext)}.db"));
var serviceProvider = serviceCollection.BuildServiceProvider();
// init db and add one item
using (var scope = serviceProvider.CreateScope())
{
var someContext = scope.ServiceProvider.GetRequiredService<SomeContext>();
someContext.Database.EnsureDeleted();
someContext.Database.EnsureCreated();
someContext.Add(new MyEntity());
someContext.SaveChanges();
}
// check that value provider is used
using (var scope = serviceProvider.CreateScope())
{
var someContext = scope.ServiceProvider.GetRequiredService<SomeContext>();
var myEntities = someContext.Entities.ToList();
Console.WriteLine(myEntities.First().NotMapped); // prints "From DI"
}
Note that var valueProvider = context.GetService<IValueProvider>(); will throw if service is not registered so possibly next implementation is better:
public MyEntity(SomeContext context)
{
var serviceProvider = context.GetService<IServiceProvider>();
var valueProvider = serviceProvider.GetService<IValueProvider>();
NotMapped = valueProvider?.GetValue() ?? "No Provider";
}
Also you can consider removing not mapped property and creating separate model with it and service which will perform the mapping.
Also in 7th version of EF Core a new hook for exactly this case should be added. See this github issue.
UPD. EF Core 7 approach.
EF 7 adds IMaterializationInterceptor (and bunch of others - see the docs) which can be used exactly for this goal. So updated code can look like the following:
No need for ctor accepting context in entity:
public class MyEntity
{
public int Id { get; set; }
[NotMapped]
public string NotMapped { get; set; }
}
Create an interceptor and overload one of it's methods (I went with InitializedInstance):
class NotMappedValueGeneratingInterceptor : IMaterializationInterceptor
{
public static NotMappedValueGeneratingInterceptor Instance = new ();
public object InitializedInstance(MaterializationInterceptionData materializationData, object entity)
{
if (entity is MyEntity my)
{
var valueProvider = materializationData.Context.GetService<IValueProvider>();
my.NotMapped = valueProvider.GetValue();
}
return entity;
}
}
And add interceptor to the context setup, with our DI approach AddDbContext changes to:
serviceCollection.AddDbContext<SomeContext>(builder =>
builder.UseSqlite($"Filename={nameof(SomeContext)}.db")
.AddInterceptors(NotMappedValueGeneratingInterceptor.Instance));
In your DbContext or whatever your context file is called you can intercept the SaveChanges() method and override it with your own things. In my example I override SaveChanges() to automatically add my audit fields so I don't have to duplicate it all over the code in a million places.
here is my example. So when a new object is being created you can override it. In My example I override both New records Added and Records modified.
These are notated at EntitState.Added and EntityStateModified.
Here is the code.
public override int SaveChanges()
{
var state = this.ChangeTracker.Entries().Select(x => x.State).ToList();
state.ForEach(x => {
if (x == EntityState.Added)
{
//Create new record changes
var created = this.ChangeTracker.Entries().Where(e => e.State == EntityState.Added).Select(e => e.Entity).ToArray();
foreach (var entity in created)
{
if (entity is AuditFields)
{
var auditFields = entity as AuditFields;
auditFields.CreateDateTimeUtc = DateTime.UtcNow;
auditFields.ModifiedDateTimeUtc = DateTime.UtcNow;
auditFields.Active = true;
}
}
}
else if (x == EntityState.Modified)
{
//Modified record changes
var modified = this.ChangeTracker.Entries().Where(e => e.State == EntityState.Modified).Select(e => e.Entity).ToArray();
foreach (var entity in modified)
{
if (entity is AuditFields)
{
var auditFields = entity as AuditFields;
auditFields.ModifiedDateTimeUtc = DateTime.UtcNow;
}
}
}
else
{
//do nothing
}
});
return base.SaveChanges();
}
Since you said:
is there any way to intercept the creation of new User objects?
You would want to do your logic in the EntityState.Added area of code above and this will allow you to intercept the creation of your new User and do whatever you want to do before it is saved to Database.

Reactive services. Grouping and caching streams

New: Entire source code with tests is now at https://github.com/bboyle1234/ReactiveTest
Let's imagine we have a view state object that is able to be updated by small partial view change events. Here are some example models of the total view, the incremental view update events and the accumulator function Update that builds the total view:
interface IDeviceView : ICloneable {
Guid DeviceId { get; }
}
class DeviceTotalView : IDeviceView {
public Guid DeviceId { get; set; }
public int Voltage { get; set; }
public int Currents { get; set; }
public object Clone() => this.MemberwiseClone();
}
class DeviceVoltagesUpdateView : IDeviceView {
public Guid DeviceId { get; set; }
public int Voltage { get; set; }
public object Clone() => this.MemberwiseClone();
}
class DeviceCurrentsUpdateView : IDeviceView {
public Guid DeviceId { get; set; }
public int Current { get; set; }
public object Clone() => this.MemberwiseClone();
}
class DeviceUpdateEvent {
public DeviceTotalView View;
public IDeviceView LastUpdate;
}
static DeviceUpdateEvent Update(DeviceUpdateEvent previousUpdate, IDeviceView update) {
if (update.DeviceId != previousUpdate.View.DeviceId) throw new InvalidOperationException("Device ids do not match (numskull exception).");
var view = (DeviceTotalView)previousUpdate.View.Clone();
switch (update) {
case DeviceVoltagesUpdateView x: {
view.Voltage = x.Voltage;
break;
}
case DeviceCurrentsUpdateView x: {
view.Currents = x.Current;
break;
}
}
return new DeviceUpdateEvent { View = view, LastUpdate = update };
}
Next, let's imagine we already have an injectable service that is able to produce an observable stream of the small update events for all devices, and that we want to create a service that can produce an aggregated view stream for individual devices.
Here is the interface of the service we want to create:
interface IDeviceService {
/// <summary>
/// Gets an observable that produces aggregated update events for the device with the given deviceId.
/// On subscription, the most recent event is immediately pushed to the subscriber.
/// There can be multiple subscribers.
/// </summary>
IObservable<DeviceUpdateEvent> GetDeviceStream(Guid deviceId);
}
How can I implement this interface and its requirements using the reactive extensions in the System.Reactive v4 library, targeting .netstandard2.0? Here's my boiler code with comments and that's as far as I've been able to get.
class DeviceService : IDeviceService {
readonly IObservable<IDeviceView> Source;
public DeviceService(IObservable<IDeviceView> source) { // injected parameter
/// When injected here, "source" is cold in the sense that it won't produce events until the first time it is subscribed.
/// "source" will throw an exception if its "Subscribe" method is called more than once as it is intended to have only one observer and
/// be read all the way from the beginning.
Source = source;
/// Callers of the "Subscribe" method below will expect data to be preloaded and will expect to be immediately delivered the most
/// recent event. So we need to immediately subscribe to "source" and start preloading the aggregate streams.
/// I'm assuming there is going to need to be a groupby to split the stream by device id.
var groups = source.GroupBy(x => x.DeviceId);
/// Now somehow we need to perform the aggregrate function on each grouping.
/// And create an observable that immediately delivers the most recent aggregated event when "Subscribe" is called below.
}
public IObservable<DeviceUpdateEvent> GetDeviceStream(Guid deviceId) {
/// How do we implement this? The observable that we return must be pre-loaded with the latest update
throw new NotImplementedException();
}
}
You have some weird code in that gist. Here's what I got working:
public class DeviceService : IDeviceService, IDisposable
{
readonly IObservable<IDeviceView> Source;
private readonly Dictionary<Guid, IObservable<DeviceUpdateEvent>> _updateStreams = new Dictionary<Guid, IObservable<DeviceUpdateEvent>>();
private readonly IObservable<(Guid, IObservable<DeviceUpdateEvent>)> _groupedStream;
private readonly CompositeDisposable _disposable = new CompositeDisposable();
public DeviceService(IObservable<IDeviceView> source)
{
Source = source;
_groupedStream = source
.GroupBy(v => v.DeviceId)
.Select(o => (o.Key, o
.Scan(new DeviceUpdateEvent { View = DeviceTotalView.GetInitialView(o.Key), LastUpdate = null }, (lastTotalView, newView) => lastTotalView.Update(newView))
.Replay(1)
.RefCount()
));
var groupSubscription = _groupedStream.Subscribe(t =>
{
_updateStreams[t.Item1] = t.Item2;
_disposable.Add(t.Item2.Subscribe());
});
_disposable.Add(groupSubscription);
}
public void Dispose()
{
_disposable.Dispose();
}
public IObservable<DeviceUpdateEvent> GetDeviceStream(Guid deviceId)
{
/// How do we implement this? The observable that we return must be pre-loaded with the latest update
if(this._updateStreams.ContainsKey(deviceId))
return this._updateStreams[deviceId];
return _groupedStream
.Where(t => t.Item1 == deviceId)
.Select(t => t.Item2)
.Switch();
}
}
The meat here is the _groupedStream piece. You group by DeviceId, as you said, then you use Scan to update state. I also moved Update to a static class and made it an extension method. You'll need an initial state, so I modified your DeviceTotalView class to get that. Modify accordingly:
public class DeviceTotalView : IDeviceView
{
public Guid DeviceId { get; set; }
public int Voltage { get; set; }
public int Currents { get; set; }
public object Clone() => this.MemberwiseClone();
public static DeviceTotalView GetInitialView(Guid deviceId)
{
return new DeviceTotalView
{
DeviceId = deviceId,
Voltage = 0,
Currents = 0
};
}
}
Next, the .Replay(1).Refcount() serves to remember the most recent update then provide that on subscription. We then stuff all of these child observables into a dictionary for easy retrieval on the method call. The dummy subscriptions (_disposable.Add(t.Item2.Subscribe())) are necessary for Replay to work.
In the event that there's an early request for a DeviceId that doesn't yet have an update, we subscribe to the _groupedStream which will wait for the first update, producing that Id's observable, then .Switch subscribes to that child observable.
However, all of this failed against your test code, I'm guessing because of the ConnectableObservableForAsyncProducerConsumerQueue class. I didn't want to debug that, because I wouldn't recommend doing something like that. In general it's not recommended to mix TPL and Rx code. They problems they solve largely overlap and they get in each other's way. So I modified your test code replacing that connectable observable queue thing with a Replay subject.
I also added the test-case for an early request (before an updates for that Device have arrived):
DeviceUpdateEvent deviceView1 = null;
DeviceUpdateEvent deviceView2 = null;
DeviceUpdateEvent deviceView3 = null;
var subject = new ReplaySubject<IDeviceView>();
var id1 = Guid.NewGuid();
var id2 = Guid.NewGuid();
var id3 = Guid.NewGuid();
subject.OnNext(new DeviceVoltagesUpdateView { DeviceId = id1, Voltage = 1 });
subject.OnNext(new DeviceVoltagesUpdateView { DeviceId = id1, Voltage = 2 });
subject.OnNext(new DeviceVoltagesUpdateView { DeviceId = id2, Voltage = 100 });
var service = new DeviceService(subject);
service.GetDeviceStream(id1).Subscribe(x => deviceView1 = x);
service.GetDeviceStream(id2).Subscribe(x => deviceView2 = x);
service.GetDeviceStream(id3).Subscribe(x => deviceView3 = x);
/// I believe there is no need to pause here because the Subscribe method calls above
/// block until the events have all been pushed into the subscribers above.
Assert.AreEqual(deviceView1.View.DeviceId, id1);
Assert.AreEqual(deviceView2.View.DeviceId, id2);
Assert.AreEqual(deviceView1.View.Voltage, 2);
Assert.AreEqual(deviceView2.View.Voltage, 100);
Assert.IsNull(deviceView3);
subject.OnNext(new DeviceVoltagesUpdateView { DeviceId = id2, Voltage = 101 });
Assert.AreEqual(deviceView2.View.Voltage, 101);
subject.OnNext(new DeviceVoltagesUpdateView { DeviceId = id3, Voltage = 101 });
Assert.AreEqual(deviceView3.View.DeviceId, id3);
Assert.AreEqual(deviceView3.View.Voltage, 101);
That passes fine and can be run without async.
Also, as a general tip, I would recommend doing unit tests for Rx code with the Microsoft.Reactive.Testing package, rather than time-gapping things.
A huge thanks to #Shlomo for the answer above.
The implementation given in the accepted answer, whilst a magical education for me, had a couple of issues that also needed to be solved in turn. The first was a threadrace problem, and the second was performance when a large number of devices were in the system. I ended up solving the threadrace AND dramatically improving performance with this modified implementation:
In the constructor, the grouped and scanned device stream is subscribed directly to a BehaviorSubject, which implements the Replay(1).RefCount() functionality required to immediately notify new subscribers of the latest value in the stream.
In the GetDeviceStream method, we continue to use a dictionary lookup to find the device stream, creating a preloaded BehaviorSubject if it doesn't already exist in the dictionary. We have removed the Where search that existed in the previous implementation in the question above. Using the where search caused a threadrace problem that was solved by making the grouped stream replayable. That caused an expontial performance issue. Replacing it with FirstOrDefault reduced the time take by half, and then removing it completely in favor of the GetCreate dictionary technique gave perfect perfomance O(1) instead of O(n2).
GetCreateSubject uses the Lazy proxy object as the dictionary value because the ConcurrentDictionary can sometimes call the Create method more than once for a single key. Supplying a Lazy to the dictionary ensures that the Value property is only called on one of the lazies, and therefore only one BehaviorSubject is created per device.
class DeviceService : IDeviceService, IDisposable {
readonly CompositeDisposable _disposable = new CompositeDisposable();
readonly ConcurrentDictionary<Guid, Lazy<BehaviorSubject<DeviceUpdateEvent>>> _streams = new ConcurrentDictionary<Guid, Lazy<BehaviorSubject<DeviceUpdateEvent>>>();
BehaviorSubject<DeviceUpdateEvent> GetCreateSubject(Guid deviceId) {
return _streams.GetOrAdd(deviceId, Create).Value;
Lazy<BehaviorSubject<DeviceUpdateEvent>> Create(Guid id) {
return new Lazy<BehaviorSubject<DeviceUpdateEvent>>(() => {
var subject = new BehaviorSubject<DeviceUpdateEvent>(DeviceUpdateEvent.GetInitialView(deviceId));
_disposable.Add(subject);
return subject;
});
}
}
public DeviceService(IConnectableObservable<IDeviceView> source) {
_disposable.Add(source
.GroupBy(x => x.DeviceId)
.Subscribe(deviceStream => {
_disposable.Add(deviceStream
.Scan(DeviceUpdateEvent.GetInitialView(deviceStream.Key), DeviceUtils.Update)
.Subscribe(GetCreateSubject(deviceStream.Key)));
}));
_disposable.Add(source.Connect());
}
public void Dispose() {
_disposable.Dispose();
}
public IObservable<DeviceUpdateEvent> GetDeviceStream(Guid deviceId) {
return GetCreateSubject(deviceId).AsObservable();
}
}
[TestMethod]
public async Task Test2() {
var input = new AsyncProducerConsumerQueue<IDeviceView>();
var source = new ConnectableObservableForAsyncProducerConsumerQueue<IDeviceView>(input);
var service = new DeviceService(source);
var ids = Enumerable.Range(0, 100000).Select(i => Guid.NewGuid()).ToArray();
var idsRemaining = ids.ToHashSet();
var t1 = Task.Run(async () => {
foreach (var id in ids) {
await input.EnqueueAsync(new DeviceVoltagesUpdateView { DeviceId = id, Voltage = 1 });
}
});
var t2 = Task.Run(() => {
foreach (var id in ids) {
service.GetDeviceStream(id).Subscribe(x => idsRemaining.Remove(x.View.DeviceId));
}
});
await Task.WhenAll(t1, t2);
var sw = Stopwatch.StartNew();
while (idsRemaining.Count > 0) {
if (sw.Elapsed.TotalSeconds > 600) throw new Exception("Failed");
await Task.Delay(100);
}
}
See entire problem source code and test code here: https://github.com/bboyle1234/ReactiveTest

Does a foreach of an IQueryable-as-IEnumerable allow previously iterated objects to be GC'ed / reclaimed?

Given, for example:
IEnumerable<LargeObject> Read(int x) {
// Implicit IQueryable<LargeObject> -> IEnumerable<LargeObject>
return ef6Context.LargeObjects.Where(o => o.ObjectType == x);
}
var largeObjects = Read(21281); // Returns "many" objects
// Only use/iteration of IEnumerable result
foreach (var o in largeObject) {
// When processing the second item (and so on),
// can the first (previous) "o" object be GC'ed?
Process(o);
}
Are the 'large objects' that have processed on previous loops eligible for garbage reclamation?
Answer should cover any internal EF caching, as appropriate.
Entity framework keeps all entities retrieved from the database in the context so they will not be collected after iteration. This is done in order to track changes. You cand disable this feature by calling AsNoTracking on your query before execution (docs). Doing so SaveChanges will not persistent any changes made to these entities.
As for the question on weather the object is released after every iteration, the answer is yes. I created a simple demo that ilustrates this by adding logging code in the constructor and destructor and added an explicit GC call. The destructor was called after every iteration.
public class Program
{
public static void Main(string[] args)
{
using (var db = new MyContext())
{
//for (int i = 0; i < 1000; i++)
//{
// db.Enta.Add(new Entity());
//}
//db.SaveChanges();
foreach (var e in db.Enta.AsNoTracking())
{
Console.WriteLine(e.Id);
GC.Collect();
}
GC.Collect();
}
}
}
public class Entity
{
public Entity()
{
Console.WriteLine("ctor");
}
~Entity()
{
Console.WriteLine("destructor");
}
public int Id { get; set; }
}
public class MyContext : DbContext
{
public DbSet<Entity> Enta { get; set; }
}

LINQ: transparent record inserting and updating

What is the custom wen the record inserting/updating is carried out?
I have this Log table in the MS SQL server database, and a C# class (example is simplified)
[Table(Name = "dbo.Sys_Log")]
public class Sys_Log
{
// Read-only, db-generated primary key ID
private int _logID;
[Column(IsPrimaryKey=true, Storage="_logID", IsDbGenerated=true)]
public int logID
{
get
{
return this._logID;
}
}
// Read-only db-generated datetime field
private System.DateTime _logTime;
[Column(Storage="_logTime", IsDbGenerated=true)]
public System.DateTime logTime
{
get
{
return this._logTime;
}
}
// Read-write string field
private string _logEvent;
[Column(Storage="_logEvent")]
public string logEvent
{
get
{
return this._logEvent;
}
set
{
this._logEvent = value;
}
}
public Sys_Log() {}
public Sys_Log(string logEvent)
{
this.logEvent = logEvent;
}
}
And this is how I add a log entry:
Table<Sys_Log> linqLog = db.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log("event");
linqLog.InsertOnSubmit(l);
db.SubmitChanges();
I am not particularly happy about this code. I'd like something like this instead:
Sys_Log.Log("event");
I have idea how this can be achieved, but I'd like to know if I am following the LINQ philosophy. With this code added to the Sys_Log class
private static DataContext db;
public static void Connect(DataContext db)
{
Sys_Log.db = db;
}
public static void Log(string logEvent)
{
Table<Sys_Log> linqLog = db.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log(logEvent);
linqLog.InsertOnSubmit(l);
db.SubmitChanges();
}
I can now do this:
Sys_Log.Connect(db); // Only once, at init
Sys_Log.Log("event1");
Sys_Log.Log("event2");
Are there any pitfalls, apart from the fact that the database is updated several times, that could be considered ineffective?
************** Update ******************
Following the advice of #usr not to reuse the DataContext object, I have made these changes to the Sys_Log class:
private static SqlConnection db;
public static void Connect(SqlConnection db)
{
Sys_Log.db = db;
}
public static void Log(string logEvent)
{
DataContext ctx = new DataContext(db);
ctx.CommandTimeout = 240;
Table<Sys_Log> linqLog = ctx.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log(logEvent);
linqLog.InsertOnSubmit(l);
ctx.SubmitChanges();
}
Use a fresh data context each time. Reusing the same context has to catastrophic consequences:
No entity memory is ever released
When an invalid entity enters the context (due to a bug) it is stuck and will forever prevent SubmitChanges from succeeding. The application will never recover
Also note, that L2S is deprecated and EF has superseded it.
You can share a SqlConnection and use it long-term if you really want. That requires, through, that you deal with broken connections. Thanks to connection pooling there are little performance incentives to do this.
It usually is the easiest and most clear way to use throw-away connections. Inject a factory, for example:
Func<SqlConnection> myFactory = () => new SqlConnection(myConnStr);
That's all there is to it. Use it, as always, with using:
using(var conn = myFactory()) { ... }

Exception deleting using Entity Framework (C#)

I have a problem with some simple code, I'm refactoring some existing code from LINQ to SQL to the Entity Framework. I'm testing my saves and deletes, and the delete is really bugging me:
[TestMethod]
public void TestSaveDelete()
{
ObjectFactory.Initialize(x =>
{
x.For<IArticleCommentRepository>().Use<ArticleCommentRepository>();
});
PLArticleComment plac = new PLArticleComment();
plac.Created = DateTime.Now;
plac.Email = "myemail";
plac.Name = "myName";
plac.Text = "myText";
plac.Title = "myTitle";
IArticleCommentRepository acrep = ObjectFactory.GetInstance<IArticleCommentRepository>();
try
{
PortalLandEntities ple = new PortalLandEntities();
int count = ple.PLArticleComment.Count();
acrep.Save(plac);
Assert.AreEqual(ple.PLArticleComment.Count(), count + 1);
//PLArticleComment newPlac = ple.PLArticleComment.First(m => m.Id == plac.Id);
//ple.Attach(newPlac);
acrep.Delete(plac);
Assert.AreEqual(ple.PLArticleComment.Count(), count + 1);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
Every time i try to run this code, I get an exception in the delete statement, telling me that its not contained within the current ObjectStateManager.Please note that both my Save and delete looks like this:
public void Delete(PLCore.Model.PLArticleComment comment)
{
using (PortalLandEntities ple = Connection.GetEntityConnection())
{
ple.DeleteObject(comment);
ple.SaveChanges();
}
}
public void Save(PLCore.Model.PLArticleComment comment)
{
using (PortalLandEntities ple = Connection.GetEntityConnection())
{
ple.AddToPLArticleComment(comment);
ple.SaveChanges();
}
}
and the connection thingy:
public class Connection
{
public static PortalLandEntities GetEntityConnection()
{
return new PortalLandEntities();
}
}
Any ideas on what i could do to make it work?
You cannot load an entity from one ObjectContext (in your case, an ObjectContext is an instance of PortalLandEntities) and then delete it from another ObjectContext, unless you detach it from the first and attach it to the second. Your life will be much, much simpler if you use only one ObjectContext at a time. If you cannot do that, you must manually Detach and then Attach first, all the while keeping track of which entities are connected to which ObjectContext.
How to use DI with your Connection : make it non-static.
public class Connection
{
private PortalLandEntities _entities;
public PortalLandEntities GetEntityConnection()
{
return _entities;
}
public Connection(PortalLandEntities entities)
{
this._entities = entities;
}
}
Then use a DI container per request. Most people do this via a controller factory.

Categories

Resources