Entity Framework 6 memory usage reaches 2GB - c#

I am building an application and am using Entity Framework 6. But I am running in to problems with memory usage. No matter what I try I sooner or later run into an out of memory error. So far I have tried the following:
Using using for the the context.
Save changes in batches and disposing of the context.
Manually calling GC.Collect().
But none of these prevent the Entity framework of using more memory with every saveChanges I do. Eventually hitting the 2GB limit and crashing my program.
Is there any way I am unaware of to make the Entity Framework release all memory?
Edit
using (var sqlite = new myEntities())
{
sqlite.Configuration.AutoDetectChangesEnabled = false;
sqlite.Configuration.ValidateOnSaveEnabled = false;
foreach (var someItem in someList)
{
var newItem = new Item
{
...
};
sqlite.tableName.Add(newItem);
if (++countRecords%1000 == 0)
{
sqlite.SaveChanges();
}
}
sqlite.SaveChanges();
}
As described above I also have tried setting the context without the using and disposing it after the SaveChanges.
if (++countRecords%1000 == 0)
{
sqlite.SaveChanges();
sqlite.Dispose();
sqlite = new myEntities()
}

If it is indeed a batch issue, try something like this:
int batchSize = 10;
for (int i = 0; i < = someList.Count / batchSize; i++)
{
var batch = someList.Skip(batchSize * i).Take(batchSize);
using (var sqllite = new nyEntities())
{
foreach(var item in batch)
{
var newItem = new Item() {...};
sqllite.tableName.Add(newItem);
}
sqllite.SaveChanges();
}
}
This inverts the using statement to dispose the sqllite after each batch, thus clearing it out and starting fresh for each batch.
This code was made in notepad++ so be careful to clean it up if you try it out.

Related

Unity crashes when reading a large csv file

I need to find a way to read information out of a very big CSV file with unity. The file is approx. 15000*4000 entries with almost 200MB and could even be longer.
Just using ReadAllLines on the file does kind of work but as soon as I try to do any operation on it, it will crash. Here is the code I am using just counting all non zero values which already crashes it. It's okay if the code might need loading time but it shouldn't crash. I assume it's because I save everything in the memory and therefore flood my RAM? Any ideas how to fix this that it won't crash?
private void readCSV()
{
string[] lines = File.ReadAllLines("Assets/Datasets/testCsv.csv");
foreach (string line in lines)
{
List<string> values = new List<string>();
values = line.Split(',').ToList();
int i = 0;
foreach (string val in values)
{
if (val != "0")
{
i++;
}
}
}
}
As I already stated in your other question you should rather go with a streamed solution in order to not load the entire thing into memory at all.
Also both FileIO as well as string.Split are slow especially for soany entries! Rather use a background thread / async Task for this!
The next future possible issue in your case 15000*4000 entries means a total of 60000000 cells. Which is still fine. However, the maximum value of int is 2147483647 so if your file grows further it might break / behave unexpected => rather use e.g. uint or directly ulong to avoid that issue.
private async Task<ulong> CountNonZeroEntries()
{
ulong count = 0;
// Using a stream reader you can load the content into memory one line at a time
using(var sr = new StreamReader("Assets/Datasets/testCsv.csv"))
{
while(true)
{
var line = await sr.ReadLineAsync();
if(line == null) break;
var values = line.Split(',');
foreach(var v in values)
{
if(v != "0") count++;
}
}
}
return count;
}
And then of course you would need to wait for the result e.g. using
// If you declare Start as asnyc Unity automatically calls it asynchronously
private async void Start()
{
var count = await CountNonZeroEntries();
Debug.Log($"{count} cells are != \"0\".");
}
The same can be done using Linq a bit easier to write in my eyes
using System.Linq;
...
private Task<ulong> CountNonZeroEntries()
{
return File.ReadLines("Assets/Datasets/testCsv.csv").Select(line => line.Split(',')).Count(v => v != "0");
}
Also File.ReadLines doesn't load the entire content at once but rather a lazy enumerable so you can use Linq queries on them one by one.

Single ScopeTransaction with multiple DbContexts

I have searched a lot on the Internet without finding similar case. I have one TransactionScope with several DbContext.
I want to commit changes to the database only in case all context have saved changes successfully.
But the problem I'm facing is that I had to call generalContext.SaveChanges in the middle of the code as the changes took place on data was retrieved by generalContext sometime earlier, but I noticed the changes are committed right away after calling generalContext.SaveChanges().
What problem did I do?
I have tried TransactionScopeOption.Required and TransactionScopeOption.RequiresNew, without being able to solve the problem
var transactionOptions = new TransactionOptions();
transactionOptions.Timeout = TimeSpan.FromMinutes(30);
using (var scope = new TransactionScope(TransactionScopeOption.Required, transactionOptions))
using (var generalContext = new CoreEntities())
{
try
{
using (var subContext = new CoreEntities())
{
// the problem is here, that the changes are committed!!!
generalContext.SaveChanges();
subContext.SaveChanges();
}
scope.Complete();
}
catch
{
scope.Dispose();
}
}

Pooling MySQL Connections with Microsoft Enterprise Library

My setup is MySql.Data.MySqlClient v6.9.8.0 and Microsoft.Practices.EnterpriseLibrary.Data v6.0.0.
The program is a long running program that runs continuously listening for tasks and then performs the job with some form of database action (depending on what the request was.) Sometimes the requests will be one after the other, sometimes there will be several hours between them.
I've tried using Pooling=true in the connection string but it causes me a lot of problems (not all the time - these are intermittent problems.)
Here is an example:
[MySqlException (0x80004005): Authentication to host 'localhost' for user 'root' using method 'mysql_native_password' failed with message: Reading from the stream has failed.]
Turning off pooling fixes the problem but at the same time it makes the queries slower because we can't reuse connections. I've searched online and a lot of people have this same issue and the only fix/workaround I've found is Pooling=false which I'd rather avoid if possible.
Here is an example of my query code:
Database db = this.GetDatabase(databaseName);
List<dynamic> results = new List<dynamic>();
// Run the sql query
using (DbCommand dbCommand = db.GetSqlStringCommand(query))
{
foreach (var parameter in inParameters)
{
db.AddInParameter(dbCommand, parameter.Key, parameter.Value.Item1, parameter.Value.Item2);
}
foreach (var parameter in outParameters)
{
db.AddOutParameter(dbCommand, parameter.Key, parameter.Value.Item1, parameter.Value.Item2);
}
using (IDataReader dataReader = db.ExecuteReader(dbCommand))
{
IDictionary<string, object> instance;
do
{
// Read each row
while (dataReader.Read())
{
instance = new ExpandoObject() as IDictionary<string, object>;
// Populate the object on the fly with the data
for (int i = 0; i < dataReader.FieldCount; i++)
{
instance.Add(dataReader.GetName(i), dataReader[i]);
}
// Add the object to the results list
results.Add(instance);
}
} while (dataReader.NextResult());
}
return results;
}
Any ideas?
Can you try this? I know, I know. using "using" should mean I don't have to call the dataReader.Close() method...but I still do it. I also slightly altered the dr.Read block.
This guy talks about it.
http://www.joseguay.com/uncategorized/ensure-proper-closure-disposal-of-a-datareader
I know, I know. You shouldn't have to. Even when using Ent library, I do an extra .Close step to try and make sure.
Database db = this.GetDatabase(databaseName);
List<dynamic> results = new List<dynamic>();
// Run the sql query
using (DbCommand dbCommand = db.GetSqlStringCommand(query))
{
foreach (var parameter in inParameters)
{
db.AddInParameter(dbCommand, parameter.Key, parameter.Value.Item1, parameter.Value.Item2);
}
foreach (var parameter in outParameters)
{
db.AddOutParameter(dbCommand, parameter.Key, parameter.Value.Item1, parameter.Value.Item2);
}
using (IDataReader dataReader = db.ExecuteReader(dbCommand))
{
IDictionary<string, object> instance;
while (dataReader.Read())
{
instance = new ExpandoObject() as IDictionary<string, object>;
// Populate the object on the fly with the data
for (int i = 0; i < dataReader.FieldCount; i++)
{
instance.Add(dataReader.GetName(i), dataReader[i]);
}
// Add the object to the results list
results.Add(instance);
}
if (dataReader != null)
{
try
{
dataReader.Close();
}
catch { }
}
}
return results;
}

Nhibernate transaction locks a tabel

I have developed a WCF api which is using nHibernate. I am new to this. I have used session.update to take care of transaction. I have a for loop in which based on select condition I am updating a record ie. If A is present in tabel1 then I am updating the table else inserting a new entry.
I am getting "could not execute query." when trying to execute a select query on a table which was previously being updated by adding a new entry in the table.
What I think is, because I am using session.save(table1) and then trying select entries from that table I am getting an error. Since session.save temporarily locks the table I am not able to execute a select query on that table.
What can be the solution on this?
Update:
This the for loop I am using to check in the database for some field:
using (ITransaction tranx = session.BeginTransaction())
{
savefunction();
tranx.Commit();
}
Save function:
public void savefunction()
{
for (int i = 0; i < dictionary.Count; i++)
{
ICandidateAttachmentManager candidateAttach = new ManagerFactory().GetCandidateAttachmentManager();
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
if(attach == null)
{
//insert new entry into table attach
session.save(attach);
}
}
}
checkCV function:
public void checkCV()
{
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
{
IList<CandidateAttachment> lstCandidateAttachment = CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
if (lstCandidateAttachment.Count > 0)
{
CandidateAttachment attach = lstCandidateAttachment.Where(x => x.CandidateAttachementType.Id.Equals(FileType)).FirstOrDefault();
if (attach != null)
{
return null;
}
else
{
return "some string";
}
}
}
}
What happening here is in the for loop if say for i=2 the attach value comes to null that I am entering new entry into attach table. Then for i=3 when it enters checkCV function I get an error at this line:
IList lstCandidateAttachment =
CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
I think it is because since I am using session.save and then trying to read the tabel contents I am unable to execute the query and table is locked till I commit my session. Between the beginTransaction and commit, the table associated with the object is locked. How can I achieve this? Any Ideas?
Update:
I read up on some of the post. It looks like I need to set isolation level for the transaction. But even after adding it doesn't seem to work. Here is how I tried to inplement it:
using (ITransaction tranx = session.BeginTransaction(IsolationLevel.ReadUncommitted))
{
saveDocument();
}
something I don't understand in your code is where you get your nHibernate session.
Indeed you use
new ManagerFactory().GetCandidateAttachmentManager();
and
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
so your ManagerFactory class provides you the ISession ?
then you do:
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
but
checkCV() returns either a null or a string ?
Finally you should never do
Save()
but instead
SaveOrUpdate()
Hope that helps you resolving your issue.
Feel free to give more details

Dynamics CRM SDK: Execute Multiple Requests for Bulk Update of around 5000 Records

I have written a function to update Default Price List for all the Active Products on the CRM 2013 Online.
//The method takes IOrganization service and total number of records to be created as input
private void UpdateMultipleProducts(IOrganizationService service, int batchSize, EntityCollection UpdateProductsCollection, Guid PriceListGuid)
{
//To execute the request we have to add the Microsoft.Xrm.Sdk of the latest SDK as reference
ExecuteMultipleRequest req = new ExecuteMultipleRequest();
req.Requests = new OrganizationRequestCollection();
req.Settings = new ExecuteMultipleSettings();
req.Settings.ContinueOnError = true;
req.Settings.ReturnResponses = true;
try
{
foreach (var entity in UpdateProductsCollection.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
entity.Attributes["pricelevelid"] = new EntityReference("pricelevel", PriceListGuid);
req.Requests.Add(updateRequest);
}
var res = service.Execute(req) as ExecuteMultipleResponse; //Execute the collection of requests
}
//If the BatchSize exceeds 1000 fault will be thrown.In the catch block divide the records into batchable records and create
catch (FaultException<OrganizationServiceFault> fault)
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
}
}
Code Description : There are around 5000 active product records in the System. So I am updating Default Price List for all of them using above code.
But, I am missing here something so that, it has updated only 438 records. It loops through the While statement correctly, but it is not updating all of them here.
What should be the Batchsize when we run this function for the First Time?
Any one can help me here?
Thank you,
Mittal.
You pass remainingCreates as the batchSize parameter but your code never references batchSize so you are just going to reenter that while loop every time.
Also, I'm not sure how you are doing all your error handling but you need to update your catch block so that it doesn't just let FaultExceptions pass-through if they don't contain a MaxBatchSize value. Right now, if you take a FaultException regarding something other than batch size it will be ignored.
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
else throw;
}
Instead of reactive handling, i prefer proactive handling of the MaxBatchSize, this is true when you already know what is MaxMatchSize is.
Following is sample code, here while adding OrgRequest to collection i keep count of batch and when it exceeds I call Execute and reset the collection to take fresh batch.
foreach (DataRow dr in statusTable.Rows)
{
Entity updEntity = new Entity("ABZ_NBA");
updEntity["ABZ_NBAid"] = query.ToList().Where(a => a.NotificationNumber == dr["QNMUM"].ToString()).FirstOrDefault().TroubleTicketId;
//updEntity["ABZ_makerfccall"] = false;
updEntity["ABZ_rfccall"] = null;
updEntity[cNBAttribute.Key] = dr["test"];
req.Requests.Add(new UpdateRequest() { Target = updEntity });
if (req.Requests.Count == 1000)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
req.Requests = new OrganizationRequestCollection();
}
}
if (req.Requests.Count > 0)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
}

Categories

Resources