I'm looking at a problem where I wish to get a collection from an expensive service call and then store it in cache so it can be used for subsequent operations on the UI. The code I'm using is as follows:
List<OrganisationVO> organisations = (List<OrganisationVO>)MemoryCache.Default["OrganisationVOs"];
List<Organisation> orgs = new List<Organisation>();
if (organisations == null)
{
organisations = new List<OrganisationVO>();
orgs = pmService.GetOrganisationsByName("", 0, 4000, ref totalCount);
foreach (Organisation org in orgs)
{
OrganisationVO orgVO = new OrganisationVO();
orgVO = Mapper.ToViewObject(org);
organisations.Add(orgVO);
}
MemoryCache.Default.AddOrGetExisting("OrganisationVOs", organisations, DateTime.Now.AddMinutes(10));
}
List<OrganisationVO> data = new List<OrganisationVO>();
data = organisations;
if (!string.IsNullOrEmpty(filter) && filter != "*")
{
data.RemoveAll(filterOrg => !filterOrg.DisplayName.ToLower().StartsWith(filter.ToLower()));
}
The issue I'm facing is that the data.RemoveAll operation affects the cached version. i.e. I want the cached version to always reflect the full dataset returned by the service call. I then want to retrieve this collection from cache whenever the filter is set and apply it but this should not change cached data - i.e. subsequent filters should happen on the full dataset - what is the best way to do this?
You need to make copy of the list if you want to use RemoveAll operation (ToList would be enough).
Also instead of modigying the list consider using LINQ operations like Where/Select.
I would either:
apply the filter dynamically and replace the filter if needed (so you cache the complete data but only return the cachedData.Where(currentFilter)
make two caches - one for the complete data and one for the filtered data - in this case the first one should only consist of the data returned from the service - no need to cache the VO-data as well
Related
I am using Seed() method of Configuration.cs class for filling data in database when using Update-Database command.
Among other things I am creating list of EventCategory objects like this:
private IList<EventCategory> CreateEventCategoriesTestDefinition()
{
eventCategories = new List<EventCategory>();
var eventCategoryRecruitment = new EventCategory("Recruitment");
eventCategories.Add(eventCategoryRecruitment);
var eventCategoryInternship = new EventCategory("Internship");
eventCategories.Add(eventCategoryInternship);
var eventCategoryTrainingPrograms = new EventCategory("Training Programs");
eventCategoryTrainingPrograms.Events
.Add(new Event("Managerial Training Program 2012-2014", eventCategoryTrainingPrograms));
eventCategories.Add(eventCategoryTrainingPrograms);
var eventCategoryEmployee = new EventCategory("Employee & Team Potential");
eventCategories.Add(eventCategoryEmployee);
return eventCategories;
}
Adding element by element. eventCategory is just a private property:
private IList<EventCategory> eventCategories;
From Seed() method I am calling CreateEventCategoriesTestDefinition()
Almost everything is good but when I go to database to check data I have noticed that data in EventCategory table doesn't have correct order:
As you can see it on a picture Internshipand Training Programs switched positions comparing to order of adding inCreateEventCategoriesTestDefinition() method.
Does anybody knows what is happening here? Why order of adding is not preserved? I know it should be perserved in List<>, but is not the same for IList<>?
Or this is maybe has something to do with EntityFramework?
If you are relying upon the database for your sorting order then either.
Turn auto-id incrementation off and specify your own ID
EventCategory(int id, string name)
If you have to use database identity then instead try using a sort order (int) column for your objects
EventCategory(string name, int sortOrder)
Either way, you cannot guarantee that you'll get a sorted List persisted to the database. I cant say for certain your use case here, but you shouldn't reply on SQL to sort your objects for you, but when querying the database use linq to order them before binding to a view.
e.g.
var mySortedCategories = dbContext.EventCategories.OrderBy(x => x.Name).ToList();
I have a collection of Employees that I stored inside MemoryCache.Now after update of particular employee details, I want to update the same cache object with updated employee details. Something like that:
var AllEmployees= cache.Get("AllEmployees");
if(AllEmployees != null)
{
var empToUpdate = AllEmployees.FirstOrDefault(i => i.EmpId== Employee.EmpId);
if (empToUpdate != null)
{
AllEmployees.Remove(empToUpdate );
AllEmployees.add(empToUpdate );
}
}
but since cache is memory cache and it has cached IEnumerable<Employee>,i am not able to directly manipulate cache object.I am not able to call these methods FirstOrDefault,Remove,Add
As mentioned in my comment i cant see a reason to no use FirstOrDefault.
IEnumerables are not ment to be modified. So you have to go a heavy, performanceunfirendly way and create a new IEnumerable without a specific item (IENumerable.Except(array of items to exclude)) then yo gonna concat it with another sequenze, which contains your new element to add.
var empToUpdate = AllEmployees.FirstOrDefault(i => i.EmpId== Employee.EmpId);
if (empToUpdate != null)
{
AllEmployees = AllEmployees.Except(new[]{empToUpdate}).Concat(new[] { empToUpdate});
}
Anyways i dont see a sense in this code, cause you are removing the object and immediatly add it again.
I need to be able to write a logic that helps me export/import the whole database. Of course, the ids should be ignored when doing this so if I export the data, and then import it - the whole graph should be cloned.
The idea was to use simple binary serialization without any custom code - so I could serialize any graph of objects I want. But I stopped at NHibernate problem.
The thing is that this graph contains many objects that are actually different objects (different references) of the single persistent object. It is very difficult to fix that as the graph is very complex and I need to redo the whole application for that. So I have to live with this.
If I just save all the graph to the file and then deserialize it and try to save to DB as-is - these objects will have some ids assigned, so NHibernate will probably fall. I need to clear the ids. But if I do this - NHibernate stops knowing about the identity of each object, so every object is transient and, of course, is not equal.
Example:
I have User {Id = 3}
and Mail {Id = 2, with User(Id = 3)}
Two users here have the same Id - so they are equal. But not reference equal.
When I clear Ids of this graph - both users become different objects, as they are not reference-equal.
I was thinking - can I tell NHibernate somehow that even though the objects have ids (!= 0), but they are transient and should be inserted to the DB and they should receive new Ids. Or maybe you know another way of solving my problem.
P.S. All the objects are detached - when I say that they are persistent, I mean that they have Id != 0 and they had their copies in DB before exporting (it's possible that was another DB)
Update
I have added an example of the code I want to work. The SaveOrUpdate calls in the end should insert exactly one object each run. The actual code is a bit more complicated, but the thing is that I have a single hierarchy with s1 and s2 in it (two different objects which represent the single persistant object. Their Equal() == true, but ReferenceEqual == false) and I need to clone it and save it and ensure that the result object will be single in the database.
User s1;
User s2;
using (var session = DBHandler.GetSessionFactory().OpenSession())
{
s1 = session.Get<User>(1);
}
using (var session = DBHandler.GetSessionFactory().OpenSession())
{
s2 = session.Get<User>(1);
}
var c1 = (User)DBHandler.DeepClone(s1);
var c2 = (User)DBHandler.DeepClone(s2);
// These updates should insert only one object, because it is actually one object.
using (var session = DBHandler.GetSessionFactory().OpenSession())
{
session.SaveOrUpdate(c1);
}
using (var session = DBHandler.GetSessionFactory().OpenSession())
{
session.SaveOrUpdate(c2);
}
same as here a proof of concept
Configuration config;
ISessionFactory factory;
public object DeepClone(object original)
{
var metadata = factory.GetClassMetadata(original.GetType());
var clone = metadata.Instantiate(0 /*or extract unsaved value from config*/, EntityMode.Poco);
var values = metadata.GetPropertyValues(original, EntityMode.Poco);
for (int i = 0; i < metadata.PropertyTypes.Length; i++)
{
if (metadata.PropertyTypes[i].IsAssociationType && values[i] != null)
{
values[i] = DeepClone(values[i]);
}
if (metadata.PropertyTypes[i].IsCollectionType)
{
// TODO: Copy Collection
}
}
metadata.SetPropertyValues(clone, values, EntityMode.Poco);
return clone;
}
I have a console application with a few methods that:
insert data1 (customers) from db 1 to db 2
update data1 from db 1 to db 2
insert data2 (contacts) from db 1 to db 2
insert data2 from db 1 to db 2
and then some data from db 2 (accessed by web services) to db 1 (MySql), the methods are initialized on execution of the application.
With these inserts and updates I need to compare a field (country state) with a value in a list I get from a web service. To get the states I have to do:
GetAllRecord getAllStates = new GetAllRecord();
getAllStates.recordType = GetAllRecordType.state;
getAllStates.recordTypeSpecified = true;
GetAllResult stateResult = _service.getAll(getAllStates);
Record[] stateRecords = stateResult.recordList;
and I can then loop through the array and look for shortname/fullname with
if (stateResult.status.isSuccess)
{
foreach (State state in stateRecords)
{
if (addressState.ToUpper() == state.fullName.ToUpper())
{
addressState = state.shortname;
}
}
}
As it is now I have the code above in all my methods but it takes a lot of time to fetch the state data and I have to do it many times (about 40k records and the web service only let me get 1k at a time so I have to use a "searchNext" method 39 times meaning that I query the web service 40 times for the states in each method.
I guess I could try to come up with something but I'm just checking what best praxis would be? If I create a separate method or class how can I access this list with all its values many times without having to download them again?
Edit: should I do something like this:
GetAllRecord getAllStates = new GetAllRecord();
getAllStates.recordType = GetAllRecordType.state;
getAllStates.recordTypeSpecified = true;
GetAllResult stateResult = _service.getAll(getAllStates);
Record[] stateRecords = stateResult.recordList;
Dictionary<string, string> allStates = new Dictionary<string, string>();
foreach (State state in stateRecords)
{
allStates.Add(state.shortname, state.fullName);
}
I am not sure where to put it though and how to access it from my methods.
One thing first, you should add a break to your code when you get a match. No need to continue looping the foreach after you have a match.
addressState = state.shortname;
break;
40 thousand records isn´t necessarily that much in with todays computers, and I would definitely implement a cache of all the fullnames <-> shortname.
If the data don´t change very often this is a perfectly good approach.
Create a Dictionary with fullName as the key and shortName as the value. Then you can just do a lookup in the methods which needs to translate the full name to the short name. You could either store this list as a static variable accessible from other classes, or have it in an instance class which you pass to your other objects as a reference.
If the data changes, you could refresh your cache every so often.
This way you only call the web service 40 times to get all the data, and all other lookups are in memory.
Code sample (not tested):
class MyCache
{
public static Dictionary<string,string> Cache = new Dictionary<string,string>();
public static void FillCache()
{
GetAllRecord getAllStates = new GetAllRecord();
getAllStates.recordType = GetAllRecordType.state;
getAllStates.recordTypeSpecified = true;
GetAllResult stateResult = _service.getAll(getAllStates);
Record[] stateRecords = stateResult.recordList;
if (stateResult.status.isSuccess)
{
foreach (State state in stateRecords)
{
Cache[state.fullName.ToUpper()] = state.shortname;
}
}
// and some code to do the rest of the web service calls until you have all results.
}
}
void Main()
{
// initialize the cache
MyCache.FillCache();
}
and in some method using it
...
string stateName = "something";
string shortName = MyCache.Cache[stateName.ToUpper()];
An easy way would be (and you really should) to cache the data locally. If I understand you correctly you do the webservice check everytime something changes which is likely unneccessary.
An easy implementation (if you can't or don't want to change your original data structures) would be to use a Dictionary somewhat like:
Dictionary<String, String> cache;
cache[addressState] = state.shortname;
BTW: You REALLY should not be using ToUpper for case insensitive compares. Use String.Compare (a, b, StringComparison.OrdinalIgnoreCase) instead.
From what I gather all the first bit of code is in some form of loop, and because of which the following line (which internally does the call to the web service) is being called 40 times:
GetAllResult stateResult = _service.getAll(getAllStates);
Perhaps you should try moving the stateResult variable to a class level scope: make it a private variable or something. So at least it will be there for the life time of the object. In the constructor of the class or in some method, make a call to the method on the object which interfaces with the ws. If you've gone with writing a method, make sure you've called the method once before you execute your loop-logic.
Hence you wouldn't have to call the ws all the time, just once.
Is there a "best practice" way of handling bulk inserts (via LINQ) but discard records that may already be in the table? Or I am going to have to either do a bulk insert into an import table then delete duplicates, or insert one record at a time?
08/26/2010 - EDIT #1:
I am looking at the Intersect and Except methods right now. I am gathering up data from separate sources, converting into a List, want to "compare" to the target DB then INSERT just the NEW records.
List<DTO.GatherACH> allACHes = new List<DTO.GatherACH>();
State.IState myState = null;
State.Factory factory = State.Factory.Instance;
foreach (DTO.Rule rule in Helpers.Config.Rules)
{
myState = factory.CreateState(rule.StateName);
List<DTO.GatherACH> stateACHes = myState.GatherACH();
allACHes.AddRange(stateACHes);
}
List<Model.ACH> newRecords = new List<Model.ACH>(); // Create a disconnected "record set"...
foreach (DTO.GatherACH record in allACHes)
{
var storeInfo = dbZach.StoreInfoes.Where(a => a.StoreCode == record.StoreCode && (a.TypeID == 2 || a.TypeID == 4)).FirstOrDefault();
Model.ACH insertACH = new Model.ACH
{
StoreInfoID = storeInfo.ID,
SourceDatabaseID = (byte)sourceDB.ID,
LoanID = (long)record.LoanID,
PaymentID = (long)record.PaymentID,
LastName = record.LastName,
FirstName = record.FirstName,
MICR = record.MICR,
Amount = (decimal)record.Amount,
CheckDate = record.CheckDate
};
newRecords.Add(insertACH);
}
The above code builds the newRecords list. Now, I am trying to get the records from this List that are not in the DB by comparing on the 3 field Unique Index:
AchExceptComparer myComparer = new AchExceptComparer();
var validRecords = dbZach.ACHes.Intersect(newRecords, myComparer).ToList();
The comparer looks like:
class AchExceptComparer : IEqualityComparer<Model.ACH>
{
public bool Equals(Model.ACH x, Model.ACH y)
{
return (x.LoanID == y.LoanID && x.PaymentID == y.PaymentID && x.SourceDatabaseID == y.SourceDatabaseID);
}
public int GetHashCode(Model.ACH obj)
{
return base.GetHashCode();
}
}
However, I am getting this error:
LINQ to Entities does not recognize the method 'System.Linq.IQueryable1[MisterMoney.LARS.ZACH.Model.ACH] Intersect[ACH](System.Linq.IQueryable1[MisterMoney.LARS.ZACH.Model.ACH], System.Collections.Generic.IEnumerable1[MisterMoney.LARS.ZACH.Model.ACH], System.Collections.Generic.IEqualityComparer1[MisterMoney.LARS.ZACH.Model.ACH])' method, and this method cannot be translated into a store expression.
Any ideas? And yes, this is completely inline with the original question. :)
You can't do bulk inserts with LINQ to SQL (I presume you were referring to LINQ to SQL when you said "LINQ"). However, based on what you're describing, I'd recommend checking out the new MERGE operator of SQL Server 2008.
Inserting, Updating, and Deleting Data by Using MERGE
Another example here.
I recommend you just write the SQL yourself to do the inserting, I find it is a lot faster and you can get it to work exactly how you want it to. When I did something similar to this (just a one-off program) I just used a Dictionary to hold the ID's I had inserted already, to avoid duplicates.
I find LINQ to SQL is good for one record or a small set that does its entire lifespan in the LINQ to SQL.
Or you can try to use SQL Server 2008's Bulk Insert .
One thing to watch out for is if you queue more than 2000 or so records without calling SubmitChanges() - TSQL has a limit on the number of statements per execution, so you cannot simply queue up every record and then call SubmitChanges() as this will throw an SqlException, you need to periodically clear the queue to avoid this.