I have a collection of Employees that I stored inside MemoryCache.Now after update of particular employee details, I want to update the same cache object with updated employee details. Something like that:
var AllEmployees= cache.Get("AllEmployees");
if(AllEmployees != null)
{
var empToUpdate = AllEmployees.FirstOrDefault(i => i.EmpId== Employee.EmpId);
if (empToUpdate != null)
{
AllEmployees.Remove(empToUpdate );
AllEmployees.add(empToUpdate );
}
}
but since cache is memory cache and it has cached IEnumerable<Employee>,i am not able to directly manipulate cache object.I am not able to call these methods FirstOrDefault,Remove,Add
As mentioned in my comment i cant see a reason to no use FirstOrDefault.
IEnumerables are not ment to be modified. So you have to go a heavy, performanceunfirendly way and create a new IEnumerable without a specific item (IENumerable.Except(array of items to exclude)) then yo gonna concat it with another sequenze, which contains your new element to add.
var empToUpdate = AllEmployees.FirstOrDefault(i => i.EmpId== Employee.EmpId);
if (empToUpdate != null)
{
AllEmployees = AllEmployees.Except(new[]{empToUpdate}).Concat(new[] { empToUpdate});
}
Anyways i dont see a sense in this code, cause you are removing the object and immediatly add it again.
Related
Please excuse my limited knowledge of Entity Framework/C#. I'm sure there is an easy way to do what I wish to do.
I want to iterate through an object that I received from the database in order to check which fields are different from my update DTO. I wish to do this so only the changed fields are updated in the objFromDb. I am aware that if I only change the fields in the objFromDb that are modified, EF will only update the changed fields. I'm not sure if this is necessary or if there is a better way to do it.
Here is my update code (works but not the way I want it to) with the individual properties as well as the commented code that I was trying to accomplish. I don't like hard-coding the individual properties as this will require maintenance in the event that the object is changed.
public T_IFS_EmployeeDTO Update(T_IFS_EmployeeDTO ojbDTO)
{
var objFromDb = _db.T_IFS_Employee.FirstOrDefault(u => u.EmployeeID == ojbDTO.EmployeeID);
if (objFromDb != null)
{
//foreach (var prop in objFromDb.GetType().GetProperties(BindingFlags.Instance))
//{
// if (prop.GetValue(objFromDb) != prop.GetValue(ojbDTO))
// {
// objFromDb.prop.GetValue(objFromDb) = prop.GetValue(ojbDTO);
// }
//}
if (objFromDb.FirstName != ojbDTO.FirstName) objFromDb.FirstName = ojbDTO.FirstName;
if (objFromDb.LastName != ojbDTO.LastName) objFromDb.LastName = ojbDTO.LastName;
if (objFromDb.UserName != ojbDTO.UserName) objFromDb.UserName = ojbDTO.UserName;
if (objFromDb.Password != ojbDTO.Password) objFromDb.Password = ojbDTO.Password;
if (objFromDb.AccessLevel != ojbDTO.AccessLevel) objFromDb.AccessLevel = ojbDTO.AccessLevel;
_db.T_IFS_Employee.Update(objFromDb);
_db.SaveChanges();
return _mapper.Map<T_IFS_Employee, T_IFS_EmployeeDTO>(objFromDb);
}
return ojbDTO;
I'm sure there is an easy way to do this but I haven't been able to figure it out. I do appreciate your time.
-Edit-
I think the following will work but will EF know if a field has not been modified and is it possible that it is as simple as this:
var objFromDb = _db.T_IFS_Employee.FirstOrDefault(u => u.EmployeeID == ojbDTO.EmployeeID);
var objFromCall = _mapper.Map<T_IFS_EmployeeDTO, T_IFS_Employee>(ojbDTO);
if (objFromDb != null)
{
objFromDb = objFromCall;
Entity Framework Core will check the values of your entity and check them against a snapshot of what they were like when you loaded them. See this for the details.
So you should be able to do:
var objFromDb = _db.T_IFS_Employee.FirstOrDefault(u => u.EmployeeID == ojbDTO.EmployeeID);
if (objFromDb != null)
{
_mapper.Map<T_IFS_EmployeeDTO, T_IFS_Employee>(ojbDTO, objFromDb);
//This overload of .Map sets properties in an existing object, as
//opposed to creating a new one
}
This will overwrite all properties in objFromDb with values from objDTO, But only the ones that are different will be written to the database when you call SaveChanges().
Setting objFromDb to objFromCall will overwrite your reference from the database and this won't be tracked at all.
And there's no need to call .Update() if you received the object from the DbContext and you haven't disabled change tracking.
I think what I want to do is not possible.
However, I would like to make sure that it is indeed not possible.
I use Audit.Net as a framework to add auditing to a system that has already been completed.
There are many different objects that may be sent for auditing. I have to get hold of the properties of those objects and send them to the database. In some cases, I would want the old values and the new values, and therefore I use the Target property of AuditEvent, otherwise if I just need the new values I use the CustomField property.
Is there any way to make the following more generic, so that I don't have to repeat these lines for each type of object like SimpleResult, LeaveRequest, Incident etc?
There is unfortunately no commonality between the objects being audited.
SimpleResult objOld = JsonConvert.DeserializeObject<SimpleResult>(auditEvent.Target.SerializedOld.ToString());
SimpleResult objNew = JsonConvert.DeserializeObject<SimpleResult>(auditEvent.Target.SerializedNew.ToString());
if (auditEvent.Target.Type.ToString() == "SimpleResult")
{
InsertTargetObjectFields<SimpleResult>(objOld, objNew, auditControlTableID, auditEvent);
}
Here is where I get hold of the properties and send them off to the database:
public void InsertTargetObjectFields<T>(T objOld, T objNew, int? auditControlTableID, AuditEvent auditEvent)
{
using (ESSDataContext ess_context = new ESSDataContext())
{
try
{
foreach (var property in objOld.GetType().GetProperties().Where(property => !property.GetGetMethod().GetParameters().Any()))
{
//Check for null values and get hold of oldValue and newValue
var sqlResult = ess_context.InsertAuditTable(resourceTag, dbObjectName, username, property.Name, oldValue,
newValue, auditEvent.StartDate, auditControlTableID.ToString(),
auditEvent.Environment.CallingMethodName);
}
}
}
}
I've tried using dynamic, but then I don't get the properties correctly.
I'm looking at a problem where I wish to get a collection from an expensive service call and then store it in cache so it can be used for subsequent operations on the UI. The code I'm using is as follows:
List<OrganisationVO> organisations = (List<OrganisationVO>)MemoryCache.Default["OrganisationVOs"];
List<Organisation> orgs = new List<Organisation>();
if (organisations == null)
{
organisations = new List<OrganisationVO>();
orgs = pmService.GetOrganisationsByName("", 0, 4000, ref totalCount);
foreach (Organisation org in orgs)
{
OrganisationVO orgVO = new OrganisationVO();
orgVO = Mapper.ToViewObject(org);
organisations.Add(orgVO);
}
MemoryCache.Default.AddOrGetExisting("OrganisationVOs", organisations, DateTime.Now.AddMinutes(10));
}
List<OrganisationVO> data = new List<OrganisationVO>();
data = organisations;
if (!string.IsNullOrEmpty(filter) && filter != "*")
{
data.RemoveAll(filterOrg => !filterOrg.DisplayName.ToLower().StartsWith(filter.ToLower()));
}
The issue I'm facing is that the data.RemoveAll operation affects the cached version. i.e. I want the cached version to always reflect the full dataset returned by the service call. I then want to retrieve this collection from cache whenever the filter is set and apply it but this should not change cached data - i.e. subsequent filters should happen on the full dataset - what is the best way to do this?
You need to make copy of the list if you want to use RemoveAll operation (ToList would be enough).
Also instead of modigying the list consider using LINQ operations like Where/Select.
I would either:
apply the filter dynamically and replace the filter if needed (so you cache the complete data but only return the cachedData.Where(currentFilter)
make two caches - one for the complete data and one for the filtered data - in this case the first one should only consist of the data returned from the service - no need to cache the VO-data as well
The following code works. However, it works because I end up creating a deep clone of the suppliers. If I do not perform a deep clone then we get an error suggesting that the supplier objects have changed and the attempt to amend the supplier table has failed. This only happens if the following line is run: foreach (Supplier suppliers in exceptions). Oddly, this occurs irrespective whether the Delete() method is executed. Why does this happen? I have posted the working code below for your inspection. As I say, if you try looping without deep cloning then it does not work... Any ideas?
public void DeleteSuppliers(IList<Supplier> suppliers, Int32 parentID)
{
// If a supplier has been deleted on the form we need to delete from the database.
// Get the suppliers from the database
List<Supplier> dbSuppliers = Supplier.FindAllByParentID(parentID);
// So return any suppliers that are in the database that are not now on this form
IEnumerable<Supplier> results = dbSuppliers.Where(f => !Suppliers.Any(d => d.Id == f.Id));
IList<Supplier> exceptions = null;
// code guard
if (results != null)
{
// cast as a list
IList<Supplier> tempList = (IList<Supplier>)results.ToList();
// deep clone otherwise there would be an error
exceptions = (IList<Supplier>)ObjectHelper.DeepClone(tempList);
// explicit clean up
tempList = null;
}
// Delete the exceptions from the database
if (exceptions != null)
{
// walk the suppliers that were deleted from the form
foreach (Supplier suppliers in exceptions)
{
// delete the supplier from the database
suppliers.Delete();
}
}
}
I think the error is about the collection being enumerated having changed. You're not allowed to change the collection being enumerated by a foreach statement (or anything that enumerated an IEnumerable, if I recall correctly).
But if you make a clone then the collection you're enumerating is separate from the collection being affected by the Delete.
Have you tried a shallow copy? I would think that would work just as well. A shallow copy could be created with ToArray.
I resolved the issue by reordering the execution flow. Originally, this piece of code was execute last. The error went away when I executed it first.
I need to loop through the properties of a custom object type that I'm getting back from the database and only show the columns that contain data.
This means I cannot simply bind the list of objects to the datagrid.
I don't want to loop through each object and see if the column is empty/null and determine in the UI to display it.
What I'm thinking is in my business layer before I send the object back I would send an IEnumerable back with only those columns that should be visible. Thus I was thinking of using Linq to Object to do this, but I'm not sure that would be very pretty.
Does anyone know of a solution that I could use without a ton of IF statements that I could do to check through a large object (30 or so columns) to determine what should be shown or not.
Foreach (CustomerData customerdata in Customers)
{
if (!customerdata.address.Equals(""))
{
dgvCustomerData.Column["Address"].visible = false;
}
//Continue checking other data columns...
}
I wish to avoid all of this in the UI and all the IFs...
I'm having a brain fart on this one can anyone help me?
Thanks
You could do the following to simplify it a bit
Action<T,string> del = (value,name) => {
if ( value.Equals("") ) {
dgvCustomerData.Column[name].Visible = false;
}
};
foreach ( var data in Customers ) {
del(data.address,"Address");
del(data.name, "Name");
...
}
Take a look at the .NET Reflection Libraries. You can use reflection to get ahold of all of an object's properties, and loop through them to find out if they are null or not. Then you could return a collection of KeyValuePair objects where Key = property name, and Value = true/false. You'd then use the keyvaluepairs to set column visibility...