when I select objects using Linq, i seem to get cached results at first.
I use the following code to fetch an applicant from the DB on GET requests assume that this data is STATE_1
using (AdmissionsAppEntities db = new AdmissionsAppEntities())
{
// Fetch the user
ApplicationData applicant = (from a in db.ApplicationDatas
where a.userGUID == userGUID
select a).SingleOrDefault();
}
and the following code to save changes against this record on POST requests, after the SaveChanges is called this record should be in STATE_2
using (AdmissionsAppEntities db = new AdmissionsAppEntities())
{
// Fetch the user
var applicant = (from a in db.ApplicationDatas
where a.userGUID == userGUID
select a).SingleOrDefault();
if (applicant != null)
{
// Save page 1 data
...
applicant.lastUpdate = DateTime.Now;
db.Entry(applicant).State = EntityState.Modified;
DataBag.result = db.Entry(applicant).GetValidationResult();
if (DataBag.result.ValidationErrors.Count == 0)
{
db.SaveChanges();
}
}
}
if (DataBag.result.ValidationErrors.Count == 0)
{
return RedirectToAction("PageTwo");
}
The database properly saves STATE_2 (i can see it in the db if I use a sql inspection tool), but on subsequent pageload, STATE_1 is retrieved.
I see tons of results where people are having this issue, but no ideas on how to fix it.
UPDATE 1
I moved the RedirectToAction calls (all my returns) to outside the using block to make sure that each DbContext's Destroy function gets called. Did not appear to solve the problem.
People are actually having this issue, so I'm going to leave this up, my issue ended up being Orchard CMS caching my module's pages.
Related
I am looking to find the best way to cache the DB Lookup Tables which consists of about 75 table.
I want to cache these tables data to use them in my application so I won't open a connection with the DB each time I need them.
Here is what I am doing:
I have created a static class which contains static properties to each lookup table called MyApplicationCache.
In each property getter I am filling it with the intended data from DB.
I'm caching the data using HttpRuntime.Cache["PropertyName"]
Each time i GET this lookup table data I check if the HttpRuntime.Cache["PropertyName"] != null
If yes then I am getting it from cache else I am getting it from DB
Finally, I am invoking all properties at application start event in global.asax
Until now everything is good, but recently I've faced a performance issue and I can't solve it. If I wanted the cache object (Payer) to be updated from DB I am doing this:
MyApplicationCache.Payer = null;
This sets HttpRuntime.Cache["Payer"] = null so if I requested it again it reloads it from the DB.
list<payer> payerList = MyApplicationCache.Payer;
Now the Performance problem raises:
PayerList in DB are about 1700 record.
Each payer object has a List property called PayerBranches which requires looping on all PayerList List and getting PayerBranches for each PayerList item.
// MyApplicationCache Payer Property:
public static List<LDM.DataEntityTier.Payer> Payer {
get {
if (HttpRuntime.Cache["Payer"] != null)
return (List<LDM.DataEntityTier.Payer>)HttpRuntime.Cache["Payer"];
// request item from its original source
using (LDM.DataAccess.OracleManager OracleManager = new LDM.DataAccess.OracleManager()) {
OracleManager.OpenConnection();
List<LDM.DataEntityTier.Payer> result = new LDM.DataService.PayerService().GetPayersListWithFullName(3, OracleManager, "UTC");
//List<LDM.DataEntityTier.Payer> result = new LDM.DataService.PayerService().GetListOfPayer("Order by Name asc", OracleManager ,"UTC");
List<PayerBranches> payerBranchesList = new LDM.DataService.PayerBranchesService().GetListOfObject(OracleManager, "UTC");
OracleManager.CloseConnection();
foreach (Payer payerItem in result) {
payerItem.PayerBranches = new List<PayerBranches>();
foreach (PayerBranches item in payerBranchesList.FindAll(x => x.PayerID == payerItem.Id)) {
payerItem.PayerBranches.Add(item);
}
}
// add item to cache
HttpRuntime.Cache["Payer"] = result;
return result;
}
}
set {
if (value == null) {
HttpRuntime.Cache.Remove("Payer");
}
}
}
This problem occurs with each property that has a list in it
I don't know if there is a better way to cache data or if there is a problem in my code.
Is there is a better way to do caching?
I am trying to delete a sql table record from C# thru Linq but some reason DeleteonSubmit is not recoganized,am not sure what iam missing here, please guide me the right way
here is my code
proxy = new ServiceClient("basicHttp_IService");
//GatewayService.ServiceClient proxy = new ServiceClient("basicHttp_IService");
SearchCriteria criteria = new SearchCriteria();
criteria.UserRoles = new string[] { "*" };
var stories = proxy.GetStoryItemsByCriteria(criteria);
var programs = proxy.GetPrograms();
var Profiles = proxy.GetProfiles();
foreach(StoryProgram sp in lstStoriestoClose)
{
try
{
DateTime LastUpdateTimestamp;
DateTime CurrentTime = DateTime.Now;
LastUpdateTimestamp = sp.Story.LastUpdatedOn;
if ((CurrentTime - LastUpdateTimestamp).TotalHours > 24)
{
//Delete the story from database
var storytoDelete = from story in stories
where story.Id == sp.Story.Id
select story;
//I am trying to delete the record like below
stories.DeleteonSubmit(storytoDelete);
stories.SubmitChanges();
//Intellisense doesn't show the option DeleteonSubmit and SubmitChanges
}
}
}
Please guide me the right way to delete the record in SQL thru Linq
DeleteOnSubmit is for single entities. Your query returns a collection of entities (granted there may be only one entry, but it's still a collection). You can either use DeleteAllOnSubmit:
//Delete the story from database
var storytoDelete = from story in stories
where story.Id == sp.Story.Id
select story;
//I am trying to delete the record like below
stories.DeleteAllOnSubmit(storytoDelete);
or explicitly extract one record:
//Delete the story from database
var storytoDelete = from story in stories
where story.Id == sp.Story.Id
select story;
//I am trying to delete the record like below
stories.DeleteOnSubmit(storytoDelete.Single()); // or First, depending on whether you expect more than one match
It looks like your service is returning an array and not a valid linq database object. This is why it does not recognize the methods you expect to see. You need to examine the type and go from there.
You can always right click the service reference and configure to check/set the return type.
I just added delete functionality in WCF service and pass the sql record details to delete the record from SQL that solved my problem.
Thanks all for the suggestion and advise.
foreach(StoryProgram sp in lstStoriestoClose)
{
try
{
DateTime LastUpdateTimestamp;
DateTime CurrentTime = DateTime.Now;
LastUpdateTimestamp = sp.Story.LastUpdatedOn;
if ((CurrentTime - LastUpdateTimestamp).TotalHours > 24)
{
//Delete the story from database
//Check the gateway to delete the record in the db.
var storytoDelete= from story in stories
where story.Id== sp.Story.Id
select story;
// stories.DeleteAllonSubmit(storytoDelete);
List<StoryProgram> lstStoriestoDelete = (from story in storytoDelete
join program in programs on story.ProgramId equals program.Id
join profile in Profiles on story.ProfileId equals profile.Id
select new StoryProgram(story, program, profile)).ToList();
foreach (StoryProgram sps in lstStoriestoDelete)
{
try
{
proxy.DeleteStoryItem(sps.Story.Id);
}
This section simply reads from an excel spreadsheet. This part works fine with no performance issues.
IEnumerable<ImportViewModel> so=data.Select(row=>new ImportViewModel{
PersonId=(row.Field<string>("person_id")),
ValidationResult = ""
}).ToList();
Before I pass to a View I want to set ValidationResult so I have this piece of code. If I comment this out the model is passed to the view quickly. When I use the foreach it will take over a minute. If I hardcode a value for item.PersonId then it runs quickly. I know I'm doing something wrong, just not sure where to start and what the best practice is that I should be following.
foreach (var item in so)
{
if (db.Entity.Any(w => w.ID == item.PersonId))
{
item.ValidationResult = "Successful";
}
else
{
item.ValidationResult = "Error: ";
}
}
return View(so.ToList());
You are now performing a database call per item in your list. This is really hard on your database and thus your performance. Try to itterate trough your excel result, gather all users and select them in one query. Make a list from this query result (else the query call is performed every time you access the list). Then perform a match between the result list and your excel.
You need to do something like this :
var ids = so.Select(i=>i.PersonId).Distinct().ToList();
// Hitting Database just for this time to get all Users Ids
var usersIds = db.Entity.Where(u=>ids.Contains(u.ID)).Select(u=>u.ID).ToList();
foreach (var item in so)
{
if (usersIds.Contains(item.PersonId))
{
item.ValidationResult = "Successful";
}
else
{
item.ValidationResult = "Error: ";
}
}
return View(so.ToList());
This seems like a popular question as I've seen several threads on this. However, I can't get updates to work. I have some LINQ-to-SQL code that looks like the following:
int orderID = GetOrderID();
using (DBDataContext database = new DBDataContext())
{
var order = database.Orders.FirstOrDefault(x => x.OrderID == orderID);
if (order != null)
{
order.IsOpen = GetIsOpen();
database.SubmitChanges();
}
}
I can set my breakpoint and see that it is getting into my IF statement. I've also fired up SQL profiler and have noticed that no statements are coming in for this code. Yet, I can successfully add Orders using the following code:
Order newOrder = GetNewOrder();
using (DBDataContext database = new DBDataContext())
{
database.Orders.InsertOnSubmit(newOrder);
database.SubmitChanges();
}
What am I doing wrong?
I think you may need to mark the Order as changed by doing the following
database.MarkAsModified(order)
before database.SubmitChanges()
I am writing a small application that does a lot of feed processing. I want to use LINQ EF for this as speed is not an issue, it is a single user app and, in the end, will only be used once a month.
My questions revolves around the best way to do bulk inserts using LINQ EF.
After parsing the incoming data stream I end up with a List of values. Since the end user may end up trying to import some duplicate data I would like to "clean" the data during insert rather than reading all the records, doing a for loop, rejecting records, then finally importing the remainder.
This is what I am currently doing:
DateTime minDate = dataTransferObject.Min(c => c.DoorOpen);
DateTime maxDate = dataTransferObject.Max(c => c.DoorOpen);
using (LabUseEntities myEntities = new LabUseEntities())
{
var recCheck = myEntities.ImportDoorAccess.Where(a => a.DoorOpen >= minDate && a.DoorOpen <= maxDate).ToList();
if (recCheck.Count > 0)
{
foreach (ImportDoorAccess ida in recCheck)
{
DoorAudit da = dataTransferObject.Where(a => a.DoorOpen == ida.DoorOpen && a.CardNumber == ida.CardNumber).First();
if (da != null)
da.DoInsert = false;
}
}
ImportDoorAccess newIDA;
foreach (DoorAudit newDoorAudit in dataTransferObject)
{
if (newDoorAudit.DoInsert)
{
newIDA = new ImportDoorAccess
{
CardNumber = newDoorAudit.CardNumber,
Door = newDoorAudit.Door,
DoorOpen = newDoorAudit.DoorOpen,
Imported = newDoorAudit.Imported,
RawData = newDoorAudit.RawData,
UserName = newDoorAudit.UserName
};
myEntities.AddToImportDoorAccess(newIDA);
}
}
myEntities.SaveChanges();
}
I am also getting this error:
System.Data.UpdateException was unhandled
Message="Unable to update the EntitySet 'ImportDoorAccess' because it has a DefiningQuery and no element exists in the element to support the current operation."
Source="System.Data.SqlServerCe.Entity"
What am I doing wrong?
Any pointers are welcome.
You can do multiple inserts this way.
I've seen the exception you're getting in cases where the model (EDMX) is not set up correctly. You either don't have a primary key (EntityKey in EF terms) on that table, or the designer has tried to guess what the EntityKey should be. In the latter case, you'll see two or more properties in the EDM Designer with keys next to them.
Make sure the ImportDoorAccess table has a single primary key and refresh the model.