Adding items to an object is taking too long - c#

I have a method that builds out an array of objects (list) and returns it to the parent. I was having a terrible time with the performance in my MVC app so I decided to add a stopwatch in place to catch of various areas of the code which is being called. I've now isolated it down to this area:
var items = new List();
_stopwatch.Start();
var query = (from img in db.links
join link in db.lScc on img.pkID equals link.nLinkConfig
select new
{
ImageBase64 = img.bzImage,
ImageType = img.szImageType,
Description = img.szDescription,
URL = img.szURI,
HrefTarget = img.nWindowBehavior,
GroupName = link.szGroupName,
LinkConfig = link.nLinkConfig
}).DistinctBy(x => x.LinkConfig);
_stopwatch.Stop();
_stopwatch.Start();
foreach (var item in query)
{
items.Add(new
{
ImageBase64 = item.ImageBase64 != null && item.ImageBase64.Length > 0 ? Convert.ToBase64String(item.ImageBase64) : "",
ImageType = string.IsNullOrEmpty(item.ImageType) ? "" : item.ImageType,
Description = string.IsNullOrEmpty(item.Description) ? "" : item.Description,
URL = string.IsNullOrEmpty(item.URL) ? "" : item.URL,
HrefTarget = item.HrefTarget,
GroupName = item.GroupName
});
}
_stopwatch.Stop(); // takes around 11 seconds for this to complete about 20 iterations
I first thought it may be the ...Convert.ToBase64String(item.ImageBase64)... but I commented that out and it had basically no effect.
Anyone have any ideas what could be causing the slowness? This should only take a fraction of a second to complete. It deals with UI so this needs to be a lot more responsive.

The issue looks to be that you want to load all items from a db into the query and not lazy load the results. The round trip to the db can be very slow.
var query = (from img in db.links
join link in db.lScc on img.pkID equals link.nLinkConfig
select new
{
ImageBase64 = img.bzImage,
ImageType = img.szImageType,
Description = img.szDescription,
URL = img.szURI,
HrefTarget = img.nWindowBehavior,
GroupName = link.szGroupName,
LinkConfig = link.nLinkConfig
}).DistinctBy(x => x.LinkConfig);
query.count();
query.count will load all the items at once into your collection rather than lazy load your collection

It turns out there was something wrong with my datamodel so the this.Configuration.LazyLoadingEnabled = false set in the model.edmx wasn't disabling the lazy loading. I updated the model from the database and fixed the issue (there was a deleted table that kept coming back) and re-tried. It's now 1/4 the time previously. Thanks for the tips on this!

Related

Entity is incorrectly detaching from tracking EF Core

I am trying to query objects from a database, loop through them and check if a column has a value and, if it does not, create a value and assign it to that column and save it to the database. The problem I'm having is that the entity is detaching after the query so I cannot save the changes. Below is the code I am using to query and update the entity.
DateTime runTime = passedDateTime ?? DateTime.Now;
await using DiscordDatabaseContext database = new();
DateTime startOfWeek = exactlyOneWeek ? runTime.OneWeekAgo() : runTime.StartOfWeek(StartOfWeek);
//Add if not in a Weekly Playlist already and if the video was submitted after the start of the week
List<PlaylistData> pld = await database.PlaylistsAdded.Select(playlist => new PlaylistData
{
PlaylistId = playlist.PlaylistId,
WeeklyPlaylistID = playlist.WeeklyPlaylistID,
Videos = playlist.Videos.Where(
video => (video.WeeklyPlaylistItemId == null ||
video.WeeklyPlaylistItemId.Length == 0) &&
startOfWeek <= video.TimeSubmitted)
.Select(video => new VideoData
{
WeeklyPlaylistItemId = video.WeeklyPlaylistItemId,
VideoId = video.VideoId
}).ToList()
}).ToListAsync().ConfigureAwait(false);
int count = 0;
int nRows = 0;
foreach (PlaylistData playlistData in pld)
{
if (string.IsNullOrEmpty(playlistData.WeeklyPlaylistID))
{
playlistData.WeeklyPlaylistID = await YoutubeAPIs.Instance.MakeWeeklyPlaylist().ConfigureAwait(false);
}
foreach (VideoData videoData in playlistData.Videos)
{
PlaylistItem playlistItem = await YoutubeAPIs.Instance.AddToPlaylist(videoData.VideoId, playlistId: playlistData.WeeklyPlaylistID, makeNewPlaylistOnError: false).ConfigureAwait(false);
videoData.WeeklyPlaylistItemId = playlistItem.Id;
++count;
}
}
nRows += await database.SaveChangesAsync().ConfigureAwait(false);
The query works correctly, I get all relevant Playlist and Video Rows to work with, they have the right data in only the specified columns, and the query that is logged looks good, but saves do not work and calling database.Entry() on any of the Playlists or Video objects show that they are all detached. What am I doing wrong? Are collections saved a different way? Should my query be changed? Is there a setting on initialization that should be changed? (The only setting I have set on init that I feel like may affect this is .UseQuerySplittingBehavior(QuerySplittingBehavior.SplitQuery) but the query logged isn't even split as far as I can see)
You work with projected objects
PlaylistData
VideoData
Projected objects does not tracked by EF core as far as I know. So the solution is to select DbSet's entity objects (mean types that specified in database.PlaylistsAdded and playlist.Videos properties) or select those objects before update and then update them.
UPDATE:
Example code for second option:
foreach (PlaylistData playlistData in pld)
{
var playlist = database.PlaylistsAdded
.Include(x=> x.Videos)
.First(x => x.PlaylistId == playlistData.playlistData);
if (string.IsNullOrEmpty(playlistData.WeeklyPlaylistID))
{
playlist.WeeklyPlaylistID = await YoutubeAPIs.Instance.MakeWeeklyPlaylist().ConfigureAwait(false);
}
foreach (VideoData videoData in playlistData.Videos)
{
var video = playlist.Videos.First(x=> x.VideoId == videoData.VideoId);
PlaylistItem playlistItem = await YoutubeAPIs.Instance.AddToPlaylist(videoData.VideoId, playlistId: playlistData.WeeklyPlaylistID, makeNewPlaylistOnError: false).ConfigureAwait(false);
video.WeeklyPlaylistItemId = playlistItem.Id;
++count;
}
}
NOTICE: this would produce double select's so first option is more preferred

C# Linq get descendants on a subquery

I've been banging my head on the desktop for the past couple of hours trying to decipher this issue.
I'm trying to query an XML file with Linq, the xml has the following format:
<MRLGroups>
<MRLGroup>
<MarketID>6084</MarketID>
<MarketName>European Union</MarketName>
<ActiveIngredientID>28307</ActiveIngredientID>
<ActiveIngredientName>2,4-DB</ActiveIngredientName>
<IndexCommodityID>59916</IndexCommodityID>
<IndexCommodityName>Cucumber</IndexCommodityName>
<ScientificName>Cucumis sativus</ScientificName>
<MRLs>
<MRL>
<PublishedCommodityID>60625</PublishedCommodityID>
<PublishedCommodityName>Cucumbers</PublishedCommodityName>
<MRLTypeID>238</MRLTypeID>
<MRLTypeName>General</MRLTypeName>
<DeferredToMarketID>6084</DeferredToMarketID>
<DeferredToMarketName>European Union</DeferredToMarketName>
<UndefinedCommodityLinkInd>false</UndefinedCommodityLinkInd>
<MRLValueInPPM>0.0100</MRLValueInPPM>
<ResidueDefinition>2,4-DB</ResidueDefinition>
<AdditionalRegulationNotes>Comments.</AdditionalRegulationNotes>
<ExpiryDate xsi:nil="true" />
<PrimaryInd>true</PrimaryInd>
<ExemptInd>false</ExemptInd>
</MRL>
<MRL>
<PublishedCommodityID>60626</PublishedCommodityID>
<PublishedCommodityName>Gherkins</PublishedCommodityName>
<MRLTypeID>238</MRLTypeID>
<MRLTypeName>General</MRLTypeName>
<DeferredToMarketID>6084</DeferredToMarketID>
<DeferredToMarketName>European Union</DeferredToMarketName>
<UndefinedCommodityLinkInd>false</UndefinedCommodityLinkInd>
<MRLValueInPPM>0.0100</MRLValueInPPM>
<ResidueDefinition>2,4-DB</ResidueDefinition>
<AdditionalRegulationNotes>More Comments.</AdditionalRegulationNotes>
<ExpiryDate xsi:nil="true" />
<PrimaryInd>false</PrimaryInd>
<ExemptInd>false</ExemptInd>
</MRL>
</MRLs>
</MRLGroup>
So far i've created classes for the "MRLGroup" section of the file
var queryMarket = from market in doc.Descendants("MRLGroup")
select new xMarketID
{
MarketID = Convert.ToString(market.Element("MarketID").Value),
MarketName = Convert.ToString(market.Element("MarketName").Value)
};
List<xMarketID> markets = queryMarket.Distinct().ToList();
var queryIngredient = from ingredient in doc.Descendants("MRLGroup")
select new xActiveIngredients
{
ActiveIngredientID = Convert.ToString(ingredient.Element("ActiveIngredientID").Value),
ActiveIngredientName = Convert.ToString(ingredient.Element("ActiveIngredientName").Value)
};
List<xActiveIngredients> ingredientes = queryIngredient.Distinct().ToList();
var queryCommodities = from commodity in doc.Descendants("MRLGroup")
select new xCommodities {
IndexCommodityID = Convert.ToString(commodity.Element("IndexCommodityID").Value),
IndexCommodityName = Convert.ToString(commodity.Element("IndexCommodityName").Value),
ScientificName = Convert.ToString(commodity.Element("ScientificName").Value)
};
List<xCommodities> commodities = queryCommodities.Distinct().ToList();
After i got the "catalogues" I'm trying to query the document against the catalogues to achieve some sort of "groups", after all this, i'm going to send this data to the database, the issue here is that the xml files are around 600MB each and i get the everyday, so my approach is to create catalogues and just send the MRLs to the database joined to the "header" table that contains the Catalogues IDs, here's what i've done so far but failed miserably:
//markets
foreach (xMarketID market in markets) {
//ingredients
foreach (xActiveIngredients ingredient in ingredientes) {
//commodities
foreach (xCommodities commodity in commodities) {
var mrls = from m in doc.Descendants("MRLGroup")
where Convert.ToString(m.Element("MarketID").Value) == market.MarketID
&& Convert.ToString(m.Element("ActiveIngredientID").Value) == ingredient.ActiveIngredientID
&& Convert.ToString(m.Element("IndexCommodityID").Value) == commodity.IndexCommodityID
select new
{
ms = new List<xMRLIndividial>(from a in m.Element("MRLs").Descendants()
select new xMRLIndividial{
publishedCommodityID = string.IsNullOrEmpty(a.Element("PublishedCommodityID").Value) ? "" : a.Element("PublishedCommodityID").Value,
publishedCommodityName = a.Element("PublishedCommodityName").Value,
mrlTypeId = a.Element("MRLTypeID").Value,
mrlTypeName = a.Element("MRLTypeName").Value,
deferredToMarketId = a.Element("DeferredToMarketID").Value,
deferredToMarketName = a.Element("DeferredToMarketName").Value,
undefinedCommodityLinkId = a.Element("UndefinedCommodityLinkInd").Value,
mrlValueInPPM = a.Element("MRLValueInPPM").Value,
residueDefinition = a.Element("ResidueDefinition").Value,
additionalRegulationNotes = a.Element("AdditionalRegulationNotes").Value,
expiryDate = a.Element("ExpiryDate").Value,
primaryInd = a.Element("PrimaryInd").Value,
exemptInd = a.Element("ExemptInd").Value
})
};
foreach (var item in mrls)
{
Console.WriteLine(item.ToString());
}
}
}
}
If you notice i'm trying to get just the MRLs descendants but i got this error:
All i can reach on the "a" variable is the very first node of MRLs->MRL not all of them, what is going on?
If you guys could lend me a hand would be super!
Thanks in advance.
With this line...
from a in m.Element("MRLs").Descendants()
...will iterate through all sub-elements, including children of children. Hence your error, since your <PublishedCommodityID> element does not have a child element.
Unless you want to specifically return all child elements of all levels, always use the Element and Elements axis instead of Descendant and Descendants:
from a in m.Element("MRLs").Elements()
That should solve your problem.
However, your query is also difficult to read with the nested foreach loops and the multiple tests for the IDs. You can simplify it with a combination of LINQ and XPath:
var mrls =
from market in markets
from ingredient in ingredientes
from commodity in commodities
let xpath = $"/MRLGroups/MRLGroup[{market.MarketId}]" +
$"[ActiveIngredientID={ingredient.ActiveIngredientId}]" +
$"[IndexCommodityID={commodity.IndexCommodityID}]/MRLs/MRL"
select new {
ms =
(from a in doc.XPathSelectElements(xpath)
select new xMRLIndividial {
publishedCommodityID = string.IsNullOrEmpty(a.Element("PublishedCommodityID").Value) ? "" : a.Element("PublishedCommodityID").Value,
publishedCommodityName = a.Element("PublishedCommodityName").Value,
mrlTypeId = a.Element("MRLTypeID").Value,
mrlTypeName = a.Element("MRLTypeName").Value,
deferredToMarketId = a.Element("DeferredToMarketID").Value,
deferredToMarketName = a.Element("DeferredToMarketName").Value,
undefinedCommodityLinkId = a.Element("UndefinedCommodityLinkInd").Value,
mrlValueInPPM = a.Element("MRLValueInPPM").Value,
residueDefinition = a.Element("ResidueDefinition").Value,
additionalRegulationNotes = a.Element("AdditionalRegulationNotes").Value,
expiryDate = a.Element("ExpiryDate").Value,
primaryInd = a.Element("PrimaryInd").Value,
exemptInd = a.Element("ExemptInd").Value
}).ToList()
};

Why does this EF upsert logic result in deletions instead of updates?

The following code results in deletions instead of updates.
My question is: is this a bug in the way I'm coding against Entity Framework or should I suspect something else?
Update: I got this working, but I'm leaving the question now with both the original and the working versions in hopes that I can learn something I didn't understand about EF.
In this, the original non working code, when the database is fresh, all the additions of SearchDailySummary object succeed, but on the second time through, when my code was supposedly going to perform the update, the net result is a once again empty table in the database, i.e. this logic manages to be the equiv. of removing each entity.
//Logger.Info("Upserting SearchDailySummaries..");
using (var db = new ClientPortalContext())
{
foreach (var item in items)
{
var campaignName = item["campaign"];
var pk1 = db.SearchCampaigns.Single(c => c.SearchCampaignName == campaignName).SearchCampaignId;
var pk2 = DateTime.Parse(item["day"].Replace('-', '/'));
var source = new SearchDailySummary
{
SearchCampaignId = pk1,
Date = pk2,
Revenue = decimal.Parse(item["totalConvValue"]),
Cost = decimal.Parse(item["cost"]),
Orders = int.Parse(item["conv1PerClick"]),
Clicks = int.Parse(item["clicks"]),
Impressions = int.Parse(item["impressions"]),
CurrencyId = item["currency"] == "USD" ? 1 : -1 // NOTE: non USD (if exists) -1 for now
};
var target = db.Set<SearchDailySummary>().Find(pk1, pk2) ?? new SearchDailySummary();
if (db.Entry(target).State == EntityState.Detached)
{
db.SearchDailySummaries.Add(target);
addedCount++;
}
else
{
// TODO?: compare source and target and change the entity state to unchanged if no diff
updatedCount++;
}
AutoMapper.Mapper.Map(source, target);
itemCount++;
}
Logger.Info("Saving {0} SearchDailySummaries ({1} updates, {2} additions)", itemCount, updatedCount, addedCount);
db.SaveChanges();
}
Here is the working version (although I'm not 100% it's optimized, it's working reliably and performing fine as long as I batch it out in groups of 500 or less items in a shot - after that it slows down exponentially but I think that just may be a different question/subject)...
//Logger.Info("Upserting SearchDailySummaries..");
using (var db = new ClientPortalContext())
{
foreach (var item in items)
{
var campaignName = item["campaign"];
var pk1 = db.SearchCampaigns.Single(c => c.SearchCampaignName == campaignName).SearchCampaignId;
var pk2 = DateTime.Parse(item["day"].Replace('-', '/'));
var source = new SearchDailySummary
{
SearchCampaignId = pk1,
Date = pk2,
Revenue = decimal.Parse(item["totalConvValue"]),
Cost = decimal.Parse(item["cost"]),
Orders = int.Parse(item["conv1PerClick"]),
Clicks = int.Parse(item["clicks"]),
Impressions = int.Parse(item["impressions"]),
CurrencyId = item["currency"] == "USD" ? 1 : -1 // NOTE: non USD (if exists) -1 for now
};
var target = db.Set<SearchDailySummary>().Find(pk1, pk2);
if (target == null)
{
db.SearchDailySummaries.Add(source);
addedCount++;
}
else
{
AutoMapper.Mapper.Map(source, target);
db.Entry(target).State = EntityState.Modified;
updatedCount++;
}
itemCount++;
}
Logger.Info("Saving {0} SearchDailySummaries ({1} updates, {2} additions)", itemCount, updatedCount, addedCount);
db.SaveChanges();
}
The thing that keeps popping up in my mind is that maybe the Entry(entity) or Find(pk) method has some side effects? I should probably be consulting the documentation but any advice is appreciated..
It's a slight assumption on my part (without looking into your models/entities), but have a look at what's going on within this block (see if the objects being attached here are related to the deletions):
if (db.Entry(target).State == EntityState.Detached)
{
db.SearchDailySummaries.Add(target);
addedCount++;
}
Your detached object won't be able to use its navigation properties to locate its related objects; you'll be re-attaching an object in a potentially conflicting state (without the correct relationships).
You haven't mentioned what is being deleted above, so I may be way off. Just off out, so this is a little rushed, hope there's something useful in there.

How to work around NotMapped properties in queries?

I have method that looks like this:
private static IEnumerable<OrganizationViewModel> GetOrganizations()
{
var db = new GroveDbContext();
var results = db.Organizations.Select(org => new OrganizationViewModel
{
Id = org.OrgID,
Name = org.OrgName,
SiteCount = org.Sites.Count(),
DbSecureFileCount = 0,
DbFileCount = 0
});
return results;
}
This is returns results pretty promptly.
However, you'll notice the OrganizationViewModel has to properties which are getting set with "0". There are properties in the Organization model which I added via a partial class and decorated with [NotMapped]: UnsecureFileCount and SecureFileCount.
If I change those 0s to something useful...
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount
... I get the "Only initializers, entity members, and entity navigation properties are supported" exception. I find this a little confusing because I don't feel I'm asking the database about them, I'm only setting properties of the view model.
However, since EF isn't listening to my argument I tried a different approach:
private static IEnumerable<OrganizationViewModel> GetOrganizations()
{
var db = new GroveDbContext();
var results = new List<OrganizationViewModel>();
foreach (var org in db.Organizations)
{
results.Add(new OrganizationViewModel
{
Id = org.OrgID,
Name = org.OrgName,
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount,
SiteCount = org.Sites.Count()
});
}
return results;
}
Technically this gives me the correct results without an exception but it takes forever. (By "forever" I mean more than 60 seconds whereas the first version delivers results in under a second.)
Is there a way to optimize the second approach? Or is there a way to get the first approach to work?
Another option would be to load the values back as an anonymous type and the loop through those to load your viewmodel (n+1 is most likely the reason for the slowness).
For example:
var results = db.Organizations.Select(org => new
{
Id = org.OrgID,
Name = org.OrgName,
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount,
SiteCount = org.Sites.Count()
}).ToList();
var viewmodels = results.Select( x=> new OrganizationViewModel
{
Id = x.Id,
Name = x.Name,
DbSecureFileCount = x.DbSecureFileCount,
DbFileCount = x.DbFileCount,
SiteCount = x.SiteCount
});
Sorry about the formatting; I'm typing on a phone.
You are basically lazy loading each object at each iteration of the loop, causing n+1 queries.
What you should do is bring in the entire collection into memory, and use it from there.
Sample code:
var organizationList = db.Organizations.Load();
foreach (var org in organizationList.Local)
{
//Here you are free to do whatever you want
}

Why am I getting index out of bounds error from database

I know what index out of bounds is all about. When I debug I see why as well. basically what is happening is I do a filter on my database to look for records that are potential/pending. I then gather a array of those numbers send them off to another server to check to see if those numbers have been upgraded to a sale. If it has been upgraded to a sale the server responds back with the new Sales Order ID and my old Pending Sales Order ID (SourceID). I then do a for loop on that list to filter it down that specific SourceID and update the SourceID to be the Sales Order ID and change a couple of other values. Problem is is that when I use that filter on the very first one it throws a index out of bounds error. I check the results returned by the filter and it says 0. Which i find kind of strange because I took the sales order number from the list so it should be there. So i dont know what the deal is. Here is the code in question that throws the error. And it doesn't do it all the time. Like I just ran the code this morning and it didn't throw the error. But last night it did before I went home.
filter.RowFilter = string.Format("Stage = '{0}'", Potential.PotentialSale);
if (filter.Count > 0)
{
var Soids = new int[filter.Count];
Console.Write("Searching for Soids - (");
for (int i = 0; i < filter.Count; i++)
{
Console.Write(filter[i][1].ToString() + ",");
Soids[i] = (int)filter[i][1];
}
Console.WriteLine(")");
var pendingRecords = Server.GetSoldRecords(Soids);
var updateRecords = new NameValueCollection();
for (int i = 0; i < pendingRecords.Length; i++)
{
filter.RowFilter = "Soid = " + pendingRecords[i][1];
filter[0].Row["Soid"] = pendingRecords[i][0];
filter[0].Row["SourceId"] = pendingRecords[i][1];
filter[0].Row["Stage"] = Potential.ClosedWon;
var potentialXML = Potential.GetUpdatePotentialXML(filter[0].Row["Soid"].ToString(), filter[0].Row["Stage"].ToString());
updateRecords.Add(filter[0].Row["ZohoID"].ToString(), potentialXML);
}
if i'm counting right line 17 is the error where the error is thrown. pendingRecords is a object[][] array. pendingRecords[i] is the individual records. pendingRecords[i][0] is the new Sales OrderID (SOID) and pendingRecords[i][1] is the old SOID (now the SourceID)
Any help on this one? is it because i'm changing the SOID to the new SOID, and the filter auto updates itself? I just don't know
Well I ended up changing how it worked all together and it actually sorts it a bit nicer now. The code i am about to post has a bunch of hard coded numbers due to the structure of my table that is returned. Sorry about that. I have learned since then to not do that, but i am working on a different project now and will change that when I have to change the program. But here is the solution.
var potentials = Server.GetNewPotentials(); //loads all records from server
for (int i = 0; i < potentials.Length; i++)
{
var filter = AllPotentials.DefaultView;
var result1 = CheckSoidOrSource(potentials[i].Soid, true);
var result2 = CheckSoidOrSource(potentials[i].SourceID,false) ;
//This potential can't be found at all so let's add it to our table
if (result1+result2==0)
{
Logger.WriteLine("Found new record. Adding it to DataTable and sending it to Zoho");
AllPotentials.Add(potentials[i]);
filter.RowFilter = string.Format("Soid = '{0}'", potentials[i].SourceID);
var index = AllPotentials.Rows.IndexOf(filter[0].Row);
ZohoPoster posterInsert = new ZohoPoster(Zoho.Fields.Potentials, Zoho.Calls.insertRecords);
AllPotentials.Rows[index]["ZohoID"] = posterInsert.PostNewPotentialRecord(3, filter[0].Row);
}
//This potential is not found, but has a SourceId that matches a Soid of another record.
if (result1==0 && result2 == 1)
{
Logger.WriteLine("Found a record that needs to be updated on Zoho");
ZohoPoster posterUpdate = new ZohoPoster(Zoho.Fields.Potentials, Zoho.Calls.updateRecords);
filter.RowFilter = string.Format("Soid = '{0}'", potentials[i].SourceID);
var index = AllPotentials.Rows.IndexOf(filter[0].Row);
AllPotentials.Rows[index]["Soid"] = potentials[i].Soid;
AllPotentials.Rows[index]["SourceId"] = potentials[i].SourceID;
AllPotentials.Rows[index]["PotentialStage"] = potentials[i].PotentialStage;
AllPotentials.Rows[index]["UpdateRecord"] = true;
AllPotentials.Rows[index]["Amount"] = potentials[i].Amount;
AllPotentials.Rows[index]["ZohoID"] = posterUpdate.UpdatePotentialRecord(3, filter[0].Row);
}
}
AllPotentials.AcceptChanges();
}
private int CheckSoidOrSource(string Soid, bool checkSource)
{
var filter = AllPotentials.DefaultView;
if (checkSource)
filter.RowFilter = string.Format("Soid = '{0}' OR SourceId = '{1}'",Soid, Soid);
else
filter.RowFilter = string.Format("Soid = '{0}'", Soid);
return filter.Count;
}
basically what is happening is that i noticed something about my data when I filter it this way. The two results would only return the following results (0,0) (0,1) and (1,0) (0,0) means that the record doesn't exist at all in this table so I need to add it. (1,0) means that the Sales Order ID (Soid) matches another Soid in the table so it already exists. Lastly (0,1) means that the Soid doesn't exist in this table but i found a record that has the Soid as it's source...which to me means that the one that had it as a source has been upgraded from a potential to a sale, which in turn means i have to update the record and Zoho. This worked out to much less work for me because now I don't have to search for won and lost records, i only have to search for lost records. less code same results is always a good thing :)

Categories

Resources