In my project for showing Grid data, I order by priority and user can change record Priority by arrow (down,up).
For Change Priority I use this code:
public virtual JsonResult ChangeOrder(int selectedCode, bool isUp)
{
var NewsObj = _newsService.Get(selectedCode);
if (NewsObj == null)
return Json(new { result = false, message = "error" });
int CurrentPriority = NewsObj.Priority;
int OtherPriority = 0;
if (isUp)
{
OtherPriority = CurrentPriority - 1;
}
else
{
OtherPriority = CurrentPriority + 1;
}
var OtherNews = _newsService.GetByPriority(OtherPriority);
if (OtherNews == null)
return Json(new { result = false, message = "error" });
int tmp = NewsObj.Priority;
NewsObj.Priority = OtherNews.Priority;
OtherNews.Priority = tmp;
_uow.MarkAllAsChanges(NewsObj);
_uow.MarkAllAsChanges(OtherNews);
_uow.SaveAllChanges();
return Json(new { result = true, message = "success" });
}
But this codes depends on ordering, for example when I use OrdeyByDesc this row does not work because -- and ++ should change.
I want to be independent of this.
I think you mix the meaning or Priority and Ordering.
Priority is normally not giving an order, because two or more elements can have the same Priority. It's giving only a kind of demand for this element.
An Order instead is lining up the elements in a sequence, so that each element has an unique Order Position. This can be done by using the Priority and/or other fields, may be something like a timestamp. Also it can be done ascendant or descendant, but that will not change the Priority.
So I would suggest to implement a functionality for changing the Priority. And separate, implement a second functionality for ordering, that can include an OrderByPriority. If someone changes the Priority of an element, just refresh the ordered list and the element will go up or down or wherever...
Related
How to combine Id from the list I get from file /test.json and id from list ourOrders[i].id?
Or if there is another way?
private RegionModel FilterByOurOrders(RegionModel region, List<OurOrderModel> ourOrders, MarketSettings market, bool byOurOrders)
{
var result = new RegionModel
{
updatedTs = region.updatedTs,
orders = new List<OrderModel>(region.orders.Count)
};
var json = File.ReadAllText("/test.json");
var otherBotOrders = JsonSerializer.Deserialize<OrdersTimesModel>(json);
OtherBotOrders = new Dictionary<string, OrderTimesInfoModel>();
foreach (var otherBotOrder in otherBotOrders.OrdersTimesInfo)
{
//OtherBotOrders.Add(otherBotOrder.Id, otherBotOrder);
BotController.WriteLine($"{otherBotOrder.Id}"); //Output ID orders to the console works
}
foreach (var order in region.orders)
{
if (ConvertToDecimal(order.price) < 1 || !byOurOrders)
{
int i = 0;
var isOurOrder = false;
while (i < ourOrders.Count && !isOurOrder)
{
if (ourOrders[i].id.Equals(order.id, StringComparison.InvariantCultureIgnoreCase))
{
isOurOrder = true;
}
++i;
}
if (!isOurOrder)
{
result.orders.Add(order);
}
}
}
return result;
}
OrdersTimesModel Looks like that:
public class OrdersTimesModel
{
public List<OrderTimesInfoModel> OrdersTimesInfo { get; set; }
}
test.json:
{"OrdersTimesInfo":[{"Id":"1"},{"Id":"2"}]}
Added:
I'll try to clarify the question:
There are three lists with ID:
First (all orders): region.orders, as order.id
Second (our orders): ourOrders, as ourOrders[i].id in a while loop
Third (our orders 2): from the /test.json file, as an array {"Orders":[{"Id":"12345..."...},{"Id":"12345..." ...}...]}
There is a foreach in which there is a while, where the First (all orders) list and the Second (our orders) list are compared. If the id's match, then these are our orders: isOurOrder = true;
Accordingly, those orders that isOurOrder = false; will be added to the result: result.orders.Add(order)
I need:
So that if (ourOrders[i].id.Equals(order.id, StringComparison.InvariantCultureIgnoreCase)) would include more Id's from the Third (our orders 2) list.
Or any other way to do it?
You should be able to completely avoid writing loops if you use LINQ (there will be loops running in the background, but it's way easier to read)
You can access some documentation here: https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/linq/introduction-to-linq-queries
and you have some pretty cool extension methods for arrays: https://learn.microsoft.com/en-us/dotnet/api/system.linq.enumerable?view=net-6.0 (these are great to get your code easy to read)
Solution
unsing System.Linq;
private RegionModel FilterByOurOrders(RegionModel region, List<OurOrderModel> ourOrders, MarketSettings market, bool byOurOrders)
{
var result = new RegionModel
{
updatedTs = region.updatedTs,
orders = new List<OrderModel>(region.orders.Count)
};
var json = File.ReadAllText("/test.json");
var otherBotOrders = JsonSerializer.Deserialize<OrdersTimesModel>(json);
// This line should get you an array containing
// JUST the ids in the JSON file
var idsFromJsonFile = otherBotOrders.Select(x => x.Id);
// Here you'll get an array with the ids for your orders
var idsFromOurOrders = ourOrders.Select(x => x.id);
// Union will only take unique values,
// so you avoid repetition.
var mergedArrays = idsFromJsonFile.Union(idsFromOurOrders);
// Now we just need to query the region orders
// We'll get every element that has an id contained in the arrays we created earlier
var filteredRegionOrders = region.orders.Where(x => !mergedArrays.Contains(x.id));
result.orders.AddRange(filteredRegionOrders );
return result;
}
You can add conditions to any of those actions (like checking for order price or the boolean flag you get as a parameter), and of course you can do it without assigning so many variables, I did it that way just to make it easier to explain.
Yes, I'm well aware Not to do this, but I have no choice. I'd agree that it's an XYZ issue, but since I can't update the service I have to use, it's out of my hands. I need some help to save some time, maybe learn something handy in the process.
I'm looking to map a list of models (items in this example) to what is essentially numbered variables of a service I'm posting to, in the example, that's the fields a part of new 'newUser'.
Additionally, there may not be always be X amount items in the list (On the right in the example), and yet I have a finite amount (say 10) of numbered variables from 'newUser' to map to (On the left in the example). So I'll have to perform a bunch of checks to avoid indexing a null value as well.
Current example:
if (items.Count >= 1 && !string.IsNullOrWhiteSpace(items[0].id))
{
newUser.itemId1 = items[0].id;
newUser.itemName1 = items[0].name;
newUser.itemDate1 = items[0].date;
newUser.itemBlah1 = items[0].blah;
}
else
{
// This isn't necessary, but this effectively what will happen
newUser.itemId1 = string.Empty;
newUser.itemName1 = string.Empty;
newUser.itemDate1 = string.Empty;
newUser.itemBlah1 = string.Empty;
}
if (items.Count >= 2 && !string.IsNullOrWhiteSpace(items[1].id))
{
newUser.itemId2 = items[1].id;
newUser.itemName2 = items[1].name;
newUser.itemDate2 = items[1].date;
newUser.itemBlah2 = items[1].blah;
}
// Removed the else to clean it up, but you get the idea.
// And so on, repeated many more times..
I looked into an example using Dictionary, but I'm unsure of how to map that to the model without just manually mapping all the variables.
PS: To all who come across this question, if you're implementing numbered variables in your API, please don't- it's wildly unnecessary and time consuming.
As an alternative to fiddling with the JSON, you could get down and dirty and use Reflection.
Given the following test data:
const int maxItemsToSend = 3;
class ItemToSend {
public string
itemId1, itemName1,
itemId2, itemName2,
itemId3, itemName3;
}
ItemToSend newUser = new();
record Item(string id, string name);
Item[] items = { new("1", "A"), new("2", "B") };
Using the rules you set forth in the question, we can loop through the projected fields as so:
// If `itemid1`,`itemId2`, etc are fields:
var fields = typeof(ItemToSend).GetFields();
// If they're properties, replace GetFields() with
// .GetProperties(BindingFlags.Instance | BindingFlags.Public);
for(var i = 1; i <= maxItemsToSend; i++){
// bounds check
var item = (items.Count() >= i && !string.IsNullOrWhiteSpace(items[i-1].id))
? items[i-1] : null;
// Use Reflection to find and set the fields
fields.FirstOrDefault(f => f.Name.Equals($"itemId{i}"))
?.SetValue(newUser, item?.id ?? string.Empty);
fields.FirstOrDefault(f => f.Name.Equals($"itemName{i}"))
?.SetValue(newUser, item?.name ?? string.Empty);
}
It's not pretty, but it works. Here's a fiddle.
I have a list whose size is not fixed. In each iteration, the number of elements in the list may get decreased, increased or remain same but with different values.
In each iteration, I receive the newer list in a setter as following:
public List<int> IconsColor
{
get { return iconsColorList; }
set
{
newIconsColorList = new List<int>(value);
if (newIconsColorList.Count == iconsColorList.Count && newIconsColorList.All(iconsColorList.Contains))
return;
//Else
nIconsChanged = true;
//??????????????????????????
//?????????- How do I update Old list with New values
//Something like iconsColorList = newIconsColorList;
//but above line makes the If-condition true since both the lists are same now
}
}
How do I modify the elements of the previous list (iconsColorList) with new values (present in newIconsColorList)? And if the number of elements in the new list is greater than to that of the older list then, add the new element to the older list also.
So you want to merge both lists (update and add new):
public List<int> IconsColor
{
set
{
for (int i = 0; i < Math.Min(iconsColorList.Count, value.Count); i++)
{
if (value[i] != iconsColorList[i])
{
iconsColorList[i] = value[i];
nIconsChanged = true;
}
}
if (value.Count > iconsColorList.Count)
{
// append new items to the end of the list
iconsColorList.AddRange(value.Skip(iconsColorList.Count));
nIconsChanged = true;
}
}
}
Side-note: i hope the lack of a getter was just because it wasn't relevant. A property without a getter is not really useful and smells like fish. In this case it would just return iconsColorList;.
I need to match email sends with email bounces so I can find if they were delivered or not. The catch is, I have to limit the bounce to within 4 days of the send to eliminate matching the wrong send to the bounce. Send records are spread over a period of 30 days.
LinkedList<event_data> sent = GetMyHugeListOfSends(); //for example 1M+ records
List<event_data> bounced = GetMyListOfBounces(); //for example 150k records
bounced = bounced.OrderBy(o => o.event_date).ToList(); //this ensures the most accurate match of bounce to send (since we find the first match)
List<event_data> delivered = new List<event_data>();
event_data deliveredEmail = new event_data();
foreach (event_data sentEmail in sent)
{
event_data bounce = bounced.Find(item => item.email.ToLower() == sentEmail.email.ToLower() && (item.event_date > sentEmail.event_date && item.event_date < sentEmail.event_date.AddDays(deliveredCalcDelayDays)));
//create delivered records
if (bounce != null)
{
//there was a bounce! don't add a delivered record!
}
else
{
//if sent is not bounced, it's delivered
deliveredEmail.sid = siteid;
deliveredEmail.mlid = mlid;
deliveredEmail.mid = mid;
deliveredEmail.email = sentEmail.email;
deliveredEmail.event_date = sentEmail.event_date;
deliveredEmail.event_status = "Delivered";
deliveredEmail.event_type = "Delivered";
deliveredEmail.id = sentEmail.id;
deliveredEmail.number = sentEmail.number;
deliveredEmail.laststoretransaction = sentEmail.laststoretransaction;
delivered.Add(deliveredEmail); //add the new delivered
deliveredEmail = new event_data();
//remove bounce, it only applies to one send!
bounced.Remove(bounce);
}
if (bounced.Count() == 0)
{
break; //no more bounces to match!
}
}
So I did some testing and it's processing about 12 sent records per second. At 1M+ records, it will take 25+ hours to process!
Two questions:
How can I find the exact line that is taking the most time?
I am assuming it's the lambda expression finding the bounce that is taking the longest since this was much faster before I put that in there. How can I speed this up?
Thanks!
Edit
---Ideas---
One idea I just had is to sort the sends by date like I did the bounces so that the search through the bounces will be more efficient, since an early send would be likely to hit an early bounce as well.
Another idea I just had is to run a couple of these processes in parallel, although I would hate to multi-thread this simple application.
I would be reasonably confident in saying that yes it is your find that is taking the time.
It looks like you are certain that the find method will return 0 or 1 records only (not a list) in which case the way to speed this up would be to create a lookup (a dictionary) instead of creating a List<event_data> for your bounced var, create a Dictionary<key, event_data> instead, then you can just look-up the value by key instead of doing a find.
The trick is in creating your key (I don't know enough about your app to help with that) but essentially the same criteria that is in your find.
EDIT. (adding some pseudo code)
void Main()
{
var hugeListOfEmails = GetHugeListOfEmails();
var allBouncedEmails = GetAllBouncedEmails();
IDictionary<string, EmailInfo> CreateLookupOfBouncedEmails = CreateLookupOfBouncedEmails(allBouncedEmails);
foreach(var info in hugeListOfEmails)
{
if(CreateLookupOfBouncedEmails.ContainsKey(info.emailAddress))
{
// Email is bounced;
}
else
{
// Email is not bounced
}
}
}
public IEnumerable<EmailInfo> GetHugeListOfEmails()
{
yield break;
}
public IEnumerable<EmailInfo> GetAllBouncedEmails()
{
yield break;
}
public IDictionary<string, EmailInfo> CreateLookupOfBouncedEmails(IEnumerable<EmailInfo> emailList)
{
var result = new Dictionary<string, EmailInfo>();
foreach(var e in emailList)
{
if(!result.ContainsKey(e.emailAddress))
{
if(//satisfies the date conditions)
{
result.Add(e.emailAddress, e);
}
}
}
return result;
}
public class EmailInfo
{
public string emailAddress { get; set; }
public DateTime DateSent { get; set; }
}
You should improve by using ToLookup method to create lookup table for email address
var bouncedLookup = bounced.ToLookup(k => k.email.ToLower());
and use this in the loop to lookup by the email first
var filteredBounced = bouncedLookup[sent_email.email.ToLower()];
// mini optimisation here
var endDate = sentEmail.event_date.AddDays(deliveredCalcDelayDays);
event_data bounce = filteredBounced.Find(item => item.event_date > sentEmail.event_date && item.event_date < endDate));
I could not compile it but I think that should do. Please try it.
You are finding items in a list. That means it has to traverse the whole list so it is an order (n) operation. Could you not store those sent emails in a Dictionary with the key being the email address you are searching on. The go through the bounces linking back to the emails in the dictionary. The lookup will be constant time and the you will go through the bounces so it will be order (n) overall. You current method is order (n squared)
Converting bounced to sortedlist might be a good solution
SortedList<string,data> sl = new SortedList<string,event_data>(bounced.ToDictionary(s=>s.email,s=>s));
and to find a bounce use
sl.Select(c=>c.Key.Equals(item => item.email,StringComparison.OrdinalIgnoreCase) && ...).FirstOrDefault();
There's another concern about your code, that I want to point out.
Memory consumption. I don't know your machine configuration, but here are some thoughts about the code:
Initially you are allocating space for 1,2M+ objects of event_data
type. I can't see event_data full type definition, but assuming
that emails are all unique and seeing that the type has quite many
properties, I can assume that such a collection is rather heavy
(hundreds of Meg possibly).
Next you are allocating another bunch of event_data objects
(almost 1M if I've counted it right). It's getting even more heavy
in terms of memory consumption
I don't know about other objects, that are present in data-model of your application, but considering all things I've mentioned, you can easily get close to memory limit
for 32-bit process and thus force GC to work very often. In fact
you can easily have a GC collecting after each call
bounced.Remove(bounce); And it's really would significantly slow down your app.
So, even if you are having a plenty of memory left and/or your app is 64-bit, I would try to minimize memory consumption. Pretty sure it would get your code run faster. For example, you can do complete processing of deliveredEmail, without storing it, or load your initial event_data in chunks etc.
On Consideration, the number of bounces is relatively small, so,
Why not pre omptimise the bounce lookup as much as possible, this code makes a delegate for each possible bounce and groups them into a dictionary for access by the e-mail key.
private static DateInRange(
DateTime sendDate,
DateTime bouncedDate,
int deliveredCalcDelayDays)
{
if (sentDate < bouncedDate)
{
return false;
}
return sentDate < bouncedDate.AddDays(deliveredCalcDelayDays);
}
static IEnumerable<event_data> GetDeliveredMails(
IEnumerable<event_data> sent,
IEnumerable<event_data> bounced,
int siteId,
int mlId,
int mId,
int deliveredCalcDelayDays)
{
var grouped = bounced.GroupBy(
b => b.email.ToLowerInvariant());
var lookup = grouped.ToDictionary(
g => g.Key,
g => g.OrderBy(e => e.event_date).Select(
e => new Func<DateTime, bool>(
s => DateInRange(s, e.event_date, deliveredCalcDelayDays))).ToList());
foreach (var s in sent)
{
var key = s.email.ToLowerInvariant();
List<Func<DateTime, nool>> checks;
if (lookup.TryGetValue(key, out checks))
{
var match = checks.FirstOrDefault(c => c(s.event_date));
if (match != null)
{
checks.Remove(match);
continue;
}
}
yield return new event_data
{
.sid = siteid;
.mlid = mlid;
.mid = mid;
.email = s.email;
.event_date = s.event_date;
.event_status = "Delivered";
.event_type = "Delivered";
.id = s.id;
.number = s.number;
.laststoretransaction = s.laststoretransaction
};
}
}
You could try pre-compiling the delegates in lookup if this is not fast enough.
Ok the final solution I found was a Dictionary for the bounces.
The sent LinkedList was sorted by sent_date so it would loop through in chronological order. That's important because I have to match the right send to the right bounce.
I made a Dictionary<string,<List<event_data>>, so the key was email, and the value was a List of all <event_data> bounces for the email address. The List was sorted by event_date since I wanted to make sure the first bounce was matched to the send.
Final result...it went from processing 700 records/minute to 500k+ records/second.
Here is the final code:
LinkedList sent = GetMyHugeListOfSends();
IEnumerable sentOrdered = sent.OrderBy(send => send.event_date);
Dictionary> bounced = GetMyListOfBouncesAsDictionary();
List delivered = new List();
event_data deliveredEmail = new event_data();
List bounces = null;
bool matchedBounce = false;
foreach (event_data sentEmail in sentOrdered)
{
matchedBounce = false;
//create delivered records
if (bounced.TryGetValue(sentEmail.email, out bounces))
{
//there was a bounce! find out if it was within 4 days after the send!
foreach (event_data bounce in bounces)
{
if (bounce.event_date > sentEmail.event_date &&
bounce.event_date <= sentEmail.event_date.AddDays(4))
{
matchedBounce = true;
//remove the record because a bounce can only match once back to a send
bounces.Remove(bounce);
if(bounces.Count == 0) //no more bounces for this email
{
bounced.Remove(sentEmail.email);
}
break;
}
}
if (matchedBounce == false) //no matching bounces in the list!
{
//if sent is not bounced, it's delivered
deliveredEmail.sid = siteid;
deliveredEmail.mlid = mlid;
deliveredEmail.mid = mid;
deliveredEmail.email = sentEmail.email;
deliveredEmail.event_date = sentEmail.event_date;
deliveredEmail.event_status = "Delivered";
deliveredEmail.event_type = "Delivered";
deliveredEmail.id = sentEmail.id;
deliveredEmail.number = sentEmail.number;
deliveredEmail.laststoretransaction = sentEmail.laststoretransaction;
delivered.Add(deliveredEmail); //add the new delivered
deliveredEmail = new event_data();
}
}
else
{
//if sent is not bounced, it's delivered
deliveredEmail.sid = siteid;
deliveredEmail.mlid = mlid;
deliveredEmail.mid = mid;
deliveredEmail.email = sentEmail.email;
deliveredEmail.event_date = sentEmail.event_date;
deliveredEmail.event_status = "Delivered";
deliveredEmail.event_type = "Delivered";
deliveredEmail.id = sentEmail.id;
deliveredEmail.number = sentEmail.number;
deliveredEmail.laststoretransaction = sentEmail.laststoretransaction;
delivered.Add(deliveredEmail); //add the new delivered
deliveredEmail = new event_data();
}
if (bounced.Count() == 0)
{
break; //no more bounces to match!
}
}
There's two parts to my question, How to do it, and if it's good style to do.
TestDataBaseEntities is the DBContext item being passed.
myQuery is the query being excecuted and read
StatusMessage is passed out of the function for the UI to report whether the operation was a success or not.
Records is what I need help figuring out. I need to figure out how to pass each record back to the calling method of ReadMan().
I was thinking that a 2d string array being passed back would benefit me, because I'd have the data read by this DataAccess class, and ready to display in my UserInterface class. But therein lies the problem.
To declare the string[,], I would need to know the size of the string, or implicitly give it dimensions by doing it like this Records[,] = new string[,] { { FirstField, SecondField }, { FirstField, SecondField } ... } and so on, BUT can't, because the first record being read in doesn't give the information needed to tell Records how large its second index is supposed to be, [ , ThisIndex], If i had this working, i'd pass the 2d Records back for the UserInterface class to Display to user.
Why have this Read function? Because I'm supposed to separate EF functions from UI right?
public class DataAccess
{
public bool ReadMan(TestDatabaseEntities dbEntities, IQueryable myQuery, out string StatusMessage, out string[,] Records)
{
string ErrorMessage;
bool bSuccessful;
string[] ThisRecord;
bSuccessful = TryDataBaseAction(dbEntities, out ErrorMessage,
() =>
{
foreach (Man m in myQuery)
{
// was thinking ThisRecord could be temporary storage but doesn't seem to work to my benefit.
ThisRecord = new string[] { m.ManID.ToString(), m.Name };
}
});
if (bSuccessful)
StatusMessage = "Records read successfully";
else
StatusMessage = ErrorMessage;
return bSuccessful;
}
public bool TryDataBaseAction(TestDatabaseEntities MyDBEntities, out string ErrorMessage, Action MyDBAction)
{
UserInterface MyUI = new UserInterface();
try
{
MyDBAction();
ErrorMessage = "No Error";
return true;
}
catch (Exception e)
{
ErrorMessage = e.ToString();
return false;
}
}
}
EDIT: FIXED
public bool ReadMan(TestDatabaseEntities dbEntities, IQueryable myQuery, out string StatusMessage, out string[,] Records)
{
string ErrorMessage;
bool bSuccessful;
string[,] TheseRecords = null;
// hands an Action() to TryDataBase, as indicated by lambda expression in 3rd arguement.
bSuccessful = TryDataBaseAction(dbEntities, out ErrorMessage,
() =>
{
List<Man> men = myQuery.OfType<Man>().ToList();
TheseRecords = new string[men.Count, 2];
// ERROR BELOW: Operator '<' cannot be applied to operands of type 'int' and 'method group'
for (int i = 0; i < men.Count; i++)
{
TheseRecords[i, 0] = men[i].ManID.ToString();
TheseRecords[i, 1] = men[i].Name;
}
});
Records = TheseRecords;
if (bSuccessful)
StatusMessage = "Records read successfully";
else
StatusMessage = ErrorMessage;
return bSuccessful;
}
Does this help?
bSuccessful = TryDataBaseAction(dbEntities, out ErrorMessage,
() =>
{
List<Man> men = myQuery.OfType<Man>().ToList();
Records = new string[men.Count, 2];
for (int i = 0; i < men.Count; i++)
{
Records[i, 0] = men[i].ManID.ToString();
Records[i, 1] = men[i].Name;
}
});
The key thing here is to convert myQuery into a list so that we can get access to its Count property. Once we have that it is then straightforward to create the Records array.
Whether doing so is good style or not is more subjective and really depends on how your application is architected. Generally speaking I have the data access layer charged with merely executing queries and retrieving data - it is left to the user interface to take care of the visual representation.
Applying this approach to your specfic case, it may have been easier if the ReadMan method took a List or some other collection as the out parameter rather than the 2-D array. This way, the calling method has an easy time of creating the desired representation (you know how many items you are dealing with), plus you avoid having user interface details creeping into the data access layer.
You may be doing this already, but the other thing to consider is how your current approach deals with very large amounts of data. We are materializing everything retreived by myQuery, so if you are wanting to implement some form of paging, it will need to manifest itself in myQuery.