PerformDataBinding, extract row count from ObjectDataSource - c#

I have a custom GridView which automatically puts the row count from SqlDataSources into grids for me. It computes that count in the code below. Note that this question relates to the custom inherited GridView control, not page-level stuff.
How do I recognise in PerformDataBinding that the "IEnumerable" thing is an ObjectDataSource? I want to find out specifically what ObjectDataSource type it is, then call its "get total row count" function.
The reason is that the total row count is (say) millions, where as at the moment the ICollection clause returns the count of just what has been retrieved from the database, which is typically "one page" of data, so (say) 20 records not 20,000,000!
I only have a couple of specific ObjectDataSource types, so I could pick them out one by one if I knew how to find their names from this IEnumerable thing.
I have reviewed this answer:
How to get row count of ObjectDataSource
but I don't know how to work out which precise BLL I'm dealing with. The debugger has lots of stuff inside this object, but I can't see what I want there.
protected override void PerformDataBinding(IEnumerable data)
{
// This does not work for my Object Data Sources, which return one page of
// records only, not the whole set. There must however be a way...
if (data is IListSource)
{
IListSource list = (IListSource)data;
rowcount = list.GetList().Count;
}
else if (data is ICollection)
{
ICollection collection = (ICollection)data;
rowcount = collection.Count;
}
base.PerformDataBinding(data);
}

Just enumerate without casting.
protected override void PerformDataBinding(IEnumerable data)
{
var enum1 = data.GetEnumerator();
int count = 0;
while (enum1.MoveNext())
{
count++;
}
this.TotalRecordCount = count;
base.PerformDataBinding(data);
}

Related

Unity Firebase Realtime Database - Get index of child in database

I am attempting to retrieve the index of an entry in my database. I can do this by ordering
the data then grabbing all of the data in my database and counting the elements up to the entry that I want to find the index of. However, this is not a viable solution when the database goes online.
I have tried to use StartAt but it returns Season1 and null
databaseReference.Child("Leaderboards").Child("Season1").StartAt("Player0").GetValueAsync().ContinueWith(task =>
{
print(task.Result.Key);
print(task.Result.GetRawJsonValue());
});
I have also tried to use EndAt but it returns all of the data in Season1
databaseReference.Child("Leaderboards").Child("Season1").EndAt("Player0").GetValueAsync().ContinueWith(task =>
{
print(task.Result.Key);
print(task.Result.GetRawJsonValue());
});
I have also added .OrderByValue() after .Child("Season1") which works fine in foreach(DataSnapshot dataSnapshot in task.Result.Children) but without limiting the data received, there is no point.
Perhaps, I need to restructure my database (Which I am fully willing to do as it does not hold actual data yet) but I do not know what format would fit my needs.
The Database being worked on:
From what I have found out through extensive testing, the functions StartAt() and EndAt() for Firebase Realtime Database only search by value and not key.
In my case, all of my values are integers so my StartAt() and EndAt() functions must include an integer value to filter data. Perhaps you can send a JSON string to filter objects but that requires further testing as the documentation is nonexistent
The best solution to find an index while retrieving as little data as possible would be to loop through the database searching for the key.
A few relevant things that I discovered during my testing are:
.OrderByValue() returns items in a low to high (0-100 and maybe a-z) format and cannot be changed
.StartAt(int value) will begin by showing all values starting at the given value passed into the function Inclusive
.EndAt(int value) will stop showing values from the given value passed into the function oddly enough, this is also Inclusive
.LimitToLast(int count) will only retrieve the last count number of elements from the previous query which in my case, is safer to use than combining .StartAt() with .EndAt() as they can return an unknown amount of data
.LimitToLast(100).LimitToLast(5) and .LimitToLast(100).LimitToFirst(5) both return the same 5 last values
There may be some cases when using .EndAt(int value).LimitToLast(int count) does not return all of the elements at value which can prevent a recursive function from progressing. As stacking .LimitToLast().LimitToFirst/Last() only returns the last values you cannot use it to iterate through the elements in the database. In this case, you can use .StartAt(int value).EndAt(int value) to obtain all elements with that value which would unfortunately return up to the entire database. This issue would happen at the start of a season when all scores are 0. A possible workaround for this issue would be to check if the current element is 0 (lowest possible value) and then return the count of higher elements
My example will be ran inside of an IEnumerator instead of .GetValueAsync().ContinueWith(task => { }); since I am updating on screen elements
Rough example:
bool hasData = false;
int index = 0;
//start getting baseData
System.Threading.Tasks.Task<DataSnapshot> getLeaderboardTask = databaseReference.Child(DB_LEADERBOARD).Child(season).OrderByValue().LimitToLast(100).GetValueAsync();
//wait for data to be loaded
while (!getLeaderboardTask.IsCompleted) yield return null;
//Check for player value
foreach (DataSnapshot snap in getLeaderboardTask.Result.Children)
{
if (snap.Key.Equals(playerName))
{
hasData = true;
//Do something with the data - Remember to get index, you should use `((int)getLeaderboardTask.Result.ChildrenCount) - index` because Firebase counts in ascending order
break;
}
index++;
}
//check if we need to iterate deeper
if (!hasData)
{
//get new task
System.Threading.Tasks.Task<DataSnapshot> getLeaderboardTask = databaseReference.Child(DB_LEADERBOARD).Child(season).OrderByValue().EndAt(endAtValue).LimitToLast(100).GetValueAsync();
//continue searching while data hasn't been found
while (!hasData)
{
//Wait for current data
while (!getLeaderboardTask.IsCompleted) yield return null;
#region Check if we have the player ranks yet
//get global Data
if (!hasData)
{
int lastRank = 0;
//loop through the data that we just grabbed to look for the player global rank
foreach (DataSnapshot snap in getLeaderboardTask.Result.Children)
{
if (snap.Key.Equals(playerName))
{
hasData = true;
//Do something with the data - Remember to get index, you should use `((int)getLeaderboardTask.Result.ChildrenCount) - index` because Firebase counts in ascending order
break;
}
lastRank = int.Parse(snap.GetRawJsonValue());
index++;
}
//we did not find the player yet, look for them 1 step deeper
if (!hasData)
{
//we are caught in a loop unable to look deeper, search through all elements at this value
if (endAtValue == lastRank)
{
getLeaderboardTask = databaseReference.Child(DB_LEADERBOARD).Child(season).OrderByValue().StartAt(endAtValue).EndAt(endAtValue).GetValueAsync();
//decriment globalEndAtValue in case we do not find it at the current value
endAtValue--;
}
else
{
endAtValue = lastRank;
//we are not in a loop and can continue searching at lastRank
getLeaderboardTask = databaseReference.Child(DB_LEADERBOARD).Child(season).OrderByValue().EndAt(endAtValue).LimitToLast(100).GetValueAsync();
}
}
}
}
}
Not getting any query reults
In your code you have an uppercase P in Player0, but in the database all keys use a lowercase p like player0. That might explain why your reads are not giving results, as queries in Firebase are case-sensitive.
Getting the index of a node
I can [get the index] by ordering the data then grabbing all of the data in my database and counting the elements up to the entry that I want to find the index of. However, this is not a viable solution when the database goes online.
Unfortunately, counting the nodes before it is the only way to get the index of a node in the Firebase Realtime Database. There is nothing built into the database to get the index of a node, so even if it exposed an API for it - it would essentially be doing the same internally.

Create a copy of IEnumerable<T> to modify collection from different threads?

I am using a thread party data model which uses it's custom data model. Hierarchy of the data model is as below:
Model
---Tables(type of Table)
-----Rows(type of Row)
-------Cells( type of Cell)
Table has property Rows as like DataTable and I have to access this property in more than tasks. Now I need a row from the table which has a column value to the specified value.
To do this, I have created a method which has lock statement to make it accessible from only one thread once.
public static Row GetRowWithColumnValue(Model model, string tableKey, string indexColumnKey, string indexColumnValue)
{
Row simObj = null;
lock (syncRoot)
{
SimWrapperFromValueFactory wrapperSimSystem = new SimWrapperFromValueFactory(model, tableKey, indexColumnKey);
simObj = wrapperSimSystem.GetWrapper(indexColumnValue);
}
return simObj;
}
To create the lookup for one of the column in Table, I have create a method which always try to create a copy of the rows to avoid collection modified exception:
Private Function GetTableRows(table As Table) As List(Of Row)
Dim rowsList As New List(Of Row)(table.Rows) 'Case 1
'rowsList.AddRange(table.Rows) 'Case 2
' Case 3
'For i As Integer = 0 To table.Rows.Count - 1
'rowsList.Add(table.Rows.ElementAt(i))
'Next
Return rowsList
End Function
but other threads can modify the table(e.g. add, remove rows or update column value in any rows). I am getting below "Collection modified exception":
at System.ThrowHelper.ThrowInvalidOperationException(ExceptionResource resource)
at System.Collections.Generic.List`1.Enumerator.MoveNextRare()
at System.Collections.Generic.List`1.InsertRange(Int32 index, IEnumerable`1 collection)
I cannot modify this third party library to concurrent collections and this same Data Model shared between multiple project.
Question: I hunting for the solution that let me allow multiple readers on this collection either it modified in another threads.. Is it possible to Get a copy of the collection without getting exception??
Referenced below SO threads but did not find exact solution:
Lock vs. ToArray for thread safe foreach access of List collection
Can ToArray() throw an exception?
Is returning an IEnumerable<> thread-safe?
The simplest solution is to retry on exception, like this:
private List<Row> CopyVolatileList(IEnumerable<Row> original)
{
while (true)
{
try
{
List<Row> copy = new List<Row>();
foreach (Row row in original) {
copy.Add(row);
}
// Validate.
if (copy.Count != 0 && copy[copy.Count - 1] == null) // Assuming Row is a reference type.
{
// At least one element was removed from the list while were copying.
continue;
}
return copy;
}
catch (InvalidOperationException)
{
// Check ex.Message?
}
// Keep trying.
}
}
Eventually you'll get a run where the exception isn't thrown and the data integrity validation passes.
Alternatively, you can dive deep (and I mean very, very deep).
DISCLAIMER: Never ever use this in production. Unless you're desperate and really have no other option.
So we've established that you're working with a custom collection (TableRowCollection) which ultimately uses List<Row>.Enumerator to iterate through the rows. This strongly suggests that your collection is backed by a List<Row>.
First things first, you need to get a reference to that list. Your collection will not expose it publicly, so you'll need to fiddle a bit. You will need to use Reflection to find and get the value of the backing list. I recommend looking at your TableRowCollection in the debugger. It will show you non-public members and you will know what to reflect.
If you can't find your List<Row>, then take a closer look at TableRowCollection.GetEnumerator() - specifically GetEnumerator().GetType(). If that returns List<Row>.Enumerator, then bingo: we can get the backing list out of it, like so:
List<Row> list;
using (IEnumerator<Row> enumerator = table.GetEnumerator())
{
list = (List<Row>)typeof(List<Row>.Enumerator)
.GetField("list", BindingFlags.Instance | BindingFlags.NonPublic)
.GetValue(enumerator);
}
If the above methods of getting your List<Row> have failed, there is no need to read further. You might as well give up.
In case you've succeeded, now that you have the backing List<Row>, we'll have to look at Reference Source for List<T>.
What we see is 3 fields being used:
private T[] _items;
private int _size; // Accessible via "Count".
private int _version;
Our goal is to copy the items whose indexes are between zero and _size - 1 from the _items array into a new array, and to do so in between _version changes.
Observations re thread safety: List<T> does not use locks, none of the fields are marked as volatile and _version is incremented via ++, not Interlocked.Increment. Long story short this means that it is impossible to read all 3 field values and confidently say that we're looking at stable data. We'll have to read the field values repeatedly in order to be somewhat confident that we're looking at a reasonable snapshot (we will never be 100% confident, but you might choose to settle for "good enough").
using System;
using System.Collections.Generic;
using System.Linq.Expressions;
using System.Reflection;
using System.Threading;
private Row[] CopyVolatileList(List<Row> original)
{
while (true)
{
// Get _items and _size values which are safe to use in tandem.
int version = GetVersion(original); // _version.
Row[] items = GetItems(original); // _items.
int count = original.Count; // _size.
if (items.Length < count)
{
// Definitely a torn read. Copy will fail.
continue;
}
// Copy.
Row[] copy = new Row[count];
Array.Copy(items, 0, copy, 0, count);
// Stabilization window.
Thread.Sleep(1);
// Validate.
if (version == GetVersion(original)) {
return copy;
}
// Keep trying.
}
}
static Func<List<Row>, int> GetVersion = CompilePrivateFieldAccessor<List<Row>, int>("_version");
static Func<List<Row>, Row[]> GetItems = CompilePrivateFieldAccessor<List<Row>, Row[]>("_items");
static Func<TObject, TField> CompilePrivateFieldAccessor<TObject, TField>(string fieldName)
{
ParameterExpression param = Expression.Parameter(typeof(TObject), "o");
MemberExpression fieldAccess = Expression.PropertyOrField(param, fieldName);
return Expression
.Lambda<Func<TObject, TField>>(fieldAccess, param)
.Compile();
}
Note re stabilization window: the bigger it is, the more confidence you have that you're not dealing with a torn read (because the list is in process of modifying all 3 fields). I've settled on the smallest value I couldn't fail in my tests where I called CopyVolatileList in a tight loop on one thread, and used another thread to add items to the list, remove them or clear the list at random intervals between 0 and 20ms.
If you remove the stabilization window, you will occasionally get a copy with uninitialized elements at the end of the array because the other thread has removed a row while you were copying - that's why it's needed.
You should obviously validate the copy once it's built, to the best of your ability (at least check for uninitialized elements at the end of the array in case the stabilization window fails).
Good luck.

WPF CollectionChanged Event OldItems.Count

I have an ObservableCollection and I attach to the CollectionChanged event:
void OnCollectionChanged(object sender, NotifyCollectionChangedEventArgs e)
{
if (e.Action == NotifyCollectionChangedAction.Add)
{
for (int i = 0; i < e.NewItems.Count; i++)
{
int id = -1 * i;
DocumentWatchList d = (DocumentWatchList)e.NewItems[i];
d.UID = id;
_dataDc.DocumentWatchLists.InsertOnSubmit(d);
}
}
else if (e.Action == NotifyCollectionChangedAction.Remove)
{
for (int i = 0; i < e.OldItems.Count; i++)
{
DocumentWatchList d = (DocumentWatchList)e.OldItems[i];
_dataDc.DocumentWatchLists.DeleteOnSubmit(d);
}
}
_dataDc.SubmitChanges();
}
My collection is bound to a datagrid, and the viewmodel code (above) gets called as expected. When I select multiple rows and hit delete, I expect that the OldItems collection will contain the number of rows that I had selected (n). however, what actually happens is that the event gets called n times, and each time the OldItems collection count is 1. So under what conditions will the OldItems collection contain more than 1 item? Is this behavior coming from the datagrid control, or rather is it the way ObservableCollection.CollectionChanged is meant to work?
For some reason, ObservableCollection doesn't support a NotifyCollectionChanged event with multiple items.
You can do this:
OnCollectionChanged(
new NotifyCollectionChangedEventArgs(
NotifyCollectionChangedAction.Remove, singleItem));
But not this: (you'll get a NotSupportedException)
OnCollectionChanged(
new NotifyCollectionChangedEventArgs(
NotifyCollectionChangedAction.Remove, multipleItems));
I can't think of any scenario where e.OldItems.Count would be greater than 0.
Here is a good article about it, where someone actually implemented the handling of multiple items themselves, for performance purposes.
If you take a look at the interface that observable collection offers to you, you already know what you will get :
https://msdn.microsoft.com/en-us/library/ms668604%28v=vs.110%29.aspx
Basically it does not offer any way to insert or remove multiple items at once. So this effectively means that you can clear the whole collection, but if you need to say remove 2 items while there are 6 in the collection, you will have to remove them one by one. Which is what the datagrid will do in your case , but suppose you were to implement you own datagrid, you would be forced to do it the same way.
To make the answer complete, I must add that there is a way to get multiple items in the deleted list, but only by clearing the collection.
Replacing an item in the collection is also possible, by using the index operator, you can even replace an item with itself. I' ve tried this and this works, you will get the same item in the deleted collection and inserted collection in that case, but also here one by one.
You can of course create you own implementation of observable collection that would solve these issues. But I think you would need a different implementation of datagrid too, that would use your custom new bulk insert or bulk delete operations.

Looping on IEnumerator<T>, Any Suggestions

I am having a situation where looping through the result of LINQ is getting on my nerves. Well here is my scenario:
I have a DataTable, that comes from database, from which I am taking data as:
var results = from d in dtAllData.AsEnumerable()
select new MyType
{
ID = d.Field<Decimal>("ID"),
Name = d.Field<string>("Name")
}
After doing the order by depending on the sort order as:
if(orderBy != "")
{
string[] ord = orderBy.Split(' ');
if (ord != null && ord.Length == 2 && ord[0] != "")
{
if (ord[1].ToLower() != "desc")
{
results = from sorted in results
orderby GetPropertyValue(sorted, ord[0])
select sorted;
}
else
{
results = from sorted in results
orderby GetPropertyValue(sorted, ord[0]) descending
select sorted;
}
}
}
The GetPropertyValue method is as:
private object GetPropertyValue(object obj, string property)
{
System.Reflection.PropertyInfo propertyInfo = obj.GetType().GetProperty(property);
return propertyInfo.GetValue(obj, null);
}
After this I am taking out 25 records for first page like:
results = from sorted in results
.Skip(0)
.Take(25)
select sorted;
So far things are going good, Now I have to pass this results to a method which is going to do some manipulation on the data and return me the desired data, here in this method when I want to loop these 25 records its taking a good enough time. My method definition is:
public MyTypeCollection GetMyTypes(IEnumerable<MyType> myData, String dateFormat, String offset)
I have tried foreach and it takes like 8-10 secs on my machine, it is taking time at this line:
foreach(var _data in myData)
I tried while loop and is doing same thing, I used it like:
var enumerator = myData.GetEnumerator();
while(enumerator.MoveNext())
{
int n = enumerator.Current;
Console.WriteLine(n);
}
This piece of code is taking time at MoveNext
Than I went for for loop like:
int length = myData.Count();
for (int i = 0; i < 25;i++ )
{
var temp = myData.ElementAt(i);
}
This code is taking time at ElementAt
Can anyone please guide me, what I am doing wrong. I am using Framework 3.5 in VS 2008.
Thanks in advance
EDIT: I suspect the problem is in how you're ordering. You're using reflection to first fetch and then invoke a property for every record. Even though you only want the first 25 records, it has to call GetPropertyValue on all the records first, in order to order them.
It would be much better if you could do this without reflection at all... but if you do need to use reflection, at least call Type.GetProperty() once instead of for every record.
(In some ways this is more to do with helping you diagnose the problem more easily than a full answer as such...)
As Henk said, this is very odd:
results = from sorted in results
.Skip(0)
.Take(25)
select sorted;
You almost certainly really just want:
results = results.Take(25);
(Skip(0) is pointless.)
It may not actually help, but it will make the code simpler to debug.
The next problem is that we can't actually see all your code. You've written:
After doing the order by depending on the sort order
... but you haven't shown how you're performing the ordering.
You should show us a complete example going from DataTable to its use.
Changing how you iterate over the sequence will not help - it's going to do the same thing either way, really - although it's surprising that in your last attempt, Count() apparently works quickly. Stick to the foreach - but work out exactly what that's going to be doing. LINQ uses a lot of lazy evaluation, and if you've done something which makes that very heavy going, that could be the problem. It's hard to know without seeing the whole pipeline.
The problem is that your "results" IEnumerable isn't actually being evaluated until it is passed into your method and enumerated. That means that the whole operation, getting all the data from dtAllData, selecting out the new type (which is happening on the whole enumerable, not just the first 25), and then finally the take 25 operation, are all happening on the first enumeration of the IEnumerable (foreach, while, whatever).
That's why your method is taking so long. It's actually doing some of the work defined elsewhere inside the method. If you want that to happen before your method, you could do a "ToList()" prior to the method.
You might find it easier to adopt a hybrid approach;
In order:
1) Sort your datatable in-situ. It's probably best to do this at the database level, but, if you can't, then DataTable.DefaultView.Sort is pretty efficient:
dtAllData.DefaultView.Sort = ord[0] + " " + ord[1];
This assumes that ord[0] is the column name, and ord[1] is either ASC or DESC
2) Page through the DefaultView by index:
int pageStart = 0;
List<DataRowView> pageRows = new List<DataRowView>();
for (int i = pageStart; i < dtAllData.DefaultView.Count; i++ )
{
if(pageStart + 25 > i || i == dtAllData.DefaultView.Count - 1) { break; //Exit if more than the number of pages or at the end of the rows }
pageRows.Add(dtAllData.DefaultView[i]);
}
...and create your objects from this much smaller list... (I've assumed the columns are called Id and Name, as well as the types)
List<MyType> myObjects = new List<MyType>();
foreach(DataRowView pageRow in pageRows)
{
myObjects.Add(new MyObject() { Id = Convert.ToInt32(pageRow["Id"]), Name = Convert.ToString(pageRow["Name"])});
}
You can then proceed with the rest of what you were doing.

LINQ: find all checked checkboxes in a GridView

Consider the current algorithm below that iterates through a GridView's rows to find whether the contained Checkbox is selected/checked.
List<int> checkedIDs = new List<int>();
foreach (GridViewRow msgRow in messagesGrid.Rows)
{
CheckBox chk = (CheckBox)msgRow.FindControl("chkUpdateStatus");
if (chk.Checked){
//we want the GridViewRow's DataKey value
checkedMsgIDs.Add(int.Parse(messagesGrid.DataKeys[msgRow.RowIndex].Value.ToString()));
}
}
This works as expected: you're left with a fully populated List<int>.
Question: How would you or could you re-write or improve this algorithm using LINQ to search the GridView for all the rows who have their Checkbox selected/checked?
I'm pretty sure you're not going to get any performance improvement from this, but it might make it slightly easier to read:
var checkedIDs = from GridViewRow msgRow in messagesGrid.Rows
where ((CheckBox)msgRow.FindControl("chkUpdateStatus")).Checked
select Int32.Parse(messagesGrid.DataKeys[msgRow.RowIndex].Value.ToString());
Again, not sure it makes a difference. Also, why are you converting to a string then to an int? Is there something Convert.ToInt32 can't do for you?
I am not sure if Rows is IEnumerable they may not be, but I am going to assume they are
List<int> checkedIDs = messagesGrid.Rows
.Where<GridViewRow>(i => (CheckBox)i.FindControl("chkUpdateStatus").Checked)
.Select<GridViewRow, int>(i => return int.Parse(messagesGrid.DataKeys[i.RowIndex].Value.ToString()))
.ToList<int>();
I just did this in notepad, there might be a compile error in there. But this is how you could do the same thing with Linq.
I have something similar but I was using it in more than one place so I created an extension method.
public static void ActOnCheckedRows(this GridView gridView, string checkBoxId, Action<IEnumerable<int>> action)
{
var checkedRows = from GridViewRow msgRow in gridView.Rows
where ((CheckBox)msgRow.FindControl(checkBoxId)).Checked
select (int) gridView.DataKeys[msgRow.RowIndex].Value;
action(checkedRows);
}
So now I can do something with all the checked rows. The compiler is pretty good at deducing the types but occasionally I need to explicitly declare checkedRows as type IEnumerable.
gvTasksToBill.ActOnCheckedRows("RowLevelCheckBox", checkedRows =>
{
foreach (int id in checkedRows)
{
// do something with id
}
});

Categories

Resources