i have problem with joining multiple collections into one
-> I need collections with data from many sensors connect into one to have for each time values from all sensors in output file, f.e. if one sensor have no data, it will fill file with 0
Please help me, I am desperate
public class MeasuredData
{
public DateTime Time { get; }
public double Value { get; }
public MeasuredData(DateTime time, double value)
{
Time = time;
Value = value;
}
}
If you have multiple variables containing List<MeasuredData>, one for each sensor, you can group them in an array and then query them.
First, you need an extension method to round the DateTimes per #jdweng if you aren't already canonicalizing them as you acquire them.
public static DateTime Round(this DateTime dt, TimeSpan rnd) {
if (rnd == TimeSpan.Zero)
return dt;
else {
var ansTicks = dt.Ticks + Math.Sign(dt.Ticks) * rnd.Ticks / 2;
return new DateTime(ansTicks - ansTicks % rnd.Ticks);
}
}
Now you can create an array of the sensor reading Lists:
var sensorData = new[] { sensor0, sensor1, sensor2, sensor3 };
Then you can extract all the rounded times to create the left hand side of the table:
var roundTo = TimeSpan.FromSeconds(1);
var times = sensorData.SelectMany(sdl => sdl.Select(md => md.Time.Round(roundTo)))
.Distinct()
.Select(t => new { Time = t, Measurements = Enumerable.Empty<MeasuredData>() });
Then you can join each sensor to the table:
foreach (var oneSensorData in sensorData)
times = times.GroupJoin(oneSensorData, t => t.Time, md => md.Time.Round(roundTo),
(t, mdj) => new { t.Time, Measurements = t.Measurements.Concat(mdj) });
Finally, you can convert each row to the time and a List of measurements ordered by time:
var ans = times.Select(tm => new { tm.Time, Measurements = tm.Measurements.ToList() })
.OrderBy(tm => tm.Time);
If you wanted to flatten the List of measurements out to fields in the answer, you would need to do that manually with another Select.
Assuming you have something to join on, you can use Enumerable.Join:
var result = collection1.Join(collection2,
/* whatever your join is */ x => x.id,
y => y.id,
(a, b) => new {x = a, y = b}
foreach(var obj in result)
{
Console.WriteLine($"{obj.x.id}, {obj.y.id}")
}
This prints the id's of the two objects, but they could access anything. The link is probably more helpful, but you didn't give us much info
Related
What I have is a string data type which stores the duration.
I am a looking for sum of the duration and then average of that sum.
I am using ASP.NET MVC.
Example:
00:30:21
00:40:01
00:21:10
Model class
public DateTime? FeedbackDateTime { get; set; }
public DateTime? FeedbackSharedDateTime { get; set; }
public string AuditorAHT { get; set; }
ReportVM To Group Data and display in the View
public string FeedbackSharedBy { get; set; }
public int AuditCount { get; set; }
public string AudtAht { get; set; }
Controller that saves the action perform by auditor as duration in
public string AuditorAHT { get; set; }
dto.FeedbackSharedDateTime = DateTime.Now;
string ahtString = string.Format("{0:hh\\:mm\\:ss}", dto.FeedbackSharedDateTime - dto.FeedbackDateTime);
dto.AuditorAHT = ahtString;
db.SaveChanges();
Below Action should display Auditors Name, Count, and Average Time spent. From which Name and Count is working but not the Average Time Spend
var audtName = db.Chats.Where(x => System.Data.Entity.DbFunctions.TruncateTime(x.MSTChatCreatedDateTime) >= mostRecentMonday
&& System.Data.Entity.DbFunctions.TruncateTime(x.MSTChatCreatedDateTime) <= weekEnd && x.Feedback != null && x.FeedbackSharedBy != null).Select(x => new {
x.FeedbackSharedBy,
x.AuditorAHT
}).ToList() // this hits the database
// We need to do grouping in the code (rather than the db)
// because timespans are stored as strings
.GroupBy(e => e.FeedbackSharedBy)
.Select(g => new ReportVM
{
FeedbackSharedBy = g.Key,
AuditCount = g.Count(),
AudtAht = TimeSpan.FromSeconds(g.Sum(t => TimeSpan.Parse(t.AuditorAHT).TotalSeconds / g.Count())).ToString()
})
.OrderByDescending(s => s.AuditCount).ToList();
ViewBag.AudtReport = audtName;
Above COde is working for me, managed to make it work.
You can convert the string duration into a TimeSpan type and use this to do time calculations. To convert it you can use TimeSpan.Parse() or if you have a fix format use TimeSpan.ParseExact().
With TimeSpan you can get the the totals out of it with the various .Total*() methods. You also can get the internal ticks count with .Ticks. That's the one with the highest precision.
Now its a simple math: sum of all ticks / count = average ticks
You can pass this average ticks count into a TimeSpan again to grab it as .TotalMilliseconds() or output it formatted with .ToString().
Here is a basic sample:
using System;
public class Program
{
public static void Main()
{
var duration1 = TimeSpan.Parse("00:30:21");
var duration2 = TimeSpan.Parse("00:40:01");
var duration3 = TimeSpan.Parse("00:21:10");
var totalDuration = duration1.Add(duration2).Add(duration3);
var averageDurationTicks = totalDuration.Ticks / 3;
var averageDuration = TimeSpan.FromTicks(averageDurationTicks);
Console.WriteLine($"Total duration: {totalDuration}, Average duration: {averageDuration}");
}
}
Here is a .Net Fiddle: https://dotnetfiddle.net/1Q9tmV
After spending lot of time and with help of tymtam made the code work with below code.
var audtName = db.Chats.Where(x => System.Data.Entity.DbFunctions.TruncateTime(x.MSTChatCreatedDateTime) >= mostRecentMonday
&& System.Data.Entity.DbFunctions.TruncateTime(x.MSTChatCreatedDateTime) <= weekEnd && x.Feedback != null && x.FeedbackSharedBy != null).Select(x => new {
x.FeedbackSharedBy,
x.AuditorAHT
}).ToList() // this hits the database
// We need to do grouping in the code (rather than the db)
// because timespans are stored as strings
.GroupBy(e => e.FeedbackSharedBy)
.Select(g => new ReportVM
{
FeedbackSharedBy = g.Key,
AuditCount = g.Count(),
AudtAht = TimeSpan.FromSeconds(g.Sum(t => TimeSpan.Parse(t.AuditorAHT).TotalSeconds / g.Count())).ToString()
})
.OrderByDescending(s => s.AuditCount).ToList();
ViewBag.AudtReport = audtName;
If a second precision is enough for you you could combine Linq's Sum and Average with TimeSpan's Parse and TotalSeconds:
Sum = TimeSpan.FromSeconds(g.Sum(t => TimeSpan.Parse(t.Time).TotalSeconds)),
Avg = TimeSpan.FromSeconds(g.Average(t => TimeSpan.Parse(t.Time).TotalSeconds))
Here is a full example:
var data = new []{
new { X = "A", Time = "00:30:21"},
new { X = "B", Time = "00:40:01"},
new { X = "B", Time = "00:21:10"}
};
var grouped = data
.GroupBy(e => e.X)
.Select( g => new {
X = g.Key,
Count = g.Count(),
Sum = TimeSpan.FromSeconds(g.Sum(t => TimeSpan.Parse(t.Time).TotalSeconds)),
Avg = TimeSpan.FromSeconds(g.Average(t => TimeSpan.Parse(t.Time).TotalSeconds))
});
foreach (var item in grouped)
{
Console.WriteLine( $"'{item.X}' has {item.Count} item(s), sum = {item.Sum}, avg = {item.Avg}");
}
This produces:
'A' has 1 item(s), sum = 00:30:21, avg = 00:30:21
'B' has 2 item(s), sum = 01:01:11, avg = 00:30:35.5000000
You could use TotalMilliseconds + FromMillisecond or even go super precise Ticks.
Variant with Aggregate
Another option is to Parse earlier:
Sum = g.Select(e => TimeSpan.Parse(e.Time)).Aggregate((t1, t2) => t1 + t2),
Avg = g.Select(e => TimeSpan.Parse(e.Time)).Aggregate((t1, t2) => t1 + t2) / g.Count()
For LINQ to Entities
As you report in your comment if we try the above with your real code it results in LINQ to Entities does not recognize the method 'System.TimeSpan FromSeconds(Double)' method, and this method cannot be translated into a store expression. (With the same for TimeSpan.Parse.
Because of this we will would need to do the grouping in the code. This is less efficient that if the TimeSpan was in the database as TimeSpan.
var grouped = db.Chats
.Where(...)
.Select( x => new {
x.FeedbackSharedBy,
x.AuditorAHT
})
.ToList() // this hits the database
// We need to do grouping in the code (rather than the db)
// because timespans are stored as strings
.GroupBy(e => e.FeedbackSharedBy)
.Select(g => new
{
FeedbackSharedBy = g.Key,
AuditCount = g.Count(),
AuditorAHTSumSeconds = TimeSpan.FromSeconds(g.Sum(t => TimeSpan.Parse(t.AuditorAHT).TotalSeconds) / g.Count())
.ToString(),
})
.OrderByDescending(s => s.AuditCount)
.ToList(); // Optional
I have a list with that each object has two fields:
Date as DateTime
Estimated as double.
I have some values like this:
01/01/2019 2
01/02/2019 3
01/03/2019 4
... and so.
I need to generate another list, same format, but accumulating the Estimated field, date by date. So the result must be:
01/01/2019 2
01/02/2019 5 (2+3)
01/03/2019 9 (5+4) ... and so.
Right now, I'm calculating it in a foreach statement
for (int iI = 0; iI < SData.TotalDays; iI++)
{
DateTime oCurrent = SData.ProjectStart.AddDays(iI);
oRet.Add(new GraphData(oCurrent, GetProperEstimation(oCurrent)));
}
Then, I can execute a Linq Sum for all the dates prior or equal to the current date:
private static double GetProperEstimation(DateTime pDate)
{
return Data.Where(x => x.Date.Date <= pDate.Date).Sum(x => x.Estimated);
}
It works. But the problem is that is ABSLOUTELLY slow, taking more than 1 minute for a 271 element list.
Is there a better way to do this?
Thanks in advance.
You can write a simple LINQ-like extension method that accumulates values. This version is generalized to allow different input and output types:
static class ExtensionMethods
{
public static IEnumerable<TOut> Accumulate<TIn, TOut>(this IEnumerable<TIn> source, Func<TIn,double> getFunction, Func<TIn,double,TOut> createFunction)
{
double accumulator = 0;
foreach (var item in source)
{
accumulator += getFunction(item);
yield return createFunction(item, accumulator);
}
}
}
Example usage:
public static void Main()
{
var list = new List<Foo>
{
new Foo { Date = new DateTime(2018,1,1), Estimated = 1 },
new Foo { Date = new DateTime(2018,1,2), Estimated = 2 },
new Foo { Date = new DateTime(2018,1,3), Estimated = 3 },
new Foo { Date = new DateTime(2018,1,4), Estimated = 4 },
new Foo { Date = new DateTime(2018,1,5), Estimated = 5 }
};
var accumulatedList = list.Accumulate
(
(item) => item.Estimated, //Given an item, get the value to be summed
(item, sum) => new { Item = item, Sum = sum } //Given an item and the sum, create an output element
);
foreach (var item in accumulatedList)
{
Console.WriteLine("{0:yyyy-MM-dd} {1}", item.Item.Date, item.Sum);
}
}
Output:
2018-01-01 1
2018-01-02 3
2018-01-03 6
2018-01-04 10
2018-01-05 15
This approach will only require one iteration over the set so should perform much better than a series of sums.
Link to DotNetFiddle example
This is exactly job of MoreLinq.Scan
var newModels = list.Scan((x, y) => new MyModel(y.Date, x.Estimated + y.Estimated));
New models will have the values you want.
in (x, y), x is the previous item and y is the current item in the enumeration.
Why your query is slow?
because Where will iterate your collection from the beginning every time you call it. so number of operations grow exponentially 1 + 2 + 3 + ... + n = ((n^2)/2 + n/2).
You can try this. Simple yet effective.
var i = 0;
var result = myList.Select(x => new MyObject
{
Date = x.Date,
Estimated = i = i + x.Estimated
}).ToList();
Edit : try in this way
.Select(x => new GraphData(x.Date, i = i + x.Estimated))
I will assume that what you said is real what you need hehehe
Algorithm
Create a list or array of values based in the original values ordered date asc
sumValues=0;
foreach (var x in collection){
sumValues+= x.Estimated; //this will accumulate all the past values and present value
oRet.Add(x.date, sumValues);
}
The first step (order the values) is the most important. For each will be very fast.
see sort
I have two linq queries, one to get confirmedQty and another one is to get unconfirmedQty.
There is a condition for getting unconfirmedQty. It should be average instead of sum.
result = Sum(confirmedQty) + Avg(unconfirmedQty)
Is there any way to just write one query and get the desired result instead of writing two separate queries?
My Code
class Program
{
static void Main(string[] args)
{
List<Item> items = new List<Item>(new Item[]
{
new Item{ Qty = 100, IsConfirmed=true },
new Item{ Qty = 40, IsConfirmed=false },
new Item{ Qty = 40, IsConfirmed=false },
new Item{ Qty = 40, IsConfirmed=false },
});
int confirmedQty = Convert.ToInt32(items.Where(o => o.IsConfirmed == true).Sum(u => u.Qty));
int unconfirmedQty = Convert.ToInt32(items.Where(o => o.IsConfirmed != true).Average(u => u.Qty));
//Output => Total : 140
Console.WriteLine("Total : " + (confirmedQty + unconfirmedQty));
Console.Read();
}
public class Item
{
public int Qty { get; set; }
public bool IsConfirmed { get; set; }
}
}
Actually accepted answer enumerates your items collection 2N + 1 times and it adds unnecessary complexity to your original solution. If I'd met this piece of code
(from t in items
let confirmedQty = items.Where(o => o.IsConfirmed == true).Sum(u => u.Qty)
let unconfirmedQty = items.Where(o => o.IsConfirmed != true).Average(u => u.Qty)
let total = confirmedQty + unconfirmedQty
select new { tl = total }).FirstOrDefault();
it would take some time to understand what type of data you are projecting items to. Yes, this query is a strange projection. It creates SelectIterator to project each item of sequence, then it create some range variables, which involves iterating items twice, and finally it selects first projected item. Basically you have wrapped your original queries into additional useless query:
items.Select(i => {
var confirmedQty = items.Where(o => o.IsConfirmed).Sum(u => u.Qty);
var unconfirmedQty = items.Where(o => !o.IsConfirmed).Average(u => u.Qty);
var total = confirmedQty + unconfirmedQty;
return new { tl = total };
}).FirstOrDefault();
Intent is hidden deeply in code and you still have same two nested queries. What you can do here? You can simplify your two queries, make them more readable and show your intent clearly:
int confirmedTotal = items.Where(i => i.IsConfirmed).Sum(i => i.Qty);
// NOTE: Average will throw exception if there is no unconfirmed items!
double unconfirmedAverage = items.Where(i => !i.IsConfirmed).Average(i => i.Qty);
int total = confirmedTotal + (int)unconfirmedAverage;
If performance is more important than readability, then you can calculate total in single query (moved to extension method for readability):
public static int Total(this IEnumerable<Item> items)
{
int confirmedTotal = 0;
int unconfirmedTotal = 0;
int unconfirmedCount = 0;
foreach (var item in items)
{
if (item.IsConfirmed)
{
confirmedTotal += item.Qty;
}
else
{
unconfirmedCount++;
unconfirmedTotal += item.Qty;
}
}
if (unconfirmedCount == 0)
return confirmedTotal;
// NOTE: Will not throw if there is no unconfirmed items
return confirmedTotal + unconfirmedTotal / unconfirmedCount;
}
Usage is simple:
items.Total();
BTW Second solution from accepted answer is not correct. It's just a coincidence that it returns correct value, because you have all unconfirmed items with equal Qty. This solution calculates sum instead of average. Solution with grouping will look like:
var total =
items.GroupBy(i => i.IsConfirmed)
.Select(g => g.Key ? g.Sum(i => i.Qty) : (int)g.Average(i => i.Qty))
.Sum();
Here you have grouping items into two groups - confirmed and unconfirmed. Then you calculate either sum or average based on group key, and summary of two group values. This also neither readable nor efficient solution, but it's correct.
I need a list with some objects for calculation.
my current code looks like this
private class HelperClass
{
public DateTime TheDate {get;set;}
public TimeSpan TheDuration {get;set;}
public bool Enabled {get;set;}
}
private TimeSpan TheMethod()
{
// create entries for every date
var items = new List<HelperClass>();
foreach(DateTime d in GetAllDatesOrdered())
{
items.Add(new HelperClass { TheDate = d, Enabled = GetEnabled(d), });
}
// calculate the duration for every entry
for (int i = 0; i < items.Count; i++)
{
var item = items[i];
if (i == items.Count -1) // the last one
item.TheDuration = DateTime.Now - item.TheDate;
else
item.TheDuration = items[i+1].TheDate - item.TheDate;
}
// calculate the total duration and return the result
var result = TimeSpan.Zero;
foreach(var item in items.Where(x => x.Enabled))
result = result.Add(item.TheDuration);
return result;
}
Now I find it a bit ugly just to introduce a type for my calculation (HelperClass).
My first approach was to use Tuple<DateTime, TimeSpan, bool> like I usually do this but since I need to modify the TimeSpan after creating the instance I can't use Tuple since Tuple.ItemX is readonly.
I thought about an anonymous type, but I can't figure out how to init my List
var item1 = new { TheDate = DateTime.Now,
TheDuration = TimeSpan.Zero, Enabled = true };
var items = new List<?>(); // How to declare this ???
items.Add(item1);
Using a projection looks like the way forward to me - but you can compute the durations as you go, by "zipping" your collection with itself, offset by one. You can then do the whole method in one query:
// Materialize the result to avoid computing possibly different sequences
var allDatesAndNow = GetDatesOrdered().Concat(new[] { DateTime.Now })
.ToList();
return allDatesNow.Zip(allDatesNow.Skip(1),
(x, y) => new { Enabled = GetEnabled(x),
Duration = y - x })
.Where(x => x.Enabled)
.Aggregate(TimeSpan.Zero, (t, pair) => t + pair.Duration);
The Zip call pairs up each date with its subsequent one, converting each pair of values into a duration and an enabled flag. The Where call filters out disabled pairs. The Aggregate call sums the durations from the resulting pairs.
You could do it with LINQ like:
var itemsWithoutDuration = GetAllDatesOrdered()
.Select(d => new { TheDate = d, Enabled = GetEnabled(d) })
.ToList();
var items = itemsWithoutDuration
.Select((it, k) => new { TheDate = it.d, Enabled = it.Enabled,
TheDuration = (k == (itemsWithoutDuration.Count - 1) ? DateTime.Now : itemsWithoutDuration[k+1].TheDate) - it.TheDate })
.ToList();
But by that point the Tuple is both more readable and more concise!
Update 1 , following Ayende's answer
This is my first journey into RavenDb and to experiment with it I wrote a small map/ reduce, but unfortunately the result is empty?
I have around 1.6 million documents loaded into RavenDb
A document:
public class Tick
{
public DateTime Time;
public decimal Ask;
public decimal Bid;
public double AskVolume;
public double BidVolume;
}
and wanted to get Min and Max of Ask over a specific period of Time.
The collection by Time is defined as:
var ticks = session.Query<Tick>().Where(x => x.Time > new DateTime(2012, 4, 23) && x.Time < new DateTime(2012, 4, 24, 00, 0, 0)).ToList();
Which gives me 90280 documents, so far so good.
But then the map/ reduce:
Map = rows => from row in rows
select new
{
Max = row.Bid,
Min = row.Bid,
Time = row.Time,
Count = 1
};
Reduce = results => from result in results
group result by new{ result.MaxBid, result.Count} into g
select new
{
Max = g.Key.MaxBid,
Min = g.Min(x => x.MaxBid),
Time = g.Key.Time,
Count = g.Sum(x => x.Count)
};
...
private class TickAggregationResult
{
public decimal MaxBid { get; set; }
public decimal MinBid { get; set; }
public int Count { get; set; }
}
I then create the index and try to Query it:
Raven.Client.Indexes.IndexCreation.CreateIndexes(typeof(TickAggregation).Assembly, documentStore);
var session = documentStore.OpenSession();
var g1 = session.Query<TickAggregationResult>(typeof(TickAggregation).Name);
var group = session.Query<Tick, TickAggregation>()
.Where(x => x.Time > new DateTime(2012, 4, 23) &&
x.Time < new DateTime(2012, 4, 24, 00, 0, 0)
)
.Customize(x => x.WaitForNonStaleResults())
.AsProjection<TickAggregationResult>();
But the group is just empty :(
As you can see I've tried two different Queries, I'm not sure about the difference, can someone explain?
Now I get an error:
The group are still empty :(
Let me explain what I'm trying to accomplish in pure sql:
select min(Ask), count(*) as TickCount from Ticks
where Time between '2012-04-23' and '2012-04-24)
Unfortunately, Map/Reduce doesn't work that way. Well, at least the Reduce part of it doesn't. In order to reduce your set, you would have to predefine specific time ranges to group by, for example - daily, weekly, monthly, etc. You could then get min/max/count per day if you reduced daily.
There is a way to get what you want, but it has some performance considerations. Basically, you don't reduce at all, but you index by time and then do the aggregation when transforming results. This is similar to if you ran your first query to filter and then aggregated in your client code. The only benefit is that the aggregation is done server-side, so you don't have to transmit all of that data to the client.
The performance concern here is how big of a time range are you filtering to, or more precisely, how many items will there be inside your filter range? If it's relatively small, you can use this approach. If it's too large, you will be waiting while the server goes through the result set.
Here is a sample program that illustrates this technique:
using System;
using System.Linq;
using Raven.Client.Document;
using Raven.Client.Indexes;
using Raven.Client.Linq;
namespace ConsoleApplication1
{
public class Tick
{
public string Id { get; set; }
public DateTime Time { get; set; }
public decimal Bid { get; set; }
}
/// <summary>
/// This index is a true map/reduce, but its totals are for all time.
/// You can't filter it by time range.
/// </summary>
class Ticks_Aggregate : AbstractIndexCreationTask<Tick, Ticks_Aggregate.Result>
{
public class Result
{
public decimal Min { get; set; }
public decimal Max { get; set; }
public int Count { get; set; }
}
public Ticks_Aggregate()
{
Map = ticks => from tick in ticks
select new
{
Min = tick.Bid,
Max = tick.Bid,
Count = 1
};
Reduce = results => from result in results
group result by 0
into g
select new
{
Min = g.Min(x => x.Min),
Max = g.Max(x => x.Max),
Count = g.Sum(x => x.Count)
};
}
}
/// <summary>
/// This index can be filtered by time range, but it does not reduce anything
/// so it will not be performant if there are many items inside the filter.
/// </summary>
class Ticks_ByTime : AbstractIndexCreationTask<Tick>
{
public class Result
{
public decimal Min { get; set; }
public decimal Max { get; set; }
public int Count { get; set; }
}
public Ticks_ByTime()
{
Map = ticks => from tick in ticks
select new {tick.Time};
TransformResults = (database, ticks) =>
from tick in ticks
group tick by 0
into g
select new
{
Min = g.Min(x => x.Bid),
Max = g.Max(x => x.Bid),
Count = g.Count()
};
}
}
class Program
{
private static void Main()
{
var documentStore = new DocumentStore { Url = "http://localhost:8080" };
documentStore.Initialize();
IndexCreation.CreateIndexes(typeof(Program).Assembly, documentStore);
var today = DateTime.Today;
var rnd = new Random();
using (var session = documentStore.OpenSession())
{
// Generate 100 random ticks
for (var i = 0; i < 100; i++)
{
var tick = new Tick { Time = today.AddMinutes(i), Bid = rnd.Next(100, 1000) / 100m };
session.Store(tick);
}
session.SaveChanges();
}
using (var session = documentStore.OpenSession())
{
// Query items with a filter. This will create a dynamic index.
var fromTime = today.AddMinutes(20);
var toTime = today.AddMinutes(80);
var ticks = session.Query<Tick>()
.Where(x => x.Time >= fromTime && x.Time <= toTime)
.OrderBy(x => x.Time);
// Ouput the results of the above query
foreach (var tick in ticks)
Console.WriteLine("{0} {1}", tick.Time, tick.Bid);
// Get the aggregates for all time
var total = session.Query<Tick, Ticks_Aggregate>()
.As<Ticks_Aggregate.Result>()
.Single();
Console.WriteLine();
Console.WriteLine("Totals");
Console.WriteLine("Min: {0}", total.Min);
Console.WriteLine("Max: {0}", total.Max);
Console.WriteLine("Count: {0}", total.Count);
// Get the aggregates with a filter
var filtered = session.Query<Tick, Ticks_ByTime>()
.Where(x => x.Time >= fromTime && x.Time <= toTime)
.As<Ticks_ByTime.Result>()
.Take(1024) // max you can take at once
.ToList() // required!
.Single();
Console.WriteLine();
Console.WriteLine("Filtered");
Console.WriteLine("Min: {0}", filtered.Min);
Console.WriteLine("Max: {0}", filtered.Max);
Console.WriteLine("Count: {0}", filtered.Count);
}
Console.ReadLine();
}
}
}
I can envision a solution to the problem of aggregating over a time filter with a potentially large scope. The reduce would have to break things down into decreasingly smaller units of time at different levels. The code for this is a bit complex, but I am working on it for my own purposes. When complete, I will post over in the knowledge base at www.ravendb.net.
UPDATE
I was playing with this a bit more, and noticed two things in that last query.
You MUST do a ToList() before calling single in order to get the full result set.
Even though this runs on the server, the max you can have in the result range is 1024, and you have to specify a Take(1024) or you get the default of 128 max. Since this runs on the server, I didn't expect this. But I guess its because you don't normally do aggregations in the TransformResults section.
I've updated the code for this. However, unless you can guarantee that the range is small enough for this to work, I would wait for the better full map/reduce that I spoke of. I'm working on it. :)