Rx Amb extension - c#

I'm working with the Reactive framework for Silverlight and would like to achieve the following.
I am try to create a typical data provider for a Silverlight client that also takes advantage of the caching framework available in MS Ent Lib. The scenarios requires that I must check in the cache for the key-value pair before hitting the WCF data client.
By using the Rx extension Amb, I am able to pull the data from the cache or WCF data client, whichever returns first, but how can I stop the WCF client from executing the call if the values is in the cache?
I would also like to consider racing conditions, e.g. if the first subscriber requests some data and the provider is fetching data from the WCF data client (async), how do I prevent subsequent async requests from doing the same thing (at this stage, the cache has yet to be populated).

I had exactly the same problem. I solved it with an extension method with the following signature:
IObservable<R> FromCacheOrFetch<T, R>(
this IObservable<T> source,
Func<T, R> cache,
Func<IObservable<T>, IObservable<R>> fetch,
IScheduler scheduler) where R : class
Effectively what this did was take in the source observable and return an observable that would match each input value with its output value.
To get each output value it would check the cache first. If the value exists in the cache it used that. If not it would spin up the fetch function only on values that weren't in the cache. If all of the values were in the cache then the fetch function would never be spun up - so no service connection set up penalty, etc.
I'll give you the code, but it's based on a slightly different version of the extension method that uses a Maybe<T> monad - so you might find you need to fiddle with the implementation.
Here it is:
public static IObservable<R> FromCacheOrFetch<T, R>(this IObservable<T> source, Func<T, R> cache, Func<IObservable<T>, IObservable<R>> fetch, IScheduler scheduler)
where R : class
{
return source.FromCacheOrFetch<T, R>(t => cache(t).ToMaybe(null), fetch, scheduler);
}
public static IObservable<R> FromCacheOrFetch<T, R>(this IObservable<T> source, Func<T, Maybe<R>> cache, Func<IObservable<T>, IObservable<R>> fetch, IScheduler scheduler)
{
var results = new Subject<R>();
var disposables = new CompositeDisposable();
var loop = new EventLoopScheduler();
disposables.Add(loop);
var sourceDone = false;
var pairsDone = true;
var exception = (Exception)null;
var fetchIn = new Subject<T>();
var fetchOut = (IObservable<R>)null;
var pairs = (IObservable<KeyValuePair<int, R>>)null;
var lookup = new Dictionary<T, int>();
var list = new List<Maybe<R>>();
var cursor = 0;
Action checkCleanup = () =>
{
if (sourceDone && pairsDone)
{
if (exception == null)
{
results.OnCompleted();
}
else
{
results.OnError(exception);
}
loop.Schedule(() => disposables.Dispose());
}
};
Action dequeue = () =>
{
while (cursor != list.Count)
{
var mr = list[cursor];
if (mr.HasValue)
{
results.OnNext(mr.Value);
cursor++;
}
else
{
break;
}
}
};
Action<KeyValuePair<int, R>> nextPairs = kvp =>
{
list[kvp.Key] = Maybe<R>.Something(kvp.Value);
dequeue();
};
Action<Exception> errorPairs = ex =>
{
fetchIn.OnCompleted();
pairsDone = true;
exception = ex;
checkCleanup();
};
Action completedPairs = () =>
{
pairsDone = true;
checkCleanup();
};
Action<T> sourceNext = t =>
{
var mr = cache(t);
list.Add(mr);
if (mr.IsNothing)
{
lookup[t] = list.Count - 1;
if (fetchOut == null)
{
pairsDone = false;
fetchOut = fetch(fetchIn.ObserveOn(Scheduler.ThreadPool));
pairs = fetchIn.Select(x => lookup[x]).Zip(fetchOut, (i, r2) => new KeyValuePair<int, R>(i, r2));
disposables.Add(pairs.ObserveOn(loop).Subscribe(nextPairs, errorPairs, completedPairs));
}
fetchIn.OnNext(t);
}
else
{
dequeue();
}
};
Action<Exception> errorSource = ex =>
{
sourceDone = true;
exception = ex;
fetchIn.OnCompleted();
checkCleanup();
};
Action completedSource = () =>
{
sourceDone = true;
fetchIn.OnCompleted();
checkCleanup();
};
disposables.Add(source.ObserveOn(loop).Subscribe(sourceNext, errorSource, completedSource));
return results.ObserveOn(scheduler);
}
Example usage would look like this:
You would have a source of the indices that you want to fetch:
IObservable<X> source = ...
You would have a function that can get values from the cache and an action that can put them in (and both should be thread-safe):
Func<X, Y> getFromCache = x => ...;
Action<X, Y> addToCache = (x, y) => ...;
Then you would have the actual call to go get the data from your database or service:
Func<X, Y> getFromService = x => ...;
Then you could define fetch like so:
Func<IObservable<X>, IObservable<Y>> fetch =
xs => xs.Select(x =>
{
var y = getFromService(x);
addToCache(x, y);
return y;
});
And finally you can make your query by calling the following:
IObservable<Y> results =
source.FromCacheOrFetch(
getFromCache,
fetch,
Scheduler.ThreadPool);
Of course you would need to subscribe to the result to make the computation take place.

Clearly Amb is not the right way to go, since that will hit both the cache and the service every time. What does EntLib return you if the cache is a miss?
Note that Observable.Timeout is a reasonable alternative:
cache(<paramters>).Timeout(TimeSpan.FromSeconds(1), service<paramters>);
But clearly it's not a great idea to timeout if you want instead process the return from EntLib and act appropriately instead.
I'm not seeing why this is necessarily a Reactive Extensions problem.

A simple approach, which is probably less fully featured than #Enigmativity's solution could be something along the lines of:
public IObservable<T> GetCachedValue<TKey, TResult>(TKey key, Func<TKey, TResult> getFromCache, Func<TKey, TResult> getFromSource)
{
return getFromCache(<key>).Concat(getFromSource(<key>).Take(1);
}
This is just a loosely formed idea, you'd need to add:
A mechanism to add the item to the cache, or assume getFromSource caches the result
Some kind of thread safety to prevent multiple hits on the source for the same uncached key (if required)
getFromCache would need to return Observable.Empty() if the item wasn't in the cache.
But if you want something simple, it's not a bad place to start.

Related

Rx combining many streams by joining on a property

I am trying to 'zip' an arbitrary number of streams in Rx, where elements correspond but may be processed out of order. Each stream's elements have an identifier that can be used to match them together. E.g. elements look like:
public class Element
{
public string Key {get; set;}
}
Normally, zip will just combine elements by their index of occurrence:
|-A-----------A
|--B---------B-
|-----C------C-
|-----ABC-----ABC <- zip
But what if we want to only match elements that share the same Key? I'm looking for a sequence that works more like this:
(In this example, the key is 1 or 2)
|--2A-------1A----------
|----1B----------2B-----
|------1C-----------2C--
|-----------1ABC----2ABC <- zipped by key 1 & 2 respectively
I feel that GroupJoin suits this scenario, but it only serves two Observables and chaining them got out of hand pretty quickly.
I also looked at And/Then/When but didn't really understand how to structure it for this scenario.
Ideally, I'd want an extension method I can call and provide a result selector for, where the inputs of the result selector are guaranteed to have the same Key.
How would you approach this problem?
Here is something I bumped together in LinqPad. It meets your requirements of your marble diagram. It is however messier than I would like.
Nuget dependencies on Rx-Testing
void Main()
{
TestScheduler scheduler = new TestScheduler();
/*
|--2A-------1A----------
|----1B----------2B-----
|------1C-----------2C--
|-----------1ABC----2ABC <- zipped by key 1 & 2 respectively
*/
var sourceA = scheduler.CreateColdObservable(
ReactiveTest.OnNext(3, "2A"),
ReactiveTest.OnNext(12, "1A"));
var sourceB = scheduler.CreateColdObservable(
ReactiveTest.OnNext(5, "1B"),
ReactiveTest.OnNext(17, "2B"));
var sourceC= scheduler.CreateColdObservable(
ReactiveTest.OnNext(7, "1C"),
ReactiveTest.OnNext(20, "2C"));
var observer = scheduler.CreateObserver<string>();
var query = Observable.Merge(sourceA, sourceB, sourceC)
.GroupBy(x => GetKey(x))
.SelectMany(grp => grp.Select(x=>GetValue(x))
.Take(3)
.Aggregate(new List<string>(),
(accumulator, current) => {
accumulator.Add(current);
return accumulator;
})
.Select(acc=>CreateGroupResult(grp.Key, acc)));
query.Subscribe(observer);
scheduler.Start();
ReactiveAssert.AreElementsEqual(
new[]{
ReactiveTest.OnNext(12, "1ABC"),
ReactiveTest.OnNext(20, "2ABC")
},
observer.Messages
);
}
// Define other methods and classes here
private static string CreateGroupResult(string key, IEnumerable<string> values)
{
var combinedOrderedValues = string.Join(string.Empty, values.OrderBy(v => v));
return string.Format("{0}{1}", key, combinedOrderedValues);
}
private static string GetKey(string message)
{
return message.Substring(0, 1);
}
private static string GetValue(string message)
{
return message.Substring(1);
}

Create an asynchronous LinQ query

I check and tried this example: How to write Asynchronous LINQ query?
This works well and it asynchronously executes both procedures but my target is to create a query that when you get the first result of the query you can already start to use this value while the query is still executing and looking for more values.
I did this in while I was programming for android and the result was amazing and really fast but I have no clue how to make it in C# and using linQ
Asynchronous sequences are modeled in .NET as IObservables. Reactive Extensions is the asynchronous section of LINQ.
You can generate an observable sequence in any number of ways, such as through a call to Generate:
var sequence = Observable.Generate(0,
i => i < 10,
i => i + 1,
i => SomeExpensiveGeneratorFunction());
And then query it using all of the regular LINQ functions (which as a result allows for the use of query syntax) along with a number of additional operations that make sense specifically for asynchronous sequence (and also a lot of different ways of creating observable sequences) such as:
var query = from item in sequence
where ConditionIsTrue(item)
select item.ToString();
The short description of what's going on here is to just say that it does exactly what you want. The generator function simply notifies its subscribers whenever it successfully generates a value (or when it's done) and continues generating values without waiting for the subscribers to finish, the Where method will subscribe to sequence, and notify its subscribers whenever it observes a value that passes the condition, Select will subscribe to the sequence returned by Where and perform its transformation (asynchronously) whenever it gets a value and will then push it to all of its subscribers.
I have modified TheSoftwareJedi answer from your given link.
You can raise the first startup event from the Asynchronous class, and use it to start-up your work.
Here's the class,
public static class AsynchronousQueryExecutor
{
private static Action<object> m_OnFirstItemProcessed;
public static void Call<T>(IEnumerable<T> query, Action<IEnumerable<T>> callback, Action<Exception> errorCallback, Action<object> OnFirstItemProcessed)
{
m_OnFirstItemProcessed = OnFirstItemProcessed;
Func<IEnumerable<T>, IEnumerable<T>> func =
new Func<IEnumerable<T>, IEnumerable<T>>(InnerEnumerate<T>);
IEnumerable<T> result = null;
IAsyncResult ar = func.BeginInvoke(
query,
new AsyncCallback(delegate(IAsyncResult arr)
{
try
{
result = ((Func<IEnumerable<T>, IEnumerable<T>>)((AsyncResult)arr).AsyncDelegate).EndInvoke(arr);
}
catch (Exception ex)
{
if (errorCallback != null)
{
errorCallback(ex);
}
return;
}
//errors from inside here are the callbacks problem
//I think it would be confusing to report them
callback(result);
}),
null);
}
private static IEnumerable<T> InnerEnumerate<T>(IEnumerable<T> query)
{
int iCount = 0;
foreach (var item in query) //the method hangs here while the query executes
{
if (iCount == 0)
{
iCount++;
m_OnFirstItemProcessed(item);
}
yield return item;
}
}
}
here's the associations,
private void OnFirstItem(object value) // Your first items is proecessed here.
{
//You can start your work here.
}
public void HandleResults(IEnumerable<int> results)
{
foreach (var item in results)
{
}
}
public void HandleError(Exception ex)
{
}
and here's how you should call the function.
private void buttonclick(object sender, EventArgs e)
{
IEnumerable<int> range = Enumerable.Range(1,10000);
var qry = TestSlowLoadingEnumerable(range);
//We begin the call and give it our callback delegate
//and a delegate to an error handler
AsynchronousQueryExecutor.Call(qry, HandleResults, HandleError, OnFirstItem);
}
If this meets your expectation, you can use this to start your work with the first item processed.
Try again ...
If I understand you the logic you want is something like ...
var query = getData.Where( ... );
query.AsParallel().ForEach(r => {
//other stuff
});
What will happen here ...
Well in short, the compiler will evaluate this to something like: Whilst iterating through query results in parallel perform the logic in the area where the comment is.
This is async and makes use of an optimal thread pool managed by .net to ensure the results are acquired as fast as possible.
This is an automatically managed async parallel operation.
It's also worth noting that I if I do this ...
var query = getData.Where( ... );
... no actual code is run until I begin iterating the IQueryable and by declaring the operation a parallel one the framework is able to operate on more than one of the results at any point in time by threading the code for you.
The ForEach is essentially just a normal foreach loop where each iteration is asynchronously handled.
The logic you put in there could call some sort of callback if you wanted but that's down to how you wrap this code ...
Might I suggest something like this:
void DoAsync<T>(IQueryable<T> items, Func<T> operation, Func<T> callback)
{
items.AsParallel().ForEach(x => {
operation(x);
callback(x);
});
}
This is pretty simple with the TPL.
Here's a dummy "slow" enumerator that has to do a bit of work between getting items:
static IEnumerable<int> SlowEnumerator()
{
for (int i = 0; i < 10; i++)
{
Thread.Sleep(1000);
yield return i;
}
}
Here's a dummy bit of work to do with each item in the sequence:
private static void DoWork(int i)
{
Thread.Sleep(1000);
Console.WriteLine("{0} at {1}", i, DateTime.Now);
}
And here's how you can simultenously run the "bit of work" on one item that the enumerator has returned and ask the enumerator for the next item:
foreach (var i in SlowEnumerator())
{
Task.Run(() => DoWork(i));
}
You should get work done every second - not every 2 seconds as you would expect if you had to interleave the two types of work:
0 at 20/01/2015 10:56:52
1 at 20/01/2015 10:56:53
2 at 20/01/2015 10:56:54
3 at 20/01/2015 10:56:55
4 at 20/01/2015 10:56:56
5 at 20/01/2015 10:56:57
6 at 20/01/2015 10:56:58
7 at 20/01/2015 10:56:59
8 at 20/01/2015 10:57:00
9 at 20/01/2015 10:57:01

Using LINQ's Zip with a closure that doesn't return a value

Disclaimer: this question is driven by my personal curiosity more than an actual need to accomplish something. So my example is going to be contrived.
Nevertheless I think it's an issue that might very well crop up.
Let's say we are using Zip to iterate over two sequences, invoking a void method that just throws an exception if one item of the couple is found to be different from the other (therefore discarding any return value). The point here is not that the method throws an exception, so much as it returns void.
In other words, we're kind of doing a ForEach over two collections (and by the way, I know what Eric Lippert thinks about ForEach, and fully agree with him and never use it).
Now, Zip wants a Func<TFirst, TSecond, TResult>, so of course passing something equivalent to Action<TFirst, TSecond> won't work.
My question is: is there an idiomatic way that is better than this (i.e. returning a dummy value)?
var collection1 = new List<int>() { ... };
var collection2 = new List<int>() { ... };
collection1.Zip(collection2, (first, second) =>
{
VoidMethodThatThrows(first, second);
return true;
});
Use Zip() to throw the items into an object, then do your foreach however way you choose (do a normal foreach loop please, not the bad ToList/ForEach combo).
var items = collection1.Zip(collection2, (x, y) => new { First = x, Second = y });
foreach (var item in items)
{
VoidMethodThatThrows(item.First, item.Second);
}
As of C# 7.0, improved tuple support and deconstruction makes it far more pleasing to work with.
var items = collection1.Zip(collection2, (x, y) => (x, y));
// or collection1.Zip(collection2, ValueTuple.Create);
foreach (var (first, second) in items)
{
VoidMethodThatThrows(first, second);
}
Furthermore, .NET Core and 5 adds an overload which automatically pairs the values into tuples so you don't have to do that mapping.
var items = collection1.Zip(collection2); // IEnumerable<(Type1, Type2)>
.NET 6 adds a third collection to the mix.
var items = collection1.Zip(collection2, collection3); // IEnumerable<(Type1, Type2, Type3)>
I often need to execute an action on each pair in two collections. The Zip method is not useful in this case.
This extension method ForPair can be used:
public static void ForPair<TFirst, TSecond>(this IEnumerable<TFirst> first, IEnumerable<TSecond> second,
Action<TFirst, TSecond> action)
{
using (var enumFirst = first.GetEnumerator())
using (var enumSecond = second.GetEnumerator())
{
while (enumFirst.MoveNext() && enumSecond.MoveNext())
{
action(enumFirst.Current, enumSecond.Current);
}
}
}
So for your example, you could write:
var collection1 = new List<int>() { 1, 2 };
var collection2 = new List<int>() { 3, 4 };
collection1.ForPair(collection2, VoidMethodThatThrows);

How to yield return inside anonymous methods?

Basically I have an anonymous method that I use for my BackgroundWorker:
worker.DoWork += ( sender, e ) =>
{
foreach ( var effect in GlobalGraph.Effects )
{
// Returns EffectResult
yield return image.Apply (effect);
}
};
When I do this the compiler tells me:
"The yield statement cannot be used
inside an anonymous method or lambda
expression"
So in this case, what's the most elegant way to do this? Btw this DoWork method is inside a static method, in case that matters for the solution.
Unfortunately you can't.
The compiler does not allow you to combine the two "magic" pieces of code. Both involve rewriting your code to support what you want to do:
An anonymous method is done by moving the code to a proper method, and lifting local variables to fields on the class with that method
An iterator method is rewritten as a state machine
You can, however, rewrite the code to return the collection, so in your particular case I would do this:
worker.DoWork += ( sender, e ) =>
{
return GlobalGraph.Effects
.Select(effect => image.Apply(effect));
};
though it looks odd for an event (sender, e) to return anything at all. Are you sure you're showing a real scenario for us?
Edit Ok, I think I see what you're trying to do here.
You have a static method call, and then you want to execute code in the background, and then return data from that static method once the background call completes.
This is, while possible, not a good solution since you're effectively pausing one thread to wait for another, that was started directly before you paused the thread. In other words, all you're doing is adding overhead of context switching.
Instead you need to just kick off the background work, and then when that work is completed, process the resulting data.
Perhaps just return the linq expression and defer execution like yield:
return GlobalGraph.Effects.Select(x => image.Apply(x));
Unless I'm missing something, you can't do what you're asking.
(I do have an answer for you, so please read past my explanation of why you can't do what you're doing first, and then read on.)
You full method would look something like this:
public static IEnumerable<EffectResult> GetSomeValues()
{
// code to set up worker etc
worker.DoWork += ( sender, e ) =>
{
foreach ( var effect in GlobalGraph.Effects )
{
// Returns EffectResult
yield return image.Apply (effect);
}
};
}
If we assume that your code was "legal" then when GetSomeValues is called, even though the DoWork handler is added to worker, the lambda expression isn't executed until the DoWork event is fired. So the call to GetSomeValues completes without returning any results and the lamdba may or may not get called at a later stage - which is then too late for the caller of the GetSomeValues method anyway.
Your best answer is to the use Rx.
Rx turns IEnumerable<T> on its head. Instead of requesting values from an enumerable, Rx has values pushed to you from an IObservable<T>.
Since you're using a background worker and responding to an event you are effectively having the values pushed to you already. With Rx it becomes easy to do what you're trying to do.
You have a couple of options. Probably the simplest is to do this:
public static IObservable<IEnumerable<EffectResult>> GetSomeValues()
{
// code to set up worker etc
return from e in Observable.FromEvent<DoWorkEventArgs>(worker, "DoWork")
select (
from effect in GlobalGraph.Effects
select image.Apply(effect)
);
}
Now callers of your GetSomeValues method would do this:
GetSomeValues().Subscribe(ers =>
{
foreach (var er in ers)
{
// process each er
}
});
If you know that DoWork is only going to fire once, then this approach might be a little better:
public static IObservable<EffectResult> GetSomeValues()
{
// code to set up worker etc
return Observable
.FromEvent<DoWorkEventArgs>(worker, "DoWork")
.Take(1)
.Select(effect => from effect in GlobalGraph.Effects.ToObservable()
select image.Apply(effect))
.Switch();
}
This code looks a little more complicated, but it just turns a single do work event into a stream of EffectResult objects.
Then the calling code looks like this:
GetSomeValues().Subscribe(er =>
{
// process each er
});
Rx can even be used to replace the background worker. This might be the best option for you:
public static IObservable<EffectResult> GetSomeValues()
{
// set up code etc
return Observable
.Start(() => from effect in GlobalGraph.Effects.ToObservable()
select image.Apply(effect), Scheduler.ThreadPool)
.Switch();
}
The calling code is the same as the previous example. The Scheduler.ThreadPool tells Rx how to "schedule" the processing of subscriptions to the observer.
I hope this helps.
For new readers: the most elegant way to implement 'anonymous iterators' (i. e. nested in other methods) in C#5 is probably something like this cool trick with async/await (don't be confused by these keywords, the code below is computed absolutely synchronously - see details in the linked page):
public IEnumerable<int> Numbers()
{
return EnumeratorMonad.Build<int>(async Yield =>
{
await Yield(11);
await Yield(22);
await Yield(33);
});
}
[Microsoft.VisualStudio.TestTools.UnitTesting.TestMethod]
public void TestEnum()
{
var v = Numbers();
var e = v.GetEnumerator();
int[] expected = { 11, 22, 33 };
Numbers().Should().ContainInOrder(expected);
}
C#7 (available now in Visual Studio 15 Preview) supports local functions, which allow yield return:
public IEnumerable<T> Filter<T>(IEnumerable<T> source, Func<T, bool> filter)
{
if (source == null) throw new ArgumentNullException(nameof(source));
if (filter == null) throw new ArgumentNullException(nameof(filter));
return Iterator();
IEnumerable<T> Iterator()
{
foreach (var element in source)
{
if (filter(element)) { yield return element; }
}
}
}
DoWork is of type DoWorkEventHandler which returns nothing (void),
so it's not possible at all in your case.
The worker should set the Result property of DoWorkEventArgs.
worker.DoWork += (s, e) => e.Result = GlobalGraph.Effects.Select(x => image.Apply(x));
Ok so I did something like this which does what I wanted (some variables omitted):
public static void Run ( Action<float, EffectResult> action )
{
worker.DoWork += ( sender, e ) =>
{
foreach ( var effect in GlobalGraph.Effects )
{
var result = image.Apply (effect);
action (100 * ( index / count ), result );
}
}
};
and then in the call site:
GlobalGraph.Run ( ( p, r ) =>
{
this.Progress = p;
this.EffectResults.Add ( r );
} );
I wanted to supplement user1414213562's answer with an implementation of the ForEachMonad.
static class ForEachMonad
{
public static IEnumerable<A> Lift<A>(A a) { yield return a; }
// Unfortunately, this doesn't compile
// public static Func<IEnumerable<A>, IEnumerable<B>> Lift<A, B>(Func<A, IEnumerable<B>> f) =>
// (IEnumerable<A> ea) => { foreach (var a in ea) { foreach (var b in f(a)) { yield return b; } } }
// Fortunately, this does :)
public static Func<IEnumerable<A>, IEnumerable<B>> Lift<A, B>(Func<A, IEnumerable<B>> f)
{
IEnumerable<B> lift(IEnumerable<A> ea)
{
foreach (var a in ea) { foreach (var b in f(a)) { yield return b; } }
}
return lift;
}
public static void Demo()
{
var f = (int x) => (IEnumerable<int>)new int[] { x + 1, x + 2, x + 3 };
var g = (int x) => (IEnumerable<double>)new double[] { Math.Sqrt(x), x*x };
var log = (double d) => { Console.WriteLine(d); return Lift(d); };
var e1 = Lift(0);
var e2 = Lift(f)(e1);
var e3 = Lift(g)(e2);
// we call ToArray in order to materialize the IEnumerable
Lift(log)(e3).ToArray();
}
}
Running ForEachMonad.Demo() produces the following output:
1
1
1,4142135623730951
4
1,7320508075688772
9

First Time Calling Extension Methods is Slower Than Subsequent Calls

I have a class that modifies data via some extension methods. In order to debug performance, I have created some rough debug code to make multiple calls to the same methods using the same data a number of times. I am finding that it consistently takes a significantly longer time to do the calculations the first time through the loop than subsequent calls.
For example, for a small set of data, it appears to take about 5 seconds to do the calculations, while each subsequent call is a second or so.
Thanks,
wTs
The code looks something like this:
Test Code
void TestCode()
{
for (int i = 0; i < iterationsPerLoop; i++)
{
DateTime startTime = DateTime.Now;
// The test is actually being done in a BackgroundWorker
dispatcher.Invoke(DispatcherPriority.Normal,
(Action)(() => this.PropertyCausingCodeToRun = "Run";
while (this.WaitForSomeCondition)
Thread.Sleep(125);
DateTime endTime = DateTime.Now;
double result = endTime.Subtract(startTime).TotalSeconds;
}
}
Method where extension methods called
private static List<ObservableItem> GetAvailableItems(MyObject myObject)
{
var items = new List<ObservableItem>(myObject.Items.ToList());
var selectedItems = items.OrderByDescending(item => item.ASortableProperty)
.SetItemIsAvailable(false)
.SetItemPriority()
.OrderByDescending(item => item.Priority)
.Where(item => item.Priority > 0)
.SetItemIsAvailable(true)
.OrderBy(item => item.Date);
return selectedItems.ToList();
}
Extension Methods (ObservableItems all created on different thread)
static class MyExtensionMethods
{
public static IEnumerable<T> SetItemIsAvailable<T>(this IEnumerable<T> sourceList,
Boolean isAvailable) where T : ObservableItem
{
Action<T> setAvailable = i => i.IsAvailable = isAvailable;
List<DispatcherOperation> invokeResults = new List<DispatcherOperation>();
foreach (var item in sourceList)
{
invokeResults.Add(
item.ItemDispatcher.BeginInvoke(setAvailable , new object[] { item }));
}
invokeResults.ForEach(ir => ir.Wait());
return sourceList;
}
public static IEnumerable<T> SetItemPriority<T>(this IEnumerable<T> sourceList) where T : ObservableItem
{
Action<T, double> setPriority = new Action<T, double>((item, priority) =>
{
item.Priority = priority;
});
List<DispatcherOperation> invokeResults = new List<DispatcherOperation>();
foreach (var item in sourceList)
{
double priority = ......; // Some set of calculations
invokeResults.Add(
item.ItemDispatcher.BeginInvoke(setPriority,
new object[] { asset, priority }));
}
invokeResults.ForEach(ir => ir.Wait());
return sourceList;
}
}
Most often, the first time methods are called, there is some overhead associated with the JIT compilation time. This will have an effect (though most likely not that much).
However, looking at your code, you're spending a huge amount of time waiting on asynchronous calls marshalling to the UI via the dispatcher. This is going to put a large hit on your overall performance, and slow this way down.
I'd recommend doing all of your operations in one dispatch call, and using Invoke instead of BeginInvoke. Instead of marshalling one message per item, just marshal a single delegate that includes the foreach loop through your items.
This will be significantly faster.
The real issue, as I've figured out, was caused by the property initially being called to sort the items (before the extension methods are even called).
The property is of the form:
public Double ASortableProperty
{
get
{
if (mASortableProperty.HasValue)
{
return mASortableProperty.Value;
}
mASortableProperty = this.TryGetDoubleValue(...);
return (mASortableProperty.HasValue ? mASortableProperty.Value : 0);
}
}
Therefore, the first time through the loop, the values were not initialized from the database, and the cost was in retrieving these values, before the sort can take place.

Categories

Resources