How to wait until item goes through pipeline? - c#

So, I'm trying to wrap my head around Microsoft's Dataflow library. I've built a very simple pipeline consisting of just two blocks:
var start = new TransformBlock<Foo, Bar>();
var end = new ActionBlock<Bar>();
start.LinkTo(end);
Now I can asynchronously process Foo instances by calling:
start.SendAsync(new Foo());
What I do not understand is how to do the processing synchronously, when needed. I thought that waiting on SendAsync would be enough:
start.SendAsync(new Foo()).Wait();
But apparently it returns as soon as item is accepted by first processor in pipeline, and not when item is fully processed. So is there a way to wait until given item was processed by last (end) block? Apart from passing a WaitHandle through entire pipeline.

In short that's not supported out of the box in data flow. Essentially what you need to do is to tag the data so you can retrieve it when processing is done. I've written up a way to do this that let's the consumer await a Job as it gets processed by the pipeline. The only concession to pipeline design is that each block take a KeyValuePair<Guid, T>. This is the basic JobManager and the post I wrote about it. Note the code in the post is a bit dated and needs some updates but it should get you in the right direction.
namespace ConcurrentFlows.DataflowJobs {
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
/// <summary>
/// A generic interface defining that:
/// for a specified input type => an awaitable result is produced.
/// </summary>
/// <typeparam name="TInput">The type of data to process.</typeparam>
/// <typeparam name="TOutput">The type of data the consumer expects back.</typeparam>
public interface IJobManager<TInput, TOutput> {
Task<TOutput> SubmitRequest(TInput data);
}
/// <summary>
/// A TPL-Dataflow based job manager.
/// </summary>
/// <typeparam name="TInput">The type of data to process.</typeparam>
/// <typeparam name="TOutput">The type of data the consumer expects back.</typeparam>
public class DataflowJobManager<TInput, TOutput> : IJobManager<TInput, TOutput> {
/// <summary>
/// It is anticipated that jobHandler is an injected
/// singleton instance of a Dataflow based 'calculator', though this implementation
/// does not depend on it being a singleton.
/// </summary>
/// <param name="jobHandler">A singleton Dataflow block through which all jobs are processed.</param>
public DataflowJobManager(IPropagatorBlock<KeyValuePair<Guid, TInput>, KeyValuePair<Guid, TOutput>> jobHandler) {
if (jobHandler == null) { throw new ArgumentException("Argument cannot be null.", "jobHandler"); }
this.JobHandler = JobHandler;
if (!alreadyLinked) {
JobHandler.LinkTo(ResultHandler, new DataflowLinkOptions() { PropagateCompletion = true });
alreadyLinked = true;
}
}
private static bool alreadyLinked = false;
/// <summary>
/// Submits the request to the JobHandler and asynchronously awaits the result.
/// </summary>
/// <param name="data">The input data to be processd.</param>
/// <returns></returns>
public async Task<TOutput> SubmitRequest(TInput data) {
var taggedData = TagInputData(data);
var job = CreateJob(taggedData);
Jobs.TryAdd(job.Key, job.Value);
await JobHandler.SendAsync(taggedData);
return await job.Value.Task;
}
private static ConcurrentDictionary<Guid, TaskCompletionSource<TOutput>> Jobs {
get;
} = new ConcurrentDictionary<Guid, TaskCompletionSource<TOutput>>();
private static ExecutionDataflowBlockOptions Options {
get;
} = GetResultHandlerOptions();
private static ITargetBlock<KeyValuePair<Guid, TOutput>> ResultHandler {
get;
} = CreateReplyHandler(Options);
private IPropagatorBlock<KeyValuePair<Guid, TInput>, KeyValuePair<Guid, TOutput>> JobHandler {
get;
}
private KeyValuePair<Guid, TInput> TagInputData(TInput data) {
var id = Guid.NewGuid();
return new KeyValuePair<Guid, TInput>(id, data);
}
private KeyValuePair<Guid, TaskCompletionSource<TOutput>> CreateJob(KeyValuePair<Guid, TInput> taggedData) {
var id = taggedData.Key;
var jobCompletionSource = new TaskCompletionSource<TOutput>();
return new KeyValuePair<Guid, TaskCompletionSource<TOutput>>(id, jobCompletionSource);
}
private static ExecutionDataflowBlockOptions GetResultHandlerOptions() {
return new ExecutionDataflowBlockOptions() {
MaxDegreeOfParallelism = Environment.ProcessorCount,
BoundedCapacity = 1000
};
}
private static ITargetBlock<KeyValuePair<Guid, TOutput>> CreateReplyHandler(ExecutionDataflowBlockOptions options) {
return new ActionBlock<KeyValuePair<Guid, TOutput>>((result) => {
RecieveOutput(result);
}, options);
}
private static void RecieveOutput(KeyValuePair<Guid, TOutput> result) {
var jobId = result.Key;
TaskCompletionSource<TOutput> jobCompletionSource;
if (!Jobs.TryRemove(jobId, out jobCompletionSource)) {
throw new InvalidOperationException($"The jobId: {jobId} was not found.");
}
var resultValue = result.Value;
jobCompletionSource.SetResult(resultValue);
}
}
}

I ended up using the following pipeline:
var start = new TransformBlock<FooBar, FooBar>(...);
var end = new ActionBlock<FooBar>(item => item.Complete());
start.LinkTo(end);
var input = new FooBar {Input = new Foo()};
start.SendAsync(input);
input.Task.Wait();
Where
class FooBar
{
public Foo Input { get; set; }
public Bar Result { get; set; }
public Task<Bar> Task { get { return _taskSource.Task; } }
public void Complete()
{
_taskSource.SetResult(Result);
}
private TaskCompletionSource<Bar> _taskSource = new TaskCompletionSource<Bar>();
}
Less than ideal, but it works.

Related

How to implement a sorted buffer?

I need to traverse a collection of disjoint folders; each folder is associated to a visited time configurated somewhere in the folder.
I then sort the folders, and process the one with the earliest visited time first. Note the processing is generally slower than the traversing.
My code targets Framework4.8.1; Currently my implementation is as follows:
public class BySeparateThread
{
ConcurrentDictionary<string, DateTime?> _dict = new ConcurrentDictionary<string, DateTime?>();
private object _lock;
/// <summary>
/// this will be called by producer thread;
/// </summary>
/// <param name="address"></param>
/// <param name="time"></param>
public void add(string address,DateTime? time) {
_dict.TryAdd(address, time);
}
/// <summary>
/// called by subscriber thread;
/// </summary>
/// <returns></returns>
public string? next() {
lock (_lock) {
var r = _dict.FirstOrDefault();
//return sortedList.FirstOrDefault().Value;
if (r.Key is null)
{
return r.Key;
}
if (r.Value is null)
{
_dict.TryRemove(r.Key, out var _);
return r.Key;
}
var key = r.Key;
foreach (var item in _dict.Skip(1) )
{
if (item.Value is null)
{
_dict.TryRemove(item.Key, out var _);
return item.Key;
}
if (item.Value< r.Value)
{
r=item;
}
}
_dict.TryRemove(key, out var _);
return key;
}
}
/// <summary>
/// this will be assigned of false by producer thread;
/// </summary>
public bool _notComplete = true;
/// <summary>
/// shared configuration for subscribers;
/// </summary>
fs.addresses_.disjoint.deV_._bak.Io io; //.io_._CfgX.Create(cancel, git)
/// <summary>
/// run this in a separate thread other than <see cref="add(string, DateTime?)"/>
/// </summary>
/// <param name="sln"></param>
/// <returns></returns>
public async Task _asyn_ofAddress(string sln)
{
while (_notComplete)
{
var f = next();
if (f is null )
{
await Task.Delay(30*1000);
//await Task.Yield();
continue;
}
/// degree of concurrency is controlled by a semophore; for instance, at most 4 are tackled:
new dev.srcs.each.sln_.delvable.Bak_srcsInAddresses(io)._startTask_ofAddress(sln);
}
}
}
For the above, I'm concerned about the while(_notComplete) part, as it looks like there would be many loops doing nothing there. I think there should be better ways to remove the while by utilizing the fact that the collection can notify whether it's empty or not at some/various stages such as when we add.
There would be better implementation which can be based on some mature framework such as those being considered by me these days but I often stopped wondering at some implementation details:
BlockingCollection
for this one, I don't know how to make the collection added and sorted dynamically while producer and subscriber are on the run;
Channel
Again, I could not come up with one fitting my need after I read its examples;
Pipeline
I havenot fully understood it;
Rx
I tried to implement an observable and an observer. It only gives me a macroscope framework, but when I get into the details, I ended with what I'm currently doing and I begin to wonder: with what I'm doing, I don't need Rx here.
Dataflow
Shall I implement my own BufferBlock or ActionBlock? It seems the built-in bufferBlock cannot be customized to sort things before releasing them to the next block.
Sorting buffered Observables seems similar to my problem; but it ends with a solution similar to the one I currently have but am not satisfied with, as stated in the above.
Could some one give me a sample code? Please give as concrete code as you can; As you can see, I have researched some general ideas/paths and finally what stops me short is the details, which are often glossed over in some docs.
I just found one solution which is better than my current one. I believe there are some even better ones, so please do post your answers if you find some; my current one is just what I can hack for what I know so far.
I found Prioritized queues in Task Parallel Library, and I write a similar one for my case:
using System;
using System.Collections;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reactive.Subjects;
using System.Threading;
using System.Threading.Tasks;
namespace nilnul.dev.srcs.every.slns._bak
{
public class BySortedSet : IProducerConsumerCollection<(string, DateTime)>
{
private class _Comparer : IComparer<(string, DateTime)>
{
public int Compare((string, DateTime) first, (string, DateTime) second)
{
var returnValue = first.Item2.CompareTo(second.Item2);
if (returnValue == 0)
returnValue = first.Item1.CompareTo(second.Item1);
return returnValue;
}
static public _Comparer Singleton
{
get
{
return nilnul._obj.typ_.nilable_.unprimable_.Singleton<_Comparer>.Instance;// just some magic to get an instance
}
}
}
SortedSet<(string, DateTime)> _dict = new SortedSet<(string, DateTime)>(
_Comparer.Singleton
);
private object _lock=new object();
public int Count
{
get
{
lock(_lock){
return _dict.Count;
}
}
}
public object SyncRoot => _lock;
public bool IsSynchronized => true;
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
//throw new NotImplementedException();
}
public void CopyTo((string, DateTime)[] array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array[index++] = item;
}
}
}
public void CopyTo(Array array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array.SetValue(item, index++);
}
}
}
public bool TryAdd((string, DateTime) item)
{
lock (_lock)
{
return _dict.Add(item);
}
}
public bool TryTake(out (string, DateTime) item)
{
lock (_lock)
{
item = _dict.Min;
if (item==default)
{
return false;
}
return _dict.Remove(item);
}
}
public (string, DateTime)[] ToArray()
{
lock (_lock)
{
return this._dict.ToArray();
}
}
public IEnumerator<(string, DateTime)> GetEnumerator()
{
return ToArray().AsEnumerable().GetEnumerator();
}
/// <summary>
/// </summary>
/// <returns></returns>
public BlockingCollection<(string, DateTime)> asBlockingCollection() {
return new BlockingCollection<(string, DateTime)>(
this
);
}
}
}
Then I can use that like:
static public void ExampleUse(CancellationToken cancellationToken) {
var s = new BySortedSet().asBlockingCollection();
/// traversal thread:
s.Add(("", DateTime.MinValue));
//...
s.CompleteAdding();
/// tackler thread:
///
foreach (var item in s.GetConsumingEnumerable(cancellationToken))
{
/// process the item;
/// todo: degree of parallelism is controlled by the tackler, or is there a better way like in dataflow or Rx or sth else?
}
}
Thanks!

Mocking Redlock.CreateAsync does not return mocked object

I am trying to Mock Redlock
I have the test below
using Moq;
using RedLockNet;
using System;
using System.Threading;
using System.Threading.Tasks;
using Xunit;
namespace RedLock.Tests
{
public class RedLockTests
{
[Fact]
public async Task TestMockingOfRedlock()
{
var redLockFactoryMock = new Mock<IDistributedLockFactory>();
var mock = new MockRedlock();
redLockFactoryMock.Setup(x => x.CreateLockAsync(It.IsAny<string>(),
It.IsAny<TimeSpan>(), It.IsAny<TimeSpan>(),
It.IsAny<TimeSpan>(), It.IsAny<CancellationToken>()))
.ReturnsAsync(mock);
var sut = new TestRedlockHandler(redLockFactoryMock.Object);
var data = new MyEventData();
await sut.Handle(data);
}
}
}
MockRedlock is a simple mock class that implements IRedLock
public class MockRedlock: IRedLock
{
public void Dispose()
{
}
public string Resource { get; }
public string LockId { get; }
public bool IsAcquired => true;
public RedLockStatus Status => RedLockStatus.Acquired;
public RedLockInstanceSummary InstanceSummary => new RedLockInstanceSummary();
public int ExtendCount { get; }
}
await sut.Handle(data); is a call to a seperate event class
I have shown this below. This has been simplified, but using the code below and the test above the null reference error can be reproduced
public class MyEventData
{
public string Id { get; set; }
public MyEventData()
{
Id = Guid.NewGuid().ToString();
}
}
public class TestRedlockHandler
{
private IDistributedLockFactory _redLockFactory;
public TestRedlockHandler(IDistributedLockFactory redLockFactory)
{
_redLockFactory = redLockFactory;
}
public async Task Handle(MyEventData data)
{
var lockexpiry = TimeSpan.FromMinutes(2.5);
var waitspan = TimeSpan.FromMinutes(2);
var retryspan = TimeSpan.FromSeconds(20);
using (var redlock =
await _redLockFactory.CreateLockAsync(data.Id.ToString(), lockexpiry, waitspan, retryspan, null))
{
if (!redlock.IsAcquired)
{
string errorMessage =
$"Did not acquire Lock on Lead {data.Id.ToString()}. Aborting.\n " +
$"Acquired{redlock.InstanceSummary.Acquired} \n " +
$"Error{redlock.InstanceSummary.Error} \n" +
$"Conflicted {redlock.InstanceSummary.Conflicted} \n" +
$"Status {redlock.Status}";
throw new Exception(errorMessage);
}
}
}
}
When I try to call this I expect my object to be returned, but instead I get null
On the line if (!redlock.IsAcquired) redLock is null
What is missing?
The definition of CreateLockAsync
/// <summary>
/// Gets a RedLock using the factory's set of redis endpoints. You should check the IsAcquired property before performing actions.
/// Blocks and retries up to the specified time limits.
/// </summary>
/// <param name="resource">The resource string to lock on. Only one RedLock should be acquired for any given resource at once.</param>
/// <param name="expiryTime">How long the lock should be held for.
/// RedLocks will automatically extend if the process that created the RedLock is still alive and the RedLock hasn't been disposed.</param>
/// <param name="waitTime">How long to block for until a lock can be acquired.</param>
/// <param name="retryTime">How long to wait between retries when trying to acquire a lock.</param>
/// <param name="cancellationToken">CancellationToken to abort waiting for blocking lock.</param>
/// <returns>A RedLock object.</returns>
Task<IRedLock> CreateLockAsync(string resource, TimeSpan expiryTime, TimeSpan waitTime, TimeSpan retryTime, CancellationToken? cancellationToken = null);
requires a nullable CancellationToken
CancellationToken? cancellationToken = null
But the setup of the mock uses
It.IsAny<CancellationToken>() //NOTE CancellationToken instead of CancellationToken?
Because the setup expects the non-nullable struct but when invoked the nullable CancellationToken? is what will be passed even though it is null,
The mock will return null by default because the setup does not match what was actually invoked.
Once the correct type was used the factory was able to return the desired mock
//...
redLockFactoryMock
.Setup(x => x.CreateLockAsync(It.IsAny<string>(),
It.IsAny<TimeSpan>(), It.IsAny<TimeSpan>(),
It.IsAny<TimeSpan>(), It.IsAny<CancellationToken?>()))
.ReturnsAsync(mock);
//...

How to build a dynamic command object?

I'll try to make this as clear as possible.
A Plugin architecture using reflection and 2 Attributes and an abstract class:
PluginEntryAttribute(Targets.Assembly, typeof(MyPlugin))
PluginImplAttribute(Targets.Class, ...)
abstract class Plugin
Commands are routed to a plugin via an interface and a delegate:
Ex: public delegate TTarget Command<TTarget>(object obj);
Using extension methods with Command<> as the target, a CommandRouter executes the delegate on the correct target interface:
Ex:
public static TResult Execute<TTarget, TResult>(this Command<TTarget> target, Func<TTarget, TResult> func) {
return CommandRouter.Default.Execute(func);
}
Putting this together, I have a class hard-coded with the command delegates like so:
public class Repositories {
public static Command<IDispatchingRepository> Dispatching = (o) => { return (IDispatchingRepository)o; };
public static Command<IPositioningRepository> Positioning = (o) => { return (IPositioningRepository)o; };
public static Command<ISchedulingRepository> Scheduling = (o) => { return (ISchedulingRepository)o; };
public static Command<IHistographyRepository> Histography = (o) => { return (IHistographyRepository)o; };
}
When an object wants to query from the repository, practical execution looks like this:
var expBob = Dispatching.Execute(repo => repo.AddCustomer("Bob"));
var actBob = Dispatching.Execute(repo => repo.GetCustomer("Bob"));
My question is this: how can I create such a class as Repositories dynamically from the plugins?
I can see the possibility that another attribute might be necessary. Something along the lines of:
[RoutedCommand("Dispatching", typeof(IDispatchingRepository)")]
public Command<IDispatchingRepository> Dispatching = (o) => { return (IDispatchingRepository)o; };
This is just an idea, but I'm at a loss as to how I'd still create a dynamic menu of sorts like the Repositories class.
For completeness, the CommandRouter.Execute(...) method and related Dictionary<,>:
private readonly Dictionary<Type, object> commandTargets;
internal TResult Execute<TTarget, TResult>(Func<TTarget, TResult> func) {
var result = default(TResult);
if (commandTargets.TryGetValue(typeof(TTarget), out object target)) {
result = func((TTarget)target);
}
return result;
}
OK, i am not sure if this is what you are looking for. I am assuming that each plugin contains field of the following definition:
public Command<T> {Name} = (o) => { return (T)o; };
example from code provided by you:
public Command<IDispatchingRepository> Dispatching = (o) => { return (IDispatchingRepository)o; };
One way to dynamically create class in .NET Core is by using the Microsoft.CodeAnalysis.CSharp nuget - this is Roslyn.
The result is compiled assembly with class called DynamicRepositories having all command fields from all plugins from all loaded dlls into the current AppDomain represented as static public fields.
The code has 3 main components: DynamicRepositoriesBuildInfo class, GetDynamicRepositoriesBuildInfo method and LoadDynamicRepositortyIntoAppDomain method.
DynamicRepositoriesBuildInfo - information for the command fields from the plugins and all assemblies needed to be loaded during the dynamic complication. This will be the assemblies which defines the Command type and the generic arguments of the Command type (ex: IDispatchingRepository)
GetDynamicRepositoriesBuildInfo method - creates DynamicRepositoriesBuildInfo using reflection by scanning loaded assemblies for the PluginEntryAttribute and PluginImplAttribute.
LoadDynamicRepositortyIntoAppDomain method - DynamicRepositoriesBuildInfo it creates assembly called DynamicRepository.dll with single public class App.Dynamic.DynamicRepositories
Here is the code
public class DynamicRepositoriesBuildInfo
{
public IReadOnlyCollection<Assembly> ReferencesAssemblies { get; }
public IReadOnlyCollection<FieldInfo> PluginCommandFieldInfos { get; }
public DynamicRepositoriesBuildInfo(
IReadOnlyCollection<Assembly> referencesAssemblies,
IReadOnlyCollection<FieldInfo> pluginCommandFieldInfos)
{
this.ReferencesAssemblies = referencesAssemblies;
this.PluginCommandFieldInfos = pluginCommandFieldInfos;
}
}
private static DynamicRepositoriesBuildInfo GetDynamicRepositoriesBuildInfo()
{
var pluginCommandProperties = (from a in AppDomain.CurrentDomain.GetAssemblies()
let entryAttr = a.GetCustomAttribute<PluginEntryAttribute>()
where entryAttr != null
from t in a.DefinedTypes
where t == entryAttr.PluginType
from p in t.GetFields(BindingFlags.Public | BindingFlags.Instance)
where p.FieldType.GetGenericTypeDefinition() == typeof(Command<>)
select p).ToList();
var referenceAssemblies = pluginCommandProperties
.Select(x => x.DeclaringType.Assembly)
.ToList();
referenceAssemblies.AddRange(
pluginCommandProperties
.SelectMany(x => x.FieldType.GetGenericArguments())
.Select(x => x.Assembly)
);
var buildInfo = new DynamicRepositoriesBuildInfo(
pluginCommandFieldInfos: pluginCommandProperties,
referencesAssemblies: referenceAssemblies.Distinct().ToList()
);
return buildInfo;
}
private static Assembly LoadDynamicRepositortyIntoAppDomain()
{
var buildInfo = GetDynamicRepositoriesBuildInfo();
var csScriptBuilder = new StringBuilder();
csScriptBuilder.AppendLine("using System;");
csScriptBuilder.AppendLine("namespace App.Dynamic");
csScriptBuilder.AppendLine("{");
csScriptBuilder.AppendLine(" public class DynamicRepositories");
csScriptBuilder.AppendLine(" {");
foreach (var commandFieldInfo in buildInfo.PluginCommandFieldInfos)
{
var commandNamespaceStr = commandFieldInfo.FieldType.Namespace;
var commandTypeStr = commandFieldInfo.FieldType.Name.Split('`')[0];
var commandGenericArgStr = commandFieldInfo.FieldType.GetGenericArguments().Single().FullName;
var commandFieldNameStr = commandFieldInfo.Name;
csScriptBuilder.AppendLine($"public {commandNamespaceStr}.{commandTypeStr}<{commandGenericArgStr}> {commandFieldNameStr} => (o) => ({commandGenericArgStr})o;");
}
csScriptBuilder.AppendLine(" }");
csScriptBuilder.AppendLine("}");
var sourceText = SourceText.From(csScriptBuilder.ToString());
var parseOpt = CSharpParseOptions.Default.WithLanguageVersion(LanguageVersion.CSharp7_3);
var syntaxTree = SyntaxFactory.ParseSyntaxTree(sourceText, parseOpt);
var references = new List<MetadataReference>
{
MetadataReference.CreateFromFile(typeof(object).Assembly.Location),
MetadataReference.CreateFromFile(typeof(System.Runtime.AssemblyTargetedPatchBandAttribute).Assembly.Location),
};
references.AddRange(buildInfo.ReferencesAssemblies.Select(a => MetadataReference.CreateFromFile(a.Location)));
var compileOpt = new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary,
optimizationLevel: OptimizationLevel.Release,
assemblyIdentityComparer: DesktopAssemblyIdentityComparer.Default);
var compilation = CSharpCompilation.Create(
"DynamicRepository.dll",
new[] { syntaxTree },
references: references,
options: compileOpt);
using (var memStream = new MemoryStream())
{
var result = compilation.Emit(memStream);
if (result.Success)
{
var assembly = AppDomain.CurrentDomain.Load(memStream.ToArray());
return assembly;
}
else
{
throw new ArgumentException();
}
}
}
This is how to execute the code
var assembly = LoadDynamicRepositortyIntoAppDomain();
var type = assembly.GetType("App.Dynamic.DynamicRepositories");
The type variable represents the compiled class which has all the plugin commands as public static fields. You are loosing all type safety once you start using dynamic code compilation / building. If you need to execute some code from the type variable you will need reflection.
So if you have
PluginA
{
public Command<IDispatchingRepository> Dispatching= (o) => ....
}
PluginB
{
public Command<IDispatchingRepository> Scheduling = (o) => ....
}
the dynamically create type will look like this
public class DynamicRepositories
{
public static Command<IDispatchingRepository> Dispatching= (o) => ....
public static Command<IDispatchingRepository> Scheduling = (o) => ....
}
Here's another take, which does not require building code dynamically.
I'm assuming the following code for the plugin framework. Note that I did not make any assumptions regarding the abstract Plugin class, because I had no further information.
#region Plugin Framework
public delegate TTarget Command<out TTarget>(object obj);
/// <summary>
/// Abstract base class for plugins.
/// </summary>
public abstract class Plugin
{
}
#endregion
Next, here are two sample plugins. Note the DynamicTarget custom attributes, which I will describe in the next step.
#region Sample Plugin: ICustomerRepository
/// <summary>
/// Sample model class, representing a customer.
/// </summary>
public class Customer
{
public Customer(string name)
{
Name = name;
}
public string Name { get; }
}
/// <summary>
/// Sample target interface.
/// </summary>
public interface ICustomerRepository
{
Customer AddCustomer(string name);
Customer GetCustomer(string name);
}
/// <summary>
/// Sample plugin.
/// </summary>
[DynamicTarget(typeof(ICustomerRepository))]
public class CustomerRepositoryPlugin : Plugin, ICustomerRepository
{
private readonly Dictionary<string, Customer> _customers = new Dictionary<string, Customer>();
public Customer AddCustomer(string name)
{
var customer = new Customer(name);
_customers[name] = customer;
return customer;
}
public Customer GetCustomer(string name)
{
return _customers[name];
}
}
#endregion
#region Sample Plugin: IProductRepository
/// <summary>
/// Sample model class, representing a product.
/// </summary>
public class Product
{
public Product(string name)
{
Name = name;
}
public string Name { get; }
}
/// <summary>
/// Sample target interface.
/// </summary>
public interface IProductRepository
{
Product AddProduct(string name);
Product GetProduct(string name);
}
/// <summary>
/// Sample plugin.
/// </summary>
[DynamicTarget(typeof(IProductRepository))]
public class ProductRepositoryPlugin : Plugin, IProductRepository
{
private readonly Dictionary<string, Product> _products = new Dictionary<string, Product>();
public Product AddProduct(string name)
{
var product = new Product(name);
_products[name] = product;
return product;
}
public Product GetProduct(string name)
{
return _products[name];
}
}
#endregion
Here's what your static Repositories class would look like with the two sample plugins:
#region Static Repositories Example Class from Question
public static class Repositories
{
public static readonly Command<ICustomerRepository> CustomerRepositoryCommand = o => (ICustomerRepository) o;
public static readonly Command<IProductRepository> ProductRepositoryCommand = o => (IProductRepository) o;
}
#endregion
To begin the actual answer to your question here's the custom attribute used to mark the plugins. This custom attribute has been used on the two example plugins shown above.
/// <summary>
/// Marks a plugin as the target of a <see cref="Command{TTarget}" />, specifying
/// the type to be registered with the <see cref="DynamicCommands" />.
/// </summary>
[AttributeUsage(AttributeTargets.Class, Inherited = false, AllowMultiple = true)]
public class DynamicTargetAttribute : Attribute
{
public DynamicTargetAttribute(Type type)
{
Type = type;
}
public Type Type { get; }
}
The custom attribute is parsed in the RegisterDynamicTargets(Assembly) of the following DynamicRepository class to identify the plugins and the types (e.g., ICustomerRepository) to be registered. The targets are registered with the CommandRouter shown below.
/// <summary>
/// A dynamic command repository.
/// </summary>
public static class DynamicCommands
{
/// <summary>
/// For all assemblies in the current domain, registers all targets marked with the
/// <see cref="DynamicTargetAttribute" />.
/// </summary>
public static void RegisterDynamicTargets()
{
foreach (Assembly assembly in AppDomain.CurrentDomain.GetAssemblies())
{
RegisterDynamicTargets(assembly);
}
}
/// <summary>
/// For the given <see cref="Assembly" />, registers all targets marked with the
/// <see cref="DynamicTargetAttribute" />.
/// </summary>
/// <param name="assembly"></param>
public static void RegisterDynamicTargets(Assembly assembly)
{
IEnumerable<Type> types = assembly
.GetTypes()
.Where(type => type.CustomAttributes
.Any(ca => ca.AttributeType == typeof(DynamicTargetAttribute)));
foreach (Type type in types)
{
// Note: This assumes that we simply instantiate an instance upon registration.
// You might have a different convention with your plugins (e.g., they might be
// singletons accessed via an Instance or Default property). Therefore, you
// might have to change this.
object target = Activator.CreateInstance(type);
IEnumerable<CustomAttributeData> customAttributes = type.CustomAttributes
.Where(ca => ca.AttributeType == typeof(DynamicTargetAttribute));
foreach (CustomAttributeData customAttribute in customAttributes)
{
CustomAttributeTypedArgument argument = customAttribute.ConstructorArguments.First();
CommandRouter.Default.RegisterTarget((Type) argument.Value, target);
}
}
}
/// <summary>
/// Registers the given target.
/// </summary>
/// <typeparam name="TTarget">The type of the target.</typeparam>
/// <param name="target">The target.</param>
public static void RegisterTarget<TTarget>(TTarget target)
{
CommandRouter.Default.RegisterTarget(target);
}
/// <summary>
/// Gets the <see cref="Command{TTarget}" /> for the given <typeparamref name="TTarget" />
/// type.
/// </summary>
/// <typeparam name="TTarget">The target type.</typeparam>
/// <returns>The <see cref="Command{TTarget}" />.</returns>
public static Command<TTarget> Get<TTarget>()
{
return obj => (TTarget) obj;
}
/// <summary>
/// Extension method used to help dispatch the command.
/// </summary>
/// <typeparam name="TTarget">The type of the target.</typeparam>
/// <typeparam name="TResult">The type of the result of the function invoked on the target.</typeparam>
/// <param name="_">The <see cref="Command{TTarget}" />.</param>
/// <param name="func">The function invoked on the target.</param>
/// <returns>The result of the function invoked on the target.</returns>
public static TResult Execute<TTarget, TResult>(this Command<TTarget> _, Func<TTarget, TResult> func)
{
return CommandRouter.Default.Execute(func);
}
}
Instead of dynamically creating properties, the above utility class offers a simple Command<TTarget> Get<TTarget>() method, with which you can create the Command<TTarget> instance, which is then used in the Execute extension method. The latter method finally delegates to the CommandRouter shown next.
/// <summary>
/// Command router used to dispatch commands to targets.
/// </summary>
public class CommandRouter
{
public static readonly CommandRouter Default = new CommandRouter();
private readonly Dictionary<Type, object> _commandTargets = new Dictionary<Type, object>();
/// <summary>
/// Registers a target.
/// </summary>
/// <typeparam name="TTarget">The type of the target instance.</typeparam>
/// <param name="target">The target instance.</param>
public void RegisterTarget<TTarget>(TTarget target)
{
_commandTargets[typeof(TTarget)] = target;
}
/// <summary>
/// Registers a target instance by <see cref="Type" />.
/// </summary>
/// <param name="type">The <see cref="Type" /> of the target.</param>
/// <param name="target">The target instance.</param>
public void RegisterTarget(Type type, object target)
{
_commandTargets[type] = target;
}
internal TResult Execute<TTarget, TResult>(Func<TTarget, TResult> func)
{
var result = default(TResult);
if (_commandTargets.TryGetValue(typeof(TTarget), out object target))
{
result = func((TTarget)target);
}
return result;
}
}
#endregion
Finally, here are a few unit tests showing how the above classes work.
#region Unit Tests
public class DynamicCommandTests
{
[Fact]
public void TestUsingStaticRepository_StaticDeclaration_Success()
{
ICustomerRepository customerRepository = new CustomerRepositoryPlugin();
CommandRouter.Default.RegisterTarget(customerRepository);
Command<ICustomerRepository> command = Repositories.CustomerRepositoryCommand;
Customer expected = command.Execute(repo => repo.AddCustomer("Bob"));
Customer actual = command.Execute(repo => repo.GetCustomer("Bob"));
Assert.Equal(expected, actual);
Assert.Equal("Bob", actual.Name);
}
[Fact]
public void TestUsingDynamicRepository_ManualRegistration_Success()
{
ICustomerRepository customerRepository = new CustomerRepositoryPlugin();
DynamicCommands.RegisterTarget(customerRepository);
Command<ICustomerRepository> command = DynamicCommands.Get<ICustomerRepository>();
Customer expected = command.Execute(repo => repo.AddCustomer("Bob"));
Customer actual = command.Execute(repo => repo.GetCustomer("Bob"));
Assert.Equal(expected, actual);
Assert.Equal("Bob", actual.Name);
}
[Fact]
public void TestUsingDynamicRepository_DynamicRegistration_Success()
{
// Register all plugins, i.e., CustomerRepositoryPlugin and ProductRepositoryPlugin
// in this test case.
DynamicCommands.RegisterDynamicTargets();
// Invoke ICustomerRepository methods on CustomerRepositoryPlugin target.
Command<ICustomerRepository> customerCommand = DynamicCommands.Get<ICustomerRepository>();
Customer expectedBob = customerCommand.Execute(repo => repo.AddCustomer("Bob"));
Customer actualBob = customerCommand.Execute(repo => repo.GetCustomer("Bob"));
Assert.Equal(expectedBob, actualBob);
Assert.Equal("Bob", actualBob.Name);
// Invoke IProductRepository methods on ProductRepositoryPlugin target.
Command<IProductRepository> productCommand = DynamicCommands.Get<IProductRepository>();
Product expectedHammer = productCommand.Execute(repo => repo.AddProduct("Hammer"));
Product actualHammer = productCommand.Execute(repo => repo.GetProduct("Hammer"));
Assert.Equal(expectedHammer, actualHammer);
Assert.Equal("Hammer", actualHammer.Name);
}
}
#endregion
You can find the whole implementation here.

Performance and Threading improvements ASP.Net MVC

Im totally new to threading, and want to make use of 6 Core processor and gain improvements.
Im trying to find some quick wins, my little business is growing and I've noticed some performance hits (a couple of customer have advised me) when completing a few sections on my service, some of it Im guessing may be down to the need to send emails and waiting for the third party to respond, is their an easy way to pass this off onto another thread\ while not breaking the session\service?
I have an action when an appointment is "Completed"
switch (appointment.State)
{
case DomainObjects.AppointmentState.Completed:
_clientService.SendMessageToClient(clientId,"Email title"EmailMessage(appointment, "AppointmentThankYou"), appointment.Id, userId);
break;
}
Is this better?
case DomainObjects.AppointmentState.Completed:
var emailThread = new Thread(() => _clientService.SendMessageToClient(clientId,"Email Subject",
EmailMessage(appointment, "AppointmentThankYou"),
appointment.Id, userId))
{
IsBackground = true
};
emailThread.Start();
Constructive Feedback welcomed.
To be honest I believe that your approach outlined above needs a considerable amount of work before releasing to production. Imagine the situation where 1000 users click on your site - you now have 1000 background tasks all trying to send messages at the same time. This will probably bottleneck your system for disk and network IO.
While there are a number of ways to approach this problem, one of the most common is to use a producer-consumer queue, preferably using a thread safe collection such as ConcurrentQueue with the number of active long running, e.g. emailing threads in process at any time controlled by a synchronization mechanism such as SemaphoreSlim
I've created a very simple application to demonstrate this approach as shown below. The key classes in this are
The MessageProcessor class which maintains the queue and controls access for both adding items AddToQueue method and sending messages ReadFromQueue. The class itself implements the Singleton pattern to ensure only one instance is ever present in the application (you don't want multiple queues). The ReadFromQueue method also implements a timer (set for 2 seconds) which specifies how often a task should be spawned to send a message.
The SingletonBase class is just an abstract class for implementing the Singleton pattern
The MessageSender class is used for the actual work of sending the message
The CreateMessagesForTest class simply simulates creating test messages for the purpose of this answer
Hope this helps
using System;
using System.Collections.Concurrent;
using System.Globalization;
using System.Reactive.Linq;
using System.Reflection;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication9
{
internal class Program
{
private static void Main(string[] args)
{
MessagingProcessor.Instance.ReadFromQueue(); // starts the message sending tasks
var createMessages = new CreateMessagesForTest();
createMessages.CreateTestMessages(); // creates sample test messages for processing
Console.ReadLine();
}
}
/// <summary>
/// Simply creates test messages every second for sending
/// </summary>
public class CreateMessagesForTest
{
public void CreateTestMessages()
{
IObservable<long> observable = Observable.Interval(TimeSpan.FromSeconds(1));
// Token for cancelation
var source = new CancellationTokenSource();
// Create task to execute.
Action action = (CreateMessage);
// Subscribe the obserable to the task on execution.
observable.Subscribe(x =>
{
var task = new Task(action);
task.Start();
}, source.Token);
}
private static void CreateMessage()
{
var message = new Message {EMailAddress = "aa#aa.com", MessageBody = "abcdefg"};
MessagingProcessor.Instance.AddToQueue(message);
}
}
/// <summary>
/// The conents of the email to send
/// </summary>
public class Message
{
public string EMailAddress { get; set; }
public string MessageBody { get; set; }
}
/// <summary>
/// Handles all aspects of processing the messages, only one instance of this class is allowed
/// at any time
/// </summary>
public class MessagingProcessor : SingletonBase<MessagingProcessor>
{
private MessagingProcessor()
{
}
private ConcurrentQueue<Message> _messagesQueue = new ConcurrentQueue<Message>();
// create a semaphore to limit the number of threads which can send an email at any given time
// In this case only allow 2 to be processed at any given time
private static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(2, 2);
public void AddToQueue(Message message)
{
_messagesQueue.Enqueue(message);
}
/// <summary>
/// Used to start the process for sending emails
/// </summary>
public void ReadFromQueue()
{
IObservable<long> observable = Observable.Interval(TimeSpan.FromSeconds(2));
// Token for cancelation
var source = new CancellationTokenSource();
// Create task to execute.
Action action = (SendMessages);
// Subscribe the obserable to the task on execution.
observable.Subscribe(x =>
{
var task = new Task(action);
task.Start();
}, source.Token);
}
/// <summary>
/// Handles dequeing and syncronisation to the queue
/// </summary>
public void SendMessages()
{
try
{
Semaphore.Wait();
Message message;
while (_messagesQueue.TryDequeue(out message)) // if we have a message to send
{
var messageSender = new MessageSender();
messageSender.SendMessage(message);
}
}
finally
{
Semaphore.Release();
}
}
}
/// <summary>
/// Sends the emails
/// </summary>
public class MessageSender
{
public void SendMessage(Message message)
{
// do some long running task
}
}
/// <summary>
/// Implements singleton pattern on all classes which derive from it
/// </summary>
/// <typeparam name="T">Derived class</typeparam>
public abstract class SingletonBase<T> where T : class
{
public static T Instance
{
get { return SingletonFactory.Instance; }
}
/// <summary>
/// The singleton class factory to create the singleton instance.
/// </summary>
private class SingletonFactory
{
static SingletonFactory()
{
}
private SingletonFactory()
{
}
internal static readonly T Instance = GetInstance();
private static T GetInstance()
{
var theType = typeof (T);
T inst;
try
{
inst = (T) theType
.InvokeMember(theType.Name,
BindingFlags.CreateInstance | BindingFlags.Instance
| BindingFlags.NonPublic,
null, null, null,
CultureInfo.InvariantCulture);
}
catch (MissingMethodException ex)
{
var exception = new TypeLoadException(string.Format(
CultureInfo.CurrentCulture,
"The type '{0}' must have a private constructor to " +
"be used in the Singleton pattern.", theType.FullName)
, ex);
//LogManager.LogException(LogManager.EventIdInternal, exception, "error in instantiating the singleton");
throw exception;
}
return inst;
}
}
}
}

How can I make method signature caching?

I'm building an app in .NET and C#, and I'd like to cache some of the results by using attributes/annotations instead of explicit code in the method.
I'd like a method signature that looks a bit like this:
[Cache, timeToLive=60]
String getName(string id, string location)
It should make a hash based on the inputs, and use that as the key for the result.
Naturally, there'd be some config file telling it how to actually put in memcached, local dictionary or something.
Do you know of such a framework?
I'd even be interested in one for Java as well
With CacheHandler in Microsoft Enterprise Library you can easily achieve this.
For instance:
[CacheHandler(0, 30, 0)]
public Object GetData(Object input)
{
}
would make all calls to that method cached for 30 minutes. All invocations gets a unique cache-key based on the input data and method name so if you call the method twice with different input it doesn't get cached but if you call it >1 times within the timout interval with the same input then the method only gets executed once.
I've added some extra features to Microsoft's code:
My modified version looks like this:
using System;
using System.Diagnostics;
using System.IO;
using System.Reflection;
using System.Runtime.Remoting.Contexts;
using System.Text;
using System.Web;
using System.Web.Caching;
using System.Web.UI;
using Microsoft.Practices.EnterpriseLibrary.Common.Configuration;
using Microsoft.Practices.Unity.InterceptionExtension;
namespace Middleware.Cache
{
/// <summary>
/// An <see cref="ICallHandler"/> that implements caching of the return values of
/// methods. This handler stores the return value in the ASP.NET cache or the Items object of the current request.
/// </summary>
[ConfigurationElementType(typeof (CacheHandler)), Synchronization]
public class CacheHandler : ICallHandler
{
/// <summary>
/// The default expiration time for the cached entries: 5 minutes
/// </summary>
public static readonly TimeSpan DefaultExpirationTime = new TimeSpan(0, 5, 0);
private readonly object cachedData;
private readonly DefaultCacheKeyGenerator keyGenerator;
private readonly bool storeOnlyForThisRequest = true;
private TimeSpan expirationTime;
private GetNextHandlerDelegate getNext;
private IMethodInvocation input;
public CacheHandler(TimeSpan expirationTime, bool storeOnlyForThisRequest)
{
keyGenerator = new DefaultCacheKeyGenerator();
this.expirationTime = expirationTime;
this.storeOnlyForThisRequest = storeOnlyForThisRequest;
}
/// <summary>
/// This constructor is used when we wrap cached data in a CacheHandler so that
/// we can reload the object after it has been removed from the cache.
/// </summary>
/// <param name="expirationTime"></param>
/// <param name="storeOnlyForThisRequest"></param>
/// <param name="input"></param>
/// <param name="getNext"></param>
/// <param name="cachedData"></param>
public CacheHandler(TimeSpan expirationTime, bool storeOnlyForThisRequest,
IMethodInvocation input, GetNextHandlerDelegate getNext,
object cachedData)
: this(expirationTime, storeOnlyForThisRequest)
{
this.input = input;
this.getNext = getNext;
this.cachedData = cachedData;
}
/// <summary>
/// Gets or sets the expiration time for cache data.
/// </summary>
/// <value>The expiration time.</value>
public TimeSpan ExpirationTime
{
get { return expirationTime; }
set { expirationTime = value; }
}
#region ICallHandler Members
/// <summary>
/// Implements the caching behavior of this handler.
/// </summary>
/// <param name="input"><see cref="IMethodInvocation"/> object describing the current call.</param>
/// <param name="getNext">delegate used to get the next handler in the current pipeline.</param>
/// <returns>Return value from target method, or cached result if previous inputs have been seen.</returns>
public IMethodReturn Invoke(IMethodInvocation input, GetNextHandlerDelegate getNext)
{
lock (input.MethodBase)
{
this.input = input;
this.getNext = getNext;
return loadUsingCache();
}
}
public int Order
{
get { return 0; }
set { }
}
#endregion
private IMethodReturn loadUsingCache()
{
//We need to synchronize calls to the CacheHandler on method level
//to prevent duplicate calls to methods that could be cached.
lock (input.MethodBase)
{
if (TargetMethodReturnsVoid(input) || HttpContext.Current == null)
{
return getNext()(input, getNext);
}
var inputs = new object[input.Inputs.Count];
for (int i = 0; i < inputs.Length; ++i)
{
inputs[i] = input.Inputs[i];
}
string cacheKey = keyGenerator.CreateCacheKey(input.MethodBase, inputs);
object cachedResult = getCachedResult(cacheKey);
if (cachedResult == null)
{
var stopWatch = Stopwatch.StartNew();
var realReturn = getNext()(input, getNext);
stopWatch.Stop();
if (realReturn.Exception == null && realReturn.ReturnValue != null)
{
AddToCache(cacheKey, realReturn.ReturnValue);
}
return realReturn;
}
var cachedReturn = input.CreateMethodReturn(cachedResult, input.Arguments);
return cachedReturn;
}
}
private object getCachedResult(string cacheKey)
{
//When the method uses input that is not serializable
//we cannot create a cache key and can therefore not
//cache the data.
if (cacheKey == null)
{
return null;
}
object cachedValue = !storeOnlyForThisRequest ? HttpRuntime.Cache.Get(cacheKey) : HttpContext.Current.Items[cacheKey];
var cachedValueCast = cachedValue as CacheHandler;
if (cachedValueCast != null)
{
//This is an object that is reloaded when it is being removed.
//It is therefore wrapped in a CacheHandler-object and we must
//unwrap it before returning it.
return cachedValueCast.cachedData;
}
return cachedValue;
}
private static bool TargetMethodReturnsVoid(IMethodInvocation input)
{
var targetMethod = input.MethodBase as MethodInfo;
return targetMethod != null && targetMethod.ReturnType == typeof (void);
}
private void AddToCache(string key, object valueToCache)
{
if (key == null)
{
//When the method uses input that is not serializable
//we cannot create a cache key and can therefore not
//cache the data.
return;
}
if (!storeOnlyForThisRequest)
{
HttpRuntime.Cache.Insert(
key,
valueToCache,
null,
System.Web.Caching.Cache.NoAbsoluteExpiration,
expirationTime,
CacheItemPriority.Normal, null);
}
else
{
HttpContext.Current.Items[key] = valueToCache;
}
}
}
/// <summary>
/// This interface describes classes that can be used to generate cache key strings
/// for the <see cref="CacheHandler"/>.
/// </summary>
public interface ICacheKeyGenerator
{
/// <summary>
/// Creates a cache key for the given method and set of input arguments.
/// </summary>
/// <param name="method">Method being called.</param>
/// <param name="inputs">Input arguments.</param>
/// <returns>A (hopefully) unique string to be used as a cache key.</returns>
string CreateCacheKey(MethodBase method, object[] inputs);
}
/// <summary>
/// The default <see cref="ICacheKeyGenerator"/> used by the <see cref="CacheHandler"/>.
/// </summary>
public class DefaultCacheKeyGenerator : ICacheKeyGenerator
{
private readonly LosFormatter serializer = new LosFormatter(false, "");
#region ICacheKeyGenerator Members
/// <summary>
/// Create a cache key for the given method and set of input arguments.
/// </summary>
/// <param name="method">Method being called.</param>
/// <param name="inputs">Input arguments.</param>
/// <returns>A (hopefully) unique string to be used as a cache key.</returns>
public string CreateCacheKey(MethodBase method, params object[] inputs)
{
try
{
var sb = new StringBuilder();
if (method.DeclaringType != null)
{
sb.Append(method.DeclaringType.FullName);
}
sb.Append(':');
sb.Append(method.Name);
TextWriter writer = new StringWriter(sb);
if (inputs != null)
{
foreach (var input in inputs)
{
sb.Append(':');
if (input != null)
{
//Diffrerent instances of DateTime which represents the same value
//sometimes serialize differently due to some internal variables which are different.
//We therefore serialize it using Ticks instead. instead.
var inputDateTime = input as DateTime?;
if (inputDateTime.HasValue)
{
sb.Append(inputDateTime.Value.Ticks);
}
else
{
//Serialize the input and write it to the key StringBuilder.
serializer.Serialize(writer, input);
}
}
}
}
return sb.ToString();
}
catch
{
//Something went wrong when generating the key (probably an input-value was not serializble.
//Return a null key.
return null;
}
}
#endregion
}
}
Microsoft deserves most credit for this code. We've only added stuff like caching at request level instead of across requests (more useful than you might think) and fixed some bugs (e.g. equal DateTime-objects serializing to different values).
To do exactly what you are describing, i.e. writing
public class MyClass {
[Cache, timeToLive=60]
string getName(string id, string location){
return ExpensiveCall(id, location);
}
}
// ...
MyClass c = new MyClass();
string name = c.getName("id", "location");
string name_again = c.getName("id", "location");
and having only one invocation of the expensive call and without needing to wrap the class with some other code (f.x. CacheHandler<MyClass> c = new CacheHandler<MyClass>(new MyClass());) you need to look into an Aspect Oriented Programming framework. Those usually work by rewriting the byte-code, so you need to add another step to your compilation process - but you gain a lot of power in the process. There are many AOP-frameworks, but PostSharp for .NET and AspectJ are among the most popular. You can easily Google how to use those to add the caching-aspect you want.

Categories

Resources