I am modifying an old, large (and un-documented) program in C# that uses an API to talk on a serial bus.
Is there some way of letting OnIndication trigger SendRepeatRequest to continue?
I would like to avoid polling a flag with wait Xms as response time varies greatly and I need quick responses.
//Pseudocode
public void SendRepeatRequest(int X)
{
SendToAPI();
// Wait until API responds, usually a few ms but can take 1-2min
// loop X times
}
//this is activated by the API response
public void OnIndication()
{
// Handle request from API...
// Tell SendRepeatRequest to continue
}
Do you have any suggestions on how to do this?
Thanks!
You may want to look into the Task library (introduced in .NET 4.0 under the System.Threading.Tasks namespace). It has a variety of threading operations to do this pretty easily.
I believe the following section might help (or get you started).
public void OnIndication()
{
Task doWork = new Task(() =>
{
// Handle request
});
Action<Task> onComplete = (task) =>
{
SendRepeatRequest(X, args)
};
doWork.Start();
doWork.ContinueWith(onComplete, TaskScheduler.FromCurrentSynchronizationContext());
}
Related
I need to add delay to some code block. I am using Task ContinueWith to achieve that, it works as expected when tested on my machine but the deployed code is not working.
public void DailyReminder()
{
//Do somethings
System.Threading.Tasks.Task.Delay(300000).ContinueWith(t => EmailAlert()); //This should not block UI
//Do more things, this should not wait for 300000 ms to get excecuted
}
public void EmailAlert()
{
//Send Email
}
For Example i need the below code to produce A B C and D only after the 5 sec delay -
using System;
using System.Threading.Tasks;
namespace HelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("A");
EmailAlert();
Console.WriteLine("C");
}
private static async Task EmailAlert()
{
Console.WriteLine("B");
await Task.Delay(5000);
Console.WriteLine("D");
}
}
}
You should separate your EmailAlert (which is a fire-and-forget task) and the logic that is not dependent on the task.
public void DailyReminder()
{
// A discard pattern to make it explicit this is a fire-and-forget
_ = EmailAlert();
// Additional code that does not depend on the preceding task
}
public async Task EmailAlert()
{
await Task.Delay(300_000);
// await Some email sending logic
}
it works as expected when tested on my machine but the deployed code is not working.
If you're using some kind of shared or cloud hosting, it's normal to have your app shut down when it's done servicing requests for some time. This is why fire-and-forget on ASP.NET is dangerous.
Since you definitely want your email to be sent, fire-and-forget is not an appropriate solution. Instead, you want a basic distributed architecture:
Instead of sending the email from ASP.NET, serialize it to a message sent to a durable queue.
Have a separate background process read that queue and send the actual email.
I'm not sure of where in your code you are trying to create this delay, but you normally don't have the control over how long a thread will be alive when deploying your application to a server.
Normally a thread is created when a user sends a request to the application, and when it returns its response the thread will be stopped.
Also, the web server may close down the entire application when there's no traffic for a while.
The solution for these long running methods is to use Worker services.
You can read more from the documentation at https://learn.microsoft.com/en-us/dotnet/core/extensions/workers
You try use case this :
public void DailyReminder()
{
//Do somethings
System.Threading.Tasks.Task.Delay(300000).ContinueWith(t => EmailAlert()).Wait(); //This should not block UI
//Do more things, this should not wait for 300000 ms to get excecuted
}
public void EmailAlert()
{
//Send Email
}
I want to call a method inside some method and do not want the process to wait or take time on other method completion.
like this
public ActionResult Insert(int userId)
{
_userService.Insert(userId);
SyncUserInSomeOtherCollection(userId);
return new EmptyResult();
}
private SyncUserInSomeOtherCollection(int userId)
{
//Do Sync work which will actually take some time
}
I want to make SyncUserInSomeOtherCollection() work in such a way so that the main method return result without any wait.
I tried to run a task like this
Task.Run(async () => await SyncUserInSomeOtherCollection(userId)).Result;
But not sure if this a good approach to follow.
Based on the code that you shared, I assume you're working on ASP.NET. I've come across same scenario and found QueueBackgroundWorkItem to work for me, it also allows you to have graceful shutdown of your long running task when Worker Process is stopped / shutdown.
HostingEnvironment.QueueBackgroundWorkItem(cancellationToken =>
{
if (!cancellationToken.IsCancellationRequested)
{
//Do Action
}
else
{
//Handle Graceful Shutdown
}
}
There are more ways to accomplish this. I suggest you go through this excellent article by Scott Hanselman.
I am newbie to .NET Core and asynchronous programming. I am trying to implement a console application to do the following:
The console application should work as intermediator between two external APIs. eg. API-1 and API-2.
It should call API-1 after every 10 milliseconds to get data.
Immediately call API-2 to submit the data that is received from API-1.
Console Application needs to wait for API-1 to receive data, but does not have to wait for the response from API-2.
Below is my code. It not working as expected. At first it invokes API-1 in 10 milliseconds as expected, but after that its invoking API-1 ONLY AFTER it receives response from API-2.
So assume API-2 takes 20 seconds, API-1 is also getting invoked after 20 seconds.
How do I make API-2 call asynchronous so it does not have to wait for API-2 response?
namespace ConsoleApp1
{
public class Program
{
private static Timer _timer;
private const int TIME_INTERVAL_IN_MILLISECONDS = 10; // 10 milliseconds
private const int API2_DELAY = 20000; // 20 seconds
public static void Main(string[] args)
{
Dowork().Wait();
Console.WriteLine("Press Any Key to stop");
Console.ReadKey();
Console.WriteLine("Done");
}
private static async Task Dowork()
{
var data = new SomeData();
_timer = new Timer(CallAPI1, data, TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
await Task.Yield();
}
private static async void CallAPI1(object state)
{
var data = state as SomeData;
Console.WriteLine("Calling API One to get some data.");
data.SomeStringValue = DateTime.Now.ToString();
await CallAPI2(data);
_timer.Change(TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
}
private static async Task CallAPI2(SomeData data)
{
Console.WriteLine("Calling API Two by passing some data received from API One " + data.SomeStringValue);
// the delay represent long running call to API 2
await Task.Delay(API2_DELAY);
}
}
}
POCO class
namespace ConsoleApp1
{
public class SomeData
{
public string SomeStringValue { get; set; }
}
}
Also note that API-1 and API-2 will be developed in ASP.NET Core 1
Update1
Let me rephrase above sentence. The API-1 would be developed in .Net core but API-2 would be windows workflow service. That means we can make multiple calls to WF. The WF will persist the requests and process one at a time.
Update2
After going through all the answers and links provided. I am thinking to use windows service as intermediator instead of console application. Right now .Net core does not support window service but has this nuget-package that can host .Net core inside windows service or I might use classic windows service using 4.6.2. I guess I can do the asyncrous implementation inside windows service as well.
There are a lot of things that I would do differently in this situation. Rather than using a timer I would use Task.Delay, also - I would most certainly wait for API2 to complete before attempting to throw more data at it. Additionally, I would ensure that my async methods are Task or Task<T> returning, notice your CallAPI1 call isn't, I understand it's a timer callback -- but that is another issue.
Consider the following:
async Task IntermediateAsync()
{
Console.WriteLine("Press ESC to exit...");
while (Console.ReadKey(true).Key != ConsoleKey.Escape)
{
var result = await _apiServiceOne.GetAsync();
await _apiServiceTwo.PostAsync(result);
// Wait ten milliseconds after each successful mediation phase
await Task.Delay(10);
}
}
This will act in the following manner:
Print a line instructing the user how to exit
Start loop
Get the result of API1
Pass the result to API2
Wait 10 milliseconds
[Step 2]
Finally, this is the same suggestion regardless of whether or not your using .NET Core. Any API interactions should follow the same guidelines.
Notes
Using a fire-and-forget on the second API call is simply setting your code up for failure. Since it is an API call there is more than likely going to be some latency with the I/O bound operations and one should assume that a tight loop of 10 milliseconds is only going to flood the availability on that endpoint. Why not simply wait for it to finish, what reason could you possibly have?
Remove the await when calling API2
private static async void CallAPI1(object state)
{
var data = state as SomeData;
Console.WriteLine("Calling API One to get some data.");
data.SomeStringValue = DateTime.Now.ToString();
//Before this will cause the program to wait
await CallAPI2(data);
// Now it will call and forget
CallAPI2(data);
_timer.Change(TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
}
Edit:
As David points out, of course there is many way to solve this problem. This is not a correct approach to solve your problem.
Another method of doing things is use quartz.net
Schedule API1 as a repeating job
When API1 is done, schedule another job to run API2 as a standalone job
This way when API2 fails you can replay/repeat the job.
I have a requirement to fire off web service requests to an online api and I thought that Parallel Extensions would be a good fit for my needs.
The web service in question is designed to be called repeatedly, but has a mechanism that charges you if you got over a certain number of calls per second. I obviously want to minimize my charges and so was wondering if anyone has seen a TaskScheduler that can cope with the following requirements:
Limit the number of tasks scheduled per timespan. I guess if the number of requests exceeded this limit then it would need to throw away the task or possibly block? (to stop a back log of tasks)
Detect if the same request is already in the scheduler to be executed but hasn't been yet and if so not queue the second task but return the first instead.
Do people feel that these are the sorts of responsibilities a task scheduler should be dealing with or am i barking up the wrong tree? If you have alternatives I am open to suggestions.
I agree with others that TPL Dataflow sounds like a good solution for this.
To limit the processing, you could create a TransformBlock that doesn't actually transform the data in any way, it just delays it if it arrived too soon after the previous data:
static IPropagatorBlock<T, T> CreateDelayBlock<T>(TimeSpan delay)
{
DateTime lastItem = DateTime.MinValue;
return new TransformBlock<T, T>(
async x =>
{
var waitTime = lastItem + delay - DateTime.UtcNow;
if (waitTime > TimeSpan.Zero)
await Task.Delay(waitTime);
lastItem = DateTime.UtcNow;
return x;
},
new ExecutionDataflowBlockOptions { BoundedCapacity = 1 });
}
Then create a method that produces the data (for example integers starting from 0):
static async Task Producer(ITargetBlock<int> target)
{
int i = 0;
while (await target.SendAsync(i))
i++;
}
It's written asynchronously, so that if the target block isn't able to process the items right now, it will wait.
Then write a consumer method:
static void Consumer(int i)
{
Console.WriteLine(i);
}
And finally, link it all together and start it up:
var delayBlock = CreateDelayBlock<int>(TimeSpan.FromMilliseconds(500));
var consumerBlock = new ActionBlock<int>(
(Action<int>)Consumer,
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = DataflowBlockOptions.Unbounded });
delayBlock.LinkTo(consumerBlock, new DataflowLinkOptions { PropagateCompletion = true });
Task.WaitAll(Producer(delayBlock), consumerBlock.Completion);
Here, delayBlock will accept at most one item every 500 ms and the Consumer() method can run multiple times in parallel. To finish processing, call delayBlock.Complete().
If you want to add some caching per your #2, you could create another TransformBlock do the work there and link it to the other blocks.
Honestly I would work at a higher level of abstraction and use the TPL Dataflow API for this. The only catch is you would need to write a custom block that will throttle the requests at the rate at which you need because, by default, blocks are "greedy" and will just process as fast as possible. The implementation would be something like this:
Start with a BufferBlock<T> which is the logical block that you would post to.
Link the BufferBlock<T> to a custom block which has the knowledge of requests/sec and throttling logic.
Link the custom block from 2 to to your ActionBlock<T>.
I don't have the time to write the custom block for #2 right this second, but I will check back later and try to fill in an implementation for you if you haven't already figured it out.
I haven't used RX much, but AFAICT the Observable.Window method would work fine for this.
http://msdn.microsoft.com/en-us/library/system.reactive.linq.observable.window(VS.103).aspx
It would seem to be a better fit than Throttle which seems to throw elements away, which I'm guessing is not what you want
If you need to throttle by time, you should check out Quartz.net. It can facilitate consistent polling. If you care about all requests, you should consider using some sort of queueing mechanism. MSMQ is probably the right solution but there are many specific implementations if you want to go bigger and use an ESB like NServiceBus or RabbitMQ.
Update:
In that case, TPL Dataflow is your preferred solution if you can leverage the CTP. A throttled BufferBlock is the solution.
This example comes from the documentation provided by Microsoft:
// Hand-off through a bounded BufferBlock<T>
private static BufferBlock<int> m_buffer = new BufferBlock<int>(
new DataflowBlockOptions { BoundedCapacity = 10 });
// Producer
private static async void Producer()
{
while(true)
{
await m_buffer.SendAsync(Produce());
}
}
// Consumer
private static async Task Consumer()
{
while(true)
{
Process(await m_buffer.ReceiveAsync());
}
}
// Start the Producer and Consumer
private static async Task Run()
{
await Task.WhenAll(Producer(), Consumer());
}
Update:
Check out RX's Observable.Throttle.
This might be discussed elsewhere but I can't find it. I have a .Net Web Service that has a function that loops through a date range and runs calculations and updates records in a database. If you give this said function a long date range, it can take quite some time to complete. This being the case, I need a way to stop this function.
Is there a way of making the web service function call run in a identified thread so that I can cancel that thread if need be? Or am I over or under thinking this? I am using C# .Net Web Page with jQuery to perform the AJAX calls to the web Service function. Any help will be greatly appreciated.
Add a Cancel() method to your web service that sets a state variable. Then, simply have your long running operation periodically check this variable and stop if its set (with appropriate protection, of course).
You need to web service methods:
StartCalculation(parms) which spanws a long running operation and returns an ID
CancelCalculation(ID) which cancels the calculation by terminating the long running operation.
The implementation of the 'long running operation' depends on your service hosts (IIS, Windows Service, etc.).
Sure, you can do that. If you're using .NET 4, you can easily cancel a task:
CancellationTokenSource cts = new CancellationTokenSource();
var processingTask = Task.Factory.StartNew(() =>
{
foreach(var item in StuffToProcess())
{
cts.Token.ThrowIfCancellationRequested();
// Do your processing in a loop
}
});
var cancelTask = Task.Factory.StartNew(() =>
{
Thread.Sleep(/* The time you want to allow before cancelling your processing */);
cts.Cancel();
});
try
{
Task.WaitAll(processingTask, cancelTask);
}
catch(AggregateException ae)
{
ae.Flatten().Handle(x =>
{
if(x is OperationCanceledException)
{
// Do Anything you need to do when the task was canceled.
return true;
}
// return false on any unhandled exceptions, true for handled ones
});
}