I create a bot, called picturesaver, using Microsoft's Bot Framework, I added a GroupMe channel, and I have it hosted in Azure. The bot works perfectly, saving pictures to Google Drive.
However, the bot gives an error saying "Service Error:POST to picturesaver timed out after 15s" Is it possible to extend the timeout time? Or even stop the bot from posting anything at all. Could this be an Azure issue or is it a GroupMe issue?
If your bot performs an operation that takes longer than 15 seconds to process a message, you can process the message on another thread, and acknowledge the call right away. Something like:
public async Task<HttpResponseMessage> Post([FromBody]Activity activity)
{
if (activity.Type == ActivityTypes.Message)
{
if ([determine if this will take > 15s])
{
// process the message asyncronously
Task.Factory.StartNew(async () => await Conversation.SendAsync(activity, () => new Dialogs.RootDialog()));
}
else
{
//process the message normally
await Conversation.SendAsync(activity, () => new Dialogs.RootDialog());
}
}
return Request.CreateResponse(HttpStatusCode.OK); //ack the call
}
This will avoid the 15 second timeout between connector and bot.
Edit: the above will not scale, and is just using a Task.Factory. Please refer to https://learn.microsoft.com/en-us/azure/bot-service/bot-builder-howto-long-operations-guidance for the recommended guidance on processing long operations from a bot.
The Bot Connector service has a 15s timeout so you need to make sure any async API calls are handled in that timeframe, or make sure your bot responds with some kind of message if it's waiting for some other operation to complete. Currently the 15s timeout cannot be modified.
The solution to process the message on another thread, and acknowledge the call right away is good only for a bot on an App Service.
But as for a Functions Bot doing so will finish the Azure Function if I immediately return from this method.
I tried it. The Azure Function stops running, and the real response to the chat never comes. So it's not a solution at all for the Function Bots.
I ended up with this code for a Functions Bot, which resolves this problem.
Using Azure Queues
public static class Functions
{
[FunctionName("messages")]
[return: Queue("somequeue")]
public static async Task<MessagePayload> Messages([HttpTrigger
(WebHookType = "genericJson")]HttpRequestMessage req) =>
// return from this Azure Function immediately to avoid timeout warning message
// in the chat.
// just put the request into "somequeue".
// We can't pass the whole request via the Queue, so pass only what we need for
// the message to be processed by Bot Framework
new MessagePayload
{
RequestUri = req.RequestUri,
Content = await req.Content.ReadAsStringAsync(),
AuthScheme = req.Headers.Authorization.Scheme,
AuthParameter = req.Headers.Authorization.Parameter
};
// Do the actual message processing in another Azure Function, which is
// triggered by a message enqueued in the Azure Queue "somequeue"
[FunctionName("processTheMessage")]
public static async Task ProcessTheMessage([QueueTrigger("somequeue")]
MessagePayload payload, TraceWriter logger)
{
// we don't want the queue to process this message 5 times if it fails,
// so we won't throw any exceptions here at all, but we'll handle them properly.
try
{
// recreate the request
var request = new HttpRequestMessage
{
Content = new StringContent(payload.Content),
RequestUri = payload.RequestUri
};
request.Headers.Authorization = new
AuthenticationHeaderValue(payload.AuthScheme, payload.AuthParameter);
// initialize dependency injection container, services, etc.
var initializer = new SomeInitializer(logger);
initializer.Initialize();
// handle the request in a usual way and reply back to the chat
await initializer.HandleRequestAsync(request);
}
catch (Exception ex)
{
try
{
// TODO: handle the exception
}
catch (Exception anotherException)
{
// swallow any exceptions in the exceptions handler?
}
}
}
}
[Serializable]
public class MessagePayload
{
public string Content { get; set; }
public string AuthParameter { get; set; }
public string AuthScheme { get; set; }
public Uri RequestUri { get; set; }
}
(Be sure to use different Azure Queues for local development with Bot Framework emulator and for a cloud-deployed Function App. Otherwise, the messages sent to your bot from real customers may be processed locally while you are debugging on your machine)
Using an HTTP request
Of course, the same can be done without using an Azure Queue with a direct call to another Azure Function's public URL - https://<my-bot>.azurewebsites.net/api/processTheMessage?code=<function-secret>. This call has to be done on another thread, without waiting for the result in the messages function.
[FunctionName("messages")]
public static async Task Run([HttpTrigger(WebHookType = "genericJson")]
HttpRequestMessage req)
{
// return from this Azure Function immediately to avoid timeout warning message
// in the chat.
using (var client = new HttpClient())
{
string secret = ConfigurationManager.AppSettings["processMessageHttp_secret"];
// change the RequestUri of the request to processMessageHttp Function's
// public URL, providing the secret code, stored in app settings
// with key 'processMessageHttp_secret'
req.RequestUri = new Uri(req.RequestUri.AbsoluteUri.Replace(
req.RequestUri.PathAndQuery, $"/api/processMessageHttp?code={secret}"));
// don't 'await' here. Simply send.
#pragma warning disable CS4014
client.SendAsync(req);
#pragma warning restore CS4014
// wait a little bit to ensure the request is sent. It will not
// send the request at all without this line, because it would
// terminate this Azure Function immediately
await Task.Delay(500);
}
}
[FunctionName("processMessageHttp")]
public static async Task ProcessMessageHttp([HttpTrigger(WebHookType = "genericJson")]
HttpRequestMessage req,
Microsoft.Extensions.Logging.ILogger log)
{
// first and foremost: initialize dependency
// injection container, logger, services, set default culture/language, etc.
var initializer = FunctionAppInitializer.Initialize(log);
// handle the request in a usual way and reply back to the chat
await initializer.HandleRequest(req);
}
Related
I have 2 Web APIs developed on ASP.NET Core. The idea is: the WebAPI_1 sends a message to the Azure Service Bus and then WebAPI_2 has to catch this moment and read it shortly after the message is sent. I understand how to catch this moment is I have a console app instead of WebAPI_2, but I am not sure how to subscribe WebAPI_2 on such the event happening in Azure Service Bus.
Below is the code where I have WebAPI_1 and the Console App.
WebAPI_1 (Sender):
public class QueueService : IQueueService
{
private readonly IConfiguration _config;
public QueueService(IConfiguration config)
{
_config = config;
}
public async Task SendMessageAsync<T>(T serviceBusMessage, string queueName)
{
var queueClient = new QueueClient(_config.GetConnectionString("AzureServiceBus"), queueName);
string messageBody = JsonSerializer.Serialize(serviceBusMessage);
var message = new Message(Encoding.UTF8.GetBytes(messageBody));
await queueClient.SendAsync(message);
}
}
And this is how I send it:
await queue.SendMessageAsync(obj, "myqueue");
And this is the Console App (Receiver):
Main(){
queueClient = new QueueClient(connectionString, queueName);
var messageHandlerOptions = new MessageHandlerOptions(ExceptionReceivedHandler)
{
MaxConcurrentCalls = 1,
AutoComplete = false
};
queueClient.RegisterMessageHandler(ProcessMessagesAsync, messageHandlerOptions);
}
private static async Task ProcessMessagesAsync(Message message, CancellationToken token)
{
var jsonString = Encoding.UTF8.GetString(message.Body);
Model obj = JsonSerializer.Deserialize<Model>(jsonString);
Console.WriteLine($"Person Received: { obj.Field1} { obj.Field2}");
await queueClient.CompleteAsync(message.SystemProperties.LockToken);
}
But I want WebAPI_2 to be able to receive the messages instead of the Console App.
Please advise.
Receiving messages requires a continuous job. ASP.NET Core Controller, as you've probably found out, is not the right place as it's not running continuously and is intended to respond to the request. For a continuous execution, a background service, or task, is the right option. ASP.NET Core has an option to run a BackgroundService that could be used for exactly what you need.
There are multiple blog posts with the details in case you want to get some inspiration:
Getting Started With Azure Service Bus Queues And ASP.NET Core Background Services
Using An ASP.NET Core IHostedService To Run Azure Service Bus Subscriptions and Consumers
I am building a web API that will serve as a connector between a 3rd-party application and mine.
This application will be running on a server and will be receiving POST requests from the 3rd-party application and sending POST requests of its own as a response.
Before it starts sending these requests, my web API needs to make a POST to the 3rd-party service, so it can be registered and received an authorization token, that it will be used on the requests it sends back, kinda similar to an OAuth token, from what I understand.
Since my code is all inside an HttpPost method, it only gets activated when it receives a call, and that part work as expected. When the service is authenticated and is receiving requests, is fine. The problem is when my service or the 3rd-party is restarted or something, the current token is made invalid or lost and a new one needs to be requested again.
What I wish to do is make that the call to register my service and receive the token is sent when the service starts, automatically.
Currently I am doing a manual call to trigger when my service needs to be registered, but that make it necessary for me to be at my computer to do so, and the connection is not make until I call that request.
Here is a sample of my code:
public class Controller : ApiController
{
static string SessionToken = "";
[HttpPost]
[Route("connector/webhook")]
public async Task<HttpStatusCode> Webhook(UpdateContentRequestBody body)
{
var NO_ERROR = 0;
try
{
if (string.IsNullOrEmpty(SessionToken))
{
// This registers my service.
var registerConector = ConectorOSCCApi.RegisterConector();
if (respostaRegistrarConector.ErrorCode != NO_ERROR)
{
throw new Exception();
}
SessionToken = registerConector.SessionToken;
}
ConectorApi.KeepAliveRequest(SessionToken);
RepeatKeepAlive();
ProccessDataAndSendResponseRequest(body);
return HttpStatusCode.OK;
}
catch (Exception e)
{
SessionToken = "";
return HttpStatusCode.InternalServerError;
}
I want the method to register the service to run without the need of a call to "connector/webhook", but the rest of the processing and response to only happens when such a call is received. How can I do that?
EDIT:
My code is inside a ASP.NET Web Application.
I am using .NET Framework 4.5 and hosting my web application on IIS.
This should do it for you :)
public class Controller : ApiController
{
static string _sessionToken = "";
static string SessionToken
{
get
{
if (string.IsNullOrEmpty(_sessionToken))
{
InitToken();
}
return _sessionToken
}
}
void InitToken()
{
if (string.IsNullOrEmpty(_sessionToken))
{
// This registers my service.
var registerConector = ConectorOSCCApi.RegisterConector();
if (respostaRegistrarConector.ErrorCode != NO_ERROR)
{
throw new Exception();
}
_sessionToken = registerConector.SessionToken;
}
}
public Controller() : base()
{
InitToken();
// anything else
}
[HttpPost]
[Route("connector/webhook")]
public async Task<HttpStatusCode> Webhook(UpdateContentRequestBody body)
{
var NO_ERROR = 0;
try
{
ConectorApi.KeepAliveRequest(SessionToken);
RepeatKeepAlive();
ProccessDataAndSendResponseRequest(body);
return HttpStatusCode.OK;
}
catch (Exception e)
{
SessionToken = "";
return HttpStatusCode.InternalServerError;
}
}
}
You don't need to wait for a request to your service to request a token.
Prerequisites : make sure you know what error code you receive from the third party API if your token is no longer correct.
When your API initializes, you will have a method available, ApplicationStart or something else in Startup.cs, depending on version, setup etc. Use that method to request the token from the third party API. Cache the token in the application level cache.
An example of caching can be found here: Caching Data in Web API
When your application receives a request, grab the token from the cache and issue the call to the third part API. If everything works, happy days. If it fails with token issue error code, then re-issue the token request and try again this time with the fresh token. Replace the cached token with the new one.
So basically, keep using a token until it fails, then automatically request a new one and update it. This way you don't need to be there to request the token manually.
You could wrap up this token logic into a service class so you don't have a lot to do in the endpoints.
How to efficiently limit request length timeout on server side ? I'm using Microsoft.Owin.Host.HttpListener and there are cases when (due to call to external service) serving request takes ridiculous amount of time. This is not a problem - but web server should give up sooner than - well never (I did some tests, but after 5 minutes I stopped it).
Is there a way how to limit time for serving single request (similar to <httpRuntime maxRequestLength="..." /> in IIS ecosystem) ?
Sample controller code:
public async Task<HttpResponseMessage> Get() {
// ... calls to 3pty services here
await Task.Delay(TimeSpan.FromMinutes(5));
}
Starting web server:
WebApp.Start(this.listeningAddress, new Action<IAppBuilder>(this.Build));
Note: I've read about limiting http listener, but that just limits incoming request properties, it doesn't cancel request that is slow due to slow server processing:
var listener = appBuilder.Properties[typeof(OwinHttpListener).FullName] as OwinHttpListener;
var timeoutManager = listener.Listener.TimeoutManager;
timeoutManager.DrainEntityBody = TimeSpan.FromSeconds(20);
timeoutManager.EntityBody = TimeSpan.FromSeconds(20);
timeoutManager.HeaderWait = TimeSpan.FromSeconds(20);
timeoutManager.IdleConnection = TimeSpan.FromSeconds(20);
timeoutManager.RequestQueue = TimeSpan.FromSeconds(20);
Related:
https://github.com/aspnet/AspNetKatana/issues/152
Conceptually "older" web server solutions - i.e. IIS are using one-thread-per-request separation and ThreadAbortException to kill slow requests. Owin is using different philosophy - i.e. it fires new task per request and forcibly cancelling task is best avoided. There are two sides of this problem:
shus client away if it takes too long
cancel server processing if it takes too long
Both can be achieved using middleware component. There also is a cancellation token provided directly by owin infrastructure for cases when client disconnects (context.Request.CallCancelled where context is IOwinContext)
If you're interested only in cancelling server flow ASAP when it takes to long, I'd recommend something like
public class MyMiddlewareClass : OwinMiddleware
{
// 5 secs is ok for testing, you might want to increase this
const int WAIT_MAX_MS = 5000;
public MyMiddlewareClass(OwinMiddleware next) : base(next)
{
}
public override async Task Invoke(IOwinContext context)
{
using (var source = CancellationTokenSource.CreateLinkedTokenSource(
context.Request.CallCancelled))
{
source.CancelAfter(WAIT_MAX_MS);
// combined "client disconnected" and "it takes too long" token
context.Set("RequestTerminated", source.Token);
await Next.Invoke(context);
}
}
}
And then in controller
public async Task<string> Get()
{
var context = this.Request.GetOwinContext();
var token = context.Get<CancellationToken>("RequestTerminated");
// simulate long async call
await Task.Delay(10000, token);
token.ThrowIfCancellationRequested();
return "Hello !";
}
Shusing the client away is more complex. The middleware will look like this:
public static async Task ShutDownClientWhenItTakesTooLong(IOwinContext context,
CancellationToken timeoutToken)
{
await Task.Delay(WAIT_MAX_MS, timeoutToken);
if (timeoutToken.IsCancellationRequested)
{
return;
}
context.Response.StatusCode = (int)HttpStatusCode.ServiceUnavailable;
}
public async Task ExecuteMainRequest(IOwinContext context,
CancellationTokenSource timeoutSource, Task timeoutTask)
{
try
{
await Next.Invoke(context);
}
finally
{
timeoutSource.Cancel();
await timeoutTask;
}
}
public override async Task Invoke(IOwinContext context)
{
using (var source = CancellationTokenSource.CreateLinkedTokenSource(
context.Request.CallCancelled))
using (var timeoutSource = new CancellationTokenSource())
{
source.CancelAfter(WAIT_MAX_MS);
context.Set("RequestTerminated", source.Token);
var timeoutTask = ShutDownClientWhenItTakesTooLong(context, timeoutSource.Token);
await Task.WhenAny(
timeoutTask,
ExecuteMainRequest(context, timeoutSource, timeoutTask)
);
}
}
So I'm using Microsofts Bot framework and their DirectLine api to talk to it. I do this beacuse I need to send a notification to the bot. The class below is called by my endpoint that I have in my backend. So when I call my notify endpoint, this class is invoked and is supposed to start a conversation with the bot to trigger certain events in it. The problem is that it doesn't seem to work as expected. When I run the code and make a request to my endpoint, it get's stuck at var conversation = await client.Conversations.StartConversationAsync();
the await keyword stops the execution until it is finished, problem is that it never finishes. BUT I can see in the debug window that the request is sent with a 201 created statuscode, so it should finish, but it never does. Not sure what to do here.
private static async Task StartBotConversation()
{
string directLineSecret = "SECRECT";
string fromUser = "DirectLineSampleClientUser";
DirectLineClient client = new DirectLineClient(directLineSecret);
Debug.WriteLine("Before starting con ");
var conversation = await client.Conversations.StartConversationAsync();
Debug.WriteLine("After starting con");
Activity userMessage = new Activity
{
From = new ChannelAccount(fromUser),
Text = "ERROR1337",
Type = ActivityTypes.Trigger
};
Debug.WriteLine("Before posting activity");
await client.Conversations.PostActivityAsync(conversation.ConversationId, userMessage);
Debug.WriteLine("After posting activity");
}
Do this : BotConversation = await Client.Conversations
.StartConversationAsync().ConfigureAwait(false);
It worked for me, I hope it helps you.
I am making an app for a chatroom that uses XHR Polling. I am able to correctly query the server and receive/send data from it, but I have no idea why the XHR Polling acts uncontrollably.
This is what I have to run the XHR Polling:
private void MakeInitialConnection() // Now that we have the session and all the required pieces of data, we can connect to the chatroom
{
this.XHRPolling.DoWork += (sender, args) => // Tell the XHR polling thread what to do
{
MakeConstantConnection();
};
this.XHRPolling.RunWorkerCompleted += (sender, args) => // Tell the thread what to do when the thread ends (i.e. the connection is cut)
{
this.OnDisconnect();
};
this.XHRPolling.RunWorkerAsync(); // Run the thread
}
private async void MakeConstantConnection()
{
while (!this.isChatQuit) // Continuously send XHR requests until the connection is determined to have ended
{
await this.SendXHR();
}
}
private async Task SendXHR()
{
HttpClient chatXHR = new HttpClient();
try
{
HttpResponseMessage response = await chatXHR.GetAsync(currentChatURL);
response.EnsureSuccessStatusCode();
string responseString = await response.Content.ReadAsStringAsync();
if (responseString == "1::")
{
this.OnConnect();
}
chatXHR.Dispose();
}
catch (HttpRequestException e)
{
this.OnErrorInConnection("SendXHR()", e.Message);
chatXHR.Dispose();
}
}
I get the data that I am expecting, however I keep getting the same data repeatedly until the app crashes. Even when I turn off networking entirely, I still keep getting duplicated data. So I have absolutely no idea what is going on. Doesn't HttpResponseMessage response = await chatXHR.GetAsync(currentChatURL); keep the request alive until it is fulfilled? Why is it being fulfilled when there is no network connection?
I have built a desktop app using almost the same code and that works exactly as intended. SendXHR must be done synchronously, otherwise the XHR polling wouldn't be XHR polling. Anyone have any ideas what is going on? I cannot seem to pinpoint the problem.