I'd like to know which is the best way to create a background task in the server, to send e-mails.
The idea is that a person bids an item, and this automatically sends a mail to the task responsible which it sends the mail to the correspondent person, but how can I do this without affecting the website functionality or making it slow.
I've read some things about async tasks but not sure if this is the solution to my problem.
You can create asynchronous background threads, look at the usage of
the .NET framework Task class if you are using .NET 4.x, in prior versions
you have to look at Thread or ThreadStart.
But be careful with accessing data, to prevent the main thread and your email
thread from getting into problems, you also need to look at locking resources
with the "lock" statement.
This is good, if you need to send many emails in one go and this should be done
asynchronously, which means "the user should not have to wait for it".
In the web, this is also the best way to do such things in a thread, as you
could get a request time out if it takes too long.
But of course, at the end of the thread, you should somehow create a report
and also send that to the executing user, so that he knows that mailing has finished
or if any errors occured.
We solved this in our company by creating a web service which is responsible
for shipping emails to the SMTP service and log them, including content and
status of email sending.
Our apllications build up the emails in the format defined from our web service
and they are responsible for staus report for the end user.
Of course by doing this, you will still need to have a layer which builds up
the email you want to send and the forward it to the mail service, and this
maybe still needs to be done async. so only the relaying to SMTP itself would
be separated from your application like this.
But if you have an application which needs to do mass mailing or something like that
and you want to separate it from your "normal" tool, (e.g. for separating processes
and process load) then simply create a separate service which "knows" the domain of your main application.
By doing this, you would only have to trigger the mailing service by sending the according information from the main application to the mailing service.
But again, maybe you need to build up a background thread which collects and sends the
information required to that service.
You could create this service in many ways, using WCF for example, as background windows service with message enqueing, or a HTTP based service with a rest API, etc. etc.
You did not give that much information about what you need to do, but maybe this will
point you to the right direction.
Related
I have a website where I need to take a bit of data from the user, make an ajax call to a .net webservice, and then the webservice does some work for about 5-10 minutes.
I naturally dont want the user to have to sit there that whole time, so I have made it an asynchronous ajax call to the webservice, and after the call has been sent, I redirect the user to a "you are done!" page.
What I want to happen is for the webservice to keep running to finish--and not abort--after it receives the information from the user.
From my testing, this is more or less what happens, but now I'm finding that this might be limited by time? I.e. if the webservice runs past a certain amount of time, it will abort if the user isnt still connected.
I might be off here in this assessment, but this is what I THINK is going on from my testing.
So my question is whether with .net web services, if this is indeed what happens? Does it get aborted after some time if the user isnt still on the other end? Is there any way to disable this abort?
Thanks in advance!
when you invoke a web service, it will always finish its work, even if user leaves the page that invoked it.
Of course webservices have their own configuration and one of them sets timeout.
If you're creating a WCF service (SOAP Service) you can set it in its contract (changing binding properties), if you're creating a service with WebApi or MVC (REST/Http Service) then you can either add to its config file or programmatically set in its controller as it follows.
HttpContext.Server.ScriptTimeout = 3600; //Number of seconds
That can be a reason causing webservice to interrupt its work but it is not related to what happens on client side.
Have a nice day,
Alberto
Whilst I agree that the answer here is technically correct, I just
wanted to post a more robust alternative approach that avoids some of
the pitfalls possible with your current approach such as
Web Server being bounced during the long-running processing of request
Web Server App pool being recycled during processing
Web server running out of threads due to too many long-running requests and not being able to process any more requests
I would recommend you take a thoroughly ansynchronous approach and use
Message Queues (MSMQ for example) with a trigger on the queue that
will execute the work.
The process would be:
Your page makes Ajax call to the Webservice
Webservice writes a message into the Queue and returns right away. The message contains details of what work needs to be carried out.
User continues on your site as usual, or goes home, etc.
A trigger on the Queue is watching for messages and when a message
arrives in the queue, it activates a process which:
Reads the message
Performs the necessary work
Updates any back-end storage, etc, with the results of the work
This is much more robust because it totaly decouples the Web service from any long-running work and means that if the user makes a request and the web server goes down a moment later (for whatever reason) then the work will still be queued up when the server comes back online, etc.
You can read more about it here (MSMQ is the MS Message Queue tech; there are many others!)
Just my 2c
I'm working on a task where I'm trying to insure delivery of data to a database in the order in which it is written. The database will be located on another machine, and it's possible that the database machine could fail while the sending computer continues to queue up database messages.
I'm trying to use WCF for this task and from my initial reading believed that this might be possible using WCF with message queue and a ReceiveContext enabled channel. The documentation states that ReceiveContext 'enables an application to decide whether to access the message or leave it in the queue for further processing.'
The problem that I've encountered is that if I call the Abandon method indicating that the message has not been successful processed and should be left on the queue, WCF appears to place the message at the back of the queue instead of leaving it at the front of the queue. Since I need to write the messages to the database in the order they were originally sent this solution will not work.
Is there any way to force WCF to 'peek' at a message before removing it or cause an abandon message to be placed at the front of the queue? If not could anyone suggest a method of accomplishing what I'm attempting to do without having to write a lot of code from scratch.
Thanks,
Al
I am working on a project in which a user will upload a file to the server that will be parsed.
I would like the user to receive a status message when the upload is completed and then for them to be able to poll the server for updates regarding the status of the parsing.
I was thinking to use a ajax file upload in which when the client receives an upload success message from the server it begins polling every 2 seconds for the status. I do not know how to return data to the user while still having the server continue execution of the parser and being able to track the status of that execution.
What is the best way to go about continueing script execution after a view is returned from a controller.
EDIT:
I suspect I may need to spawn another process, but I have no idea how to do this
I think that in this particular case it would make sense to decouple your file processing from the web request. The ThreadPool.QueueUserWorkItem approach suggested by C.M. is one option, but you might also want to consider using a real queuing mechanism (like MSMQ or RabbitMQ) and process your uploads in a separate application. This way, your web tier is decoupled from your business processes and you can scale each piece independently if you need to.
You should take a look at SignalR (https://github.com/SignalR/SignalR) it's a library for building web web apps with very easy communication between client and server.
If you want to know about how to do background processing you can use threading to spin up a thread that will run even after the webpage has been returned to the user. There are plenty of examples of this on StackOverflow and the web.
A simple way I've seen this done is using the ThreadPool.QueueUserWorkItem along with a static list that keeps track of the status background threads.
I am working on one Asp.Net application and need to send mails periodically based on some event. First I thought of creating a thread in global.asax and start thread in application_start. But that becomes a bit of problem when application pool crashes or something. So I implemented a windows service and started thread in that and log any errors in windows event log. This works fine. But I need to know whether I am implementing it correctly or is there a better way of doing it?
I think you are moving (or moved already) to the right direction.
We have similar architectures as well, in some cases we used MSMQ to queue outgoing notifications from the ASP.NET application then the Windows Service, usually called Messaging Manager, can grab asynchronously the incoming messages and send the emails or alerts out.
this proves to be effective and robust, if anything crashes after the message has been queued, nothing will be lost because the windows service will always process the messages in the queue, so you can have ASP.NET recycling or the machine with the windows service being rebooted, nothing is lost ever. And in fact in normal production mode, messages are sent out instantly, the decoupling or loose of sync is mostly hidden when everything is working smooth and servers are not overloaded or suffering anything.
In a later project we are now implementing something similar using TIBCO technologies, EMS for the queues and Business Works for queue subscribers.
Using a Windows Service for this kind of tasks is the preferred way instead of doing it in the ASP.NET application. You may also take a look at Quartz.NET which could simply your code for scheduling the task execution and dealing with threads. But if you don't want to write Windows Services probably the simplest would be to have a console application that will do the job of sending emails and then simply use the Windows Task Scheduler to run it at regular intervals.
Another option is a message-based approach. You could have a Windows Service/Console Application reading messages of a message queue (like msmq) and send email when a message is recieved. You can then have your ASP.NET application publish messages to this queue.
Minibuss is a lightweight client for msmq which is very easy to work with. Another options is NServiceBus.
I have a email queue with email to be send. A webservice calls a SOAP webservice that processes the queue one by one.
We send email using an external vendor using their REST API. My problem is that calls to this API can take from 0.1ms to 12s. We sent thousands of emails to customer that subscribe to our notices and it important that in each batch there's not to much delay between the first compared to the last in the queue (ideally they'd be sent in simultaneously).
I've complained to the vendor but as they suck I'm quite sure they will not do anything about this.
Can I somehow Threadify this process, instantiating simultaneous calls to the server? The server is also my web server so I can't use all the juice. How many threads is appropriate? Is this a good idea? What's the best way to generically manage these threads?
You shouldn't be creating threads within an ASP.Net application. If you have a large enough queue to warrant multithreading you should create a windows service to handle the queue.
I would queue the email in a database table and generate a separate windows service that reads from the table and spawns a thread for each email, up to some max thread limit. The database can also be used to capture throughput time.
You also should find out how many simultaneous web service requests your vendor can handle. BCC yourself on the emails to find out if simultaneous submissions on your end end up as a single-threaded transmission on their end. And perhaps start shopping for an alternative to this vendor (you did say they suck).
If you want to get fancy and offload the effort from your own server, you send a batch of emails to a cloud service (Amazon Web Services, Microsoft Azure, or Google App Server) and spawn a process on the cloud to spray the emails to your vendor simultaneously.
You can also send the emails directly from the cloud, at least you can with Amazon. They provide a default limit, but then here's a link on how to remove the limit: http://aws.amazon.com/contact-us/ec2-email-limit-request/.
I have had some success with ThreadPool.QueueUserWorkItem() for a ASP.NET app. You can google for some usage examples.
There is no need to spawn threads yourself. The class generated by visual studio to access a web service already contains asynchronous methods. For each webservice call Foo, you will see that there is a BeginFoo and EndFoo method. The BeginFoo method will immediately return an IAsyncResult object while the webservice call is done in another thread.
See this MSDN topic for more information on how to use IAsyncResult.