How to prevent SendGrid throttling when sending multiple emails? - c#

For a client of mine I am currently building a 'payslip' export. The export has to be emailed to around 90 of their employees. Each email contains a unique attachment with their own payslip in it.
My current solution is to use the SendGrid.Helpers.Mail.MailHelper package to create an email using the CreateSingleTemplateEmail method. I then use the AddAttachment method to include an attachment. Finally, I send the email by using the SendGridClient.SendEmailAsync method.
All of the above logic is in a foreach loop that iterates over all 90 employees. The problem however, is that the first 15 emails are sent instantaneously, after which the API seems to get throttled. The application is built in a simple Azure Static Web App (cheaper). So there's no real possibility for complex solutions, except if we start paying for more complex solutions. Our goal is to keep this on the cheap side.
After around two minutes, the Azure Function times out and it stops sending the emails.
I've been searching the internet for possible solutions, but I haven't really found a good solution that includes attachments.
Do you guys have any suggestions for me, or is this not possible at all?
Thanks!
Thomas

What if you did not send all of the mails from within one function, but make use of the power of Azure Functions and use Queues, basically to implement Queued Load Levelling. Since I do not know your code, I won't come up with an actual example, but basically, you could do the following
Each of the lanes is a function.
The first one is responsible only for retrieving the payslip data (from a database I guess). It iterates over all the employees and writes the data for each employee to a queue.
The second function is triggered by this very queue and generates a PDF from the payslip data for one employee at a time. This PDF along with some metadata is then written to a second queue (or to a blob container, you maybe will have to experiment which one is the most useful for you, they both have their advantages I guess).
Then a third function would be triggered by the second queue (or by the blob storage), generates an email for that one PDF (based on the metadata) and sends it via SendGrid.
This way even if the API throttles (which seems to be unlikely, based on the comments on the question), each executed function would still finish in a time that is not prone to experience a timeout. This also seems to be a more Cloud (or Serverless) native approach to me than doing it all in one function.

Related

In C# How can I get a event from MySQL when somebody Insert, Delete or Modify a record?

I am developing a program in WPF.Net, and I need to know when somebody makes a change over any table of the database.
The idea is receive a event from the database when it was changed. I was reading a lot of articles but I can't find a method to resolve my problem.
Kind Regards
The best solution is to use a message queue. After your app commits a change to the database, the app also publishes a message on the message queue. Other clients then just wait for notifications on that message queue.
There are a few other common solutions, but all of them have disadvantages.
Polling. If a client is interested in recent changes, they run a query searching for new data every N seconds.
The downside is you have to keep polling even during times when there are no changes. You might have to poll very frequently, depending on how promptly you need to notice the changes. This adds to database load just to support the polling queries.
Also it costs more if you have many clients all polling for queries. In one system I supported, the database was struggling to process 30,000 queries per second just for clients running polling.
Change Data Capture. Using the binary log as a de facto message queue, because it records all the changes. Use a client tool such as Debezium, or write your own binlog tail client (this is a lot of work).
The downside is the binlog records all changes, not just those you want to be notified about. You have to filter it somehow. Also you have to learn how to use Debezium or equivalent tool.
Triggers. Write a trigger on the table that invokes a UDF to post notification outside the database. This is a bad idea, because the trigger executes when your insert/update/delete executes, not when the transaction commits. Clients could be notified of changes before the changes are committed, so if they go query the database right after they get the notification, the change is not visible to them yet.
Also a disadvantage because it requires you install a UDF extension in MySQL Server. MySQL doesn't normally have any way of posting an external notification.
I'm not a C# developer so I can't suggest specific code. But the general methods above are similar regardless of which language the app is written in.
I don't think this is possible with MySQL, DBs like MondgoDB have this sort of feature.
You may like to use the method described in this answer.
Essentially have date/time fields on rows where you can pull data since a certain date time. Or you could use a CQRS/Event stratagem and maybe use a message queue.

Best approach for c# web service with concurrent users

I'm fairly new to C# but have experience with C++.
I'm looking to write a web service, that I can call from a website to pull hotel rates similar to rate comparison websites. My question is; what is the best method to be able to deal with the concurrent loads and allow the data requests to run in the background whilst updating the user interface.
Basically I've all the code ready and to pull the XML data from the likes of Booking.com, on a single per user basis.
But want to change this way of working, to allow me to send off requests to say 10 suppliers at once, and then in the background the web service works to get the rates.
The front end, will then refresh to get the latest data.
Any help or advice would be appreciated.

Sending SMS in automated batches AND through web-interface

This is more about the design/efficiency of the application rather than the syntax - I need to create a process that sends a batch of texts that will be run on a scheduler (automated batches), but I also need to allow an admin to send a batch manually (manual batch) or individual SMS messages (triggered). My initial thought was to build a server-side console application that can be executed with parameters to handle the sending of all texts, but I'm not positive if this would be the best option. I'm a bit worried about conflicts arising with multiple instances of the console app running (which I would obviously need to code for). Any suggestions on the best way to tackle this?
The batches will process one at a time in a loop, which will post the message to the operator (Twilio) and log the message in our database as sent.
It probably depends on your operator. This one has quite a lot of tech samples and docs.

Send out multiple emails from Webpage Asp.net

I am making an webpage with asp.net C#.
I want people to log on and enter in quote requests. then the quote request is emailed to all the relevant people to quote (could be 100+ people).
Obviously I can not have the user sit and wait for the 100+ people to be emailed as the webpage will freeze.
I have thought about implementing a backend program on the server. perhapes that checks for a text file or something and when that text file is there. searches the database for any un-emailed quotes. and emails to the relevant people. then marks record as emailed.
But there must be a better way? IS there a que system or something designed to do things like this?
You can use
ThreadPool.QueueUserWorkItem
in a loop to queue your email sending. See http://msdn.microsoft.com/en-us/library/system.threading.threadpool.queueuserworkitem.aspx for info.
I would use something like Microsoft message queue, and then have a task that sweeps the queue periodically.
I'd get the website to push to the queue, as have the queue set-up as persistent so if the box crashes, restarts etc. nothing is lost.
Simple. Just send it. If it freezes more than 0.1 seconds or so you need better hardware and confiuguration.
Have the SMTP class write to a FOLDER On disc, have the locally installed SMTP service use this as pickup directory. Then youre page is done the moment the file is written. Stnadard .NET classes support this setup via a simple configuration setting. Most people never bother to read the documentation, though.
Noo network invovlved, just 100 file generation or so. Or even 10 files with 10 bcc recipients each. Finished.
Generating 10 small files should be fat. if your discs overload get a small SSD ;) They cost near nothing.
All the rest is a lot of programming work and introduces more to watch into your system.
You could combine the two, although it wouldn't be the approach I'd use.
But using your example, you could create a static listener interface to your backend (server) application, and instead of putting a file and check periodically, you could ping your backend operation to start the operation.
Generating 1 email with 100 people in the "To" or "CC" fields should not cause the page to freeze up. Have you actually observed this behavior? If so, check your SMTP configuration, as something sounds amiss.
However, the solution I've seen put into good use, is to have a SQL database that holds all pending messages to be sent out, then have SQL Server run a job every ten minutes to run through that pending table and do the emailing (as opposed to emailing straight from the .NET app).

Building a scalable ASP.NET MVC Web Application

I'm currently in the process of building an ASP.NET MVC web application in c#.
I want to make sure that this application is built so that it can scale out in the future without the need for major re-factoring.
I'm quite keen on using some sort of queue to post any writes to my database base to and have a process which polls that queue asynchronously to perform the update. Once this data has been posted back to the database the client then needs to be updated with the new information. The implication here being that the process to write the data back to the database could take a short while based on business rules executing on the server.
My question is what would be the best way to handle the update from the client\browser perspective.
I'm thinking along the lines of posting the data back to the server and adding it to the queue and immediately sending a response to the client then polling at some frequency to get the updated data. Any best practices or patterns on this would be appreciated.
Also in terms of reading data from the database would you suggest using any particular techniques or would reading straight from db be sufficient given my scenario.
Update
Thought I'd post an update on this as it's been a while. We've actually ended up using Windows Azure but the solution is applicable to other platforms.
What we've ended up doing is using the Windows Azure Queue to post messages\commands to. This is a very quick process and returns immediately. We then have a worker role which processes these messages on another thread. This allows us to minimize any db writes\updates on the web role in theory allowing us to scale more easily.
We handle informing the user via emails or even silently depending on the type of data we are dealing with.
Not sure if this helps but why dont you have an auto refresh on the page every 30 seconds for example. This is sometimes how news feeds work on sports websites, saying the page will be updated every x minutes.
<meta http-equiv="refresh" content="120;url=index.aspx">
Why not let the user manually poll the status of the request? This is how your typical e-commerce app is implemented. When you purchase something online, the order is submitted to a queue for fullfillment. After it's submitted, the user is presented with a "Thank you for your order" page and a link where they can check the status of the order. The user can visit the link anytime to check the status, no need for an auto-poll mechanism.
Is your scenario so different from this?
Sorry in my previous answer I might have misunderstood. I was talking of a "queue" as something stored in a SQL DB, but it seems on reading your post again you are may be talking about a separate message queueing component like MSMQ or JMS?
I would never put a message queue in the front end, between a user and backend SQL DB. Queues are good for scaling across time, which is suitable between backend components, where variances in processing times are acceptable (e.g. order fulfillment)... when dealing with users, this variance is usually not acceptable.
While I don't know if I agree with the logic of why, I do know that something like jQuery is going to make your life a LOT easier. I would suggest making a RESTful web API that your client-side code consumes. For example, you want to post a new order to the system and have the client responsive? Make a post to www.mystore.com/order/create and have that return the new URI to access the order (i.e. order#) as a URI (www.mystore.com/order/1234). That response is then stored in the client code and a jQuery call is setup to poll for a response or stop polling on an error.
For further reading check out this Wikipedia article on the concept of REST.
Additionally you might consider the Reactive Extensions for .NET and within that check out the RxJS sub-project which has some pretty slick ways of handling with the polling problem without causing you to write the polling code yourself. Fun things to play with!
Maybe you can add a "pending transactions" area to the UI. When you queue a transaction, add it to the user's "pending transactions" list.
When it completes, show that in the user's "pending transactions" list the next time they request a new page.
You can make a completed transaction stay listed until the user clicks on it, or for a predetermined length of time.

Categories

Resources