What are some ideas (using .NET and SQL 2005) for implementing a service that sends emails? The emails are to be data-driven. The date and time an email is to be sent is a field in a table.
I have built several high volume email notification services used to send data driven emails in the past. A few recommendations:
Look into using a high quality email service provider that specializes in managing bounces, unsubscribes, isp and black list management, etc. If sending email is critical to your business, but not your main business it will be worth it. Most will have an api for sending templated messages, click tracking, open rates and will have provide triggers etc.
Look into the SQL Server Service Broker to queue the actual messages, otherwise you can consider Microsoft Message Queuing Services. There is no need to build your own queuing service. We spent too much time dealing with queing infrastructure code when this was already solved.
Develop a flexible set of events on your business tier to allow for the triggering of such messages and put them in your queue asynchronously, this will save you alot of grief in the long run as opposed to polling on the DB or hacking it in with Database triggers.
You can use triggers to send emails on UPDATE/DELETE/INSERT. The triggers can be implemented with .Net, just send mails from there using the classes in System.Net.Mail namespace.
Here is a good article how to implement CLR (.Net) triggers in .Net.
For a light-weight SMPT server, and to minimize the delays, you can use the one, recommended in Kenny's answer.
Thanks everyone for the feedback. For simplicity's sake I've started out with a SP that looks up the reminders to be sent and uses sp_ send_ dbmail (SQL Database Mail) to send the emails. This runs on a job every minute. I update the record to indicate the reminder was sent with the MailItemId sent back from sp_ send_ dbmail. The volume of reminders expected is worst case in the 10^2 range per day.
I'd love to hear feedback about any shortcomings people think this solution may have.
By the way, I can't believe Vista doesn't come with a local SMTP server! Luckily Google is more generous, I used Gmail's server for testing.
Usually, I just spin up a process such as http://caspian.dotconf.net/menu/Software/SendEmail/
I was going to suggest SQL Server Notification Services, which will handle the job nicely. But I see that's been dropped from SQL Server 2008, so you probably don't want to go there.
Data Driven SSRS Subscriptions? Just a thought.
Related
Im just starting to work on a particular piece of development.
We have a .NET WCF application, MySql/EF DAL/ORM, that is called by a threaded job scheduler that pulls data from one client, stores it in our DB and passes the latest data to a another client and vice versa.
So to think about it as messages,
ClientB sends an order to ClientA through our system which transforms the order into a readable format for ClientA.
ClientA then can send messages to the ClientB through our system to say stuff like "your order is shipped" or "your order is late".
I need to take these messages and relay them onto ClientB but I want it to be transactional and for us to have full control over failed messages etc.
My current thoughts are, for simplicity sake, to have a OrderMessages table in our DB which receives messages, with a state of "Ready" which can then be processed by a factory and forwarded to the relevant client using a configuration stored against the clients.
Sorry for this being all over the place, but hopefully I've explained what im trying to do :/
Neil
Your proposed architecture is a classic queue table pattern. Remus Rusanu is the canonical resource for building such a thing with SQL Server. The ideas apply to other databases as well.
There is nothing wrong with this architecture. Note, that in case of an error when messaging a client you cannot know whether the message was received or not. There is no 100% solution for this problem. It is the Two Generals Problem.
If you make the clients pull directly from the database you can avoid this problem. Clients can use their own transactions in this case.
Consider leveraging a message platform for publishers and subscribers.
Specifically, consider using a hub and spoke pattern.
Also, BizTalk specializes in workflows across distributed systems.
Also consider the effort involved:
Transactions (short and long)
Error handling
Expected message formats
Orchestrations
We have a number of different old school client-server C# WinForm client-side apps that are essentially front-ends for the database. Then there is a C# server-side windows service that waits on the client apps to submit orders and then it processes them.
The way the server-side service finds out whether there is work to do is that it polls the database. Over the years the logic of polling for waiting orders has gotten a lot more complicated due to the myriad of business rules. So because of this, the polling stored proc itself uses quite a bit of SQL Server resources even if there is nothing to do. Add to this the requirement that the orders be processed the moment they are submitted and you got yourself a performance problem, as the database is being polled constantly.
The setup actually works fine right now, but the load is about to go through the roof and, it is obvious, that it won't hold up.
What are some effective ways to communicate between a bunch of different client-side apps and a server-side windows service, that will be more future-proof than the current method?
The database server is SQL Server 2005. I can probably get the powers that be to pony up for latest SQL Server if it really comes to that, but I'd rather not fight that battle.
There are numerous options ways you can notify the clients.
You can use a ready-made solution like NServiceBus, to publish information from the server to the clients or other servers. NServiceBus uses MSMQ to publish one message to multiple subscribers in a very easy and durable way.
You can use MSMQ or another queuing product to publish messages from the server that will be delivered to the clients.
You can host a WCF service on the Windows service and connect to it from each client using a Duplex channel. Each time there is a change the service will notify the appropriate clients or even all of them. This is more complex to code but also much more flexible. You could probably send enough information back to the clients that they wouldn't need to poll the database at all.
You can have the service broadcast a UDP packet to all clients to notify them there are changes they need to pull. You can probably add enough information in the packet to allow the clients to decide whether they need to pull data from the server or not. This is a very lightweight for the server and the network, but it assumes that all clients are in the same LAN.
Perhaps you can leverage SqlDependency to receive notifications only when the data actually changes.
You can use any messaging middleware like MSMQ, JMS or TIBCO to communicate between your client and the service.
By far the easiest, and most likely the cheapest, answer is to simply buy a bigger server.
Barring that, you are in for a development effort that has a high probability of early failure. By failure I don't mean that you end up scraping whatever it is you end up building. Rather, I mean you launch the changes and orders will be screwed up while you are debugging your myriad of business rules.
Quite frankly, I wouldn't consider approaching a communications change under pressure; presuming your statement about load going "through the roof" in the near term.
If your risk exposure is such that it has to be 100% functional day one (which is normal when you are expecting a large increase in orders), with no hiccups then just upsize the DB server. Heck, I wouldn't even install the latest sql server on it. Instead, just buy a larger machine, install the exact same OS and DB server (and patch levels) and move your database.
Then look at your architecture to determine what needs to go away and what can be salvaged.
If everybody connects to SQL Server then there is also the option of Service Broker. Unlike other messaging/queueing solution recommended so far it is entirely contained in your database (no separate product to deploy, administer and configure), it offers a single story vis-a-vis your backup/recovery and high availability needs ( no separate backup for message store, no separate DR/HA, whatever is your DB solution is also your messaging solution) and overs a uniform programming API (SQL).
Even when everything is within one single SQL Server instance (ie. there is no need to communicate over network between multiple SQL Service instances) Service Broker still has an ace that no one can match: activation. With activation you eliminate completely the need to poll because the system itself will launch your processing code (will 'activate') when there are events to process. The processing code can be internal (T-SQL procedure or SQLCLR .Net procedure) or external (see external activator).
I am using SmtpClient and MailMessage to send emails from ASP.Net code.
I want to be able to delay the delivery of emails based on business logic.
I will be able to generate a DateTime object about when the email should actually be delivered.
I am aware that I can create a Windows Service to schedule email delivery using Sql Server database as an email storage. But I do not want to deploy an extra service on the hosting server.
I am also aware that I can create a SQL Server job to achieve this, but I would not like to, until and unless, this cannot be achieved using pure ASP.NET
A point to note is that, my code checks whether MSMQ is enabled on the server. If it is, MessageQueue class is used for email delivery. So any pointers on achieving this using queues will be helpful.
By the way, the solution should work in both cases (i.e. with SmtpClient as well as MessageQueue.
I have already reviewed this and this questions and its accepted answers. I do not want to use 3rd party products until and unless this cannot be achieved using pure ASP.NET
P.S. .Net Framework version : 3.5, IIS 7 and Sql Server 2008 should be the environment.
Some information about keeping a timer running in asp.net including a great comment about an always on scenario:
Start timer on web application start
You can use the following to keep asp.net from shutting down:
Can you prevent your ASP.NET application from shutting down?
You can simply use the timer to check a database and send all pending mail.
After some research and clarifications from client, it was decided that I should create a long running background thread, which will handle sending of emails at pre-defined times and need not bother about whether the application pool has been shut down due to inactivity.
I have an existing asp.net c# application for which I'd like to implement a feature that allows users to post content via email. A user would send an email to a designated address and the system would parse the email and create database entries using the email subject, body and any attached images. My proposed approach is to create a windows service that pings a pop3/imap enabled email provider to retrieve incoming emails. The service would then parse the emails using an existing library I found here http://www.lesnikowski.com/mail/. The user would be matched according to the email address in the from field to the asp.net membership and then new records would be inserted from the contents of the email for that user. Initially the windows service would run on a separate EC2 instance that I'll set up for this purpose since the current host does not permit root access. But eventually I'll probably migrate the entire site to EC2.
Before I dive in I wanted to get some feedback from you all on my overall approach and architecture. More specifically:
Is what I described above the approach you would take?
Would you recommend implementing a web service to manage the interactions between the windows service and the database of the asp.net site? Or would you recommend hitting the database directly?
If I program the windows service to
ping the email provider every 30
seconds, will that be a problem?
Do you foresee any security issues with this approach I've outlined?
What about issues with reliability (needs to be a 24x7 service)?
Additional Background --- the asp.net website is an inventory system where each entry has a name, description and optional images. From the email the subject will become the name, the body will become the description and the images are the images. If you're familiar with the Posterous blogging platform you'll have an excellent reference point for what I am trying to accomplish.
Is what I described above the approach you would take?
It would be better if you could set up an Exchange server or sth similiar where you get notifications about new emails, so you don't have to ping every 30 minutes, but I never did it this way and cannot tell you if this is even possible.
The approach itself sounds plausible, because sending emails is really easy and everybody knows how to do that.
Would you recommend implementing a web service to manage the interactions
between the windows service and the
database of the asp.net site? Or would
you recommend hitting the database
directly?
I would recommend an extra abstraction layer, because it is not much effort and improves the design. This decreases performance (shouldn't be that much), so it depends on your requirements.
If I program the windows service to ping the email provider every 30
seconds, will that be a problem?
Depends on your email provider. Normally and if they allow it: No. You should definetly ask them first.
If it's your own: You're good to go.
There can be problems however if you're doing this inside a thread and you're accessing the IMAP multiple times at the same time. You should try to avoid that.
Do you foresee any security issues with this approach I've outlined?
Yes. You can easily forge the "from" field of an email you've send. There can be issues then, if the email is known. You should definetly add some kind of extra security like sending the mail to <SaltedHashThatIsDifferentForEachUser>#example.com. (Facebook does this too for example)
What about issues with reliability (needs to be a 24x7 service)?
I see more problems with the reliability of your email provider than with your service, because as long as the emails are saved, you can still parse them later.
You should investigate the maximum size of your imap to avoid rejected mails (e.g. delete them once you've successfully parsed them)
Would you recommend implementing a web service to manage the interactions between the windows service and the database of the asp.net site? Or would you recommend hitting the database directly?
There is no need to have a web service, it will just add complexity as well as introduce another attack target on your web server. Having your windows service hit your database directly will be simpler and more secure.
If I program the windows service to ping the email provider every 30 seconds, will that be a problem?
Should not be a problem ... Email providers provide POP3 and IMAP so that external services can use them (outlook, thunderbird, iphone) so they expect them to be constantly pinged.
Do you foresee any security issues with this approach I've outlined?
As Simon stated, emails can be easily forged, providing a security vulnerability. This link discusses a hacking incident on posterous and the trade off between ease of use and security. As a CISSP, I tend to lean toward security, especially when the vulnerability very easy to exploit.
The unique, "secret" email address is a better solution in terms of security. However, it takes a lot away from your goal of simplifying the update process. It also makes your solution more complex and costly since you will need to be able to support (and programmatically create) an unique address for every user.
What about issues with reliability (needs to be a 24x7 service)?
Most mainstream email providers have outstanding availability. In regards to the availability of this solution (without the preexisting factors such as your current hardware and hosting facility), you would want to ensure the windows service was well written and included some "fault tolerance". For example, the services i have written in the past handle a few select errors caused by external dependencies (database or email being unavailable) so that it does not crash but just waits until its back online. This provides better availability since the service is ready to go when the dependency is ok again, without someone required to manually restart the windows service.
Is what I described above the approach you would take?
Due to the security vulnerability exposed by relying on the sender of the email for authentication and authorization, I would not take this approach. If the main goal was to simplify and streamline the addition of new items from mobile platforms, I would probably create a "mobile friendly" web page to accomplish this.
I just returned from a web design conference in Seattle and it was heavily focused on "non-pc" platforms. After listing their very innovative ideas and best practices for designing for the mobile industry, I can see a web app being a great solution to achieving this goal.
I'm writing a simple accounting program consists of several C# winform clients and a java server app that read/write data into a database. One of the requirement is that all C# clients should receive updates from the server. For example, if user a create a new invoice from his C# client, other users should see this new invoice from their client.
My experience is mainly on web development and I don't know what's the best way to fulfill this requirement with C#s client and Java servlet server.
My initial though is to run ActiveMQ with Glassfish and use messaging pub/sub method so that updates can be pushed to C# client. I will create different topics like newInvoice, cancelInvoice, etc in order to differentiate the message type. Each message will simply contains the object encoded in JSON.
But it seems to me that this involves quite a lot of work. Given that my user base is very small ( just 3 or 4 concurrent user), it seems to me that there should be some simpler solutions. (I'm not familiar socket programming :) )
I know this is a client-server programming 101 questions but would be great if any experienced programmer can point me to some simple solutions.
The simplest approach here is often to simply use a poll - i.e. have the clients query for data every (your time interval). That avoids a whole family of issues (firewalls, security, line-of-sight, resolution, client-tracking, etc).
With WCF, you can have callbacks on duplex channels (allowing the server to actively send a message to clients), but this is more complex. I value simplicity, so I usually just poll.
Tricks that help here are designing the system to have an inbuilt mechanism for querying "changes since x" - for example, an audit table, perhaps fed by database triggers. The exact details vary per project, of course.
Another option that you might want to look at is ADO.NET Sync Services; this does much of what you ask for, for keeping a local copy of the database up to date with the server - but has a few complexities of its own. This is available (IIRC) in the "Local Database Cache" VS template.
Rather than pushing information from the server to 1:N Clients, would it not be easier to have the clients Poll the server for updates every so often ? Or when the client launches and creates a connection to the server, the server could dynamically generate a new Message Queue for that Client Connection, which the client could then poll for updates?
There are several push technologies available to you, like ActiveMQ (as you mentioned), or XMPP. But if you only have 3 or 4 clients to concern yourself with, polling would be the simplest solution. It doesn't scale well, but that isn't really a concern in your case, unless your server is an 8086 or something 8-)
You may want to take a look at StreamHub Push Server - its a popular Comet server written in Java that has a .NET Client SDK for receiving updates from the server in C#. It also has a Java Client SDK and the usual Ajax/Comet web browser support giving you more flexibility in the future to push data to web, Java and C# clients.