I have a requirement to build a SSIS package that sends HTML formatted emails and then saves the emails as tiff files. I have created a script task that processes the necessary records and then coverts the HTML code to the tiff. I have split the process into separate packages, the email send works fine the converting HTML to tiff is causing the issue.
When running the package manually it will process all files without any issues. my test currently is about 315 files this needs to be able to process at least 1,000 when finished with the ability to send up to 10,000 at one time. The problem is when I set the package to execute using SQL Server Agent it stops at 207 files. The package is deployed to SQL Server 2019 in the SSIS Catalog
What I have tried so far
I started with the script being placed in a SSIS package and deployed to the server and calling the package from a step (works 99.999999% of the time with all packages) tried both 32 and 64 bit runtime. Never any error messages just Unexpected Termination when looking at the execution reports. When clicking in the catalog and executing package it will process all the files. The SQL Server Agent is using a proxy and I also created another proxy account with my admin credentials to test for any issues with the account.
Created another package to call the package and used the Execute Package Task to call the first package, same result 207 files. Changed the execute Process task to an Execute SQL Task and tried the script that is created to manually start a package in the catalog 207 files. Tried executing the script from the command line both through the other SSIS package and the SQL Server Agent directly same results 207 files. If I try any of those methods directly outside SQL Server Agent the process runs no issues.
I converted the script task to a console application and it works processing all the files. When calling the executable file from any method from the SQL Server Agent it once again stops at the 207 files.
I have consulted with the companies DBA and Systems teams and they have not found anything that could be causing this error. There seems to be some type of limit that no matter the method of execution SQL Server Agent will not allow. I have mentioned looking at third-party applications but have been told no.
I have included the code below that I have been able to piece together. I am a SQL developer so C# is outside my knowledge base. Is there a way to optimize the code so it only uses one thread or does a cleanup between each letter. There may be a need for this to create over ten thousand letters at certain times.
Update
I have replaced the code with the new updated code. The email and image creation are all included as this is what the final product must do. When sending the emails there is a primary and secondary email address and depending on what email address is used it will change what the body of the email contains. When looking at the code there is a section of try catch that sends to primary when indicated to and if that fails it send to secondary instead. I am guessing there is a much cleaner way of doing that section but this is my first program as I work in SQL for everything else.
Thank You for all the suggestions and help.
Updated Code
using System;
using System.Data;
using System.Windows.Forms;
using System.Data.SqlClient;
using System.Threading;
using System.Threading.Tasks;
using System.Drawing;
using System.IO;
using System.Net.Mail;
using System.Configuration;
using System.Diagnostics;
namespace DocCreator
{
class Program
{
static void Main(string[] args)
{
var connSSIS = ConfigurationManager.ConnectionStrings["ssis_ssrs"].ConnectionString;
int logid = 0;
int count = 0;
string previous = "";
try
{
var connDataExtract = ConfigurationManager.ConnectionStrings["dataExtract"].ConnectionString;
string archiveFolder = #"Folder Path";
string project = "Mobile Pay";
string logProc = "[dbo].[usp_EmailSentandError]";
int rowCount = 0;
DataTable dt = new DataTable();
using (SqlConnection connLog = new SqlConnection(connDataExtract))
{
connLog.Open();
SqlCommand command = new SqlCommand("etl.usp_GetEmailListID", connLog)
{
CommandType = CommandType.StoredProcedure
};
command.Parameters.Add("#Project", SqlDbType.NText).Value = project;
command.Parameters.Add("#NullValue", SqlDbType.Int).Value = 0;
using (SqlDataReader dr = command.ExecuteReader())
{
dt.Load(dr);
}
connLog.Close();
}
foreach (DataRow dr in dt.Rows)
//Parallel.ForEach(dt.AsEnumerable(), dr =>
{
try
{
var emailID = dr["Email_ID"];
var usePrimary = dr["UsePrimaryEmail"];
try
{
if ((bool)usePrimary)
{
try
{
var dp = GetDataPoints(connDataExtract, (int)emailID, (bool)usePrimary);
string indexfqdn = Path.Combine(archiveFolder, dp.IndexFile);
string filefqdn = Path.Combine(archiveFolder, dp.ArchiveFileName);
string mailBody = GetEmailBody(connDataExtract, dp.SqlProc, (int)emailID, dp.EmailBody_id);
SendEmail(dp.EmailFrom, dp.EmailSubject, dp.Email, mailBody);
Archive(mailBody, filefqdn, dp.FileWidth, dp.FileHeight, dp.IndexFileInsert, indexfqdn, dp.ArchiveFile);
LogEmail(connDataExtract, logProc, (int)emailID, dp.EmailBody_id, 1, 1, "", 0);
rowCount++;
}
catch (Exception e)
{
try
{
var dp = GetDataPoints(connDataExtract, (int)emailID, false);
string indexfqdn = Path.Combine(archiveFolder, dp.IndexFile);
string filefqdn = Path.Combine(archiveFolder, dp.ArchiveFileName);
string mailBody = GetEmailBody(connDataExtract, dp.SqlProc, (int)emailID, dp.EmailBody_id);
SendEmail(dp.EmailFrom, dp.EmailSubject, dp.Email, mailBody);
Archive(mailBody, filefqdn, dp.FileWidth, dp.FileHeight, dp.IndexFileInsert, indexfqdn, dp.ArchiveFile);
LogEmail(connDataExtract, logProc, (int)emailID, dp.EmailBody_id, 0, 1, e.Message.ToString(), 1);
rowCount++;
}
catch (Exception e2)
{
LogEmail(connDataExtract, logProc, (int)emailID, 0, 0, 0, e2.Message.ToString(), 1);
Console.Clear();
Console.WriteLine(e2.Message);
Console.ReadLine();
}
}
}
else
{
try
{
var dp = GetDataPoints(connDataExtract, (int)emailID, (bool)usePrimary);
string indexfqdn = Path.Combine(archiveFolder, dp.IndexFile);
string filefqdn = Path.Combine(archiveFolder, dp.ArchiveFileName);
string mailBody = GetEmailBody(connDataExtract, dp.SqlProc, (int)emailID, dp.EmailBody_id);
SendEmail(dp.EmailFrom, dp.EmailSubject, dp.Email, mailBody);
Archive(mailBody, filefqdn, dp.FileWidth, dp.FileHeight, dp.IndexFileInsert, indexfqdn, dp.ArchiveFile);
LogEmail(connDataExtract, logProc, (int)emailID, dp.EmailBody_id, 0, 1, "", 0);
rowCount++;
}
catch (Exception e)
{
LogEmail(connDataExtract, logProc, (int)emailID, 0, 0, 0, e.Message.ToString(), 1);
Console.Clear();
Console.WriteLine(e.Message);
Console.ReadLine();
}
}
}
catch (Exception e)
{
LogEmail(connDataExtract, logProc, (int)emailID, 0, 0, 0, e.Message.ToString(), 1);
Console.Clear();
Console.WriteLine(e.Message);
Console.ReadLine();
}
}
catch (Exception e2)
{
//Console.WriteLine(e2.Message);
using (SqlConnection connLog2 = new SqlConnection(connSSIS))
{
connLog2.Open();
SqlCommand command2 = new SqlCommand("dbo.usp_InsertssisScriptTaskLog", connLog2)
{
CommandType = CommandType.StoredProcedure
};
command2.Parameters.Add("#PackageLogID", SqlDbType.Int).Value = ((ulong)logid);
command2.Parameters.Add("#ErrorMessage", SqlDbType.NText).Value = e2.Message.ToString();
command2.ExecuteNonQuery();
connLog2.Close();
}
}
count++;
int directThreadsCount = Process.GetCurrentProcess().Threads.Count;
Console.Clear();
Console.WriteLine(previous + Environment.NewLine+ count + " Memory usage: " + GC.GetTotalMemory(false) + " " + GC.GetTotalMemory(true) + " Threads: " + directThreadsCount);
previous = count + " Memory usage: " + GC.GetTotalMemory(false) + " " + GC.GetTotalMemory(true) + " Threads: " + directThreadsCount;
GC.Collect(1);
}
Console.WriteLine("Files have been created");
}
catch (Exception e)
{
//Console.WriteLine(e.Message);
using (SqlConnection connLog = new SqlConnection(connSSIS))
{
connLog.Open();
SqlCommand command = new SqlCommand("dbo.usp_InsertssisScriptTaskLog", connLog)
{
CommandType = CommandType.StoredProcedure
};
command.Parameters.Add("#PackageLogID", SqlDbType.Int).Value = ((ulong)logid);
command.Parameters.Add("#ErrorMessage", SqlDbType.NText).Value = e.Message.ToString();
command.ExecuteNonQuery();
connLog.Close();
}
}
}
public static
(int EmailBody_id, bool ArchiveFile, int FileHeight, int FileWidth, string IndexFile, string IndexFileInsert, string Email, string ArchiveFileName, string EmailFrom, string EmailSubject, string SqlProc)
GetDataPoints(string connDataExtract, int Email_ID, bool UsePrimary)
{
string dataExtract = connDataExtract;
int emailID = Email_ID;
bool usePri = UsePrimary;
using (SqlConnection connLog = new SqlConnection(dataExtract))
{
connLog.Open();
SqlCommand command = new SqlCommand("etl.usp_EmailGetDataPoints", connLog)
{
CommandType = CommandType.StoredProcedure
};
command.Parameters.Add("#Email_ID", SqlDbType.Int).Value = emailID;
command.Parameters.Add("#UsePrimary", SqlDbType.Bit).Value = usePri;
SqlDataReader sqlDataReader;
int EmailBody_id = 0;
bool ArchiveFile = true;
int FileHeight = 0;
int FileWidth = 0;
string IndexFile = "";
string IndexFileInsert = "";
string Email = "";
string ArchiveFileName = "";
string EmailFrom = "";
string EmailSubject = "";
string SqlProc = "";
sqlDataReader = command.ExecuteReader();
while (sqlDataReader.Read())
{
EmailBody_id = (int)sqlDataReader.GetValue(sqlDataReader.GetOrdinal("EmailBody_ID"));
ArchiveFile = (bool)sqlDataReader.GetValue(sqlDataReader.GetOrdinal("ArchiveFile"));
FileHeight = (int)sqlDataReader.GetValue(sqlDataReader.GetOrdinal("FileHeight"));
FileWidth = (int)sqlDataReader.GetValue(sqlDataReader.GetOrdinal("FileWidth"));
IndexFile = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("IndexFile")).ToString();
IndexFileInsert = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("IndexFileInsert")).ToString();
Email = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("Email")).ToString();
ArchiveFileName = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("ArchiveFileName")).ToString();
EmailFrom = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("EmailFrom")).ToString();
EmailSubject = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("EmailSubject")).ToString();
SqlProc = sqlDataReader.GetValue(sqlDataReader.GetOrdinal("SqlProc")).ToString();
}
connLog.Close();
return (
EmailBody_id,
ArchiveFile,
FileHeight,
FileWidth,
IndexFile,
IndexFileInsert,
Email,
ArchiveFileName,
EmailFrom,
EmailSubject,
SqlProc);
}
}
public static string GetEmailBody(string connDataExtract, string SQLProc, int Email_ID, int EmailBodyID)
{
string dataExtract = connDataExtract;
string proc = SQLProc;
int emailID = Email_ID;
int bodyID = EmailBodyID;
string MailBody;
using (SqlConnection connLog = new SqlConnection(dataExtract))
{
connLog.Open();
SqlCommand command = new SqlCommand(proc, connLog)
{
CommandType = CommandType.StoredProcedure
};
command.Parameters.Add("#Email_ID", SqlDbType.Int).Value = ((ulong)emailID);
command.Parameters.Add("#EmailBody_ID", SqlDbType.Int).Value = ((ulong)bodyID);
SqlDataReader dataReader;
string Output = "";
dataReader = command.ExecuteReader();
while (dataReader.Read())
{
Output = dataReader.GetValue(0).ToString();
}
connLog.Close();
MailBody = Output;
return MailBody;
}
}
public static void SendEmail(string emailFrom, string emailSubject, string emailTo, string mailBody)
{
string from = emailFrom;
string subject = emailSubject;
string to = emailTo;
string source = mailBody;
using (MailMessage myHtmlFormattedMail = new MailMessage())
{
MailAddress fromMail = new MailAddress(from);
myHtmlFormattedMail.From = fromMail;
myHtmlFormattedMail.Subject = subject;
myHtmlFormattedMail.Body = source;
foreach (var address in to.Split(new[] { ";" }, StringSplitOptions.RemoveEmptyEntries))
{
myHtmlFormattedMail.To.Add(address);
}
myHtmlFormattedMail.IsBodyHtml = true;
SmtpClient mySmtpClient = new SmtpClient();
mySmtpClient.Send(myHtmlFormattedMail);
}
}
public static void IndexFile(string indexFileInsert, string indexfqdn)
{
string insert = indexFileInsert;
string fqdn = indexfqdn;
try
{
if (!File.Exists(fqdn))
{
using (StreamWriter sw = File.CreateText(fqdn))
{
sw.WriteLine(insert);
}
}
using (StreamWriter sw = File.AppendText(fqdn))
{
sw.WriteLine(insert);
}
}
catch { }
}
public static void LogEmail(string databaseServer, string logProc, int email_ID, int emailBodyID, int primaryEmailUsed, int emailSent, string errorMessage, int errorExists)
{
string dataExtract = databaseServer;
string proc = logProc;
int emailID = email_ID;
int bodyID = emailBodyID;
int usePri = primaryEmailUsed;
int sent = emailSent;
string error = errorMessage;
int exists = errorExists;
using (SqlConnection connLog = new SqlConnection(dataExtract))
{
connLog.Open();
SqlCommand command = new SqlCommand(proc, connLog)
{
CommandType = CommandType.StoredProcedure
};
command.Parameters.Add("#Email_ID", SqlDbType.Int).Value = ((ulong)emailID);
command.Parameters.Add("#EmailBody_ID", SqlDbType.Int).Value = ((ulong)bodyID);
command.Parameters.Add("#PrimaryEmailUsed", SqlDbType.Int).Value = ((ulong)usePri);
command.Parameters.Add("#EmailSent", SqlDbType.Int).Value = ((ulong)sent);
command.Parameters.Add("#ErrorMessage", SqlDbType.NText).Value = error;
command.Parameters.Add("#ErrorExists", SqlDbType.Int).Value = ((ulong)exists);
command.ExecuteNonQuery();
connLog.Close();
}
}
public static void StartBrowser(string mailBody, string file, int width, int height, string fileInsert, string indexFile)
{
try
{
string source = mailBody;
string fqdn = file;
int w = width;
int h = height;
string insert = fileInsert;
string indexFQDN = indexFile;
IndexFile(insert, indexFQDN);
using (WebBrowser wb = new WebBrowser())
{
wb.ScrollBarsEnabled = false;
wb.Width = w;
wb.Height = h;
wb.Visible = false;
wb.DocumentCompleted +=
(sender, e) => WebBrowser_DocumentCompleted(sender, e, fqdn);
wb.DocumentText = source;
Application.Run();
}
}
finally
{
Application.Exit();
}
}
public static void WebBrowser_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e, string file)
{
string fqdn = file;
var webBrowser = (WebBrowser)sender;
using (Bitmap bitmap = new Bitmap(webBrowser.Width, webBrowser.Height))
{
webBrowser
.DrawToBitmap(bitmap, new Rectangle(0, 0, bitmap.Width, bitmap.Height));
bitmap.Save(fqdn, System.Drawing.Imaging.ImageFormat.Jpeg);
}
}
static void Wait(int milliseconds)
{
using (System.Windows.Forms.Timer timer1 = new System.Windows.Forms.Timer())
{
if (milliseconds == 0 || milliseconds < 0) return;
// Console.WriteLine("start wait timer");
timer1.Interval = milliseconds;
timer1.Enabled = true;
timer1.Start();
timer1.Tick += (s, e) =>
{
timer1.Enabled = false;
timer1.Stop();
// Console.WriteLine("stop wait timer");
};
while (timer1.Enabled)
{
System.Windows.Forms.Application.DoEvents();
}
}
}
public static void Archive(string emailBody, string file, int width, int height, string fileInsert, string indexFile, bool archiveFile)
{
string source = emailBody;
string fqdn = file;
int w = width;
int h = height;
string insert = fileInsert;
string indexFQDN = indexFile;
bool archive = archiveFile;
if (archive)
{
Thread tr = new Thread(() => StartBrowser(source, fqdn, w, h, insert, indexFQDN))
{
Name = "Fred",
IsBackground = true
};
tr.SetApartmentState(ApartmentState.STA);
tr.Start();
int wc = 800;
while (!File.Exists(file) || wc <= 0)
{
Wait(50);
wc--;
};
tr.Abort();
}
}
}
}
I have resolved the issue so it meets the needs of my project. There is probably a better solution but this does work. Using the code above I created an executable file and limited the result set to top 100. Created a ssis package with a For Loop that does a record count from the staging table and kicks off the executable file. I performed several tests and was able to exceed the 10,000 limit that was a requirement to the project.
I have the following code and it works fine when I loop through one by one using foreach, but when I change it to use Parallel.ForEach I'm getting errors. Trying to figure out how I can correct this. FYI the dParms list contain unique id's.
The error I'm getting is
Error writing to path..... ErrorMessage:Item has already been added.
Key in dictionary: 'IIS_WasUrlRewritten' Key being added:
'IIS_WasUrlRewritten',
private void GeneratePages(RequestStatus response, string localDir, List<Parameters> dParms, int total, DateTime generatedTime)
{
int current = 0;
Parallel.ForEach(dParms, curJob =>
{
try
{
DownloadPage(localDir, curJob, generatedTime);
}
catch (Exception ex)
{
response.Status = false;
response.Message = ex.Message;
}
finally
{
Interlocked.Increment(ref current);
if (current % 10 == 0)
//to do: Send Progress to UI
}
//});
}
private string DownloadPage(string localDir, Parameters p, DateTime generatedTime)
{
string strExtension = "html";
string url = string.Empty;
url = this.Url.Action("MyAction", "Home", new { area = "", #id = p.MyId, #generatedTime = generatedTime }, this.Request.Url.Scheme);
var document = new HtmlWeb().Load(url);
string strFileName = url.Substring(url.LastIndexOf("/") + 1);
strFileName = strFileName.Substring(0, strFileName.IndexOf("generatedTime") - 1);
string strDiskFileName = strFileName.Replace(".aspx?", "");
strDiskFileName = strDiskFileName.Replace("?", "");
strDiskFileName = strDiskFileName.Replace(".aspx", "");
strDiskFileName = strDiskFileName.Replace("&", "");
strDiskFileName = strDiskFileName.Replace("=", "");
strDiskFileName = strDiskFileName.Replace("%20", "");
strDiskFileName += "." + strExtension;
document.Save(localDir + strDiskFileName);
return url;
}
I have a script that imports a csv file and reads each line to update the corresponding item in Sitecore. It works for many of the products but the problem is for some products where certain cells in the row have commas in them (such as the product description).
protected void SubmitButton_Click(object sender, EventArgs e)
{
if (UpdateFile.PostedFile != null)
{
var file = UpdateFile.PostedFile;
// check if valid csv file
message.InnerText = "Updating...";
Sitecore.Context.SetActiveSite("backedbybayer");
_database = Database.GetDatabase("master");
SitecoreContext context = new SitecoreContext(_database);
Item homeNode = context.GetHomeItem<Item>();
var productsItems =
homeNode.Axes.GetDescendants()
.Where(
child =>
child.TemplateID == new ID(TemplateFactory.FindTemplateId<IProductDetailPageItem>()));
try
{
using (StreamReader sr = new StreamReader(file.InputStream))
{
var firstLine = true;
string currentLine;
var productIdIndex = 0;
var industryIdIndex = 0;
var categoryIdIndex = 0;
var pestIdIndex = 0;
var titleIndex = 0;
string title;
string productId;
string categoryIds;
string industryIds;
while ((currentLine = sr.ReadLine()) != null)
{
var data = currentLine.Split(',').ToList();
if (firstLine)
{
// find index of the important columns
productIdIndex = data.IndexOf("ProductId");
industryIdIndex = data.IndexOf("PrimaryIndustryId");
categoryIdIndex = data.IndexOf("PrimaryCategoryId");
titleIndex = data.IndexOf("Title");
firstLine = false;
continue;
}
title = data[titleIndex];
productId = data[productIdIndex];
categoryIds = data[categoryIdIndex];
industryIds = data[industryIdIndex];
var products = productsItems.Where(x => x.DisplayName == title);
foreach (var product in products)
{
product.Editing.BeginEdit();
try
{
product.Fields["Product Id"].Value = productId;
product.Fields["Product Industry Ids"].Value = industryIds;
product.Fields["Category Ids"].Value = categoryIds;
}
finally
{
product.Editing.EndEdit();
}
}
}
}
// when done
message.InnerText = "Complete";
}
catch (Exception ex)
{
message.InnerText = "Error reading file";
}
}
}
The problem is that when a description field has commas, like "Product is an effective, preventative biofungicide," it gets split as well and throws off the index, so categoryIds = data[8] gets the wrong value.
The spreadsheet is data that is provided by our client, so I would rather not require the client to edit the file unless necessary. Is there a way I can handle this in my code? Is there a different way I can read the file that won't split everything by comma?
I suggest use Ado.Net, If the field's data are inside quotes and it will parse it like a field and ignore any commas inside this..
Code Example:
static DataTable GetDataTableFromCsv(string path, bool isFirstRowHeader)
{
string header = isFirstRowHeader ? "Yes" : "No";
string pathOnly = Path.GetDirectoryName(path);
string fileName = Path.GetFileName(path);
string sql = #"SELECT * FROM [" + fileName + "]";
using(OleDbConnection connection = new OleDbConnection(
#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + pathOnly +
";Extended Properties=\"Text;HDR=" + header + "\""))
using(OleDbCommand command = new OleDbCommand(sql, connection))
using(OleDbDataAdapter adapter = new OleDbDataAdapter(command))
{
DataTable dataTable = new DataTable();
dataTable.Locale = CultureInfo.CurrentCulture;
adapter.Fill(dataTable);
return dataTable;
}
}
I have windows service which periodically fetches data from table and creates excel file and mails it to users.After mail sending I need to delete that file. Have used following code:
public void LABInstrumentExcelGeneration(string filePath) {
try {
string connectionString = GetConnectionString(filePath);
List < LABInstruments > listLABInstrument = null;
listLABInstrument = new List < LABInstruments > ();
listLABInstrument = LABInstrumentBL.GetLABInstrumentList();
if (listLABInstrument.Count > 0) {
using(OleDbConnection conn = new OleDbConnection(connectionString)) {
conn.Open();
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "CREATE TABLE [table2] (SrNo string,CalibrationDoneOn Date);";
cmd.ExecuteNonQuery();
foreach(LABInstruments tc1 in listLABInstrument) {
cmd.CommandText = "INSERT INTO [table2](SrNo,CalibrationDoneOn) VALUES('" + tc1.SrNo + "','" + tc1.CalibrationDoneOn + "');";
cmd.ExecuteNonQuery();
}
conn.Close();
conn.Dispose();
}
}
} catch (Exception ex) {}
}
SendMail(filePath, role);
if (File.Exists(filePath)) {
File.Delete(filePath);
eLog.WriteEntry("file deleted");
}
But it gives error File is being used by another process.
Ho can I delete file? Moreover, i've used OLEDB for file creation. Is there any other best practise for file creation? Have tried ExcelLibrary, but files created in it does not work in all versions of office so have dropped it.
Try this:
protected virtual bool IsLocked(FileInfo fileName)
{
FileStream fStream = null;
try
{
fStream = fileName.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (IOException)
{
return true;
}
finally
{
if (fStream != null)
{
fStream.Close();
}
}
return false;
}
And then:
if (File.Exists(filePath))
{
FileInfo myfile = new FileInfo(filePath);
if(IsLocked(myfile))
{
File.Create(filePath).Close();
File.Delete(filePath);
eLog.WriteEntry("file deleted");
}
else
{
File.Delete(filePath);
eLog.WriteEntry("file deleted");
}
}
I think the problem may be that sendmail returns before it has finished using the file.
Instead of sendmail() I used this function which frees up the file for deletion:
public static void send(string subject, string body, string from, string to, List<string> attachments = null)
{
using (MailMessage message = new MailMessage(new MailAddress(from), new MailAddress(to)))
{
message.Subject = subject;
message.Body = body;
if (attachments != null && attachments.Count > 0)
{
foreach (string s in attachments)
{
if (s != null)
{
/* this code fixes the error where the attached file is
* prepended with the path of the file */
Attachment attachment = new Attachment(s, MediaTypeNames.Application.Octet);
ContentDisposition disposition = attachment.ContentDisposition;
disposition.CreationDate = File.GetCreationTime(s);
disposition.ModificationDate = File.GetLastWriteTime(s);
disposition.ReadDate = File.GetLastAccessTime(s);
disposition.FileName = Path.GetFileName(s);
disposition.Size = new FileInfo(s).Length;
disposition.DispositionType = DispositionTypeNames.Attachment;
message.Attachments.Add(attachment);
}
}
}
using (SmtpClient client = new SmtpClient())
{
client.Send(message);
}
}
}