I am using Koolwired.Imap to retrieve attachments via iMAP. - c#

I am using Koolwired.Imap to retrieve attachments. The following is the code that I have written.
using K = Koolwired.Imap;
public void GetAttachmentsTest(string thread, string selectFolder, string fileName)
{
K.ImapConnect connect = new K.ImapConnect(Global.host);
K.ImapCommand command = new K.ImapCommand(connect);
K.ImapAuthenticate auth = new K.ImapAuthenticate(connect, Global.username, Global.password);
connect.Open();
auth.Login();
K.ImapMailbox mailBox = command.Select(Global.inbox);
mailBox = command.Fetch(mailBox);
K.ImapMailboxMessage mbstructure = new K.ImapMailboxMessage();
while (true)
{
try
{
int mailCount = mailBox.Messages.Count;
if (mailCount == 0)
{
Console.WriteLine("no more emails");
break;
}
for (int i = 0; i < mailCount; ++i)
{
mbstructure = mailBox.Messages[mailCount - 1];
mbstructure = command.FetchBodyStructure(mbstructure);
for (int j = 0; j < mbstructure.BodyParts.Count; ++j)
{
if (mbstructure.BodyParts[j].Attachment)
{
//Attachment
command.FetchBodyPart(mbstructure, mbstructure.BodyParts.IndexOf(mbstructure.BodyParts[j]));
//Write Binary File
string tempPath = Path.GetTempPath();
FileStream fs = new FileStream(tempPath + mbstructure.BodyParts[j].FileName, FileMode.Create);
int length = Convert.ToInt32(mbstructure.BodyParts[j].DataBinary.Length);
fs.Write(mbstructure.BodyParts[j].DataBinary, 0,length);
fs.Flush();
fs.Close();
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("T1 " + ex.Message);
Console.WriteLine("T1 " + ex.StackTrace);
if (ex.InnerException != null)
Console.WriteLine("T1 " + ex.InnerException.Message);
}
}
}
I am getting error on the statement:
int length = Convert.ToInt32(mbstructure.BodyParts[j].DataBinary.Length);
and
fs.Write(mbstructure.BodyParts[j].DataBinary, 0,length);
and the error is:
The input is not a valid Base-64 string as it contains a non-base 64 characters, more than two padding characters, or an illegal character among the padding characters.
The above code breaks down at the lines shown when there is only 1 attachment.
If there are more than one attachment:
Then the code breaks down on line
mbstructure = command.FetchBodyStructure(mbstructure);
and the error is:
Invalid format could not parse body part headers.
I am soo close to getting this assignment taken care of. Could any one please help me.
I would also like to know how to delete the emails once I retrieve them.
Thanks.

I experienced the same problem
If anyone cares about, I solved it downloading the latest source code for the library from codeplex.
Once compiled, it works with no change. Looks like they have fixed it.
Also for deleting an email, just mark it for deletion:
command.SetDeleted(n, true); //-> Where n is the message number.
If is an IMAP connection, actually you have to expunge the deleted mails to complete the deletion.
command.Expunge();
Hope it helps someone.

Related

Index was outside the bounds of the array in MSCORLIB.DLL

I will be amazed if I find a solution for this, since it is very specific and vague, but I figured I would try. I'll try to give as much information as humanly possible, since I've been searching for answers for some time now.
I am building a utility in C# which copies records from a file in a library on the i-series/AS400 and builds an encrypted text file with each record from the AS400 as a comma separated string. In the file, it will have values like filename, fieldvalue1, fieldvalue2, fieldvalue3. I then take that text file to another PC, and run a C# utility which copies that record into the same file name in a library over there on a different i-series machine. Unfortunately, I receive the outside bounds of the array exception in some cases, but I cannot determine why. In the record just prior to the exception, the record looks pretty much the same and it works fine. My code is below in a nutshell. I usually don't give up, but I don't expect to ever figure this out. If someone does, I'll probably sing karaoke tonight.
// Select records from AS400 file and write them to text file
Recordset rs = new Recordset();
sqlQuery = "SELECT * FROM " + dataLibrary + "." + fileName;
try
{
rs.Open(sqlQuery, con);
while (!rs.EOF)
{
int[] fieldLengths;
fieldLengths = new int[rs.Fields.Count];
String[] fieldValues;
fieldValues = new String[rs.Fields.Count];
String fullString = "";
for (i = 0; i < rs.Fields.Count; i++)
{
fieldLengths[i] += rs.Fields[i].DefinedSize;
fieldValues[i] += rs.Fields[i].Value;
}
fullString = fileName + "," + String.Join(",", fieldValues);
fullString = Functions.EncryptString(fullString);
File.AppendAllText(savefile.FileName, fullString + Environment.NewLine);
rs.MoveNext();
}
}
catch (Exception ex)
{
}
cmd.Dispose();
// This gives me a text file of filename, fieldvalue1, fieldvalue2, etc...
// Next, I take the file to another system and run this process:
while ((myString = inputFile.ReadLine()) != null)
{
int stringLength = myString.Length;
String[] valuesArray = myString.Split(',');
for (i = 0; i < valuesArray.Length; i++)
{
if (i == 0)
{
fileName = valuesArray[0];
// Create file if it doesn't exist already
createPhysicalFile(newLibrary, fileName);
SQLStatement = "INSERT INTO " + newLibrary + "." + fileName + "VALUES(";
}
else
{
if (i == valuesArray.Length - 1)
{
SQLStatement += "#VAL" + i + ")";
}
else
{
SQLStatement += "#VAL" + i + ", ";
}
}
}
try
{
using (connection)
{
try
{
connection.Open();
}
catch (Exception ex)
{
}
// Create a new SQL command
iDB2Command command = new iDB2Command(SQLStatement, connection);
for (i = 1; i < valuesArray.Length; i++)
{
try
{
command.Parameters.AddWithValue("#VAL" + i, (valuesArray[i]));
}
catch (Exception ex)
{
}
}
// Just split the array into a string to visually check
// differences in the records
String arraySplit = ConvertStringArrayToString(valuesArray);
// The query gets executed here. The command looks something
// like:
// INSERT INTO LIBNAME.FILENAME VALUES(#VAL!, #VAL2, #VAL3, #VAL4)
// There are actually 320 fields in the file I'm having a problem with,
// so it's possible I'm overlooking something. I have narrowed it down to
// field # 316 when the exception occurs, but in both cases
// field 316 is blanks (when it works and when it doesn't).
command.ExecuteNonQuery();
}
}
catch (Exception ex)
{
// Here I get the exception out of bounds error in MSCORLIB.DLL.
// Some records are added fine, while others cause this exception.
// I cannot visibly tell any major differences, nor do I see any
// errors in the AS400 job log or anything in C# that would lead me
// down a certain path.
String error = ex.Message;
}
}
For what it's worth, I found this happening one a smaller file in the system and was able to figure out what going on, after painstaking research into the code and the net. Basically, the file file has numeric fields on the i-series. Somehow, the records were written to the file on the original system with null values in the numeric fields instead of numeric values. When storing the original records, I had to do this calculation:
String fieldType = rs.Fields[i].Type.ToString();
object objValue = rs.Fields[i].Value;
if (fieldType == "adNumeric" && objValue is DBNull)
{
fieldValues[i] += "0";
}
else
{
fieldValues[i] += rs.Fields[i].Value;
}
After this, if null values were found in one of the numeric fields, it just put "0" in it's place so that when writing to the new machine, it would put a valid numeric character in there and continue on writing the rest of the values. Thanks for all the advice and moral support. :)

"The process cannot access the file because it is being used by another process"

The full error I am receiving is:
"The process cannot access the file 'e:\Batch\NW\data_Test\IM_0232\input\RN318301.WM' because it is being used by another process.>>> at IM_0232.BatchModules.BundleSort(String bundleFileName)
at IM_0232.BatchModules.ExecuteBatchProcess()"
The involved code can be seen below. The RN318301.WM file being processed is a text file that contains information which will eventually be placed in PDF documents. There are many documents referenced in the RN318301.WM text file with each one being represented by a collection of rows. As can be seen in the code, the RN318301.WM text file is first parsed to determine the number of documents represented in it as well as the maximum number of lines in a documents. This information is then used to create two-dimensional array that will contain all of the document information. The RN318301.WM text file is parsed again to populate the two-dimensional array and at the same time information is collected into a dictionary that will be sorted later in the routine.
The failure occurs at the last line below:
File.Delete(_bundlePath + Path.GetFileName(bundleFileName));
This is a sporadic problem that occurs only rarely. It has even been seen to occur with a particular text file with which it had not previously occurred. That is, a particular text file will process fine but then on reprocessing the error will be triggered.
Can anyone help us to diagnose the cause of this error? Thank you very much...
public void BundleSort(string bundleFileName)
{
Dictionary<int, string> memberDict = new Dictionary<int, string>();
Dictionary<int, string> sortedMemberDict = new Dictionary<int, string>();
//int EOBPosition = 0;
int EOBPosition = -1;
int lineInEOB = 0;
int eobCount = 0;
int lineCount = 0;
int maxLineCount = 0;
string compareString;
string EOBLine;
//#string[][] EOBLineArray;
string[,] EOBLineArray;
try
{
_batch.TranLog_Write("\tBeginning sort of bundle " + _bundleInfo.BundleName + " to facilitate householding");
//Read the bundle and create a dictionary of comparison strings with EOB position in the bundle being the key
StreamReader file = new StreamReader(#_bundlePath + _bundleInfo.BundleName);
//The next section of code counts CH records as well as the maximum number of CD records in an EOB. This information is needed for initialization of the 2-dimensional EOBLineArray array.
while ((EOBLine = file.ReadLine()) != null)
{
if (EOBLine.Substring(0, 2) == "CH" || EOBLine.Substring(0, 2) == "CT")
{
if (lineCount == 0)
lineCount++;
if (lineCount > maxLineCount)
{
maxLineCount = lineCount;
}
eobCount++;
if (lineCount != 1)
lineCount = 0;
}
if (EOBLine.Substring(0, 2) == "CD")
{
lineCount++;
}
}
EOBLineArray = new string[eobCount, maxLineCount + 2];
file = new StreamReader(#_bundlePath + _bundleInfo.BundleName);
try
{
while ((EOBLine = file.ReadLine()) != null)
{
if (EOBLine.Substring(0, 2) == "CH")
{
EOBPosition++;
lineInEOB = 0;
compareString = EOBLine.Substring(8, 40).Trim() + EOBLine.Substring(49, 49).TrimEnd().TrimStart() + EOBLine.Substring(120, 5).TrimEnd().TrimStart();
memberDict.Add(EOBPosition, compareString);
EOBLineArray[EOBPosition, lineInEOB] = EOBLine;
}
else
{
if (EOBLine.Substring(0, 2) == "CT")
{
EOBPosition++;
EOBLineArray[EOBPosition, lineInEOB] = EOBLine;
}
else
{
lineInEOB++;
EOBLineArray[EOBPosition, lineInEOB] = EOBLine;
}
}
}
}
catch (Exception ex)
{
throw ex;
}
_batch.TranLog_Write("\tSending original unsorted bundle to archive");
if(!(File.Exists(_archiveDir + "\\" +DateTime.Now.ToString("yyyyMMdd")+ Path.GetFileName(bundleFileName) + "_original")))
{
File.Copy(_bundlePath + Path.GetFileName(bundleFileName), _archiveDir + "\\" +DateTime.Now.ToString("yyyyMMdd")+ Path.GetFileName(bundleFileName) + "_original");
}
file.Close();
file.Dispose();
GC.Collect();
File.Delete(_bundlePath + Path.GetFileName(bundleFileName));
You didn't close/dispose your StreamReader first time round so the file handle is still open
Consider using the using construct - this will automatically dispose of the object when it goes out of scope:
using(var file = new StreamReader(args))
{
// Do stuff
}
// file has now been disposed/closed etc
You need to close your StreamReaders for one thing.
StreamReader file = new StreamReader(#_bundlePath + _bundleInfo.BundleName);
You need to close the StreamReader object, and you could do this in a finally block:
finally {
file.Close();
}
A better way is to use a using block:
using (StreamReader file = new StreamReader(#_bundlePath + _bundleInfo.BundleName)) {
...
}
It looks to me like you are calling GC.Collect to try to force the closing of these StreamReaders, but that doesn't guarantee that they will be closed immediately as per the MSDN doc:
http://msdn.microsoft.com/en-us/library/xe0c2357.aspx
From that doc:
"All objects, regardless of how long they have been in memory, are considered for collection;"

Using StreamWriter to implement a rolling log, and deleting from top

My C# winforms 4.0 application has been using a thread-safe streamwriter to do internal, debug logging information. When my app opens, it deletes the file, and recreates it. When the app closes, it saves the file.
What I'd like to do is modify my application so that it does appending instead of replacing. This is a simple fix.
However, here's my question:
I'd like to keep my log file AROUND 10 megabytes maximum. My constraint would be simple. When you go to close the file, if the file is greater than 10 megabytes, trim out the first 10%.
Is there a 'better' way then doing the following:
Close the file
Check if the file is > 10 meg
If so, open the file
Parse the entire thing
Cull the first 10%
Write the file back out
Close
Edit: well, I ended up rolling my own (shown following) the suggestion to move overt to Log4Net is a good one, but the time it woudl take to learn the new library and move all my log statements (thousands) over isn't time effective for the small enhancement I was trying to make.
private static void PerformFileTrim(string filename)
{
var FileSize = Convert.ToDecimal((new System.IO.FileInfo(filename)).Length);
if (FileSize > 5000000)
{
var file = File.ReadAllLines(filename).ToList();
var AmountToCull = (int)(file.Count * 0.33);
var trimmed = file.Skip(AmountToCull).ToList();
File.WriteAllLines(filename, trimmed);
}
}
I researched this once and never came up with anything, but I can offer you plan B here:
I use the selection below to keep a maximum of 3 log files. At first, log file 1 is created and appended to. When it exceeds maxsize, log 2 and later log 3 are created. When log 3 is too large, log 1 is deleted and the remaining logs get pushed down the stack.
string[] logFileList = Directory.GetFiles(Path.GetTempPath(), "add_all_*.log", SearchOption.TopDirectoryOnly);
if (logFileList.Count() > 1)
{
Array.Sort(logFileList, 0, logFileList.Count());
}
if (logFileList.Any())
{
string currFilePath = logFileList.Last();
string[] dotSplit = currFilePath.Split('.');
string lastChars = dotSplit[0].Substring(dotSplit[0].Length - 3);
ctr = Int32.Parse(lastChars);
FileInfo f = new FileInfo(currFilePath);
if (f.Length > MaxLogSize)
{
if (logFileList.Count() > MaxLogCount)
{
File.Delete(logFileList[0]);
for (int i = 1; i < MaxLogCount + 1; i++)
{
Debug.WriteLine(string.Format("moving: {0} {1}", logFileList[i], logFileList[i - 1]));
File.Move(logFileList[i], logFileList[i - 1]); // push older log files back, in order to pop new log on top
}
}
else
{
ctr++;
}
}
}
The solutions here did not really work for me. I took user3902302's answer, which again was based on bigtech's answer and wrote a complete class. Also, I am NOT using StreamWriter, you can change the one line (AppendAllText against the StreamWrite aequivalent).
There is little error handling (e. g. re-try access when it is failing, though the lock should catch all internal concurrent access).
This might be enough for some people who had to use a big solution like log4net or nlog before. (And log4net RollingAppender is not even thread-safe, this one is. :) )
public class RollingLogger
{
readonly static string LOG_FILE = #"c:\temp\logfile.log";
readonly static int MaxRolledLogCount = 3;
readonly static int MaxLogSize = 1024; // 1 * 1024 * 1024; <- small value for testing that it works, you can try yourself, and then use a reasonable size, like 1M-10M
public static void LogMessage(string msg)
{
lock (LOG_FILE) // lock is optional, but.. should this ever be called by multiple threads, it is safer
{
RollLogFile(LOG_FILE);
File.AppendAllText(LOG_FILE, msg + Environment.NewLine, Encoding.UTF8);
}
}
private static void RollLogFile(string logFilePath)
{
try
{
var length = new FileInfo(logFilePath).Length;
if (length > MaxLogSize)
{
var path = Path.GetDirectoryName(logFilePath);
var wildLogName = Path.GetFileNameWithoutExtension(logFilePath) + "*" + Path.GetExtension(logFilePath);
var bareLogFilePath = Path.Combine(path, Path.GetFileNameWithoutExtension(logFilePath));
string[] logFileList = Directory.GetFiles(path, wildLogName, SearchOption.TopDirectoryOnly);
if (logFileList.Length > 0)
{
// only take files like logfilename.log and logfilename.0.log, so there also can be a maximum of 10 additional rolled files (0..9)
var rolledLogFileList = logFileList.Where(fileName => fileName.Length == (logFilePath.Length + 2)).ToArray();
Array.Sort(rolledLogFileList, 0, rolledLogFileList.Length);
if (rolledLogFileList.Length >= MaxRolledLogCount)
{
File.Delete(rolledLogFileList[MaxRolledLogCount - 1]);
var list = rolledLogFileList.ToList();
list.RemoveAt(MaxRolledLogCount - 1);
rolledLogFileList = list.ToArray();
}
// move remaining rolled files
for (int i = rolledLogFileList.Length; i > 0; --i)
File.Move(rolledLogFileList[i - 1], bareLogFilePath + "." + i + Path.GetExtension(logFilePath));
var targetPath = bareLogFilePath + ".0" + Path.GetExtension(logFilePath);
// move original file
File.Move(logFilePath, targetPath);
}
}
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.ToString());
}
}
}
edit:
Since I just noticed that you asked a slightly different question: should your lines vary greatly in size, this would be a variation (, that in 90% of cases does not improve over yours, though, and might be very slightly faster, also introduced a new unhandled error (\n not being present)):
private static void PerformFileTrim(string filename)
{
var fileSize = (new System.IO.FileInfo(filename)).Length;
if (fileSize > 5000000)
{
var text = File.ReadAllText(filename);
var amountToCull = (int)(text.Length * 0.33);
amountToCull = text.IndexOf('\n', amountToCull);
var trimmedText = text.Substring(amountToCull + 1);
File.WriteAllText(filename, trimmedText);
}
}
This is derived from bigtech's answer:
private static string RollLogFile()
{
string path = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
string appName = Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]);
string wildLogName = string.Format("{0}*.log",appName);
int fileCounter = 0;
string[] logFileList = Directory.GetFiles(path, wildLogName, SearchOption.TopDirectoryOnly);
if (logFileList.Length > 0)
{
Array.Sort(logFileList, 0, logFileList.Length);
fileCounter = logFileList.Length - 1;
//Make sure we apply the MaxLogCount (but only once to reduce the delay)
if (logFileList.Length > MaxLogCount)
{
//Too many files - remove one and rename the others
File.Delete(logFileList[0]);
for (int i = 1; i < logFileList.Length; i++)
{
File.Move(logFileList[i], logFileList[i - 1]);
}
--fileCounter;
}
string currFilePath = logFileList[fileCounter];
FileInfo f = new FileInfo(currFilePath);
if (f.Length < MaxLogSize)
{
//still room in the current file
return currFilePath;
}
else
{
//need another filename
++fileCounter;
}
}
return string.Format("{0}{1}{2}{3:00}.log", path, Path.DirectorySeparatorChar, appName, fileCounter);
}
Usage:
string logFileName = RollLogFile();
using (StreamWriter sw = new StreamWriter(logFileName, true))
{
sw.AutoFlush = true;
sw.WriteLine(string.Format("{0:u} {1}", DateTime.Now, message));
}
This function will allow you to rotate your log based on weekdays. First time y our application will launch on Monday, will check for any existing entry for Monday Date, if not already initialized for today will discard old entries and reinitialize new file. Onwards for whole of that day, file will keep appending the text to same log file.
So, total 7 log files will be created.
debug-Mon.txt, debog-Tue.txt...
it will also add the method name which actually logged the message along with date time. very useful for general purpose use.
private void log(string text)
{
string dd = DateTime.Now.ToString("yyyy-MM-dd");
string mm = DateTime.Now.ToString("ddd");
if (File.Exists("debug-" + mm + ".txt"))
{
String contents = File.ReadAllText("debug-" + mm + ".txt");
if (!contents.Contains("Date: " + dd))
{
File.Delete("debug-" + mm + ".txt");
}
}
File.AppendAllText("debug-" + mm + ".txt", "\r\nDate: " + DateTime.Now.ToString("yyyy-MM-dd HH:mm:s") + " =>\t" + new System.Diagnostics.StackFrame(1, true).GetMethod().Name + "\t" + text);
}
I liked greggorob64's solution but also wanted to zip the old file. This has everything you need other than the part of compressing the old file to a zip, which you can find here: Create zip file in memory from bytes (text with arbitrary encoding)
static int iMaxLogLength = 2000; // Probably should be bigger, say 200,000
static int KeepLines = 5; // minimum of how much of the old log to leave
public static void ManageLogs(string strFileName)
{
try
{
FileInfo fi = new FileInfo(strFileName);
if (fi.Length > iMaxLogLength) // if the log file length is already too long
{
int TotalLines = 0;
var file = File.ReadAllLines(strFileName);
var LineArray = file.ToList();
var AmountToCull = (int)(LineArray.Count - KeepLines);
var trimmed = LineArray.Skip(AmountToCull).ToList();
File.WriteAllLines(strFileName, trimmed);
string archiveName = strFileName + "-" + DateTime.Now.ToString("MM-dd-yyyy") + ".zip";
File.WriteAllBytes(archiveName, Compression.Zip(string.Join("\n", file)));
}
}
catch (Exception ex)
{
Console.WriteLine("Failed to write to logfile : " + ex.Message);
}
}
I have this as part of the initialization / reinitialization section of my application, so it gets run a few times a day.
ErrorLogging.ManageLogs("Application.log");
I was looking through the win32 api, and I'm not even sure it's possible to do this with native win32 vfs calls, nevermind through .Net.
About the only solution I would have would be to use memory-mapped files and move the data manually, which .Net seems to support as of .Net 4.0.
Memory Mapped Files

Windows service of OpenPop exit from GetMessage code, same code OK win app

We are implementing the code of downloading the attachment of Gmail A.C using
openpop3 namespace. In this code we are checking the attachment size if attachment
size is greater than specify value (value set in config file in kb).then it has to
send email to sender ....
It works fine in windows application but whenever I implement code
in Window service it is getting a problem. It exit function from this line of code
OpenPop.Mime.Message m = popClient.GetMessage(i);
Framework:3.5
V.S:2008
Language#
Open POP namespace V2.0.4.369
This is my code
OpenPop.Mime.Message m = popClient.GetMessage(i);
private void ReceiveMails()
{
Utility.Log = true;
if (popClient.Connected)
{
popClient.Disconnect();
}
popClient.Connect(POPServer, port, ssl);
popClient.Authenticate(username, password);
int Count = popClient.GetMessageCount();
writeToLogFile("Total Mail count is:" + Count.ToString());
if (Count > 0)
{
for (int i = 1; i <= Count; i++)
{
flag = false;
OpenPop.Mime.Message m = popClient.GetMessage(i);
Sub = m.Headers.Subject;
int size = popClient.GetMessageSize(i);
int mailsize = int.Parse(ConfigurationSettings.AppSettings
["emailSize"]) * 1024;
if (size < mailsize)
{
//we are checking the sub of Email
for (int j = 1; j < 30; j++)
{
strFranchisekey = ConfigurationSettings.AppSettings
["Franchise" + j];
if (strFranchisekey != "")
{
int inex = strFranchisekey.IndexOf("=");
strFranchiseshortvalue = strFranchisekey.Substring
(0, inex);
if (Sub.Contains(strFranchiseshortvalue))
{
flag = true;
foreach (OpenPop.Mime.MessagePart attachment in
m.FindAllAttachments())
{
writeToLogFile(attachment.FileName);
file = attachment.FileName;
index = strFranchisekey.IndexOf("=");
string StrCity = strFranchisekey.Substring
(index + 1);
strFolderPath =
(ConfigurationSettings.AppSettings["FolderPath" +StrCity]);
StrSubFolderPath =
(ConfigurationSettings.AppSettings["SubPath" + StrCity]);
if (Directory.Exists(strFolderPath))
//we are checking folder exists or not ?
{
File.WriteAllBytes(strFolderPath + "\\"
+ file, attachment.Body);
}
//
else if (Directory.Exists
(StrSubFolderPath))
{
File.WriteAllBytes(StrSubFolderPath +
"\\" + file, attachment.Body);
}
else
{
//we can give here invalid path.
File.WriteAllBytes
(ConfigurationSettings.AppSettings["InvalidPath"] + "\\" + file, attachment.Body);
sendEmail(i);
}
}
break;
}
}
}
if (flag != true)
{
writeToLogFile("matching franchise name is not found");
foreach (OpenPop.Mime.MessagePart attachment in
m.FindAllAttachments())
{
File.WriteAllBytes
(ConfigurationSettings.AppSettings["InvalidPath"] + "\\" + file, attachment.Body);
}
sendEmail(i);
}
}
}
else
{
writeToLogFile("Please reduce the email size");
}
}
else
{
writeToLogFile("No New Attachment");
}
}
Thanks #Antonio Bakula I wrote try catch block in my windows service app and logging it. Then I understood my bug and it gave exception, can't read message another instance is already reading. This was because the code is timer-based and
I was firing event after every 1 minute. Now I added code to stop timer as soon as it starts email processing and start timer once it finishes email processing code.

Move files in C#

I am moving some images (filenames are(1).PNG, (2).PNG and so on) from one directory to another. I am using the following code:
for (int i = 1; i < n; i++)
{
try
{
from = "E:\\vid\\(" + i + ").PNG";
to = "E:\\ConvertedFiles\\" + i + ".png";
File.Move(from, to); // Try to move
Console.WriteLine("Moved"); // Success
}
catch (IOException ex)
{
Console.WriteLine(ex); // Write error
}
}
However, I am getting the following error:
A first chance exception of type System.IO.FileNotFoundException occurred in mscorlib.dll
System.IO.FileNotFoundException: Could not find file 'E:\vid\(1).PNG'.
Also, I am planning to rename the files so that the converted file name will be 00001.png, 00002.png, ... 00101.png and so on.
I suggest you to use '#' in order to escape slashes in a more readable way. Use also Path.Combine(...) in order to concatenate paths and PadLeft in order to have your filenames as your specifics.
for (int i = 1; i < n; i++)
{
try
{
from = System.IO.Path.Combine(#"E:\vid\","(" + i.ToString() + ").PNG");
to = System.IO.Path.Combine(#"E:\ConvertedFiles\",i.ToString().PadLeft(6,'0') + ".png");
File.Move(from, to); // Try to move
Console.WriteLine("Moved"); // Success
}
catch (IOException ex)
{
Console.WriteLine(ex); // Write error
}
}
Why don't you use something like this?
var folder = new DirectoryInfo(#"E:\vid\"));
if (folder.Exists)
{
var files = folder.GetFiles(".png");
files.toList().ForEach(f=>File.Move(from,to));
}
The exception means that the file E:\vid(1).PNG doesn't exist. Do you mean E:\vid1.PNG?
Use System.IO.Path class for constructing paths, it's better than concatenating strings. You don't have to worry about escapting backslashes.
I just ran this in Visual Studio. It worked.
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace ConsoleApplication2
{
class Program
{
static void Main()
{
int n = 3;
for (int i = 1; i < n; i++)
{
string from = "C:\\vid\\(" + i + ").PNG";
string to = "C:\\ConvertedFiles\\" + i + ".png";
{
try
{
File.Move(from, to); // Try to move
Console.WriteLine("Moved"); // Success
}
catch (System.IO.FileNotFoundException e)
{
Console.WriteLine(e); // Write error
}
}
}
}
}
}
Maybe when you were moving files into vid directory to begin the test, windows shaved off the parenthesis. (1).png became 1.png... I got a file not found error from that phenomenon... otherwise, your code is solid. My version is almost identical.
i.ToString()
might help you. You are passing
from = "E:\\vid\\(" + i + ").PNG";
to = "E:\\ConvertedFiles\\" + i + ".png";
I as integer and concatenation doesn't work due to that
and instead of using \\, add # like this
from = #"E:\vid\(" + i + ").PNG";
var folder = new DirectoryInfo(sourcefolder);
if (folder.Exists)
{
var files = folder.GetFiles("*.png");
files.ToList().ForEach(f => File.Move(sourcefolder + f, newFolderName + f));
}
I believe this will help.

Categories

Resources