timeout if method takes too long to finish - c#

I am trying to timeout and throw an exception after waiting a specified amount of time, and am wondering if the way I'm currently doing it is best.
class Timeout
{
XmlDocument doc;
System.Object input;
public Timeout(XmlDocument doc, System.Object input)
{
this.doc = doc;
this.input = input;
}
public void run()
{
if (input is Stream)
{
doc.Load((Stream)input);
}
else if (input is XmlReader)
{
doc.Load((XmlReader)input);
}
else if (input is TextReader)
{
doc.Load((TextReader)input);
}
else
{
doc.Load((string)input);
}
System.Threading.Thread.CurrentThread.Abort();
}
}
private void LoadXmlDoc(XmlDocument doc, System.Object input)
{
Timeout timeout = new Timeout(doc, input);
System.Threading.Thread timeoutThread = new System.Threading.Thread(new ThreadStart(timeout.run));
timeoutThread.Start();
System.Threading.Thread.Sleep(this.timeout * 1000);
if (timeoutThread.IsAlive)
{
throw new DataSourceException("timeout reached", timeout.GetHashCode());
}
}
This current approach does work, so I'm just wondering if there's a simpler/better way to go about accomplishing the same thing.

In addition to my comment (here's the link from it), here is some more information regarding threading. Basically it comes down to learning the different designs/libraries, what their pros and cons are, which one suits your needs best.
From my understanding, and hopefully someone with more knowledge on the subject will pitch in on this, there are basically 2 different categories that you can put the threading designs in, synchronous and asynchronous. You have used the asynchronous design, employing the thread pool. If you want to stick with this design, you can try using the Task or, for synchronous operations Parallel.
On a side note: I'm not sure as to the wisdom behind using an exception to handle simple logic. In other words, the exception could be simply replace with returning a boolean.

Try doing something like this:
try
{
var cts = new System.Threading.CancellationTokenSource();
cts.CancelAfter(TimeSpan.FromSeconds(0.01));
var tw = Task.Run<System.Xml.XmlDocument>(() =>
{
var doc = new System.Xml.XmlDocument();
doc.Load("https://maps.googleapis.com/maps/api/geocode/xml?address=1+Exchange+Plaza+,+Floor+26+,+NY&sensor=false");
return doc;
}, cts.Token);
var xml = await tw;
}
catch (TaskCanceledException e)
{
}

What I ended up doing:
class Timeout
{
XmlDocument doc;
System.Object input;
public Timeout(XmlDocument doc, System.Object input)
{
this.doc = doc;
this.input = input;
}
public void run()
{
if (input is Stream)
{
doc.Load((Stream)input);
}
else if (input is XmlReader)
{
doc.Load((XmlReader)input);
}
else if (input is TextReader)
{
doc.Load((TextReader)input);
}
else
{
doc.Load((string)input);
}
}
}
private void LoadXmlDoc(XmlDocument doc, System.Object input)
{
Timeout timeout = new Timeout(doc, input);
System.Threading.Thread timeoutThread = new System.Threading.Thread(new ThreadStart(timeout.run));
timeoutThread.Name = "XmlWorker" + threadNumber++;
timeoutThread.Start();
if (!timeoutThread.Join(this.timeout)) //Join returning false implies the timeout was reached
{
if (timeoutThread.IsAlive)
timeoutThread.Abort();
throw new DataConnectionException("timeout reached: " + this.timeout.Milliseconds + "ms", new TimeoutException(this.timeout.Milliseconds));
}
}

Related

How to get parallel access to the method for multiple users

My telegram bot is necessary so that the user can answer questions in order and save these answers in the same order for a specific user in parallel.
static readonly ConcurrentDictionary<int, string[]> Answers = new ConcurrentDictionary<int, string[]>();
static void Main(string[] args)
{
try
{
Task t1 = CreateHostBuilder(args).Build().RunAsync();
Task t2 = BotOnMessage();
await Task.WhenAll(t1, t2);
}
catch (Exception ex)
{
Console.WriteLine("Error" + ex);
}
}
here is my BotOnMessage() method to receive and process messages from users
async static Task BotOnMessage()
{
int offset = 0;
int timeout = 0;
try
{
await bot.SetWebhookAsync("");
while (true)
{
var updates = await bot.GetUpdatesAsync(offset, timeout);
foreach (var update in updates)
{
var message = update.Message;
if (message.Text == "/start")
{
Registration(message.Chat.Id.ToString(), message.Chat.FirstName.ToString(), createdDateNoTime.ToString("yyyy-MM-dd HH:mm:ss.fff", CultureInfo.InvariantCulture));
var replyKeyboard = new ReplyKeyboardMarkup
{
Keyboard = new[]
{
new[]
{
new KeyboardButton("eng"),
new KeyboardButton("ger")
},
}
};
replyKeyboard.OneTimeKeyboard = true;
await bot.SendTextMessageAsync(message.Chat.Id, "choose language", replyMarkup: replyKeyboard);
}
switch (message.Text)
{
case "eng":
var replyKeyboardEN = new ReplyKeyboardMarkup
{
Keyboard = new[]
{
new[]
{
new KeyboardButton("choice1"),
new KeyboardButton("choice2")
},
}
};
replyKeyboardEN.OneTimeKeyboard = true;
await bot.SendTextMessageAsync(message.Chat.Id, "Enter choice", replyMarkup: replyKeyboardEN);
await AnonymEN();
break;
case "ger":
var replyKeyboardGR = new ReplyKeyboardMarkup
{
Keyboard = new[]
{
new[]
{
new KeyboardButton("choice1.1"),
new KeyboardButton("choice2.2")
},
}
};
replyKeyboardGR.OneTimeKeyboard = true;
await bot.SendTextMessageAsync(message.Chat.Id, "Enter choice", replyMarkup: replyKeyboardGR);
await AnonymGR();
break;
}
offset = update.Id + 1;
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error" + ex);
}
}
and AnonymEN() method for eng case in switch. The problem appears here when I call this method from switch case in BotOnMessage(). Until switch (message.Text) multiple users can asynchronously send messages and get response. When first user enters AnonymEN() second user can't get response from this method until first user will finish it till the end. Also I call BotOnMessage() in the end of AnonymEN() to get back for initial point with possibility to start bot again. For the ordered structure of questions and answers I used ConcurrentDictionary way from here Save user messages sent to bot and send finished form to other user. Any suggestion and solution how to edit code to make this bot available for multiple users at one time?
async static Task AnonymEN()
{
int offset = 0;
int timeout = 0;
try
{
await bot.SetWebhookAsync("");
while (true)
{
var updates = await bot.GetUpdatesAsync(offset, timeout);
foreach (var update in updates)
{
var message = update.Message;
int userId = (int)message.From.Id;
if (message.Type == MessageType.Text)
{
if (Answers.TryGetValue(userId, out string[] answers))
{
var title = message.Text;
if (answers[0] == null)
{
answers[0] = message.Text;
await bot.SendTextMessageAsync(message.Chat, "Enter age");
}
else
{
SaveMessage(message.Chat.Id.ToString(), "anonym", "anonym", "anonym", answers[0].ToString(), title.ToString(), createdDateNoTime.ToString("yyyy-MM-dd HH:mm:ss.fff", CultureInfo.InvariantCulture));
Answers.TryRemove(userId, out string[] _);
await bot.SendTextMessageAsync(message.Chat.Id, "ty for request click /start");
await BotOnMessage();
}
}
else if (message.Text == "choice1")
{
Answers.TryAdd(userId, new string[1]);
await bot.SendTextMessageAsync(message.Chat.Id, "Enter name");
}
}
offset = update.Id + 1;
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error" + ex);
}
}
I can see multiple issues with your code:
It is hard to read. While this is a personal preference I strongly advise to write short concise methods that have 1 responsibility. This will make it easier to understand and maintain your code. https://en.wikipedia.org/wiki/Single-responsibility_principle
Everything is static. This makes it very hard to keep track of any state such as language that should be tracked per user.
Using infinite loops and recursion with no escape. I highly doubt this was intended but you could get an infinite chain of calls like this BotOnMessage -> AnonymEN -> BotOnMessage -> AnonymEN. I think you want to exit the AnonymEN function using either a return, break or while(someVar) approach instead of calling the BotOnMessage function.
If two users are sending messages you get mixed responses. Example message flow user1: /start, user1: eng, user2: hello. The bot will now give an english response to user2. I'm sure this is not intended
The code below is a minimal example that addresses the issues I mentioned. It is not perfect code but should help you get started.
private Dictionaty<string, UserSession> userSessions = new ();
async Task BotOnMessage()
{
try
{
while(true)
{
var message = await GetMessage(timeout);
var userSession = GetUserSession(message.user);
userSession.ProcessMessage(message);
}
}
catch(){}
}
async void GetUserSession(string user)
{
if(!userSessions.HasKey(user))
{
userSessions[user](new Session());
}
return userSessions[user];
}
public class UserSession
{
public async Task ProcessMessage(message)
{
// Existing message processing code goes here.
// Do not use a loop or recursion.
// Instead track the state (e.g. languge) using fields.
}
}

StreamWriter keeps writing to the same file after instantiating it again for another file

I'm trying to implement a simple file-logger object, with the possibility to truncate the file when its size reaches a threshold.
I am using a StreamWriter, which gets written to at each call of the method Log(). In order to decide when to truncate, I am checking the StreamWriter.BaseStream.Length property before each write and, if it is bigger than the threshold, I close the StreamWriter, create a new file and open the StreamWriter on that file.
For example, if I set the threshold to 10Mb files, it will create a new file each 10Mb of written data.
Under normal load (let's say 3-4 seconds between calls to Log()), everything works as it should. However, the product which is going to use this logger will work with lots of data and required logging every 1 second, even less.
The problem is that the logger seems to completely ignore the creation of the new file(and opening the new stream), failing to truncate it and keeps writing to the existing stream.
I also tried to manually compute the stream's length, hoping it would be a problem with the stream, but it does not work.
I have found out that going step by step with the debugger makes it work correctly, but it does not solve my problem. Logging each second seems to make the program skip the UpdateFile() method entirely.
public class Logger
{
private static Logger _logger;
private const string LogsDirectory = "Logs";
private StreamWriter _streamWriter;
private string _path;
private readonly bool _truncate;
private readonly int _maxSizeMb;
private long _currentSize;
//===========================================================//
public static void Set(string filename, bool truncate = false, int maxSizeMb = 10)
{
if (_logger == null)
{
if (filename.Contains('_'))
{
throw new Exception("Filename cannot contain the _ character!");
}
if (filename.Contains('.'))
{
throw new Exception("The filename must not include the extension");
}
_logger = new Logger(filename, truncate, maxSizeMb);
}
}
//===========================================================//
public static void Log(string message, LogType logType = LogType.Info)
{
_logger?.InternalLog(message, logType);
}
//===========================================================//
public static void LogException(Exception ex)
{
_logger?.InternalLogException(ex);
}
//===========================================================//
private Logger(string filename, bool truncate = false, int maxSizeMb = 10)
{
_path = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, LogsDirectory, $"{filename}_{DateTimeToPrefix(DateTime.Now)}.log");
if (CheckForExistingLogs())
{
_path = GetLatestLogFilename();
}
_truncate = truncate;
_maxSizeMb = maxSizeMb;
_streamWriter = new StreamWriter(File.Open(_path, FileMode.Append, FileAccess.Write, FileShare.ReadWrite));
_currentSize = _streamWriter.BaseStream.Length;
}
//===========================================================//
private bool CheckForExistingLogs()
{
var directory = Path.GetDirectoryName(_path);
var filename = Path.GetFileNameWithoutExtension(_path);
if (filename.Contains('_'))
{
filename = filename.Split('_').First();
}
return new DirectoryInfo(directory).GetFiles().Any(x => x.Name.ToLower().Contains(filename.ToLower()));
}
//===========================================================//
private string GetLatestLogFilename()
{
var directory = Path.GetDirectoryName(_path);
var filename = Path.GetFileNameWithoutExtension(_path);
if (filename.Contains('_'))
{
filename = filename.Split('_').First();
}
var files = new DirectoryInfo(directory).GetFiles().Where(x => x.Name.ToLower().Contains(filename.ToLower()));
files = files.OrderBy(x => PrefixToDateTime(x.Name.Split('_').Last()));
return files.Last().FullName;
}
//===========================================================//
private void UpdateFile()
{
_streamWriter.Flush();
_streamWriter.Close();
_streamWriter.Dispose();
_streamWriter = StreamWriter.Null;
_path = GenerateNewFilename();
_streamWriter = new StreamWriter(File.Open(_path, FileMode.Append, FileAccess.Write, FileShare.ReadWrite));
_currentSize = _streamWriter.BaseStream.Length;
}
//===========================================================//
private string GenerateNewFilename()
{
var directory = Path.GetDirectoryName(_path);
var oldFilename = Path.GetFileNameWithoutExtension(_path);
if (oldFilename.Contains('_'))
{
oldFilename = oldFilename.Split('_').First();
}
var newFilename = $"{oldFilename}_{DateTimeToPrefix(DateTime.Now)}.log";
return Path.Combine(directory, newFilename);
}
//===========================================================//
private static string DateTimeToPrefix(DateTime dateTime)
{
return dateTime.ToString("yyyyMMddHHmm");
}
//===========================================================//
private static DateTime PrefixToDateTime(string prefix)
{
var year = Convert.ToInt32(string.Join("", prefix.Take(4)));
var month = Convert.ToInt32(string.Join("", prefix.Skip(4).Take(2)));
var day = Convert.ToInt32(string.Join("", prefix.Skip(6).Take(2)));
var hour = Convert.ToInt32(string.Join("", prefix.Skip(8).Take(2)));
var minute = Convert.ToInt32(string.Join("", prefix.Skip(10).Take(2)));
return new DateTime(year, month, day, hour, minute, 0);
}
//===========================================================//
private int ConvertSizeToMb()
{
return Convert.ToInt32(Math.Truncate(_currentSize / 1024f / 1024f));
}
//===========================================================//
public void InternalLog(string message, LogType logType = LogType.Info)
{
if (_truncate && ConvertSizeToMb() >= _maxSizeMb)
{
UpdateFile();
}
var sendMessage = string.Empty;
switch (logType)
{
case LogType.Error:
{
sendMessage += "( E ) ";
break;
}
case LogType.Warning:
{
sendMessage += "( W ) ";
break;
}
case LogType.Info:
{
sendMessage += "( I ) ";
break;
}
}
sendMessage += $"{DateTime.Now:dd.MM.yyyy HH:mm:ss}: {message}";
_streamWriter.WriteLine(sendMessage);
_streamWriter.Flush();
_currentSize += Encoding.ASCII.GetByteCount(sendMessage);
Console.WriteLine(_currentSize);
}
//===========================================================//
public void InternalLogException(Exception ex)
{
if (_truncate && ConvertSizeToMb() >= _maxSizeMb)
{
UpdateFile();
}
var sendMessage = $"( E ) {DateTime.Now:dd.MM.yyyy HH:mm:ss}: {ex.Message}{Environment.NewLine}{ex.StackTrace}";
_streamWriter.WriteLine(sendMessage);
_streamWriter.Flush();
_currentSize += Encoding.ASCII.GetByteCount(sendMessage);
}
}
Usage example:
private static void Main(string[] args)
{
Logger.Set("Log", true, 10);
while (true)
{
Logger.Log("anything");
}
}
Have you ever encountered such a problem before? How can it be solved? Thanks :)
I don't know how much data your application writes to the log each minute. But if the amount is more than 10MB then the method DateTimeToPrefix will return the same name for a second call inside a minute interval. (Well at least for me this is what happens with the code included in the Main method).
I changed the ToString() to include also the seconds and this gives correct amount of data written in the expected files.
private static string DateTimeToPrefix(DateTime dateTime)
{
return dateTime.ToString("yyyyMMddHHmmss");
}
Your code looks "fine", so I'm guessing the issue is being caused by multiple threads accessing the InternalLog method at once. Your code is not thread safe as it does not use any locking mechanisms. The easiest and probably totally sufficient solution for your project is to add a locking object at class level:
private readonly object _lock = new object();
then wrap your whole InternalLog method in a lock(_lock) statement:
public void InternalLog(string message, LogType logType = LogType.Info)
{
lock(_lock)
{
// your existing code
}
}
This is not a perfect solution and could cause a bottleneck especially as you are flushing the StreamWriter in every call to InternalLog. But for now it should be perfectly fine!

C# hanging code when XML from URL is interrupted

I asked this question yesterday.
Essentially, I'm trying to parse an XML from a URL but my code hangs forever if the connection is lost when attempting to read the XML.
I am still having the same problem, however I changed the code in a way I thought would prevent the program from freezing if the connection to the URL was interrupted. Could someone please explain why my solution didn't work and how I can fix it? Thanks!
Here are the two functions I am using. CanReach just checks the connection to make sure the URL is there, and GetTags gets all the parent tags of the XML file. I want it to break if the connection is interrupted. I tried to do this by loading the xml file instead of parsing it right from the URL and using try and catch to catch the error. xmlLocation is the URL.
public static bool CanReach(string xmlLocation)
{
WebRequest request = WebRequest.Create(xmlLocation);
request.Timeout = 1000;
try
{
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
response.Dispose();
request.Abort();
return true;
}
catch (System.Net.WebException)
{
request.Abort();
return false;
}
}
public static List<string> GetTopTags(string xmlLocation)
{
bool canBeReached = CanReach(xmlLocation);
if (canBeReached)
{
try
{
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.Load(xmlLocation);
XmlReader reader = new XmlNodeReader(xmlDoc);
List<string> dataList = new List<string>();
while (reader.Read())
{
switch (reader.NodeType)
{
case XmlNodeType.Text:
dataList.Add(reader.Name);
break;
}
}
reader.Dispose();
return topTags;
}
catch
{
return null;
}
}
else
{
return null;
}
}
We can start of another thread that is reading the XML file and the current method can continuously check whether connection is alive or not. If the we can not reach the URL anymore, we can cancel the operation and return null. If the read operation is finished, isFinished flag is set and we can return returnValue.
Try this:
public static List<string> GetTopTags(string xmlLocation)
{
bool canBeReached = CanReach(xmlLocation);
if (!canBeReached)
return null;
List<string> returnValue = null;
CancellationTokenSource cts = new CancellationTokenSource();
bool isFinished = false;
Task.Factory.StartNew(() =>
{
try
{
var xmlDoc = new XmlDocument();
xmlDoc.Load(xmlLocation);
using var reader = new XmlNodeReader(xmlDoc);
List<string> dataList = new List<string>();
while (reader.Read())
{
switch (reader.NodeType)
{
case XmlNodeType.Text:
dataList.Add(reader.Name);
break;
}
}
if (reader.ReadState == ReadState.Error)
returnValue = null;
else
returnValue = topTags;
}
catch
{
returnValue = null;
}
isFinished = true;
}, cts.Token, TaskCreationOptions.LongRunning, TaskScheduler.Default);
while (!isFinished)
if (!CanReach(xmlLocation))
cts.Cancel();
return returnValue;
}
I'm not wild about that XmlDocument.Load() and passing it a URL. You're handing over too much control and it's making it hard for you to debug. I would separate out the networking and the XML reading. With the networking separated out, you eliminate the need for your CanReach() function. Anything network related needs to be ran in a separate thread. Based on your source, something like the following is what I might start with.
public static async Task<List<string>> GetTopTags(string xmlLocation)
{
string xmlText = null;
try
{
// You are free to use WebRequest here, I've used WebClient for simplicity.
using (var webClient = new WebClient())
{
xmlText = await webClient.DownloadStringTaskAsync(xmlLocation);
}
}
catch (Exception)
{
// Handle network related issues.
}
if (string.IsNullOrWhiteSpace(xmlText))
{
// We weren't able to download the XML, or the downloaded XML is not valid,
// "CanReach()" is false.
return null;
}
// We downloaded the XML successfully if you get here, now just read it.
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.Load(new StringReader(xmlText));
using (XmlReader reader = new XmlNodeReader(xmlDoc))
{
List<string> dataList = new List<string>();
while (reader.Read())
{
switch (reader.NodeType)
{
case XmlNodeType.Text:
dataList.Add(reader.Name);
break;
}
}
return dataList;
}
}

Database async query and processing

Let me rephrase.
I have a method that generates strings(paths) after given a start string(path)
IF those paths are for a directory I want to enqueue that in the input of the method.
After processing the path synchronously, I want to get the Data and clone it async into multiple paths of a pipeline, were each path needs to get the datablock. So the Broadcastblock is out of the question (it cant send a blocking signal to the blocks before itself),
The joinblock, joining the results is relatively straight forward.
So to sum up
Is there a Block in Dataflow block, where i can access the inputqueue from the delegate, if when, how?
Is there a construct that acts like the broadcastblock but can block the blocks that came before it?
I tried doing it via almighty google:
class subversion
{
private static string repo;
private static string user;
private static string pw;
private static DateTime start;
private static DateTime end;
private static List<parserObject> output;
public static List<parserObject> svnOutputList
{
get {return output; }
}
private static List<string> extension_whitelist;
public async void run(string link, string i_user, string i_pw, DateTime i_start, DateTime i_end)
{
repo = link;
user = i_user;
pw = i_pw;
start = i_start;
end = i_end;
output = new List<parserObject>();
BufferBlock<string> crawler_que = new BufferBlock<string>();
BufferBlock<svnFile> parser_que = new BufferBlock<svnFile>();
var svn = crawl(crawler_que, parser_que);
var broadcaster = new ActionBlock<svnFile>(async file =>
{//tried to addapt the code from this ensure always send broadcastblock -> see link below
List<Task> todo = new List<Task>();
todo.Add(mLoc);//error cannot convert methodgroup to task
foreach (var task in todo)//error: Only assignment, call, increment, decrement, await, and new object expressions can be used as a statement?
{
task.SendAsync(file);//error cannot convert task to targetblock
}
await Task.WhenAll(todo.ToArray());
});
parser_que.LinkTo(broadcaster);
await Task.WhenAll(broadcaster, svn);//error cannot convert actionblock to task
}
private static async Task crawl(BufferBlock<string> in_queue, BufferBlock<svnFile> out_queue)
{
SvnClient client = new SvnClient();
client.Authentication.ForceCredentials(user, pw);
SvnListArgs arg = new SvnListArgs
{
Depth = SvnDepth.Children,
RetrieveEntries = SvnDirEntryItems.AllFieldsV15
};
while (await in_queue.OutputAvailableAsync())
{
string buffer_author = null;
string prev_author = null;
System.Collections.ObjectModel.Collection<SvnListEventArgs> contents;
string link = await in_queue.ReceiveAsync();
if (client.GetList(new Uri(link), arg, out contents))
{
foreach (SvnListEventArgs item in contents)
{
if (item.Entry.NodeKind == SvnNodeKind.Directory)
{
in_queue.Post(item.Path);
}
else if (item.Entry.NodeKind == SvnNodeKind.File)
{
try
{
int length = item.Name.LastIndexOf(".");
if (length <= 0)
{
continue;
}
string ext = item.Name.Substring(length);
if (extension_whitelist.Contains(ext))
{
Uri target = new Uri((repo + link));
SvnRevisionRange range;
SvnBlameArgs args = new SvnBlameArgs
{
Start = start.AddDays(-1),
End = end
};
try
{
svnFile file_instance = new svnFile();
client.Blame(target, args, delegate(object sender3, SvnBlameEventArgs e)
{
if (e.Author != null)
{
buffer_author = e.Author;
prev_author = e.Author;
}
else
{
buffer_author = prev_author;
}
file_instance.lines.Add(new svnLine(buffer_author, e.Line));
});
out_queue.Post(file_instance);
}
catch (Exception a) { Console.WriteLine("exception:" + a.Message);}
}
}
catch (Exception a)
{
}
}
}
}
}
}
private static async Task mLoc(svnFile file)
{
List<parserPart> parts = new List<parserPart>();
int find;
foreach (svnLine line in file.lines)
{
if ((find = parts.FindIndex(x => x.uploader_id == line.author)) > 0)
{
parts[find].count += 1;
}
else
{
parts.Add(new parserPart(line.author));
}
find = 0;
}
parserObject ret = new parserObject(parts, "mLoc");
await output.Add(ret);
return;
}
}
broadcastblock answer: Alternate to Dataflow BroadcastBlock with guaranteed delivery

C# WinForms App Maxing Processor But Doing Nothing Strenuous!

I have a netbook with 1.20Ghz Processor & 1GB Ram.
I'm running a C# WinForms app on it which, at 5 minute intervals, reads every line of a text file and depending on what the content of that line is, either skips it or writes it to an xml file. Sometimes it may be processing about 2000 lines.
When it begins this task, the processor gets maxed out, 100% use. However on my desktop with 2.40Ghz Processor and 3GB Ram it's untouched (for obvious reasons)... is there any way I can actually reduce this processor issue dramatically? The code isn't complex, I'm not bad at coding either and I'm not constantly opening the file, reading and writing... it's all done in one fell swoop.
Any help greatly appreciated!?
Sample Code
***Timer.....
#region Timers Setup
aTimer.Tick += new EventHandler(OnTimedEvent);
aTimer.Interval = 60000;
aTimer.Enabled = true;
aTimer.Start();
radioButton60Mins.Checked = true;
#endregion Timers Setup
private void OnTimedEvent(object source, EventArgs e)
{
string msgLoggerMessage = "Checking For New Messages " + DateTime.Now;
listBoxActivityLog.Items.Add(msgLoggerMessage);
MessageLogger messageLogger = new MessageLogger();
messageLogger.LogMessage(msgLoggerMessage);
if (radioButton1Min.Checked)
{
aTimer.Interval = 60000;
}
if (radioButton60Mins.Checked)
{
aTimer.Interval = 3600000;
}
if (radioButton5Mins.Checked)
{
aTimer.Interval = 300000;
}
// split the file into a list of sms messages
List<SmsMessage> messages = smsPar.ParseFile(smsPar.CopyFile());
// sanitize the list to get rid of stuff we don't want
smsPar.SanitizeSmsMessageList(messages);
ApplyAppropriateColoursToRecSMSListinDGV();
}
public List<SmsMessage> ParseFile(string filePath)
{
List<SmsMessage> list = new List<SmsMessage>();
using (StreamReader file = new StreamReader(filePath))
{
string line;
while ((line = file.ReadLine()) != null)
{
var sms = ParseLine(line);
list.Add(sms);
}
}
return list;
}
public SmsMessage ParseLine(string line)
{
string[] words = line.Split(',');
for (int i = 0; i < words.Length; i++)
{
words[i] = words[i].Trim('"');
}
SmsMessage msg = new SmsMessage();
msg.Number = int.Parse(words[0]);
msg.MobNumber = words[1];
msg.Message = words[4];
msg.FollowedUp = "Unassigned";
msg.Outcome = string.Empty;
try
{
//DateTime Conversion!!!
string[] splitWords = words[2].Split('/');
string year = splitWords[0].Replace("09", "20" + splitWords[0]);
string dateString = splitWords[2] + "/" + splitWords[1] + "/" + year;
string timeString = words[3];
string wholeDT = dateString + " " + timeString;
DateTime dateTime = DateTime.Parse(wholeDT);
msg.Date = dateTime;
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
Application.Exit();
}
return msg;
}
public void SanitizeSmsMessageList(List<SmsMessage> list)
{
// strip out unwanted messages
// list.Remove(some_message); etc...
List<SmsMessage> remove = new List<SmsMessage>();
foreach (SmsMessage message in list)
{
if (message.Number > 1)
{
remove.Add(message);
}
}
foreach (SmsMessage msg in remove)
{
list.Remove(msg);
}
//Fire Received messages to xml doc
ParseSmsToXMLDB(list);
}
public void ParseSmsToXMLDB(List<SmsMessage> list)
{
try
{
if (File.Exists(WriteDirectory + SaveName))
{
xmlE.AddXMLElement(list, WriteDirectory + SaveName);
}
else
{
xmlE.CreateNewXML(WriteDirectory + SaveName);
xmlE.AddXMLElement(list, WriteDirectory + SaveName);
}
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
Application.Exit();
}
}
public void CreateNewXML(string writeDir)
{
try
{
XElement Database = new XElement("Database");
Database.Save(writeDir);
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
}
public void AddXMLElement(List<SmsMessage> messages, string writeDir)
{
try
{
XElement Database = XElement.Load(writeDir);
foreach (SmsMessage msg in messages)
{
if (!DoesExist(msg.MobNumber, writeDir))
{
Database.Add(new XElement("SMS",
new XElement("Number", msg.MobNumber),
new XElement("DateTime", msg.Date),
new XElement("Message", msg.Message),
new XElement("FollowedUpBy", msg.FollowedUp),
new XElement("Outcome", msg.Outcome),
new XElement("Quantity", msg.Quantity),
new XElement("Points", msg.Points)));
EventNotify.SendNotification("A New Message Has Arrived!", msg.MobNumber);
}
}
Database.Save(writeDir);
EventNotify.UpdateDataGridView();
EventNotify.UpdateStatisticsDB();
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
}
public bool DoesExist(string number, string writeDir)
{
XElement main = XElement.Load(writeDir);
return main.Descendants("Number")
.Any(element => element.Value == number);
}
Use a profiler and/or Performance Monitor and/or \\live.sysinternals.com\tools\procmon.exe and/or ResourceMonitor to determine what's going on
If the 5 minute process is a background task, you can make use of Thread Priority.
MSDN here.
If you do the processing on a separate thread, change your timer to be a System.Threading.Timer and use callback events, you should be able to set a lower priority on that thread than the rest of your application.
Inside your ParseFile loop, you could try adding a Thread.Sleep and/or an Application.DoEvents() call to see if that helps. Its better to do this in the parsing is on a seperate thread, but at least you can try this simple test to see if it helps.
Might be that the MessageBoxes in your catches are running into cross-thread problems. Try swapping them out for writing to the trace output.
In any case, you've posted an entire (little) program, which will not help you get specific advice. Try deleting method bodies -- one at a time, scientifically -- and try to get the problem to occur/stop occurring. This will help you to locate the problem and eliminate the irrelevant parts of your question (both for yourself and for SO).
Your current processing model is batch based - do the parsing, then process the messages, and so on.
You'll likely reduce the memory overhead if you switched to a Linq style "pull" approach.
For example, you could convert your ParseFile() method in this way:
public IEnmerable<SmsMessage> ParseFile(string filePath)
{
using (StreamReader file = new StreamReader(filePath))
{
string line;
while ((line = file.ReadLine()) != null)
{
var sms = ParseLine(line);
yield return sms;
}
}
}
The advantage is that each SmsMessage can be handled as it is generated, instead of parsing all of the messages at once and then handling all of them.
This lowers your memory overhead, which is one of the most likely causes for the performance difference between your netbook and your desktop.

Categories

Resources