Read Event Log From Newest to Oldest - c#

I have written a short program to establish the uptime for remote PC's using the event log messages which are posted at startup and shutdown. Currently the logic is :
foreach (eventlogentry)
{
if (entryTime > OldestTime)
{
if (entry = Startup)
{
addOnTime(entry.Time);
}
if (entry = Shutdown)
{
addOffTime(entry.Time);
}
}
}
"OldestTime" define how far to scan backwards in time....
I would like to know if there is anyway to easily ammend my program to read the events from newest to oldest?
It's reading remote event logs and its taking a while for this function to run, as it starts at the end and reads forward.
I know this because I added a "else" block to the first "if" to break out of the foreach block, if the entry isnt within the timespan we are looking for and the program stop's at the first event it reads.

It has been a while since you asked this question, but i ran into the same problem and found a solution.
using System.Diagnostics;
using System.Linq;
EventLog events = new EventLog("Application", System.Environment.MachineName);
foreach (EventLogEntry entry in events.Entries.Cast<EventLogEntry>().Reverse())
{
//Do your tasks
}
This awnser still is not as fast as to just enumerate it forward but it is a bit more elegant than using a loop to copy items to a list.
#leinad13, for your application you need to change System.Environment.MachineName to a string with the name of the computer you want the events from and change "Application" to the log you want to see. I think "System" in your case.

Your best solution may be to move the data into a list and then reverse the order of that. Something like the following:
EventLog eventLog = new EventLog();
eventLog.Log = myEventLog;
var eventList = new List<EventLogEntry>();
foreach(EventLogEntry entry in eventLog.Entries)
{
eventList.Add(entry);
}
eventList.Reverse();
That should get the data in the reverse order, i.e. latest first, and then you can just process it as before but this time exit the loop when you hit a date before the oldest time.
Its not an ideal solution as you are still processing the whole log but it may be worth trying out to see if you get an improvement in performance

Related

detecting that a file is currently being written to

(I know It's a common problem but I couldn't find an exact answer)
I need to write a windows service that monitors a directory, and upon the arrival of a file, opens it, parses the text, does something with it and moves it to another directory afterwards. I used IsFileLocked method mentioned in this post to find out if a file is still been written. My problem is that I don't know how much it takes for another party to complete writing into the file. I could wait a few seconds before opening the file but this is not a perfect solution since I don't know in which rate is the file written to and a few seconds may not suffice.
here's my code:
while (true)
{
var d = new DirectoryInfo(path);
var files = d.GetFiles("*.txt").OrderBy(f => f);
foreach (var file in files)
{
if (!IsFileLocked(file))
{
//process file
}
else
{
//???
}
}
}
I think you might use a FileSystemWatcher (more info about it here: http://msdn.microsoft.com/it-it/library/system.io.filesystemwatcher(v=vs.110).aspx ).
Specificially you could hook to the OnChanged event and after it raises you can check IsFileLocked to verify if it's still being written or not.
This strategy should avoid you to actively wait through polling.

Better Technique: Reading Data in a Thread

I've got a routine called GetEmployeeList that loads when my Windows Application starts.
This routine pulls in basic employee information from our Active Directory server and retains this in a list called m_adEmpList.
We have a few Windows accounts set up as Public Profiles that most of our employees on our manufacturing floor use. This m_adEmpList gives our employees the ability to log in to select features using those Public Profiles.
Once all of the Active Directory data is loaded, I attempt to "auto logon" that employee based on the System.Environment.UserName if that person is logged in under their private profile. (employees love this, by the way)
If I do not thread GetEmployeeList, the Windows Form will appear unresponsive until the routine is complete.
The problem with GetEmployeeList is that we have had times when the Active Directory server was down, the network was down, or a particular computer was not able to connect over our network.
To get around these issues, I have included a ManualResetEvent m_mre with the THREADSEARCH_TIMELIMIT timeout so that the process does not go off forever. I cannot login someone using their Private Profile with System.Environment.UserName until I have the list of employees.
I realize I am not showing ALL of the code, but hopefully it is not necessary.
public static ADUserList GetEmployeeList()
{
if ((m_adEmpList == null) ||
(((m_adEmpList.Count < 10) || !m_gotData) &&
((m_thread == null) || !m_thread.IsAlive))
)
{
m_adEmpList = new ADUserList();
m_thread = new Thread(new ThreadStart(fillThread));
m_mre = new ManualResetEvent(false);
m_thread.IsBackground = true;
m_thread.Name = FILLTHREADNAME;
try {
m_thread.Start();
m_gotData = m_mre.WaitOne(THREADSEARCH_TIMELIMIT * 1000);
} catch (Exception err) {
Global.LogError(_CODEFILE + "GetEmployeeList", err);
} finally {
if ((m_thread != null) && (m_thread.IsAlive)) {
// m_thread.Abort();
m_thread = null;
}
}
}
return m_adEmpList;
}
I would like to just put a basic lock using something like m_adEmpList, but I'm not sure if it is a good idea to lock something that I need to populate, and the actual data population is going to happen in another thread using the routine fillThread.
If the ManualResetEvent's WaitOne timer fails to collect the data I need in the time allotted, there is probably a network issue, and m_mre does not have many records (if any). So, I would need to try to pull this information again the next time.
If anyone understands what I'm trying to explain, I'd like to see a better way of doing this.
It just seems too forced, right now. I keep thinking there is a better way to do it.
I think you're going about the multithreading part the wrong way. I can't really explain it, but threads should cooperate and not compete for resources, but that's exactly what's bothering you here a bit. Another problem is that your timeout is too long (so that it annoys users) and at the same time too short (if the AD server is a bit slow, but still there and serving). Your goal should be to let the thread run in the background and when it is finished, it updates the list. In the meantime, you present some fallbacks to the user and the notification that the user list is still being populated.
A few more notes on your code above:
You have a variable m_thread that is only used locally. Further, your code contains a redundant check whether that variable is null.
If you create a user list with defaults/fallbacks first and then update it through a function (make sure you are checking the InvokeRequired flag of the displaying control!) you won't need a lock. This means that the thread does not access the list stored as member but a separate list it has exclusive access to (not a member variable). The update function then replaces (!) this list, so now it is for exclusive use by the UI.
Lastly, if the AD server is really not there, try to forward the error from the background thread to the UI in some way, so that the user knows what's broken.
If you want, you can add an event to signal the thread to stop, but in most cases that won't even be necessary.

Force loop containing asynchronous task to maintain sequence

Something tells me this might be a stupid question and I have in fact approached my problem from the wrong direction, but here goes.
I have some code that loops through all the documents in a folder - The alphabetical order of these documents in each folder is important, this importance is also reflected in the order the documents are printed. Here is a simplified version:
var wordApp = new Microsoft.Office.Interop.Word.Application();
foreach (var file in Directory.EnumerateFiles(folder))
{
fileCounter++;
// Print file, referencing a previously instantiated word application object
wordApp.Documents.Open(...)
wordApp.PrintOut(...)
wordApp.ActiveDocument.Close(...)
}
It seems (and I could be wrong) that the PrintOut code is asynchronous, and the application sometimes gets into a situation where the documents get printed out of order. This is confirmed because if I step through, or place a long enough Sleep() call, the order of all the files is correct.
How should I prevent the next print task from starting before the previous one has finished?
I initially thought that I could use a lock(someObject){} until I remembered that they are only useful for preventing multiple threads accessing the same code block. This is all on the same thread.
There are some events I can wire into on the Microsoft.Office.Interop.Word.Application object: DocumentOpen, DocumentBeforeClose and DocumentBeforePrint
I have just thought that this might actually be a problem with the print queue not being able to accurately distinguish lots of documents that are added within the same second. This can't be the problem, can it?
As a side note, this loop is within the code called from the DoWork event of a BackgroundWorker object. I'm using this to prevent UI blocking and to feedback the progress of the process.
Your event-handling approach seems like a good one. Instead of using a loop, you could add a handler to the DocumentBeforeClose event, in which you would get the next file to print, send it to Word, and continue. Something like this:
List<...> m_files = Directory.EnumerateFiles(folder);
wordApp.DocumentBeforeClose += ProcessNextDocument;
...
void ProcessNextDocument(...)
{
File file = null;
lock(m_files)
{
if (m_files.Count > 0)
{
file = m_files[m_files.Count - 1];
m_files.RemoveAt(m_files.Count - 1);
}
else
{
// Done!
}
}
if (file != null)
{
PrintDocument(file);
}
}
void PrintDocument(File file)
{
wordApp.Document.Open(...);
wordApp.Document.PrintOut(...);
wordApp.ActiveDocument.Close(...);
}
The first parameter of Application.PrintOut specifies whether the printing should take place in the background or not. By setting it to false it will work synchronously.

Monitoring a remote process

I have a method that stops a service(s) but I also need to delete the logs. Usually this is not a problem but the process can take a little bit of time before closing. Again, although the service appears stopped, the process does take additional time to close properly. Since the process is still running, I cannot delete the logs so I need to find a way to monitor the .exe to know when its safe to delete the logs.
so far my best option is a do while loop, unfortunately the first iteration of the delete statement throws an exception and stops the program.
do
{
// delete logs
}
while (System.Diagnostics.Process.GetProcessesByName(processName, machineName).Length > 0);
Im sure there is a simple solution but my lack of experience is the real problem.
This is probably not the best answer either, but you could invert the loop to:
while (System.Diagnostics.Process.GetProcessesByName(processName, machineName).Length > 0)
{
// delete log files.
}
I would suppose this would evalutate the condition of the loop before executing the contents. But according to your statements, this will not execute the code until the process has exited.
A hackish way around this is to perform a loop, and break out manually once the conditions:
bool CloseProcessOperation = true; // Control variable incase you want to abort the loop
while (CloseProcessOperation)
{
if (System.Diagnostics.Process.GetProcessesByName(processName, machineName).Length > 0) { break; }
// break if no logs exist
// break for some other condition
// etc
// delete logs
}

Do you know a Bulked/Batched Flows Library for C#

I am working on a project with peek performance requirements, so we need to bulk (batch?) several operations (for example persisting the data to a database) for efficiency.
However, I want our code to maintain an easy to understand flow, like:
input = Read();
parsed = Parse(input);
if (parsed.Count > 10)
{
status = Persist(parsed);
ReportSuccess(status);
return;
}
ReportFailure();
The feature I'm looking for here is automatically have Persist() happen in bulks (and ergo asynchronously), but behave to its user as if it's synchronous (user should block until the bulk action completes). I want the implementor to be able to implement Persist(ICollection).
I looked into flow-based programming, with which I am not highly familiar. I saw one library for fbp in C# here, and played a bit with Microsoft's Workflow Foundation, but my impression is that both are overkill for what I need. What would you use to implement a bulked flow behavior?
Note that I would like to get code that is exactly like what I wrote (simple to understand & debug), so solutions that involve yield or configuration in order to connect flows to one another are inadequate for my purpose. Also, chaining
is not what I'm looking for - I don't want to first build a chain and then run it, I want code that looks as if it is a simple flow ("Do A, Do B, if C then do D").
Common problem - instead of calling Persist I usually load up commands (or smt along those lines) into a Persistor class then after the loop is finished I call Persistor.Persist to persist the batch.
Just a few pointers - If you're generating sql the commands you add to the persistor can represent your queries somehow (with built-in objects, custom objects or just query strings). If you're calling stored procedures you can use the commands to append stuff to a piece of xml tha will be passed down to the SP when you call the persist method.
hope it helps - Pretty sure there's a pattern for this but dunno the name :)
I don't know if this is what you need, because it's sqlserver based, but have you tried taking a look to SSIS and or DTS?
One simple thing that you can do is to create a MemoryBuffer where you push the messages which simply add them to a list and returns. This MemoryBuffer has a System.Timers.Timer which gets invoked periodically and do the "actual" updates.
One such implementation can be found in a Syslog Server (C#) at http://www.fantail.net.nz/wordpress/?p=5 in which the syslog messages gets logged to a SQL Server periodically in a batch.
This approach might not be good if the info being pushed to database is important, as if something goes wrong, you will lose the messages in MemoryBuffer.
How about using the BackgroundWorker class to persist each item asynchronously on a separate thread? For example:
using System;
using System.Collections;
using System.Collections.Generic;
using System.ComponentModel;
using System.Threading;
class PersistenceManager
{
public void Persist(ICollection persistable)
{
// initialize a list of background workers
var backgroundWorkers = new List<BackgroundWorker>();
// launch each persistable item in a background worker on a separate thread
foreach (var persistableItem in persistable)
{
var worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(worker_DoWork);
backgroundWorkers.Add(worker);
worker.RunWorkerAsync(persistableItem);
}
// wait for all the workers to finish
while (true)
{
// sleep a little bit to give the workers a chance to finish
Thread.Sleep(100);
// continue looping until all workers are done processing
if (backgroundWorkers.Exists(w => w.IsBusy)) continue;
break;
}
// dispose all the workers
foreach (var w in backgroundWorkers) w.Dispose();
}
void worker_DoWork(object sender, DoWorkEventArgs e)
{
var persistableItem = e.Argument;
// TODO: add logic here to save the persistableItem to the database
}
}

Categories

Resources