I have got an odd issue in an app I am writing.
It reads a master xml config file, creates a local copy for the user then performs some actions as laid out in the xml file. It checks to see if the local xml file needs updating so it doesn't need to do it every time.
Each of the actions are run in a new thread, and once all the threads have finished it writes a log and inform the user its finished.
One of the actions is to import a .reg file into the registry. I'm doing that like so:
Process regeditProcess = Process.Start("regedit.exe", "/s " + RegFilePath);
regeditProcess.WaitForExit();
regeditProcess.Close();
The issue I have is if the application creates a new config file, then the regedit process can take anything up to 30 seconds. If it doesn't create one then it finishes within a second.
In the loop that checks the threads have finished, I have put an
Application.DoEvents();
If I don't do this, the regedit process will run, and if I wait long enough I get a ContextSwitchDeadlock was detected error from the debugger. I have tried using different .reg file and only running this one action but still the same.
The creation of the xml file takes place in the main worker thread before any of the actions are attempted, but I'm sure this is having some effect on it, I'm just not sure what.
I know its a bit of a strange scenario, but has anyone hit something like this before?
Are you explicitly closing the file writer object after creating the config file? If not, you may be locking the file, preventing access from your worker.
I've worked out what the issue was, I had the STAThread directive in Program.cs. Once i took this out it worked a treat.
STAThread and multithreading
http://ilvyanyatka.spaces.live.com/blog/cns!EA0C02AB2E2FCFAC!193.entry?wa=wsignin1.0&sa=143328961
Related
I am looking for an idea on how I can launch a C# process based on something happening on a Windows server. My first challange is to determine when to start the first process. It needs to monitor a SFTP folder to see if a certain file type has been delivered. My initial thought was to have the task scheduler start a Perl script, have the script look to see if the file exists and then start the process. But once it has started the process, I don't want it to look for the file till the next day.
The second issue is that the first process moves files to another folder and then a third party application will start converting these files from PDFs to TEXT. The second process needs to start when this is done. I am not sure how to make this happen.
Thoughts??????
Write a windows service which uses a filewatcher to monitor for new files. https://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher(v=vs.110).aspx
That can then use File.Move to move the file out and into the alternate directory for further processing. https://msdn.microsoft.com/en-us/library/system.io.file.move(v=vs.110).aspx
I would use a Task for this and a task.continuewith to kick off the next 'stage' of your workflow, etc. Might also want to do a file COPY first, then a file delete (instead of a move, that way if something screws up during the copy you still have your original to work with).
I'm using a FileSystemWatcher to watch a directory. I created a _Created() event handler to fire when a file is moved to this folder. My problem is the following:
The files in this directory get created when the user hits a "real life button" (a button in our stock, not in the application). The FileSystemWatcher take this file, do some stuff in the system and then delete it. That wouldn't be a problem when the application runs only once. But it is used by 6 clients. So every application on every client is trying to delete it. If one client is too slow, it will throw an exception because the file is already deleted.
What I'm asking for is: Is there a way to avoid this?
I tried using loops and check if the file still exists, but without any success.
while (File.Exists(file))
{
File.Delete(file);
Thread.Sleep(100);
}
Can someone give me a hint how it could probably work?
Design
If you want a file to be processed by a single instance only (for example, the first instance that reacts gets the job), then you should implement a locking mechanism. Only the instance that is able to obtain a lock on the file is allowed to process and remove it, all other instances should skip the file.
If you're fine with all instances processing the file, and only care that at least one of them succeeds, then you need to figure out which exceptions indicate a genuine failure and which ones indicate a failure caused by the actions of another instance.
Locking
To 'lock' a file, you can open it with share-mode FileShare.None. This prevents other processes from opening it until you close the file. However, you'll then need to close the file before you can delete it, which leaves a small gap during which another instance could open the file.
A better solution is to create a separate lock file for that purpose. Create it with file-mode FileMode.Create and share-mode FileShare.None and keep it open until the whole process is finished, including the removal of the processed file. Then the lock file can be closed and optionally removed.
Exception
As for the UnauthorizedAccessException you got, according to the documentation, that means one of 4 things:
You don't have the required permission
The file is an executable file that is in use
The path is a directory
The file is read-only
1 and 4 seem most likely in this case (if the file was open in another process you'd get an IOException).
If you want to synchronize access between multiple clients on the same computer you should use a Named Mutex.
I have essentially two programs:
main.exe
update.exe
Update creates a flag file (update.inprogress) so that main cannot run while the update is in progress.
If main opens and that file exists, it immediately exits to prevent a program in use conflict.
I'm only having one issue. If the update is in process, the main program closes without and reason when they try to go in. I need to tell them the program is updating to keep them from calling us that the world has come to an end...
My question is, how can I issue a message that the update is in progress without tying up the main.exe? If I issue it from main.exe, then it will be in use and cannot be updated.
I was thinking of opening up notepad and putting a message in there but that just seems like a bad way of doing it.
I could also create another exe that only displays this message, but, if I have to update it, it will be in use too.. kind of defeats my purpose.
Anyone have a better idea?
Clarification:
This is a peer-to-peer network. The update could be run on workstation XYZ and someone could attempt to get into the main.exe at workstation ABC. This is why I am using a flag file. I have to way to check the process running on another workstation.
I assume that when update.exe runs, it does not need to update itself? If that is the case, you can modify update.exe to invoke main.exe if no updates are necessary.
For instance, if an update is necessary(you can accomplish this via a adding a version number to your main.exe and checking it), update.exe will create your update.inprogress file and run the updates. Then if another instance of update.exe runs, it will see the update.inprogress file and alert the user that update is in progress and terminate itself without tying up main.exe. If update.exe runs when no updates are necessary and update.inprogress does not exist, it will invoke main.exe programmatically.
I would suggest to create a thread from your update.exe to check for the existence of your main.exe process. In case it shows up, alert the user with a message from your update.exe.
I am working on an app that will keep a running index of work in accomplished.
I could write once at the end of a work session, but I don't want to risk losing data if something blows up. Therefore, I rewrite to disk (XML) every time a new entry or a correction is made by the user.
private void WriteIndexFile()
{
XmlDocument IndexDoc
// Build document here
XmlTextWriter tw = new XmlTextWriter(_filePath, Encoding.UTF8);
tw.Formatting = Formatting.Indented;
IndexDoc.Save(tw);
}
It is possible for the writes to be triggered in rapid succession. If this happens, it tries to open the file for writing before the prior write is complete. (While it would not be normal, I suppose it is possible that the file gets opened for use by another program.)
How can I check if the file can be re-written?
Edit for clarification: This is part of an automated lab data collection system. The users will click a button to capture data (saved in separate files), and identify the sub-task the the data package is for. Typically, it will be 3-10 minutes between clicks.
If they make an error, they need to be able to go back and correct it, so it's not an append-only usage.
Finally, the files will be read by other automated tools and manually by humans. (XML/XSLT)
The size will be limited as each work session (worker shift or less) will have a new index file generated.
Further question: As the overwhelming consensus is to not use XML and write in an append-only mode, how would I solve the requirement of going back and correcting earlier entries?
I am considering having a "dirty" flag, and save a few minutes after the flag is set and upon closing the work session. If multiple edits happen in that time, only one write will occur - no more rapid user - also have a retry/cancel dialog if the save fails. Thoughts?
XML is a poor choice in your case because new content has to be inserted before the closing tag. Use Text istead and simply open the file for append and write the new content at the end of the file, see How to: Open and Append to a Log File.
You can also look into a simple logging framework like log4net and use that instead of handling the low level file stuff urself.
If all you want is a simple log of all operations, XML may be the wrong choice here as it is difficult to append to an XML document without rewriting the whole file, which will become slower and slower as the file grows.
I'd suggest instead File.AppendText or even better: keeping the file open for the duration of the aplication's life time and using WriteLine.
(Oh, and as others have pointed out, you need to lock to ensure that only one thread writes to the file at a time. This is still true even with this solution.)
There are also logging frameworks that already solve this problem, such as log4net. Have you considered using an existing logging framework instead of rolling your own?
I have a logger that uses System.Collections.Queue. Basically it waits until something is queued then trys to write it. While writing items, which could be slow, more items could be added to the queue.
This will also help in just grouping messages rather than trying to keep up. It is running on a separate thread.
private AutoResetEvent ResetEvent { get; set; }
LogMessage(string fullMessage)
{
this.logQueue.Enqueue(fullMessage);
// Trigger the Reset Event to send the
this.ResetEvent.Set();
}
private void ProcessQueueMessages()
{
while (this.Running)
{
// This will process all the items in the queue.
while (this.logQueue.Count > 0)
{
// This method will just log the top item on the queue
this.LogQueueItem();
}
// Once the queue is empty will wait for a
// another message to queueed before running again.
// Rather than sleeping and checking if the queue is full,
// saves from doing a System.Threading.Thread.Sleep(1000); stuff
this.ResetEvent.WaitOne();
}
}
I handle write failures but not dequeueing until it wrote to the file with no errors. Then I just keep attempting until it finally can write. This has saved me because somebody removed permissions from one of our apps during it process. Permission was given back with out shutting down our app, and we didn't lose a single log statement.
Consider using a flat text file. I have a process that I wrote that uses an XML log... it was a poor choice. You can't just write out the state as you run without having to constantly rewrite the file to make sure the tags are correct. If it was flat entries written to a file you could have an automatic timeline that could give you details of what happened without trying to figure out if it was the XML writer/tag set that blew up and you don't have to worry about your logs bloating out as much.
I agree with others suggesting you avoid XML. Also, I would suggest you have one component (a "monitor") that is responsible for all access to the file. That component will have the job of handling multiple simultaneous requests and making the disk writes happen one after another.
Is there a built in method for waiting for a file to be created in c#? How about waiting for a file to be completely written?
I've baked my own by repeatedly attempting File.OpenRead() on a file until it succeeds (and failing on a timeout), but spinning on a file doesn't seem like the right thing to do. I'm guessing there's a baked-in method in .NET to do this, but I can't find it.
What about using the FileSystemWatcher component ?
This class 'watches' a given directory or file, and can raise events when something (you can define what) has happened.
When creating a file with File.Create you can just call the Close Function.
Like this:
File.Create(savePath).Close();
FileSystemWatcher can notify you when a file is created, deleted, updated, attributes changed etc. It will solve your first issue of waitign for it to be created.
As for waiting for it to be written, when a file is created, you can spin off and start tracking it's size and wait for it stop being updated, then add in a settle time period, You can also try and get an exclusive lock but be careful of locking the file if the other process is also trying to lock it...you could cause unexpected thigns to occur.
FileSysWatcher cannot monitor network paths. In such instances, you manually have to "crawl" the files in a directory -- which can result in the above users error.
Is there an alternative, so that we can be sure we don't open a file before it has been fully written to disk and closed?