c# saving settings while reading settings - c#

I have a method which save settings to file. This method is called if value of dateTimePicker changed. But I have in Form_Load loading settings -> I read value from file and assign it with dateTimePicker, but this call method save_settings (couse value changed). And in this moment is problem couse the file is open by program -> reading values and program wanna write to file changes...
How can I do that?

I think that you have critical section. There are plenty ways to deal with this issue . One way is to put Lock statement around file saving . This way one thread should wait after another thread is finished. But from understanding your question I think that problem is with your desing . As I understand you trying to read and write simultaneously . Maybe you should declare global bool variable isToSave . That will indicate when you can save . When working with file do not forget to use using statement to release file
Handle.

Use some kind of flag - set it when you start reading your config and unset it in finally block. When the flag is set, ignore calls to ValueChanged. Since you are loading config in OnLoad, there will be no other reasons for dateTimePicker's value change because you're in main UI thread and message pump is not pumping at the moment.

Related

FileSystemWatcher and write completion

I am implementing an event handler that must open and process the content of a file created by a third part application over which I have no control. I am warned by a note in "C# 4.0 in a nutshell" (page 495) about the risk to open a file before it is fully populated; so I am wondering how to manage this occurrence. To keep at minimum the load on the event handler, I am considering to have the handler simply insert in a queue the file names and then to have a different thread to manage the processing, but, anyways, how may I make sure that the write is completed and the file read is safe? The file size could be arbitrary.
Some idea? Thanks
A reliable way to achieve what you want might be to use FileSystemWatcher + NTFS USN journal.
Maybe more complicated than you expected, but FileSystemWatcher alone won't tell you for sure that the newly created file has been closed
-first, the FileSystemWatcher, to know when a file is created. From there you have the complete file path, and are 1 or 2 pinvokes away from getting the file unique ID (which can help you to track it during its whole lifetime).
-then, read the USN journal, which tracks everything that occurs on your drive. Filter on entries corresponding to your new file's ID, and read the journal until reaching the entry with the 'Close' event.
From there, unless your file is manipulated in special ways (opened and closed multiple times by the application that generates it), you can assume it is safe to read it and do whatever you wanted to do with it.
A really great C# implementation of an USN journal parser is StCroixSkipper's work, available here:
http://mftscanner.codeplex.com/
If you are interested I can give you more help about USN journal, as I use it in my project.
Our workaround is to watch for a specific extension. When a file is uploaded, the extension is ".tmp". When its done uploading, it's renamed to have the proper extension.
Another alternative is to have the server try to move the file in a try/catch block. If the fie isn't done being uploaded, the attempt to move the file will throw an exception, so we wait and try again.
Realistically, you can't know. If the other applications "write" operation is to open the file denying write access to everyone else then when it's done, close the file. When you get a notification then you could simply open the file requesting write access and if that fails, you know the operation isn't complete. But, if the "write" operation is to open the file, write, close the file, open the file again, and write again, etc., then you're pretty much out of luck.
The best solution I've seen is to set a timer after the last notification. When the timer elapses, try to open the file for write--if you can, assume the "operation" is done and do what you need to do. If the open fails, assume the operation is still in progress and wait some more.
Of course, nothing is foolproof. Despite the above, another operation could start while you're doing what you want with the file and cause interaction problems.

Issue Message without locking up exe

I have essentially two programs:
main.exe
update.exe
Update creates a flag file (update.inprogress) so that main cannot run while the update is in progress.
If main opens and that file exists, it immediately exits to prevent a program in use conflict.
I'm only having one issue. If the update is in process, the main program closes without and reason when they try to go in. I need to tell them the program is updating to keep them from calling us that the world has come to an end...
My question is, how can I issue a message that the update is in progress without tying up the main.exe? If I issue it from main.exe, then it will be in use and cannot be updated.
I was thinking of opening up notepad and putting a message in there but that just seems like a bad way of doing it.
I could also create another exe that only displays this message, but, if I have to update it, it will be in use too.. kind of defeats my purpose.
Anyone have a better idea?
Clarification:
This is a peer-to-peer network. The update could be run on workstation XYZ and someone could attempt to get into the main.exe at workstation ABC. This is why I am using a flag file. I have to way to check the process running on another workstation.
I assume that when update.exe runs, it does not need to update itself? If that is the case, you can modify update.exe to invoke main.exe if no updates are necessary.
For instance, if an update is necessary(you can accomplish this via a adding a version number to your main.exe and checking it), update.exe will create your update.inprogress file and run the updates. Then if another instance of update.exe runs, it will see the update.inprogress file and alert the user that update is in progress and terminate itself without tying up main.exe. If update.exe runs when no updates are necessary and update.inprogress does not exist, it will invoke main.exe programmatically.
I would suggest to create a thread from your update.exe to check for the existence of your main.exe process. In case it shows up, alert the user with a message from your update.exe.

How to detect block level changes in a file in C#?

How I may know which file is modified and what data is changed in the file?
Edit: I want to watch the file as it gets modified and then compare it against a previous version to know which data blocks are changed. I guess watching the file for changes can be accomplished by using file watcher API but I have no idea about the second part.
You may need the FileSystemWatcher class.
The most common approach is define FileSystemWatcher, subscribe to its events and process them accordingly to the logic of your application.
Here is a simple example.

rewriting the same file in rapid succession?

I am working on an app that will keep a running index of work in accomplished.
I could write once at the end of a work session, but I don't want to risk losing data if something blows up. Therefore, I rewrite to disk (XML) every time a new entry or a correction is made by the user.
private void WriteIndexFile()
{
XmlDocument IndexDoc
// Build document here
XmlTextWriter tw = new XmlTextWriter(_filePath, Encoding.UTF8);
tw.Formatting = Formatting.Indented;
IndexDoc.Save(tw);
}
It is possible for the writes to be triggered in rapid succession. If this happens, it tries to open the file for writing before the prior write is complete. (While it would not be normal, I suppose it is possible that the file gets opened for use by another program.)
How can I check if the file can be re-written?
Edit for clarification: This is part of an automated lab data collection system. The users will click a button to capture data (saved in separate files), and identify the sub-task the the data package is for. Typically, it will be 3-10 minutes between clicks.
If they make an error, they need to be able to go back and correct it, so it's not an append-only usage.
Finally, the files will be read by other automated tools and manually by humans. (XML/XSLT)
The size will be limited as each work session (worker shift or less) will have a new index file generated.
Further question: As the overwhelming consensus is to not use XML and write in an append-only mode, how would I solve the requirement of going back and correcting earlier entries?
I am considering having a "dirty" flag, and save a few minutes after the flag is set and upon closing the work session. If multiple edits happen in that time, only one write will occur - no more rapid user - also have a retry/cancel dialog if the save fails. Thoughts?
XML is a poor choice in your case because new content has to be inserted before the closing tag. Use Text istead and simply open the file for append and write the new content at the end of the file, see How to: Open and Append to a Log File.
You can also look into a simple logging framework like log4net and use that instead of handling the low level file stuff urself.
If all you want is a simple log of all operations, XML may be the wrong choice here as it is difficult to append to an XML document without rewriting the whole file, which will become slower and slower as the file grows.
I'd suggest instead File.AppendText or even better: keeping the file open for the duration of the aplication's life time and using WriteLine.
(Oh, and as others have pointed out, you need to lock to ensure that only one thread writes to the file at a time. This is still true even with this solution.)
There are also logging frameworks that already solve this problem, such as log4net. Have you considered using an existing logging framework instead of rolling your own?
I have a logger that uses System.Collections.Queue. Basically it waits until something is queued then trys to write it. While writing items, which could be slow, more items could be added to the queue.
This will also help in just grouping messages rather than trying to keep up. It is running on a separate thread.
private AutoResetEvent ResetEvent { get; set; }
LogMessage(string fullMessage)
{
this.logQueue.Enqueue(fullMessage);
// Trigger the Reset Event to send the
this.ResetEvent.Set();
}
private void ProcessQueueMessages()
{
while (this.Running)
{
// This will process all the items in the queue.
while (this.logQueue.Count > 0)
{
// This method will just log the top item on the queue
this.LogQueueItem();
}
// Once the queue is empty will wait for a
// another message to queueed before running again.
// Rather than sleeping and checking if the queue is full,
// saves from doing a System.Threading.Thread.Sleep(1000); stuff
this.ResetEvent.WaitOne();
}
}
I handle write failures but not dequeueing until it wrote to the file with no errors. Then I just keep attempting until it finally can write. This has saved me because somebody removed permissions from one of our apps during it process. Permission was given back with out shutting down our app, and we didn't lose a single log statement.
Consider using a flat text file. I have a process that I wrote that uses an XML log... it was a poor choice. You can't just write out the state as you run without having to constantly rewrite the file to make sure the tags are correct. If it was flat entries written to a file you could have an automatic timeline that could give you details of what happened without trying to figure out if it was the XML writer/tag set that blew up and you don't have to worry about your logs bloating out as much.
I agree with others suggesting you avoid XML. Also, I would suggest you have one component (a "monitor") that is responsible for all access to the file. That component will have the job of handling multiple simultaneous requests and making the disk writes happen one after another.

Waiting for a file to be created in C#

Is there a built in method for waiting for a file to be created in c#? How about waiting for a file to be completely written?
I've baked my own by repeatedly attempting File.OpenRead() on a file until it succeeds (and failing on a timeout), but spinning on a file doesn't seem like the right thing to do. I'm guessing there's a baked-in method in .NET to do this, but I can't find it.
What about using the FileSystemWatcher component ?
This class 'watches' a given directory or file, and can raise events when something (you can define what) has happened.
When creating a file with File.Create you can just call the Close Function.
Like this:
File.Create(savePath).Close();
FileSystemWatcher can notify you when a file is created, deleted, updated, attributes changed etc. It will solve your first issue of waitign for it to be created.
As for waiting for it to be written, when a file is created, you can spin off and start tracking it's size and wait for it stop being updated, then add in a settle time period, You can also try and get an exclusive lock but be careful of locking the file if the other process is also trying to lock it...you could cause unexpected thigns to occur.
FileSysWatcher cannot monitor network paths. In such instances, you manually have to "crawl" the files in a directory -- which can result in the above users error.
Is there an alternative, so that we can be sure we don't open a file before it has been fully written to disk and closed?

Categories

Resources