I work on an app where user can type in some text. Text is saved to XML file, I try to make the file save “on the fly” as user is typing so it saves instantly. However if data is typed quick, I get an error of “file currently in use”. How to overcome this issue?
The reason for the error is that you are trying to write a file while the previous write operation is incomplete and the file is still open for write.
Now, if you absolutely must make a write on every character change - I would put in a queue in place, so when XML content is changed - instead of writing to a file right away - agg a message to a queue. Then have the code that monitors that queue and only writes the next chnage once the previous write has finished.
You can try to put a flag to control if the file is already open or not. If this is open, you keep the text and don't write on XML, but if it is not you just write.
This is a concurrency problem, you can acess the website: https://www.oreilly.com/library/view/concurrency-in-c/9781491906675/ch01.html to get more options.
Related
I have an application that writes logs into a file created by NLog.
And I have another application called Log Viewer. It can open and read the log file mentioned above. But there is some problem here. While log viewer is reading the log file, and first application is writing in it , some of the log line can not be seen in log viewer.
for example if first application write log in every milisecond , log viewer can not track new log lines and miss some of them , you know ? I need an online log viewer that can track any new log lines. I do not want to read all text in file every method call, I need just to read new log line in it
The only way you can know that a log line is new is by knowing which position in the file you last read (eg. "int lastposition=0;".
You need to read from that position until the end of file. The position of "End of file" is the same as the filelength. When the file block has been read; you show what you read in the viewer, and save the last position into the variable lastposition; where you need to start next time, the viewer is reading.
But if the viewer can't open the file... that's an other story.
Having both applications share the same log is likely to be problematic. Probably the easiest solution is to have your viewer copy the original log file, and view it's own dedicated copy. You can occasionally check to see if the actual log file has updates and make new copies accordingly.
Having both access the same file will require locking, and risks causing issues in your application if the file is unavailable to write to (possibly blocking, losing log entries, or generating exceptions).
Best solution is to set up a NLog target for a database. Keep track of the last updated row is easier and safer than tracking the file position.
I wouldn't recommend sharing a file active log file both read and write.
How to set up NLog Database target.
I am new to programming and I came across a problem and I'm not sure how to deal with it.
I use the line
textBox2.Text = System.IO.File.ReadAllText(path);
To read from a text file and paste the contents in textBox2.
Now the issue is that the text file I'm try to read is a large (couple megabytes) text file. This text file contains logs from a program, new logs are always added at the bottom of the file.
Now I want to update textBox2 if the text file is updated. However I am not sure how to do this in an efficient way. One way is to just read the whole text file again, but since the text file is so big, this is a very slow process.
I am interested in finding out a different and faster way to handle this. I'm not really interested in the exact code, I just hoped to find out in what direction I should look and what options I can consider.
Well, two obvious things you could check:
The size of the file (FileInfo.Length)
The last write time (FileSystemInfo.LastWriteTimeUtc)
If you keep track of those, you should be able to detect when the file has changed - at least with a reasonable degree of confidence.
Additionally, you can use FileSystemWatcher to watch for changes.
Also, you might want to consider keeping track of where you've read to - so you could just read the new data, by seeking to the right place in the file.
Finally, a TextBox may not really be the best user interface for a huge log file. If this is a structured log file, it would be good to have that structure represented in the UI - for example, one row in a table per log entry, potentially with filtering options etc.
You can check every X seconds. If the file changed then update, if not, do nothing. You can keep the modification time of the file to know if it changed or not.
I'm reading the contents of an XML file and parsing that into an object model.
When I modify the values in the object model, then use the following code to save it back to the xml:
XElement optionXml = _panelElement.Elements("options").FirstOrDefault();
optionXml.SetAttributeValue("arming", value.ToString());
_document.Save(_fileName);
This works, as far as I can see, because when I close the application and restart it the values that I had saved are reflected in the object model next time I view it.
However, when I load the actual XML file, the values are still as they were originally.
Why is this? What do I need to do to save the actual XML file with the new values?
You are most likely experiencing file system virtualisation, which was introduced in Windows Vista.
Basically what this means is that you are saving your file, just not where you think you're saving it. For example, you might think that you are saving to C:\Program Files\Your App\yourFile.xml, but what is happening under the hood is that the OS is silently redirecting that to %APPDATA%\Your App\yourFile.xml. When you go to reload it, once again the OS silently redirects from that location.
This is a security measure designed to better encapsulate applications and their data and to prevent unauthorised writes to locations where damage can occur. You can still force a save to %PROGRAMFILES%\Your App, but to do that you either need to relax the ACLs applied to that folder, or you need to elevate the privilege level your application runs at.
I wasn't sure whether to put this as a comment or as an answer, but I think it could be a potential answer. It sounds like the XML file is being saved because the data is being persisted across instances of the application. It may be file system virtualization like slugster mentioned, but it might be a simple as the fact that you are looking at the wrong copy of the XML file. If you are using a relative path, the file may have been copied to the new location. I would suggest you do a quick file search for that file name and see what you get back.
It turns out the file was being copied to and read from the Output Directory. I can see that it's being updated as expected from there.
I'm developing a document based desktop app which writes a fairly large and complex file to disk when the user saves his document. What is the best practice to do here to prevent data corruption? There are a number of things that can happen:
The save process may fail half way, which is of course a serious application error, but in this case one would rather have the old file left than the corrupted half-written file. The same problem will occur if the application is terminated for some other reason half way through the file writing.
The most robust approach I can think of is using a temporary file while saving and only replace the original file once the new file has been successfully created. But I find there are several operations (creating tempfile, saving to tempfile, deleting original, moving tempfile to original) that may or may not fail, and I end up with quite a complicated mess of try/catch statements to handle them correctly.
Is there a best practice/standard for this scenario? For example is it better to copy the original to a temp file and then overwrite the original than to save to a temp file?
Also, how does one reason with the state of a file in a document based application (in windows)? Is it better to leave the file open for writing by the application until the user closes the document, or to just quickly get in an read the file on open and quickly close it again? Pros and cons?
Typically the file shuffling dance goes something like this, aiming to end up with file.txt containing the new data:
Write to file.txt.new
Move file.txt to file.txt.old
Move file.txt.new to file.txt
Delete file.txt.old
At any point you always have at least one valid file:
If only file.txt exists, you failed to start writing file.txt.new
If file.txt and file.txt.new exist, you probably failed during the write - file.txt should be the valid old copy. (If you can validate files, you could try loading the new file - it could be the move that failed)
If file.txt.old and file.txt.new exist, the second move operation failed. You can use either file, depending on whether you want new or old
If file.txt.old and file.txt exist, the delete operation failed. Again, you can use either file.
This is assuming you're on a file system with an atomic move operation. If that's not the case, I believe the procedure is the same but you'd need to be more careful about the recovery procedure.
Answering from the last question:
If we are talking here about fairly complex and big files, I would personaly choose to lock the file as during the reading I may not need to load all data on view, but only that one user needs now.
One first:
Save in temp file always.
Replace old one with new one, if this fails, considering the fact that your app is document management app, your primary objective failed, so the worst ever case, but you have new temp file. So on this error can close your app and reopen (critical error), on reopenning control if there is a temp file, if yes, run recovering of data, more or less like VS does in case of crashes.
Creating a temp file and then replacing the original file by the temp file (the latter being a cheap operation in terms of I/O) is the mechanism used by MFC's document persistence classes. I've NEVER seen it fail. Neither have users reported such problems. And yes back then the documents were large (they were complex as well but that's irrelevant as far as I/O is concerned).
I am developing a tool in c#, at one instance I start writing into a xml file continuously using my tool,when i suddenly restart my machine the particular xml file gets
corrupted, what is the reason an how to avoid it?
xmldocument x= new xmldocument();
x.open();
// change a value of the node every time
x.save();
x=null
this is my code
Use the "safe replace pattern". For example, to replace foo.txt
Write to foo.new
Move foo.txt to foo.old
Move foo.new to foo.txt
Delete foo.old
At any point, you have at least one complete, valid file.
(That helps if you want to write a new file periodically; for appending, I'd go with the answers suggesting that XML isn't the best way forward for you.)
Don't use XML.
XML has a syntax which doesn't lend itself well to writing continuously to the same file, as you always need a final end tag which you can't write unless the file is complete (which it never is with log files, for example).
That means you will always get an invalid XML file when you cancel the writing prematurely (by killing the process or restarting the computer, etc.).
We had a similar situation a while ago and settled on YAML as a nice format which allows for simply appending to the file.
Check that your file is properly closed before the application shuts down.
Also, as someone has pointed out, an XML file must be properly ended with closing tags.
Additional details would also be useful, such as the code that you use to open, write and close the file.
The reason for your file getting corrupted is that due to a crash, you never closed it.
I remember solving an issue like that once, with a file overlapping flag. But that was in C++ using method CreateFile.