We have a website application which acts as a form designer.
The form data is stored in XML file. Each form has it's own xml file.
When i edit a form, i basically recreate the XML file.
public void Save(Guid form_Id, IEnumerable<FormData> formData)
{
XDocument doc = new XDocument();
XElement formsDataElement = new XElement("FormsData");
doc.Add(formsDataElement);
foreach (FormData data in formData)
{
formsDataElement.Add(new XElement("FormData",
new XAttribute("Id", data.Id)
new XAttribute("Name", data.Name)
// other attributes
));
}
doc.Save(formXMLFilePath);
}
This works good, but i want to make sure that two users won't update at the same time the XML file. I want to lock it somehow.
How can i individually lock the save process for each file?
I could lock the Save function like below, but this will lock all the users, even if they save a different XML file.
private static readonly object _lock = new object();
public void Save(Guid form_Id, IEnumerable<FormData> formData)
{
lock(_lock)
{
XDocument doc = new XDocument();
foreach (FormData data in formData)
{
// Code
}
doc.Save(formXMLFilePath);
}
}
To prevent files being written simultaneously, the XDocument.Save method internally creates a FileStream with the following constructor:
new FileStream(
outputFileName,
FileMode.Create,
FileAccess.Write,
FileShare.Read,
0x1000,
UseAsync)
The option FileShare.Read prevents multiple writes occuring simultaneously. If you attempted to perform two writes simultaneously, you'd wind up with an IOException being thrown.
You might want to write some code to deal with this eventuality. This answer https://stackoverflow.com/a/50800/138578 should provide exactly how to handle a file being locked and re-trying until the file becomes available.
If your documents are small and many, you shouldn't run into this situation very often, if at all.
Try something like this:
FileStream file = new FileStream(formXMLFilePath,FileMode.Create,FileAccress.Write,FileShare.None);
try {
doc.Save(file);
} finally {
file.Close();
}
This uses a FileStream and we supply a FileShare mode. In our case we are exclusively locking the file so that only we can write to it.
However, this simply means that subsequent writes will fail as there is a lock.
To get around this problem I tend to have a queue and a separate thread who's only job is to process the queue and write to the file system.
However another solution is to simply try and access the file, and if that fails wait a little bit then try again. This would let you do something akin to lock where it waits until it can access and then proceeds:
private static bool IsLocked(string fileName)
{
if (!File.Exists(fileName)) return false;
try {
FileStream file = new FileStream(fileName,FileMode.Open,FileAccess.Write,FileShare.None);
try {
return false;
} finally {
file.Close();
}
} catch (IOException) {
return true;
}
}
public void Save(Guid form_Id, IEnumerable<FormData> formData)
{
while (IsLocked(formXMLFilePath)) Thread.Sleep(100);
FileStream file = new FileStream(formXMLFilePath,FileMode.Create,FileAccress.Write,FileShare.None);
try {
doc.Save(file);
} finally {
file.Close();
}
}
You did not closed your xml file connection after opening your xml file. So I think you need to first closed your connection at the end of your function.
you can use this
filename.Close();
I hope this will help you
Related
When I open an excel file, a hidden temporary file is generated in the same folder. I can open it with the TotalCommander Viewer, but I always get an IO exception when trying to open with powershell or c#.
new FileStream(#"D:\~$test.xlsx", FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
System.IO.IOException: 'The process cannot access the file 'D:~$test.xlsx' because it is being used by another process.'
So how can I get the content?
Unfortunately for some reason you can not open the direct file, so I suggest another method when you copy a the file to a temp file, then read it and finally you delete the temp file, this way you can read it, I suppose TotalCommander uses the same method for opening files in Viewer.
static void Main(string[] args)
{
CopyReadAndDelete(#"c:\Documents\~$test.xlsx");
}
static void CopyReadAndDelete(string filePath)
{
var tempFileFullPath = Path.Combine(Path.GetDirectoryName(filePath), Guid.NewGuid().ToString());
File.Copy(filePath, tempFileFullPath);
try
{
using (var sr = new StreamReader(tempFileFullPath))
{
Console.WriteLine(sr.ReadToEnd()); //or do anything with the content
}
}
finally
{
File.Delete(tempFileFullPath);
}
}
I intend to load an xml file using XDocument.Load(), update values of some elements, then save it using XDocument.Save(). Reading works fine, saving just won't work.
My code:
class XmlDocHandler
{
private string filePath;
private XDocument xmlDoc;
private IList<XElement> updatedElements;
public IEnumerable<XElement> Elements => xmlDoc.Descendants();
public IEnumerable<XElement> UpdatedElements => updatedElements;
public XmlDocHandler(string filePath)
{
this.filePath = filePath;
ReloadFromFile();
updatedElements = new List<XElement>();
}
public void UpdateElements(IEnumerable<XElement> newElements)
{
updatedElements = new List<XElement>();
foreach (XElement newElement in newElements)
{
XElement element = xmlDoc.Descendants()
.FirstOrDefault(x => x.Name.LocalName == newElement.Name.LocalName);
if (element != null)
{
if (element.Value != newElement.Value)
{
element.Value = newElement.Value;
updatedElements.Add(element);
}
}
}
}
public void ReloadFromFile()
{
bool success = false;
if (File.Exists(filePath))
{
try
{
xmlDoc = XDocument.Load(filePath);
success = true;
}
catch
{
}
}
if (!success)
{
xmlDoc = new XDocument();
}
}
public void WriteToFile()
{
xmlDoc.Save(filePath);
}
}
As far as I can tell, its a serialized set of operations, nothing parallel or other fancy stuff, that could block my file. I've found no indication that XDocument.Load("c:\file.xml") would create a lock on the file.
I've tried to replace the straight forward operations
xmlDoc = XDocument.Load(filePath);
and
xmlDoc.Save(filePath);
with the stream based approaches found here:
XDocument.Save() unable to access file
and here
c# xml.Load() locking file on disk causing errors
so that they look like this:
Loading..
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
xmlDoc = XDocument.Load(fs);
}
or
using (var sr = new StreamReader(filePath))
{
xmlDoc = XDocument.Load(sr);
}
and writing..
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Write, FileShare.Write))
{
xmlDoc.Save(fs);
}
No matter what I do, closing the streams properly and making sure the file isn't opened in any editor or otherwise used, the stupid file is always "used by another process".
What exactly am I not seeing here? The file in question resides in my Debug output folder of VS2017 Pro next to the .exe file. I'm not aware that I have limited write access in that folder.
I found the error causing this fiasco!
The reason for this issue has nothing to do with XDocument.Load() or .Save(). While being relieved to have solved my problem, I'm also embarrassed to admit that one of my own implementaions I've made years ago caused this.
I've once made a wrapper class for System.Windows.Forms.OpenFileDialog() that would ease my frequently used configuration of the OFD class. In it I've used something like
var ofd = new OpenFileDialog();
try
{
if((var stream = ofd.OpenFile()) != null)
{
return ofd.FileName;
}
else
{
return "file already opened";
}
}
catch
{
return "error";
}
while keeping the ofd and the stream references as instance variables. Looking at this code now makes me shiver (or laugh) but back then I didn't know any better. Of course this has now bitten me in my a.., because I did not close that stream! A crime I've committed years ago now cost me almost 2 days of work.
My remedy: I've rewritten my OFD wrapper class to use the Microsoft.Win32.OpenFileDialog class, because that does not seem to have any stream based stuff that would lock files unintentionally.
Lesson(s) learned (and what I warmly recommend to others): never trust yourself when using streams or other constructs that can be left open and cause memory leaks. Read up on the using block statement that makes use of the dispose pattern by calling the IDisposable.Dispose() method which usually takes care of such cleanup work. And lastly, use the neat profiling features in VS that weren't available back when I made my OFD wrapper, they help discover such issues.
Thanks to all who helped and to SO to make their help available.
try
fs.Flush();
and maybe
fs.Close();
after writing. AFAIK there is some sort of caching going on in the IO Streams. It might help.
Regards.
I have a function that should check a directory's files, and if it finds a file with a specific name it should delete the directory (kinda like a cleaning job)
public static void consumeFolderFiles(string path)
{
// get directory info
DirectoryInfo info = new DirectoryInfo(path);
// I want the files to be in order because I am planning to send them via FTP in order in the future.
FileInfo[] files = info.GetFiles().OrderBy(p => p.CreationTime).ToArray();
bool success = false;
if (files.Length > 0)
{
// FTP logic (not implemented yet)
// Check if there's a file's name that contains "balance"
success = files.FirstOrDefault(f => f.Name.ToLower().Contains("balance")) != null;
// Check if it's time to delete the directory
if (success)
{
Directory.Delete(path, true);
}
}
else
{
// Custom exception
throw new NoNewTransactionsInFolderException();
}
}
This function seems to be working, but occasionally I get the exception "the process cannot access the file ****.*** because it's being used by another process".Also, I noticed that all the files before the exception got deleted normally.
The function throws the exception inconsistently. Which means it might delete the folder or stop somewhere in the middle. Thus, I don't think I forgot closing a resource or a handler.
I think it's a good idea to include the file creating function. I might missed something there.
public static void createXMLFile(object o, string path, string fileName)
{
XmlSerializer xml = new XmlSerializer(o.GetType());
XmlSerializerNamespaces ns = new XmlSerializerNamespaces();
ns.Add("", "");
XmlTextWriter xtw = new XmlTextWriter(Path.Combine(path, fileName), Encoding.UTF8);
xtw.Formatting = Formatting.Indented;
xml.Serialize(xtw, o, ns);
xtw.Close();
xtw.Dispose();
}
I have seen other answers where people suggested adding a try catch and call Directory.Delete(path,true) in the catch clause, or using .NET Transactional File Manager. However, I would like to know the reason of this exception. Is it something I am doing wrong or an OS bug?
I am trying to export the strings in a nested list to a txt or csv file of the users choice and everything seems to be working but when I actually go to check the file after I have exported it the file is absolutely blank. I went and did it on a separate test program to mock my problem and it worked on that program but when I moved the code over it would still not export anything.
This is just my initialized nested list in case its needed.
List<List<string>> aQuestion = new List<List<string>>();
This is the problem area for the code.
static void writeCSV(List<List<string>> aQuestion, List<char> aAnswer)
{
StreamWriter fOut = null;
string fileName = "";
//export questions
//determine if the file can be found
try
{
Console.Write("Enter the file path for where you would like to export the exam to: ");
fileName = Console.ReadLine();
if (!File.Exists(fileName))
{
throw new FileNotFoundException();
}
}
catch (FileNotFoundException)
{
Console.WriteLine("File {0} cannot be found", fileName);
}
//writes to the file
try
{
fOut = new StreamWriter(fileName, false);
//accesses the nested lists
foreach (var line in aQuestion)
{
foreach (var value in line)
{
fOut.WriteLine(string.Join("\n", value));
}
}
Console.WriteLine("File {0} successfully written", fileName);
}
catch (IOException ioe)
{
Console.WriteLine("File {0} cannot be written {1}", fileName, ioe.Message);
}
So if any of you guys can help me with this problem that would be great because it seems like such a small problem but I can't figure it out for the life of me.
It may happen that the buffer was not flushed to the disk. You should dispose the stream writer and it will push everything out to disk:
using (StreamWriter writer = new StreamWriter(fileName, false)) // <-- this is the change
{
//accesses the nested lists
foreach (var line in aQuestion)
{
foreach (var value in line)
{
writer.WriteLine(string.Join("\n", value));
}
}
}
On a more elaborate level, streams that may cause performance loss are normally buffered. File streams are definitely buffered, because it would be very inefficient to push each separate piece of data to the IO immediately.
When you're working with file streams, you can flush their content explicitly using the StreamWriter.Flush() method - that is useful if you want to debug code and wish to see how far it has gone writing the data.
However, you normally do not flush the stream yourself but just let its internal mechanisms choose the best moment to do that. Instead, you make sure to dispose the stream object, and that will force buffer to be flushed before closing the stream.
Use this simple method instead, it is much easier and it will take care of creating and disposing StreamWriter.
File.WriteAllLines(PathToYourFile,aQuestion.SelectMany(x=>x));
More reference on File.WriteAllLines Here
Also, in your code your not disposing StreamWrite. Enclose it in a Using block. Like this..
using(var writer = new StreamWriter(PathToYourFile,false)
{
//Your code here
}
The following code gives me a System.IO.IOException with the message 'The process cannot access the file'.
private void UnPackLegacyStats()
{
DirectoryInfo oDirectory;
XmlDocument oStatsXml;
//Get the directory
oDirectory = new DirectoryInfo(msLegacyStatZipsPath);
//Check if the directory exists
if (oDirectory.Exists)
{
//Loop files
foreach (FileInfo oFile in oDirectory.GetFiles())
{
//Check if file is a zip file
if (C1ZipFile.IsZipFile(oFile.FullName))
{
//Open the zip file
using (C1ZipFile oZipFile = new C1ZipFile(oFile.FullName, false))
{
//Check if the zip contains the stats
if (oZipFile.Entries.Contains("Stats.xml"))
{
//Get the stats as a stream
using (Stream oStatsStream = oZipFile.Entries["Stats.xml"].OpenReader())
{
//Load the stats as xml
oStatsXml = new XmlDocument();
oStatsXml.Load(oStatsStream);
//Close the stream
oStatsStream.Close();
}
//Loop hit elements
foreach (XmlElement oHitElement in oStatsXml.SelectNodes("/*/hits"))
{
//Do stuff
}
}
//Close the file
oZipFile.Close();
}
}
//Delete the file
oFile.Delete();
}
}
}
I am struggling to see where the file could still be locked. All objects that could be holding onto a handle to the file are in using blocks and are explicitly closed.
Is it something to do with using FileInfo objects rather than the strings returned by the static GetFiles method?
Any ideas?
I do not see problems in your code, everything look ok. To check is the problem lies in C1ZipFile I suggest you initialize zip from stream, instead of initialization from file, so you close stream explicitly:
//Open the zip file
using (Stream ZipStream = oFile.OpenRead())
using (C1ZipFile oZipFile = new C1ZipFile(ZipStream, false))
{
// ...
Several other suggestions:
You do not need to call Close() method, with using (...), remove them.
Move xml processing (Loop hit elements) outsize zip processing, i.e. after zip file closeing, so you keep file opened as least as possible.
I assume you're getting the error on the oFile.Delete call. I was able to reproduce this error. Interestingly, the error only occurs when the file is not a zip file. Is this the behavior you are seeing?
It appears that the C1ZipFile.IsZipFile call is not releasing the file when it's not a zip file. I was able to avoid this problem by using a FileStream instead of passing the file path as a string (the IsZipFile function accepts either).
So the following modification to your code seems to work:
if (oDirectory.Exists)
{
//Loop files
foreach (FileInfo oFile in oDirectory.GetFiles())
{
using (FileStream oStream = new FileStream(oFile.FullName, FileMode.Open))
{
//Check if file is a zip file
if (C1ZipFile.IsZipFile(oStream))
{
// ...
}
}
//Delete the file
oFile.Delete();
}
}
In response to the original question in the subject: I don't know if it's possible to know if a file can be deleted without attempting to delete it. You could always write a function that attempts to delete the file and catches the error if it can't and then returns a boolean indicating whether the delete was successful.
I'm just guessing: are you sure that oZipFile.Close() is enough? Perhaps you have to call oZipFile.Dispose() or oZipFile.Finalize() to be sure it has actually released the resources.
More then Likely it's not being disposed, anytime you access something outside of managed code(streams, files, etc.) you MUST dispose of them. I learned the hard way with Asp.NET and Image files, it will fill up your memory, crash your server, etc.
In the interest of completeness I am posing my working code as the changes came from more than one source.
private void UnPackLegacyStats()
{
DirectoryInfo oDirectory;
XmlDocument oStatsXml;
//Get the directory
oDirectory = new DirectoryInfo(msLegacyStatZipsPath);
//Check if the directory exists
if (oDirectory.Exists)
{
//Loop files
foreach (FileInfo oFile in oDirectory.GetFiles())
{
//Set empty xml
oStatsXml = null;
//Load file into a stream
using (Stream oFileStream = oFile.OpenRead())
{
//Check if file is a zip file
if (C1ZipFile.IsZipFile(oFileStream))
{
//Open the zip file
using (C1ZipFile oZipFile = new C1ZipFile(oFileStream, false))
{
//Check if the zip contains the stats
if (oZipFile.Entries.Contains("Stats.xml"))
{
//Get the stats as a stream
using (Stream oStatsStream = oZipFile.Entries["Stats.xml"].OpenReader())
{
//Load the stats as xml
oStatsXml = new XmlDocument();
oStatsXml.Load(oStatsStream);
}
}
}
}
}
//Check if we have stats
if (oStatsXml != null)
{
//Process XML here
}
//Delete the file
oFile.Delete();
}
}
}
The main lesson I learned from this is to manage file access in one place in the calling code rather than letting other components manage their own file access. This is most apropriate when you want to use the file again after the other component has finished it's task.
Although this takes a little more code you can clearly see where the stream is disposed (at the end of the using), compared to having to trust that a component has correctly disposed of the stream.