After an image is uploaded to my server, my code moves it into a specific folder given by the user details. Sometimes I think it tries to move the file too fast or the upload file is still in use so 9/10 the function won't perform the move.
Is there a way to add a 'wait' or a way to check if a file is in use and possibly perform a while loop until the file is allowed to be moved?
Current move function in my controller:
while (!File.Exists(uploadedPath))
{
}
File.Move(uploadedPath, savePath);
PS. I intend to add in a counter to ensure the while loop doesn't get stuck and has a timeout.
If you have control over the code receiving the file, I would update it to notify the moving code when the file is received completely. Alternatively I would move the file from there or even save the file where it should be eventually.
Otherwise, it will be a hack. You need
Try to move the file,
Catch the exception if it doesn't move
Use Thread.Sleep for a few sec
Go To 1
Something along the lines:
bool success = false;
for (var count = 0; !success && count < 10; ++count)
{
try
{
File.Move(uploadedPath, savePath);
success = true;
}
catch (IOException)
{
Thread.Wait(1000);
}
}
You also need to handle the situation when it cannot move the file at all. So it is a hack and should not be done in general if there are other ways to notify the moving code.
Also note:
From File.Move msdn:
If you try to move a file across disk volumes and that file is in use,
the file is copied to the destination, but it is not deleted from the
source.
which means that your file will remain in the received files directory after moving.
Are UploadFile and MoveFile 2 different components that are independent of each other. If so I don't think it's a good architecture. I would recommend a way so as to have the UploadFile pass the control to MoveFile once it's part is done. This way you can avoid multiple processes trying to access the same file.
Related
I made a system where user can upload a file (image) to a server and server saves it. All is good, but when I want to delete the files uploaded by user, I get an exception saying:
the process cannot access the file because it is being used by another process
This is the code for saving the file:
HttpFileCollection files = httpRequest.Files;
for (int i = 0; i < files.Count; i++) {
var postedFile = files[i];
// I tried this one before, but I read that I should .Dispose() files, therefore
// I settled to the other, uncommented solution (however, both of them do the same thing)
//postedFile.SaveAs(filePath);
using (FileStream fs = File.Create(filePath)) {
postedFile.InputStream.CopyTo(fs);
postedFile.InputStream.Close();
postedFile.InputStream.Dispose();
fs.Dispose();
fs.Close();
}
}
The deleting of files is quite simple. In a method called Delete, I call this method:
...
File.Delete(HttpContext.Current.Server.MapPath(CORRECT_PATH_TO_FILE));
...
Any suggestions on how to solve this?
Thanks
Just as Michael Perrenoud suggested me in the comment to my main question, I was also opening the file in another class and not disposing it when done with working with it. Problem is therefore solved. Thanks!
Where are you trying the file delete method? As part of the loop? If so, it is natural to have it locked. If outside of the loop, then it is a different problem (perhaps not garbage collected yet?).
To avoid the loop problem, gather a list of locations you are going to delete (declare outside of the loop, can be populated within) and then delete in another "clean up" loop (another method is even better for reusability).
NOTE: Close() before Dispose() not the other way around. You actually do not have to do both, as Dispose() should always handle making sure everything is clean (especially in .NET framework uses of IDisposable), but I don't see any harm in Close() followed by Dispose().
(I know It's a common problem but I couldn't find an exact answer)
I need to write a windows service that monitors a directory, and upon the arrival of a file, opens it, parses the text, does something with it and moves it to another directory afterwards. I used IsFileLocked method mentioned in this post to find out if a file is still been written. My problem is that I don't know how much it takes for another party to complete writing into the file. I could wait a few seconds before opening the file but this is not a perfect solution since I don't know in which rate is the file written to and a few seconds may not suffice.
here's my code:
while (true)
{
var d = new DirectoryInfo(path);
var files = d.GetFiles("*.txt").OrderBy(f => f);
foreach (var file in files)
{
if (!IsFileLocked(file))
{
//process file
}
else
{
//???
}
}
}
I think you might use a FileSystemWatcher (more info about it here: http://msdn.microsoft.com/it-it/library/system.io.filesystemwatcher(v=vs.110).aspx ).
Specificially you could hook to the OnChanged event and after it raises you can check IsFileLocked to verify if it's still being written or not.
This strategy should avoid you to actively wait through polling.
I'm using a System.IO.FileSystemWatcher to get notified on file renaming inside a directory. This files are log files, created by a different process.
The event handler looks like this:
private async void FileRenamedHandler(object sender, RenamedEventArgs e)
{
//when file is renamed
//try to upload it to a storage
//if upload is succesful delete it from disk
}
all looks good until now but i need to add a second method that iterates through the directory when this application starts in order to upload existing log files to storage
so
public async Task UploadAllFilesInDirectory()
{
foreach (var file in Directory.GetFiles(_directoryPath))
{
await TryUploadLogAsync(file);
}
}
Problem is i get into race conditions like for example:
file has just been renamed and FileRenamedHandler is triggered but the same fill would be also parsed by UploadAllFilesInDirectory method. In this moment i may upload the same file twice or i would get an exception when trying to delete it from disk because it has been already deleted.
I can see more race condition cases with this code.
Any idea how i can solve this?
Thanks
You can use a ConcurrentDictionary to keep track of the items currently being processed, and let it worry about the thread safety.
Create the dictionary in which the key is the file path (or some other identifying object) and the value is...whatever. We're treating this as a set, not a dictionary, but there is no ConcurrentSet, so this will have to do.
Then for each file you have to process call TryAdd. If it returns true you added the object, and you can process the file. If it returns false then the file was there, and it's being processed elsewhere.
You can then remove the object when you're done processing it:
//store this somewhere
var dic = new ConcurrentDictionary<string, string>();
//to process each file
if (dic.TryAdd(path, path))
{
//process the file at "path"
dic.TryRemove(path, out path);
}
I would suggest to build a queue and store the files to be uploaded as some sort of job into the queue. If you process the items in the queue you can check the existence of every file before trying to upload it.
Something tells me this might be a stupid question and I have in fact approached my problem from the wrong direction, but here goes.
I have some code that loops through all the documents in a folder - The alphabetical order of these documents in each folder is important, this importance is also reflected in the order the documents are printed. Here is a simplified version:
var wordApp = new Microsoft.Office.Interop.Word.Application();
foreach (var file in Directory.EnumerateFiles(folder))
{
fileCounter++;
// Print file, referencing a previously instantiated word application object
wordApp.Documents.Open(...)
wordApp.PrintOut(...)
wordApp.ActiveDocument.Close(...)
}
It seems (and I could be wrong) that the PrintOut code is asynchronous, and the application sometimes gets into a situation where the documents get printed out of order. This is confirmed because if I step through, or place a long enough Sleep() call, the order of all the files is correct.
How should I prevent the next print task from starting before the previous one has finished?
I initially thought that I could use a lock(someObject){} until I remembered that they are only useful for preventing multiple threads accessing the same code block. This is all on the same thread.
There are some events I can wire into on the Microsoft.Office.Interop.Word.Application object: DocumentOpen, DocumentBeforeClose and DocumentBeforePrint
I have just thought that this might actually be a problem with the print queue not being able to accurately distinguish lots of documents that are added within the same second. This can't be the problem, can it?
As a side note, this loop is within the code called from the DoWork event of a BackgroundWorker object. I'm using this to prevent UI blocking and to feedback the progress of the process.
Your event-handling approach seems like a good one. Instead of using a loop, you could add a handler to the DocumentBeforeClose event, in which you would get the next file to print, send it to Word, and continue. Something like this:
List<...> m_files = Directory.EnumerateFiles(folder);
wordApp.DocumentBeforeClose += ProcessNextDocument;
...
void ProcessNextDocument(...)
{
File file = null;
lock(m_files)
{
if (m_files.Count > 0)
{
file = m_files[m_files.Count - 1];
m_files.RemoveAt(m_files.Count - 1);
}
else
{
// Done!
}
}
if (file != null)
{
PrintDocument(file);
}
}
void PrintDocument(File file)
{
wordApp.Document.Open(...);
wordApp.Document.PrintOut(...);
wordApp.ActiveDocument.Close(...);
}
The first parameter of Application.PrintOut specifies whether the printing should take place in the background or not. By setting it to false it will work synchronously.
I have a function that is reading a file and adding some of the string in a list and returning this list. Because I wanted that nobody and nothing could change, delete or whatever the current file that I was reading I locked it. Everything was fine, I did it somehow like this:
public static List<string> Read(string myfile)
{
using (FileStream fs = File.Open(myfile, FileMode.Open, FileAccess.Read, FileShare.None))
{
//read lines, add string to a list
//return list
}
}
Thats fine. Now I have another function in another class that is doing stuff with the list and calling other functions and so on. Now sometimes I want to move the file that I was reading. And here is the problem: because Im now in a new function and the function Read(string myfile) is already processed, there is no more lock in the file.
//in another class
public static void DoStuff(/*somefile*/)
{
List<string> = Read(/*somefile*/);
//the file (somefile) is not more locked!
//do stuff
if (something)
Move(/*somefile*/) //could get an error, file maybe not more there or changed...
}
So another function/user could change the file, rename it, deleting it or whatever => Im not able to move this file. Or I will move the changed file, but I dont what that. If I would use threading, another thread with the same function could lock the file again and I could not move it.
Thats why I somehow need to lock this file for a longer time. Is there an easy way? Or do I have to replace my using (FileStream fs = File.Open(myfile, FileMode.Open, FileAccess.Read, FileShare.None) code? Any suggestions? thank you
If you want to keep the file locked for longer then you need to refactor your code so that the Stream object is kept around for longer - I would change the Read method to accept a FileStream, a little bit like this
using (FileStream fs = File.Open(myfile, FileMode.Open, FileAccess., FileShare.None))
{
List<string> = Read(fs);
if (something)
{
File.Move(/* somefile */)
}
}
The problem you are going to have is that File.Move method is going to fail as this file is already locked (by you, but File.Move doesn't know that).
Depending on what exactly it is you want to do it might be possible to work out a way of keeping the file locked while also "moving" the file, (for example if you know something in advance you could open the file specifying FileOptions.DeleteOnClose and write a new file with the same contents in the desired destination), however this isn't really the same as moving the file and so it all depends on what exactly it is you are trying to do.
In general such things are almost always going to be more trouble than they are worth - you are better off just unlocking the file just before you move it and catching/ handling any exception that is thrown as a result of the move.
The only way you could keep it locked is to keep it exclusively open, like you have done in your code.
Maybe you need to //do stuff within your using statement, and then straight after call Move
No amount of locking will prevent this. A lock only protects the data in the file. The user (or any other program) can still move or rename the file. The file's attributes, including name, time stamps and file attributes are stored separately and can be changed at will.
This is just something you'll have to deal with in any Windows program. It is rare enough that simply catching the exception is good enough to let you know that something happened to the file. The user will rarely be surprised. If you really need to know up front then you can use FileSystemWatcher to get a notification when it happens.
You are locking the file only when Read method is called.
If you want to keep it locked and release it only when you decide, make your methods OpenFile(string filename) and CloseFile(string filename). Then remove the using statement from Read method.
Open it when you start working (lock). Read it when you need it. When you have to move it, simply create a new file with the same name and copy the content. Close the original file (unlock) and delete it.