On my web API, I want to delete a folder after my return statement.
public string Post(HttpRequestMessage request)
{
//Do cool stuff with request
try
{
return "10.0.2.2:8080/myFolder/index.html";
}
finally
{
Thread.Sleep(60000);
Directory.Delete(myFolder, true);
}
}
What I expected is that the device making the POST could get the return statement and load the html file. After a minute, we delete this file to free space on the server.
What happens is that the return statement is actually sent after the finally statement.
How can I run code after a return statement with delay, without delaying the return?
There is no way to do that and it would be bad anyway, since you would keep resources (webserver threads for example) busy waiting.
Write another process (preferably a Windows service if you are on Windows) that checks the directory periodically and deletes all files of a certain age.
you can't execute code after return
in my opinion you can check files if the file created before 1m delete it
DirectoryInfo objDirectoryInfo = new
DirectoryInfo(Server.MapPath("~/files"));
foreach (FileInfo File in objDirectoryInfo.GetFiles) {
if ((File.LastAccessTime < DateTime.Now.AddMinutes(-1))) {
File.Delete();
}
}
Related
I have web form which has a button.When you click that button,it will create a text file and write something to it.Just imagine like i am writing large things of 1G content ,which will change once in a day.And this is an asp.net application and many users will use.So suppose first user clicks at morning 6.o clock it will generate .Now i want to resuse it for others rather creating a new one till next morning 6 o clock.How to do it.I am posting a small prototype code
try
{
File.WriteAllText("E:\\test.txt", "welcome");
}
catch (Exception ex)
{
Response.Write(ex.Message);
}
NB:This is an asp.net application so cant think of thread.So i am not thinking
While(true)
{
Thread.Sleep() etc
}
Use File.GetLastWriteTime Method to check last modification in file
try
{
if(!File.Exists("E:\\test.txt") )
{
File.WriteAllText("E:\\test.txt", "welcome");
}
else
{
if(File.GetLastWriteTime(path).Day != DateTime.Now.Day)
{
//code for next day
}
}
}
catch (Exception ex)
{
Response.Write(ex.Message);
}
Assuming you are making a new file every day, and already have delete logic in place at the end of the day.
Check and see if the file exists before you create it.
try
{
if (//file does not exist)
File.WriteAllText("E:\\test.txt", "welcome");
}
catch (Exception ex)
{
Response.Write(ex.Message);
}
You could also check on the date of the file and if outside of your parameters then delete and create a new one (in the same if condition as the 'exists' logic).
Perhaps you should try using an Application variable to store the last time the file has been written ( a date value ) and just be sure that the file is only ever written once per day. For example:
Dim dt as DateTime
If TryCast(Application("LastFileWrite"), dt) Then
If String.Compare(dt.Date.ToString(), Now.Date.ToString()) <> 0 Then
' we're a different day today, go ahead and write file here
End If
Else
' we've never writting this application variable, this is
' the first run, go ahead and write file here as well
End If
For more information about the Application state, take a look at the following documentation:
https://msdn.microsoft.com/en-us/library/bf9xhdz4(v=vs.71).aspx
This should prevent two or more threads from writing the same file twice.
The first thread to grab the lock will create the file, then the other threads will skip creating the file with the second check of the file inside the lock.
public static object fileLock = new object();
public void createFile()
{
if (File.Exists("filepath") == false) {
lock (fileLock) {
if (File.Exists("filepath") == false) {
File.WriteAllText("E:\\test.txt", "welcome");
}
}
}
}
When a file is created (FileSystemWatcher_Created) in one directory I copy it to another. But When I create a big (>10MB) file it fails to copy the file, because it starts copying already, when the file is not yet finished creating...
This causes Cannot copy the file, because it's used by another process to be raised. ;(
Any help?
class Program
{
static void Main(string[] args)
{
string path = #"D:\levan\FolderListenerTest\ListenedFolder";
FileSystemWatcher listener;
listener = new FileSystemWatcher(path);
listener.Created += new FileSystemEventHandler(listener_Created);
listener.EnableRaisingEvents = true;
while (Console.ReadLine() != "exit") ;
}
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
File.Copy(e.FullPath, #"D:\levan\FolderListenerTest\CopiedFilesFolder\" + e.Name);
Console.Read();
}
}
There is only workaround for the issue you are facing.
Check whether file id in process before starting the process of copy. You can call the following function until you get the False value.
1st Method, copied directly from this answer:
private bool IsFileLocked(FileInfo file)
{
FileStream stream = null;
try
{
stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
return true;
}
finally
{
if (stream != null)
stream.Close();
}
//file is not locked
return false;
}
2nd Method:
const int ERROR_SHARING_VIOLATION = 32;
const int ERROR_LOCK_VIOLATION = 33;
private bool IsFileLocked(string file)
{
//check that problem is not in destination file
if (File.Exists(file) == true)
{
FileStream stream = null;
try
{
stream = File.Open(file, FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (Exception ex2)
{
//_log.WriteLog(ex2, "Error in checking whether file is locked " + file);
int errorCode = Marshal.GetHRForException(ex2) & ((1 << 16) - 1);
if ((ex2 is IOException) && (errorCode == ERROR_SHARING_VIOLATION || errorCode == ERROR_LOCK_VIOLATION))
{
return true;
}
}
finally
{
if (stream != null)
stream.Close();
}
}
return false;
}
From the documentation for FileSystemWatcher:
The OnCreated event is raised as soon as a file is created. If a file
is being copied or transferred into a watched directory, the
OnCreated event will be raised immediately, followed by one or more
OnChanged events.
So, if the copy fails, (catch the exception), add it to a list of files that still need to be moved, and attempt the copy during the OnChanged event. Eventually, it should work.
Something like (incomplete; catch specific exceptions, initialize variables, etc):
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
try {
File.Copy(e.FullPath, #"D:\levani\FolderListenerTest\CopiedFilesFolder\" + e.Name);
}
catch {
_waitingForClose.Add(e.FullPath);
}
Console.Read();
}
public static void listener_Changed(object sender, FileSystemEventArgs e)
{
if (_waitingForClose.Contains(e.FullPath))
{
try {
File.Copy(...);
_waitingForClose.Remove(e.FullPath);
}
catch {}
}
}
It's an old thread, but I'll add some info for other people.
I experienced a similar issue with a program that writes PDF files, sometimes they take 30 seconds to render.. which is the same period that my watcher_FileCreated class waits before copying the file.
The files were not locked.
In this case I checked the size of the PDF and then waited 2 seconds before comparing the new size, if they were unequal the thread would sleep for 30 seconds and try again.
You're actually in luck - the program writing the file locks it, so you can't open it. If it hadn't locked it, you would have copied a partial file, without having any idea there's a problem.
When you can't access a file, you can assume it's still in use (better yet - try to open it in exclusive mode, and see if someone else is currently opening it, instead of guessing from the failure of File.Copy). If the file is locked, you'll have to copy it at some other time. If it's not locked, you can copy it (there's slight potential for a race condition here).
When is that 'other time'? I don't rememeber when FileSystemWatcher sends multiple events per file - check it out, it might be enough for you to simply ignore the event and wait for another one. If not, you can always set up a time and recheck the file in 5 seconds.
Well you already give the answer yourself; you have to wait for the creation of the file to finish. One way to do this is via checking if the file is still in use. An example of this can be found here: Is there a way to check if a file is in use?
Note that you will have to modify this code for it to work in your situation. You might want to have something like (pseudocode):
public static void listener_Created()
{
while CheckFileInUse()
wait 1000 milliseconds
CopyFile()
}
Obviously you should protect yourself from an infinite while just in case the owner application never releases the lock. Also, it might be worth checking out the other events from FileSystemWatcher you can subscribe to. There might be an event which you can use to circumvent this whole problem.
When the file is writing in binary(byte by byte),create FileStream and above solutions Not working,because file is ready and wrotted in every bytes,so in this Situation you need other workaround like this:
Do this when file created or you want to start processing on file
long fileSize = 0;
currentFile = new FileInfo(path);
while (fileSize < currentFile.Length)//check size is stable or increased
{
fileSize = currentFile.Length;//get current size
System.Threading.Thread.Sleep(500);//wait a moment for processing copy
currentFile.Refresh();//refresh length value
}
//Now file is ready for any process!
So, having glanced quickly through some of these and other similar questions I went on a merry goose chase this afternoon trying to solve a problem with two separate programs using a file as a synchronization (and also file save) method. A bit of an unusual situation, but it definitely highlighted for me the problems with the 'check if the file is locked, then open it if it's not' approach.
The problem is this: the file can become locked between the time that you check it and the time you actually open the file. Its really hard to track down the sporadic Cannot copy the file, because it's used by another process error if you aren't looking for it too.
The basic resolution is to just try to open the file inside a catch block so that if its locked, you can try again. That way there is no elapsed time between the check and the opening, the OS does them at the same time.
The code here uses File.Copy, but it works just as well with any of the static methods of the File class: File.Open, File.ReadAllText, File.WriteAllText, etc.
/// <param name="timeout">how long to keep trying in milliseconds</param>
static void safeCopy(string src, string dst, int timeout)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
//don't forget to either return from the function or break out fo the while loop
break;
}
catch (IOException)
{
//you could do the sleep in here, but its probably a good idea to exit the error handler as soon as possible
}
Thread.Sleep(100);
//if its a very long wait this will acumulate very small errors.
//For most things it's probably fine, but if you need precision over a long time span, consider
// using some sort of timer or DateTime.Now as a better alternative
timeout -= 100;
}
}
Another small note on parellelism:
This is a synchronous method, which will block its thread both while waiting and while working on the thread. This is the simplest approach, but if the file remains locked for a long time your program may become unresponsive. Parellelism is too big a topic to go into in depth here, (and the number of ways you could set up asynchronous read/write is kind of preposterous) but here is one way it could be parellelized.
public class FileEx
{
public static async void CopyWaitAsync(string src, string dst, int timeout, Action doWhenDone)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
doWhenDone();
break;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
public static async Task<string> ReadAllTextWaitAsync(string filePath, int timeout)
{
while (timeout > 0)
{
try {
return File.ReadAllText(filePath);
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
return "";
}
public static async void WriteAllTextWaitAsync(string filePath, string contents, int timeout)
{
while (timeout > 0)
{
try
{
File.WriteAllText(filePath, contents);
return;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
}
And here is how it could be used:
public static void Main()
{
test_FileEx();
Console.WriteLine("Me First!");
}
public static async void test_FileEx()
{
await Task.Delay(1);
//you can do this, but it gives a compiler warning because it can potentially return immediately without finishing the copy
//As a side note, if the file is not locked this will not return until the copy operation completes. Async functions run synchronously
//until the first 'await'. See the documentation for async: https://msdn.microsoft.com/en-us/library/hh156513.aspx
CopyWaitAsync("file1.txt", "file1.bat", 1000);
//this is the normal way of using this kind of async function. Execution of the following lines will always occur AFTER the copy finishes
await CopyWaitAsync("file1.txt", "file1.readme", 1000);
Console.WriteLine("file1.txt copied to file1.readme");
//The following line doesn't cause a compiler error, but it doesn't make any sense either.
ReadAllTextWaitAsync("file1.readme", 1000);
//To get the return value of the function, you have to use this function with the await keyword
string text = await ReadAllTextWaitAsync("file1.readme", 1000);
Console.WriteLine("file1.readme says: " + text);
}
//Output:
//Me First!
//file1.txt copied to file1.readme
//file1.readme says: Text to be duplicated!
You can use the following code to check if the file can be opened with exclusive access (that is, it is not opened by another application). If the file isn't closed, you could wait a few moments and check again until the file is closed and you can safely copy it.
You should still check if File.Copy fails, because another application may open the file between the moment you check the file and the moment you copy it.
public static bool IsFileClosed(string filename)
{
try
{
using (var inputStream = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None))
{
return true;
}
}
catch (IOException)
{
return false;
}
}
I would like to add an answer here, because this worked for me. I used time delays, while loops, everything I could think of.
I had the Windows Explorer window of the output folder open. I closed it, and everything worked like a charm.
I hope this helps someone.
What I would like to do, and have worked towards developing, is a standard class which I can use for retrieving all sub-directories (and their sub directories and files, and so on) and files.
WalkthroughDir(Dir)
Files a
Folders b
WalkthroughDir(b[i])
A straightforward recursive directory search.
Using this as a basis I wanted to extend it to fire events when:
A file is found;
A directory is found;
The search is completed
private void GetDirectories(string path)
{
GetFiles(path);
foreach (string dir in Directory.EnumerateDirectories(path))
{
if (DirectoryFound != null)
{
IOEventArgs<DirectoryInfo> args = new IOEventArgs<DirectoryInfo>(new DirectoryInfo(dir));
DirectoryFound(this, args);
}
// do something with the directory...
GetDirectories(dir, dirNode);
}
}
private void GetFiles(string path)
{
foreach (string file in Directory.EnumerateFiles(path))
{
if (FileFound != null)
{
IOEventArgs<FileInfo> args = new IOEventArgs<FileInfo>(new FileInfo(file));
FileFound(this, args);
}
// do something with the file...
}
}
Where you find the comments above ("do something[...]") is where I might add the file or directory to some data structure.
The most common factor in doing this type of search though is the processing time, particularly for large directories. So naturally I wanted to take this yet another step forward and implement threading. Now, my knowledge of threading is pretty limited but so far this is an outline of what I've come up with:
public void Search()
{
m_searchThread = new Thread(new ThreadStart(SearchThread));
m_searching = true;
m_searchThread.Start();
}
private void SearchThread()
{
GetDirectories(m_path);
m_searching = false;
}
If I use this implementation, assign the events in a control it throws errors (as I expected) that my GUI application is trying to access another thread.
Could anyone feedback on this implementation as well as how to accomplish the threading. Thanks.
UPDATE (selkathguy recommendation):
This is the adjusted code following selkathguy's recommendation:
private void GetDirectories(DirectoryInfo path)
{
GetFiles(path);
foreach (DirectoryInfo dir in path.GetDirectories())
{
if (DirectoryFound != null)
{
IOEventArgs<DirectoryInfo> args = new IOEventArgs<DirectoryInfo>(dir);
DirectoryFound(this, args);
}
// do something with the directory...
GetDirectories(dir);
}
}
private void GetFiles(DirectoryInfo path)
{
foreach (FileInfo file in path.GetFiles())
{
if (FileFound != null)
{
IOEventArgs<FileInfo> args = new IOEventArgs<FileInfo>(file);
FileFound(this, args);
}
// do something with the file...
}
}
Original code time taken: 47.87s
Altered code time taken: 46.14s
To address the first part of your request about raising your own events from the standard class: you can create a delegate to which other methods can be hooked as callbacks for the event. Please see http://msdn.microsoft.com/en-us/library/aa645739(v=vs.71).aspx as a good resource. It's fairly trivial to implement.
As for threading, I believe that would be unnecessary at least for your performance concerns. Most of the bottleneck of performance for recursively checking directories is waiting for the node information to load from the disk. Relatively speaking, this is what takes all of your time, as fetching a directory info is a blocking process. Making numerous threads all checking different directories can easily slow down the overall speed of your search, and it tremendously complicates your application with the management of the worker threads and delegation of work shares. With that said, having a thread per disk might be desirable if your search spans multiple disks or resource locations.
I have found that something as simple as recursion using DirectoryInfo.GetDirectories() was one of the fastest solutions, as it takes advantage of the caching that Windows already does. A search application I made using it can search tens of thousands of filenames and directory names per second.
When a file is being copied to the file watcher folder, how can I identify whether the file is completely copied and ready to use? Because I am getting multiple events during file copy. (The file is copied via another program using File.Copy.)
When I ran into this problem, the best solution I came up with was to continually try to get an exclusive lock on the file; while the file is being written, the locking attempt will fail, essentially the method in this answer. Once the file isn't being written to any more, the lock will succeed.
Unfortunately, the only way to do that is to wrap a try/catch around opening the file, which makes me cringe - having to use try/catch is always painful. There just doesn't seem to be any way around that, though, so it's what I ended up using.
Modifying the code in that answer does the trick, so I ended up using something like this:
private void WaitForFile(FileInfo file)
{
FileStream stream = null;
bool FileReady = false;
while(!FileReady)
{
try
{
using(stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None))
{
FileReady = true;
}
}
catch (IOException)
{
//File isn't ready yet, so we need to keep on waiting until it is.
}
//We'll want to wait a bit between polls, if the file isn't ready.
if(!FileReady) Thread.Sleep(1000);
}
}
Here is a method that will retry file access up to X number of times, with a Sleep between tries. If it never gets access, the application moves on:
private static bool GetIdleFile(string path)
{
var fileIdle = false;
const int MaximumAttemptsAllowed = 30;
var attemptsMade = 0;
while (!fileIdle && attemptsMade <= MaximumAttemptsAllowed)
{
try
{
using (File.Open(path, FileMode.Open, FileAccess.ReadWrite))
{
fileIdle = true;
}
}
catch
{
attemptsMade++;
Thread.Sleep(100);
}
}
return fileIdle;
}
It can be used like this:
private void WatcherOnCreated(object sender, FileSystemEventArgs e)
{
if (GetIdleFile(e.FullPath))
{
// Do something like...
foreach (var line in File.ReadAllLines(e.FullPath))
{
// Do more...
}
}
}
I had this problem when writing a file. I got events before the file was fully written and closed.
The solution is to use a temporary filename and rename the file once finished. Then watch for the file rename event instead of file creation or change event.
Note: this problem is not solvable in generic case. Without prior knowledge about file usage you can't know if other program(s) finished operation with the file.
In your particular case you should be able to figure out what operations File.Copy consist of.
Most likely destination file is locked during whole operation. In this case you should be able to simply try to open file and handle "sharing mode violation" exception.
You can also wait for some time... - very unreliable option, but if you know size range of files you may be able to have reasonable delay to let Copy to finish.
You can also "invent" some sort of transaction system - i.e. create another file like "destination_file_name.COPYLOCK" which program that copies file would create before copying "destination_file_name" and delete afterward.
private Stream ReadWhenAvailable(FileInfo finfo, TimeSpan? ts = null) => Task.Run(() =>
{
ts = ts == null ? new TimeSpan(long.MaxValue) : ts;
var start = DateTime.Now;
while (DateTime.Now - start < ts)
{
Thread.Sleep(200);
try
{
return new FileStream(finfo.FullName, FileMode.Open);
}
catch { }
}
return null;
})
.Result;
...of course, you can modify aspects of this to suit your needs.
One possible solution (It worked in my case) is to use the Change event. You can log in the create event the name of the file just created and then catch the change event and verify if the file was just created. When I manipulated the file in the change event it didn't throw me the error "File is in use"
If you are doing some sort of inter-process communication, as I do, you may want to consider this solution:
App A writes the file you are interested in, eg "Data.csv"
When done, app A writes a 2nd file, eg. "Data.confirmed"
In your C# app B make the FileWatcher listen to "*.confirmed" files. When you get this event you can safely read "Data.csv", as it is already completed by app A.
(Edit: inspired by commets) Delete the *.confirmed filed with app B when done processing the "Data.csv" file.
I have solved this issue with two features:
Implement the MemoryCache pattern seen in this question: A robust solution for FileSystemWatcher firing events multiple times
Implement a try\catch loop with a timeout for access
You need to collect average copy times in your environment and set the memory cache timeout to be at least as long as the shortest lock time on a new file. This eliminates duplicates in your processing directive and allows some time for the copy to finish. You will have much better success on first try, which means less time spent in the try\catch loop.
Here is an example of the try\catch loop:
public static IEnumerable<string> GetFileLines(string theFile)
{
DateTime startTime = DateTime.Now;
TimeSpan timeOut = TimeSpan.FromSeconds(TimeoutSeconds);
TimeSpan timePassed;
do
{
try
{
return File.ReadLines(theFile);
}
catch (FileNotFoundException ex)
{
EventLog.WriteEntry(ProgramName, "File not found: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (PathTooLongException ex)
{
EventLog.WriteEntry(ProgramName, "Path too long: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (DirectoryNotFoundException ex)
{
EventLog.WriteEntry(ProgramName, "Directory not found: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (Exception ex)
{
// We swallow all other exceptions here so we can try again
EventLog.WriteEntry(ProgramName, ex.Message, EventLogEntryType.Warning, ex.HResult);
}
Task.Delay(777).Wait();
timePassed = DateTime.Now.Subtract(startTime);
}
while (timePassed < timeOut);
EventLog.WriteEntry(ProgramName, "Timeout after waiting " + timePassed.ToString() + " seconds to read " + theFile, EventLogEntryType.Warning, 258);
return null;
}
Where TimeoutSeconds is a setting that you can put wherever you hold your settings. This can be tuned for your environment.
suppose we have c:\\d1\\d2\\d3\\... where there are many files and directories in d3.
we want to move all items in d3 to c:\\d1\\new\\.
how to do it clean and safe?
c:\\d1\\new exists!
If c:\d1\new does not exist yet, and you don't want to keep an empty c:\d1\d2\d3 folder afterward, you can use the Directory.Move() method:
using System.IO;
try {
Directory.Move(#"c:\d1\d2\d3", #"c:\d1\new");
} catch (UnauthorizedAccessException) {
// Permission denied, recover...
} catch (IOException) {
// Other I/O error, recover...
}
If c:\d1\new does exist, you'll have to iterate over the contents of c:\d1\d2\d3 and move its files and folders one by one:
foreach (string item in Directory.GetFileSystemEntries(#"c:\d1\d2\d3")) {
string absoluteSource = Path.Combine(#"c:\d1\d2\d3", item);
string absoluteTarget = Path.Combine(#"c:\d1\new", item);
if (File.GetAttributes(absoluteSource) & FileAttributes.Directory != 0) {
Directory.Move(absoluteSource, absoluteTarget);
} else {
File.Move(absoluteSource, absoluteTarget);
}
}
Use Directory.Move
Also, MSDN has a handy table of what functions to use for Common I/O Tasks which is a good reference for questions like this.
try
{
System.IO.Directory.Move(#"c:\d1\d2\d3\", #"c:\d1\new\");
}
catch(...)
{
}
The Move method can throw any of the following exceptions that depending on your usage may or may not be thrown. So you need to code the exception handler in a manner that suits your application.
System.IO.IOExeption
System.UnauthorizedAccessException
System.ArgumentException
System.ArgumentNullException
System.IO.PathToLongException
System.IO.DirectoryNotFoundException
As an general example (you probably don't want/need to display message boxes on errors):
try
{
System.IO.Directory.Move(#"c:\d1\d2\d3\", #"c:\d1\new\");
}
catch (System.UnauthorizedAccessException)
{
MessageBox.Show("You do not have access to move this files/directories");
}
catch(System.IO.DirectoryNotFoundException)
{
MessageBox.Show("The directory to move files/directories from was not found")
}
catch
{
MessageBox.Show("Something blew up!");
}
Finally, it is worth mentioning that the call to Move will block the current thread until the move is complete. So if you are doing this from a UI it will block the UI until the copy is complete. This might take some time depending on how many files/directories are being move. Therefore it might be prudent to run this in a seperate thread and/or display a cycling progress bar.
Use Directory.Move.
Moves a file or a directory and its contents to a new location.