Question: I have an ASP.NET application which creates temporary PDF files (for the user to download).
Now, many users over many days can create many PDFs, which take much disk space.
What's the best way to schedule deletion of files older than 1 day/ 8 hours ?
Preferably in the asp.net application itselfs...
For each temporary file that you need to create, make a note of the filename in the session:
// create temporary file:
string fileName = System.IO.Path.GetTempFileName();
Session[string.Concat("temporaryFile", Guid.NewGuid().ToString("d"))] = fileName;
// TODO: write to file
Next, add the following cleanup code to global.asax:
<%# Application Language="C#" %>
<script RunAt="server">
void Session_End(object sender, EventArgs e) {
// Code that runs when a session ends.
// Note: The Session_End event is raised only when the sessionstate mode
// is set to InProc in the Web.config file. If session mode is set to StateServer
// or SQLServer, the event is not raised.
// remove files that has been uploaded, but not actively 'saved' or 'canceled' by the user
foreach (string key in Session.Keys) {
if (key.StartsWith("temporaryFile", StringComparison.OrdinalIgnoreCase)) {
try {
string fileName = (string)Session[key];
Session[key] = string.Empty;
if ((fileName.Length > 0) && (System.IO.File.Exists(fileName))) {
System.IO.File.Delete(fileName);
}
} catch (Exception) { }
}
}
}
</script>
UPDATE: I'm now accually using a new (improved) method than the one described above. The new one involves HttpRuntime.Cache and checking that the files are older than 8 hours. I'll post it here if anyones interested. Here's my new global.asax.cs:
using System;
using System.Web;
using System.Text;
using System.IO;
using System.Xml;
using System.Web.Caching;
public partial class global : System.Web.HttpApplication {
protected void Application_Start() {
RemoveTemporaryFiles();
RemoveTemporaryFilesSchedule();
}
public void RemoveTemporaryFiles() {
string pathTemp = "d:\\uploads\\";
if ((pathTemp.Length > 0) && (Directory.Exists(pathTemp))) {
foreach (string file in Directory.GetFiles(pathTemp)) {
try {
FileInfo fi = new FileInfo(file);
if (fi.CreationTime < DateTime.Now.AddHours(-8)) {
File.Delete(file);
}
} catch (Exception) { }
}
}
}
public void RemoveTemporaryFilesSchedule() {
HttpRuntime.Cache.Insert("RemoveTemporaryFiles", string.Empty, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration, CacheItemPriority.NotRemovable, delegate(string id, object o, CacheItemRemovedReason cirr) {
if (id.Equals("RemoveTemporaryFiles", StringComparison.OrdinalIgnoreCase)) {
RemoveTemporaryFiles();
RemoveTemporaryFilesSchedule();
}
});
}
}
Try using Path.GetTempPath(). It will give you a path to a windows temp folder. Then it will be up to windows to clean up :)
You can read more about the method here http://msdn.microsoft.com/en-us/library/system.io.path.gettemppath.aspx
The Best way is to create a batch file which it be called by the windows task scheduler one at the interval that you want.
OR
you can create a windows service with the class above
public class CleanUpBot
{
public bool KeepAlive;
private Thread _cleanUpThread;
public void Run()
{
_cleanUpThread = new Thread(StartCleanUp);
}
private void StartCleanUp()
{
do
{
// HERE THE LOGIC FOR DELETE FILES
_cleanUpThread.Join(TIME_IN_MILLISECOND);
}while(KeepAlive)
}
}
Notice that you can also call this class at the pageLoad and it wont affect the process time because the treatment is in another thread. Just remove the do-while and the Thread.Join().
How do you store the files? If possible, you could just go with a simple solution, where all files are stored in a folder named after the current date and time.
Then create a simple page or httphandler that will delete old folders. You could call this page at intervals using a Windows schedule or other cron job.
Create a timer on Appication_Start and schedule the timer to call a method on every 1 hours and flush the files older than 8 hours or 1 day or whatever duration you need.
I sort of agree with whats said in the answer by dirk.
The idea being that the temp folder in which you drop the files to is a fixed known location however i differ slightly ...
Each time a file is created add the filename to a list in the session object (assuming there isn't thousands, if there is when this list hits a given cap do the next bit)
when the session ends the Session_End event should be raised in global.asax should be raised. Iterate all the files in the list and remove them.
private const string TEMPDIRPATH = #"C:\\mytempdir\";
private const int DELETEAFTERHOURS = 8;
private void cleanTempDir()
{
foreach (string filePath in Directory.GetFiles(TEMPDIRPATH))
{
FileInfo fi = new FileInfo(filePath);
if (!(fi.LastWriteTime.CompareTo(DateTime.Now.AddHours(DELETEAFTERHOURS * -1)) <= 0)) //created or modified more than x hours ago? if not, continue to the next file
{
continue;
}
try
{
File.Delete(filePath);
}
catch (Exception)
{
//something happened and the file probably isn't deleted. the next time give it another shot
}
}
}
The code above will remove the files in the temp directory that are created or modified more than 8 hours ago.
However I would suggest to use another approach. As Fredrik Johansson suggested, you can delete the files created by the user when the session ends. Better is to work with an extra directory based on the session ID of the user in you temp directory. When the session ends you simply delete the directory created for the user.
private const string TEMPDIRPATH = #"C:\\mytempdir\";
string tempDirUserPath = Path.Combine(TEMPDIRPATH, HttpContext.Current.User.Identity.Name);
private void removeTempDirUser(string path)
{
try
{
Directory.Delete(path);
}
catch (Exception)
{
//an exception occured while deleting the directory.
}
}
Use the cache expiry notification to trigger file deletion:
private static void DeleteLater(string path)
{
HttpContext.Current.Cache.Add(path, path, null, Cache.NoAbsoluteExpiration, new TimeSpan(0, 8, 0, 0), CacheItemPriority.NotRemovable, UploadedFileCacheCallback);
}
private static void UploadedFileCacheCallback(string key, object value, CacheItemRemovedReason reason)
{
var path = (string) value;
Debug.WriteLine(string.Format("Deleting upladed file '{0}'", path));
File.Delete(path);
}
ref: MSDN | How to: Notify an Application When an Item Is Removed from the Cache
Related
I wrote a program using CSOM to upload documents to SharePoint and insert metadata to the properties. once a while(like every 3 months) the SharePoint server gets busy or we reset IIS or any other communication problem that it may have, we get "The operation has timed out" error on clientContext.ExecuteQuery(). To resolve the issue I wrote an extension method for ExecuteQuery to try every 10 seconds for 5 times to connect to the server and execute the query. My code works in the Dev and QA environment without any problem but in Prod, when it fails the first time with timeout error, in the second attempt, it only uploads the document but it doesn't update the properties and all the properties are empty in the library. It doesn't return any error as result of ExecteQuery() but It seems from the two requests in the batch witch are uploading the file and updating the properties, it just does uploading and I don't know what happens to the properties. It kinda removes that from the batch in the second attempt!
I used both upload methods docs.RootFolder.Files.Add and File.SaveBinaryDirect in different parts of my code but I copy just one of them here so you can see what I have in my code.
I appreciate your help.
public static void ExecuteSharePointQuery(ClientContext context)
{
int cnt = 0;
bool isExecute = false;
while (cnt < 5)
{
try
{
context.ExecuteQuery();
isExecute = true;
break;
}
catch (Exception ex)
{
cnt++;
Logger.Error(string.Format("Communication attempt with SharePoint failed. Attempt {0}", cnt));
Logger.Error(ex.Message);
Thread.Sleep(10000);
if (cnt == 5 && isExecute == false)
{
Logger.Error(string.Format("Couldn't execute the query in SharePoint."));
Logger.Error(ex.Message);
throw;
}
}
}
}
public static void UploadSPFileWithProperties(string siteURL, string listTitle, FieldMapper item)
{
Logger.Info(string.Format("Uploading to SharePoint: {0}", item.pdfPath));
using (ClientContext clientContext = new ClientContext(siteURL))
{
using (FileStream fs = new FileStream(item.pdfPath, FileMode.Open))
{
try
{
FileCreationInformation fileCreationInformation = new FileCreationInformation();
fileCreationInformation.ContentStream = fs;
fileCreationInformation.Url = Path.GetFileName(item.pdfPath);
fileCreationInformation.Overwrite = true;
List docs = clientContext.Web.Lists.GetByTitle(listTitle);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(fileCreationInformation);
uploadFile.CheckOut();
//Update the metadata
ListItem listItem = uploadFile.ListItemAllFields;
//Set field values on item
foreach (List<string> list in item.fieldMappings)
{
if (list[FieldMapper.SP_VALUE_INDEX] != null)
{
TrySet(ref listItem, list[FieldMapper.SP_FIELD_NAME_INDEX], (FieldType)Enum.Parse(typeof(FieldType), list[FieldMapper.SP_TYPE_INDEX]), list[FieldMapper.SP_VALUE_INDEX]);
}
}
listItem.Update();
uploadFile.CheckIn(string.Empty, CheckinType.OverwriteCheckIn);
SharePointUtilities.ExecuteSharePointQuery(clientContext);
}
catch (Exception ex)
{
}
}
}
}
There's too many possible reasons for me to really comment on a solution, especially considering it's only on the prod environment.
What I can say is that it's probably easiest to keep a reference to the last uploaded file. If your code fails then check if the last file has been uploaded correctly.
Side note: I'm not sure if this is relevant but if it's a large file you want to upload it in slices.
I have a issue with the FileSystemWatcher. I'm using it in a windows service to monitor certain folders and when a file is copied, it proccesses that file using a SSIS package. Everything works fine, but every now and then, the FileWatcher picks up the same file and fires the Created event multiple times in a infinate loop. The code below works as follow:
Firstly, this method is called by the windows service and creates a watcher :
private void CreateFileWatcherEvent(SSISPackageSetting packageSettings)
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.IncludeSubdirectories = false;
watcher.Path = packageSettings.FileWatchPath;
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.Size;
//Watch for all files
watcher.Filter = "*.*";
watcher.Created += (s, e) => FileCreated(e, packageSettings);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
Next up, The Watcher.Created event looks something like this:
private void FileCreated(FileSystemEventArgs e, SSISPackageSetting packageSettings)
{
//Bunch of other code not important to the issue
ProcessFile(packageSettings, e.FullPath, fileExtension);
}
The ProcessFile method looks something like this:
private void ProcessFile(SSISPackageSetting packageSetting,string Filename,string fileExtension)
{
//COMPLETE A BUNCH OF SSIS TASKS TO PROCESS THE FILE
//NOW WE NEED TO CREATE THE OUTPUT FILE SO THAT SSIS CAN WRITE TO IT
string errorOutPutfileName = packageSetting.ImportFailurePath + #"\FailedRows" + System.DateTime.Now.ToFileTime() + packageSetting.ErrorRowsFileExtension;
File.Create(errorOutPutfileName).Close();
MoveFileToSuccessPath(Filename, packageSetting);
}
Lastly, the MoveFile Method looks like this:
private void MoveFileToSuccessPath(string filename, SSISPackageSetting ssisPackage)
{
try
{
string newFilename = MakeFilenameUnique(filename);
System.IO.File.Move(filename, ssisPackage.ArchivePath.EndsWith("\\") ? ssisPackage.ArchivePath + newFilename : ssisPackage.ArchivePath + "\\" + newFilename);
}
catch (Exception ex)
{
SaveToApplicationLog(string.Format
("Error ocurred while moving a file to the success path. Filename {0}. Archive Path {1}. Error {2}", filename, ssisPackage.ArchivePath,ex.ToString()), EventLogEntryType.Error);
}
}
So somewhere in there, we go into a infinite loop and the FileWatcher keeps on picking up the same file. Anyone have any idea? This happens randomly and intermittently.
When using the FileSystemWatcher I tend to use a dictionary to add the files to when the notification event fires. I then have a separate thread using a timer which picks files up from this collection when they are more than a few seconds old, somewhere around 5 seconds.
If my processing is also likely to change the last access time and I watch that too then I also implement a checksum which I keep in a dictionary along with the filename and last processed time for every file and use that to suppress it firing multiple times in a row. You don't have to use an expensive one to calculate, I have used md5 and even crc32 - you are only trying to prevent multiple notifications.
EDIT
This example code is very situation specific and makes lots of assumptions you may need to change. It doesn't list all your code, just somethind like the bits you need to add:
// So, first thing to do is add a dictionary to store file info:
internal class FileWatchInfo
{
public DateTime LatestTime { get; set; }
public bool IsProcessing { get; set; }
public string FullName { get; set; }
public string Checksum { get; set; }
}
SortedDictionary<string, FileWatchInfo> fileInfos = new SortedDictionary<string, FileWatchInfo>();
private readonly object SyncRoot = new object();
// Now, when you set up the watcher, also set up a [`Timer`][1] to monitor that dictionary.
CreateFileWatcherEvent(new SSISPackageSetting{ FileWatchPath = "H:\\test"});
int processFilesInMilliseconds = 5000;
Timer timer = new Timer(ProcessFiles, null, processFilesInMilliseconds, processFilesInMilliseconds);
// In FileCreated, don't process the file but add it to a list
private void FileCreated(FileSystemEventArgs e) {
var finf = new FileInfo(e.FullPath);
DateTime latest = finf.LastAccessTimeUtc > finf.LastWriteTimeUtc
? finf.LastAccessTimeUtc : finf.LastWriteTimeUtc;
latest = latest > finf.CreationTimeUtc ? latest : finf.CreationTimeUtc;
// Beware of issues if other code sets the file times to crazy times in the past/future
lock (SyncRoot) {
// You need to work out what to do if you actually need to add this file again (i.e. someone
// has edited it in the 5 seconds since it was created, and the time it took you to process it)
if (!this.fileInfos.ContainsKey(e.FullPath)) {
FileWatchInfo info = new FileWatchInfo {
FullName = e.FullPath,
LatestTime = latest,
IsProcessing = false, Processed = false,
Checksum = null
};
this.fileInfos.Add(e.FullPath, info);
}
}
}
And finally, here is the process method as it now is
private void ProcessFiles(object state) {
FileWatchInfo toProcess = null;
List<string> toRemove = new List<string>();
lock (this.SyncRoot) {
foreach (var info in this.fileInfos) {
// You may want to sort your list by latest to avoid files being left in the queue for a long time
if (info.Value.Checksum == null) {
// If this fires the watcher, it doesn't matter, but beware of big files,
// which may mean you need to move this outside the lock
string md5Value;
using (var md5 = MD5.Create()) {
using (var stream = File.OpenRead(info.Value.FullName)) {
info.Value.Checksum =
BitConverter.ToString(md5.ComputeHash(stream)).Replace("-", "").ToLower();
}
}
}
// Data store (myFileInfoStore) is code I haven't included - use a Dictionary which you remove files from
// after a few minutes, or a permanent database to store file checksums
if ((info.Value.Processed && info.Value.ProcessedTime.AddSeconds(5) < DateTime.UtcNow)
|| myFileInfoStore.GetFileInfo(info.Value.FullName).Checksum == info.Value.Checksum) {
toRemove.Add(info.Key);
}
else if (!info.Value.Processed && !info.Value.IsProcessing
&& info.Value.LatestTime.AddSeconds(5) < DateTime.UtcNow) {
info.Value.IsProcessing = true;
toProcess = info.Value;
// This processes one file at a time, you could equally add a bunch to a list for parallel processing
break;
}
}
foreach (var filePath in toRemove) {
this.fileInfos.Remove(filePath);
}
}
if (toProcess != null)
{
ProcessFile(packageSettings, toProcess.FullName, new FileInfo(toProcess.FullName).Extension);
}
}
Finally, ProcessFile needs to process your file, then once completed go inside a lock, mark the info in the fileInfos dictionary as Processed, set the ProcessedTime, and then exit the lock and move the file. You will also want to update the checksum if it changes after an acceptable amount of time has passed.
It is very hard to provide a complete sample as I know nothing about your situation, but this is the general pattern I use. You will need to consider file rates, how frequently they are updated etc. You can probably bring down the time intervals to sub second instead of 5 seconds and still be ok.
Ok, i have some code that will scan my computer and find .txt files and display them in a listbox:
private void button2_Click(object sender, EventArgs e)
{
IEnumerable<string> files = System.IO.Directory.EnumerateFiles(#"C:\", "*.txt*", System.IO.SearchOption.AllDirectories);
foreach (var f in files)
{
listBox1.Items.Add(String.Format("{0}", f));
}
}
I get an error every time i run this. It says i do not authorization to the trash bin. I do not care weather it scans the trash or not. It there any way to exclude the trash bin out of the scan? Also, can someone help me improve my code, if you see anything wrong! Thanks!
Quickest way is to put them under a try-catch block because EnumerateFiles function does not have access to the restricted files because of the operating system permissions.
private void SearchDrives()
{
foreach (String drive in Directory.GetLogicalDrives())
{
try
{
// Search for folders into the drive.
SearchFolders(drive);
}
catch (Exception) { }
}
}
//---------------------------------------------------------------------------
private void SearchFolders(String prmPath)
{
try
{
foreach (String folder in Directory.GetDirectories(prmPath))
{
// Recursive call for each subdirectory.
SearchFolders(folder);
// Create the list of files.
SearchFiles(folder);
}
}
catch (Exception) { }
}
//---------------------------------------------------------------------------
private void SearchFiles(String prmPath)
{
try
{
foreach (String file in Directory.GetFiles(prmPath))
{
FileInfo info = new FileInfo(file);
if (info.Extension == ".txt")
{
listBox1.Items.Add(info.Name);
}
}
}
catch (Exception) { }
}
//---------------------------------------------------------------------------
Not just the recycle bin, it will also fail to read the file header of several files into your system direcotry.
In general you can do it so that you do recursive calls for every folder and just use try/catch blocks to see witch ones you can or can't access. But as Andras suggested I would also go with what allready exists will save you time
Another aproach on your example
I'm not really into multithreading so probably the question is stupid but it seems I cannot find a way to solve this problem (especially because I'm using C# and I've been using it for a month).
I have a dynamic number of directories (I got it from a query in the DB). Inside those queries there are a certain amount of files.
For each directory I need to use a method to transfer these files using FTP in a cuncurrent way because I have basically no limit in FTP max connections (not my word, it's written in the specifics).
But I still need to control the max amount of files transfered per directory. So I need to count the files I'm transfering (increment/decrement).
How could I do it? Should I use something like an array and use the Monitor class?
Edit: Framework 3.5
You can use the Semaphore class to throttle the number of concurrent files per directory. You would probably want to have one semaphore per directory so that the number of FTP uploads per directory can be controlled independently.
public class Example
{
public void ProcessAllFilesAsync()
{
var semaphores = new Dictionary<string, Semaphore>();
foreach (string filePath in GetFiles())
{
string filePathCapture = filePath; // Needed to perform the closure correctly.
string directoryPath = Path.GetDirectoryName(filePath);
if (!semaphores.ContainsKey(directoryPath))
{
int allowed = NUM_OF_CONCURRENT_OPERATIONS;
semaphores.Add(directoryPath, new Semaphore(allowed, allowed));
}
var semaphore = semaphores[directoryPath];
ThreadPool.QueueUserWorkItem(
(state) =>
{
semaphore.WaitOne();
try
{
DoFtpOperation(filePathCapture);
}
finally
{
semaphore.Release();
}
}, null);
}
}
}
var allDirectories = db.GetAllDirectories();
foreach(var directoryPath in allDirectories)
{
DirectoryInfo directories = new DirectoryInfo(directoryPath);
//Loop through every file in that Directory
foreach(var fileInDir in directories.GetFiles()) {
//Check if we have reached our max limit
if (numberFTPConnections == MAXFTPCONNECTIONS){
Thread.Sleep(1000);
}
//code to copy to FTP
//This can be Aync, when then transfer is completed
//decrement the numberFTPConnections so then next file can be transfered.
}
}
You can try something along the lines above. Note that It's just the basic logic and there are proberly better ways to do this.
I am opening a file with read access and allowing subsequent read|write|delete file share access to the file (tailing the file). If the file is deleted during processing is there a way to detect that the file is pending delete (see Files section http://msdn.microsoft.com/en-us/library/aa363858(v=VS.85).aspx)? If some outside process (the owning process) has issued a delete, I want to close my handle as soon as possible to allow the file deletion so as not to interfere with any logic in the owning process.
I'm in C# and see no method of detecting the pending delete. The file was opened using a FileStream object. Is there some method for detecting the delete in C# or in some other windows function?
You can use the Windows API function GetFileInformationByHandleEx to detect a pending delete on a file you have open. The second argument is an enumeration value which lets you specify what kind of information the function should return. The FileStandardInfo (1) value will cause it to return the FILE_STANDARD_INFO structure, which includes a DeletePending boolean.
Here is a demonstration utility:
using System;
using System.Text;
using System.IO;
using System.Runtime.InteropServices;
using System.Threading;
internal static class Native
{
[DllImport("kernel32.dll", SetLastError = true)]
public extern static bool GetFileInformationByHandleEx(IntPtr hFile,
int FileInformationClass,
IntPtr lpFileInformation,
uint dwBufferSize);
public struct FILE_STANDARD_INFO
{
public long AllocationSize;
public long EndOfFile;
public uint NumberOfLinks;
public byte DeletePending;
public byte Directory;
}
public const int FileStandardInfo = 1;
}
internal static class Program
{
public static bool IsDeletePending(FileStream fs)
{
IntPtr buf = Marshal.AllocHGlobal(4096);
try
{
IntPtr handle = fs.SafeFileHandle.DangerousGetHandle();
if (!Native.GetFileInformationByHandleEx(handle,
Native.FileStandardInfo,
buf,
4096))
{
Exception ex = new Exception("GetFileInformationByHandleEx() failed");
ex.Data["error"] = Marshal.GetLastWin32Error();
throw ex;
}
else
{
Native.FILE_STANDARD_INFO info = Marshal.PtrToStructure<Native.FILE_STANDARD_INFO>(buf);
return info.DeletePending != 0;
}
}
finally
{
Marshal.FreeHGlobal(buf);
}
}
public static int Main(string[] args)
{
TimeSpan MAX_WAIT_TIME = TimeSpan.FromSeconds(10);
if (args.Length == 0)
{
args = new string[] { "deleteme.txt" };
}
for (int i = 0; i < args.Length; ++i)
{
string filename = args[i];
FileStream fs = null;
try
{
fs = File.Open(filename,
FileMode.CreateNew,
FileAccess.Write,
FileShare.ReadWrite | FileShare.Delete);
byte[] buf = new byte[4096];
UTF8Encoding utf8 = new UTF8Encoding(false);
string text = "hello world!\r\n";
int written = utf8.GetBytes(text, 0, text.Length, buf, 0);
fs.Write(buf, 0, written);
fs.Flush();
Console.WriteLine("{0}: created and wrote line", filename);
DateTime t0 = DateTime.UtcNow;
for (;;)
{
Thread.Sleep(16);
if (IsDeletePending(fs))
{
Console.WriteLine("{0}: detected pending delete", filename);
break;
}
if (DateTime.UtcNow - t0 > MAX_WAIT_TIME)
{
Console.WriteLine("{0}: timeout reached with no delete", filename);
break;
}
}
}
catch (Exception ex)
{
Console.WriteLine("{0}: {1}", filename, ex.Message);
}
finally
{
if (fs != null)
{
Console.WriteLine("{0}: closing", filename);
fs.Dispose();
}
}
}
return 0;
}
}
I would use a different signaling mechanism. (I am making the assumption all file access is within your control and not from a closed external program, mainly due to the flags being employed.)
The only "solution" within those bounds I can think of is a poll on file-access and check the exception (if any) you get back. Perhaps there is something much more tricky (at a lower-level than the win32 file API?!?), but this is already going down the "uhg path" :-)
FileSystemWatcher would probably be the closest thing, but it can't detect a "pending" delete; when the file IS deleted, an event will be raised on FileSystemWatcher, and you can attach a handler that will gracefully interrupt your file processing. If the lock (or lack of one) you acquire in opening the file makes it possible for the file to be deleted at all, simply closing your read-only FileStream when that happens should not affect the file system.
The basic steps of a file watcher are to create one, passing an instance of a FileInfo object to the constructor. FileInfos can be created inexpensively by just instantiating one, passing it the path and filename of the file as a string. Then, set its NotifyFilter to the type(s) of file system modifications you want to watch for on this file. Finally, attach your process's event handler to the OnDeleted event. This event handler can probably be as simple as setting a bit flag somewhere that your main process can read, and closing the FileStream. You'll then get an exception on your next attempt to work with the stream; catch it, read the flag, and if it's set just gracefully stop doing file stuff. You can also put the file processing in a seperate worker thread, and the event handler can just tell the thread to die in some graceful method.
If the file is small enough, your application could process a copy of the file, rather than the file itself. Also, if your application needs to know whether the owning process deleted the original file, set up a FileSystemWatcher (FSW) on the file. When the file disappears, the FSW could set a flag to interrupt processing:
private bool _fileExists = true;
public void Process(string pathToOriginalFile, string pathToCopy)
{
File.Copy(pathToOriginalFile, pathToCopy);
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = pathToOriginalFile;
watcher.Deleted += new FileSystemEventHandler(OnFileDeleted);
bool doneProcessing = false;
watcher.EnableRaisingEvents = true;
while(_fileExists && !doneProcessing)
{
// process the copy here
}
...
}
private void OnFileDeleted(object source, FileSystemEventArgs e)
{
_fileExists = false;
}
No, there's no clean way to do this. If you were concerned about other processes opening and/or modifying the file, then oplocks could help you. But if you're just looking for notification of when the delete disposition gets set to deleted, there isn't a straightforward way to do this (sans building a file system filter, hooking the APIs, etc. all of which spooky for an application do be doing w/o very good reason).