Delete uploaded file if it is not moved to another folder - c#

I am developing a commenting system with asp.net. A user can attach an image with "Attach" button and post the comment with "Post" button. Uploading the image starts when the user attaches it. An ASHX handler saves the uploaded file to "temp" folder. If the user clicks "Post" button, I move the image into a safe place. If he doesn't click "Post", closes the browser and goes away, the file remains in the "temp" folder. How can I delete a file from this "temp" folder one hour later after it is uploaded?
Details:
I thought using System.Timers.Timer in the ashx file used for uploading
System.Timers.Timer timer = new System.Timers.Timer(300);
string fileName;
public void Cleaner()
{
System.Timers.Timer timer = new System.Timers.Timer(300); //3 second
timer.Elapsed += new System.Timers.ElapsedEventHandler(timer_Elapsed);
timer.Start();
}
protected void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs a)
{
timer.Stop();
timer.Close();
string path = "temp";
string mapPath = HttpContext.Current.Server.MapPath("../" + path);
FileInfo TheFile = new FileInfo(mapPath + "\\" + fileName);
if (TheFile.Exists) File.Delete(mapPath + "\\" + fileName);
}
public void ProcessRequest(HttpContext context)
{
//Saving uploaded file
Cleaner();
}
but I feel that I am not doing right.
Timer ticks after 3 seconds but HttpContext.Current in the timer_Elapsed() function returns null. Besides, file name also returns null after timer ticks. I couldn't find a way to pass file name as a parameter when binding an event. Simply, it is problematic. I am looking for a more elegant way to delete the uploaded file after one hour.

I would avoid timers as you will create one timer per file which will not scale very well at all.
How about this, run a clean up process on another thread in the web app started on app start that will delete temp files each time a session expires. That way you need no timers as the process will be prompted each time a session expires. You will need a class to store a reference (by unique name I guess) to the file which are still live (by that I mean the session to which they belong is still live) which the clean process can check.
LMK if you want some code pointers.

HttpContext.Current should be null as the the context died as soon as response was sent.
If you were using unix, I would suggest write a script and run using cron. But seems you are using Windows.
So, write a program (exe) which deletes files (even better only image files) from temp directory based on creation date. Google and you will find lots of tutorial how to do it. Deleting a file is one line of code. If you are using system temp dir, that is another line of code. If you are using custom temp dir, you already know the path.
If you want to check creation time property (or last modified time property), you need to write few more lines.
Now, schedule the exe as per your requirement using windows task manager. Or you can use 3rd party task managers available for windows.

Related

Is there a way to check if there are any files uploading to an azure storage account container?

I've got an api that allows a user to create a container in a storage account and then upload an unlimited number of files. At the point they've finished triggering the uploads they can trigger a process that will start validating those files and downloading to another machine. They can trigger this process before the uploads have finished.
So I was hoping to find a way where I can check whether a container has any file uploads in progress. I've not been able to find anything other than workarounds to track whether specific files are still uploading. I need to make a call to a container and see if anything is in progress or not.
• Yes, there is a way through which you can check whether the container has any file uploads in progress or not but for that purpose, you will have to use the ‘FileUpload’ control option along with a Boolean value to verify whether that control has a file to upload or not. In addition, the ‘File.Exists’ method is called to check whether a file with the same name already occurs in the path. If it does, the name of the file to upload is prefixed with an underscore character before the ‘SaveAs’ method is called.
Thus, atleast through this method, you can determine which files are in upload queue from a particular directory to the container. Please find the below snapshots and code snippets for your reference: -
protected void UploadButton_Click(object sender, EventArgs e)
{
// Before attempting to save the file, verify
// that the FileUpload control contains a file.
if (FileUpload1.HasFile)
// Call a helper method routine to save the file.
SaveFile(FileUpload1.PostedFile);
else
// Notify the user that a file was not uploaded.
UploadStatusLabel.Text = "You did not specify a file to upload.";
}
The above section of code states whether the file to be uploaded is present in the directory or not while the one below tries to find out whether a file with the same name already exists in the given path or not and prefixes it with a number so that when this file is being uploaded and queried for further, it can be identified.
if (System.IO.File.Exists(pathToCheck))
{
int counter = 2;
while (System.IO.File.Exists(pathToCheck))
{
// if a file with this name already exists,
// prefix the filename with a number.
tempfileName = counter.ToString() + fileName;
pathToCheck = savePath + tempfileName;
counter ++;
}
To know more details regarding this, I would suggest you to please refer the below link: -
https://learn.microsoft.com/en-us/dotnet/api/system.web.ui.webcontrols.fileupload.hasfile?view=netframework-4.8#examples
https://www.educba.com/asp-dot-net-fileupload/

C# FileSystemWatcher file size is only updating with windows explorer

I've a problem with the FileSystemWatcher in C#.
I watch a file used by another program.
This is not a problem. The problem is that the only value that changes with the file is the size. The other program is writing the file without updating the change or write date.
And the size value is only updating when the (windows 7) explorer refreshing (F5, or clicked on the file).
FileSystemWatcher fileWatcher = new FileSystemWatcher();
fileWatcher.Changed += new FileSystemEventHandler(fileWatcher_Changed);
fileWatcher.Path = Path.GetDirectoryName(path); // get directory of file path.
fileWatcher.Filter = Path.GetFileName(path); // only this file
fileWatcher.NotifyFilter = NotifyFilter.Size; // and maybe other
fileWatcher.EnableRaisingEvents = true;
private void fileWatcher_Changed(object sender, FileSystemEventArgs e)
{
// ...
}
I guess that this problem can only be solved by polling. Because the file should be triggered for refreshing file info data.
But I hope for another solution without polling.
About my application:
Reading a file which is in use of another program. Getting the text of the file is possible with FileStream FileShare.ReadWrite. It's working fine. I want to update the textbox (reading the file) when the file has been changed. But the only value who is updating while the other program is accessing to it, is the file size. But only when the explorer is refreshing or I'm clicking on the file. This is the issue of this question. If the problem is unsolvable, the alternative is: updating the file content (reading file) all x time. (polling) without a file watcher.
Perhaps, following would solve your problem, and avoid polling. Try using WMI, querying root\cimv2 namespace with something like:
select filesize from 'cim_datafile' where name='_your_path_'
Might differ slightly when applied from C# but along those lines. Regarding WMI, there's lots of tutorials how to initialize a WMI listener in .NET. Search around.

Read files based on their modified time in C#

I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.

Automatically Check if a webpage source code has changed

I am new to C#/WPF programming and I am trying to automatically update a local copy of a source code if the source code for a certain page has changed. Is there a way to check the source code say every other day without me having to go in manual do diff?
To get the source code for the website I have
Private bool getSourceCode(string UserInputSub)
{
//insert error catching..
using (WebClient webClient = new WebClient()) //Get source code of page.. user //enters URL
{
string s = webClient.DownloadString(UserInputSub);
string fixedString = s.Replace("\n", "\r\n");
string desktopPath = Environment.GetFolderPath(Environment.SpecialFolder.Desktop);
string FilePath = desktopPath + "\\SourceCode.txt";
System.IO.StreamWriter wr = new System.IO.StreamWriter(#FilePath);
wr.Write(fixedString);//writes to script
wr.Close();
}
return true;
}
This only runs once the program is being runned. I would like it so the user does not have the run teh program for it to update the txt it produces.
Add a timer to the project, set the interval to 86400000 ms(24 hs), and then in the tick event call your function,is not the best solution,better will be to add as a cron job or something, but if is a dedicated machine,sure it will work.
The simple solution would be to write a windows service that will fire once a day, and do this for you.
It'll add some complexity, but do what you want.
edit:
If you want this as part of the windows application, you can set a timer, or poll every x amount of time, but then your application needs to be open all the time for this to happen.
If you want to collect the data independently from the windows program who uses it, you'll have to have a separate service running in the background. Of course, you can have a simple console windows in the background that'll do it for you, but that so hackish that it should be illegal.

FileSystemWatcher and Monitoring Config File Changes

I have about 5-6 Server Manager programs that write their own configuration file out to a particualr folder, such as C:\ACME. The config files all end with a *ServerConfig.cfg" where * = Program name that created it.
I have a Windows service that has a FileSystemWatcher setup that I want to FTP the configuration files each time the program updates. I've gotten everything to work, but I'm noticing that the different Server Manager programs are behaving differently.
When saving a configuration file, the FileSystemWatcher is picking up two "change" events. This is causing my program to FTP the configuration file twice where I only need it once.
In other instances I'm seeing where it may create 4, 5, or 6 "change" events when saving a configuration file.
What is the best way to handle processing/FTPing these files when they are really done saving only one time.
I really dont want o set something up to poll the directory for filechanges every so often... and like the idea that each time a configuration is saved, I get a duplicate copy along with a date/timestamp appended to the filename copied elsewhere.
I have seen lots of suggestions Googling around and even here on Stackoverflow, but nothing that seems to be all-in-one for me.
I suppose I could put the filename in a queue when a "change" event occurred if it didn't already exist in the queue. Not sure if this is the best approx.
Here is my sample code:
Startup-code:
private DateTime _lastTimeFileWatcherEventRaised = DateTime.Now;
_watcherCFGFiles = new FileSystemWatcher();
_watcherCFGFiles.Path = #"C:\ACME";
_watcherCFGFiles.IncludeSubdirectories = true;
_watcherCFGFiles.Filter = "*ServerConfig.cfg";
_watcherCFGFiles.NotifyFilter = NotifyFilters.Size;
//_watcherCFGFiles.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.FileName;
_watcherCFGFiles.Changed += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Created += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Deleted += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Renamed += new RenamedEventHandler(LogFileSystemRenaming);
_watcherCFGFiles.Error += new ErrorEventHandler(LogBufferError);
_watcherCFGFiles.EnableRaisingEvents = true;
Here is that actual handler for the "change" event. I'm skipping the first "change" event if the second is within 700ms. But this doesn't account for the files that make 3-4 change events...
void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
string log = string.Format("{0} | {1}", e.FullPath, e.ChangeType);
if( e.ChangeType == WatcherChangeTypes.Changed )
{
if(DateTime.Now.Subtract(_lastTimeFileWatcherEventRaised).TotalMilliseconds < 700)
{
return;
}
_lastTimeFileWatcherEventRaised = DateTime.Now;
LogEvent(log);
// Process file
FTPConfigFileUpdate(e.FullPath);
}
}
I had the exact same issue. I used a HashMap that mapped filenames to times of writes, I then used this as a lookup table for files to check and see if the changed event had been applied very quickly. I defined some epsilon (for me it was about 2 seconds to make sure events were flushed). If the time found in the map was older than that I would put it on a queue to be processed. Essentially all I had to do was keep the HashMap up to date with events and changes and this worked out (although you may want to change your epsilon value depending on your application).
Its normal this behavior because the antivirus system or other programs make more writes when a file change the content. I usually create a (global) HashTable and check if the filename exists, if don't, put the filename in it and start and an asynchronous operation to remove the filename after 3-5 seconds.
This is expected behavior - so you need to figure out how to handle it in your particular case.
The file system does not have a concept of "program done working with this file". I.e. one can write editor that updates (open/write/close) file on every keystroke. File system will report a lot of updates, but from the user point of view there is only one update when the editor is closed.

Categories

Resources