Real time file scanning - c#

Is there a way using C# to get all the files that are accessed in real time? similar to what antiviruses' real time protection does. I guess there most be an API to hook into the kernel or something?

FileSystemWatcher will give you notifications of file activity.
An AV program also has to scan and possibly block access prior to the data being returned - they'd do that with a file system filter driver. I don't think there's a supported managed equivalent.

You will need to get a list of all the running processes, and then for each process get a list of all the files that are being accessed by that process.
Please check http://www.sqaforums.com/showflat.php?Cat=0&Board=UBB1&Number=23978&Searchpage=1&Main=23978&Words=+AUTOMATION_GURU&topic=&Search=true
&
http://hintdesk.com/c-get-all-files-being-accessed-by-a-process-in-64-bits/

Related

How to write a window service that track the number of times a specific folder was Opened

I Hope that this is the Correct way of asking this question. first my problem is that i want to know that how many times a specific folder was opened from the time my windows service start's. I don't want to write a desktop application for this purpose because i want it to happen in the background and also later i may want to add some more functionality. So that is why i need to be it a windows service.
is there some kind of OS Event that i can handle during my code, i.e the event is fired when a user open's folder.
If this is not the correct method then please let me know some other method that can help.
That's not possible in C#. You can be notified of changes within a directory and infer from that that the directory was opened--but there are many times when a directory is opened and nothing will be changed. What you're describing is most like a File System Filter Driver.
From What is a File System Filter Driver:
A file system filter driver can filter I/O operations for one or more file systems or file system volumes. Depending on the nature of the driver, filter can mean log, observe, modify, or even prevent.
Writing a filter is relatively easy, considering there are templates that you can use to base your work from. But, they do consist of kernel-mode code meaning they're not written with C# (they are typically written with C) and they are drivers.
for more details: http://msdn.microsoft.com/en-us/library/windows/hardware/ff540382(v=vs.85).aspx

File deletion detect from any folder of my operating system and prevent the deletion by c# or win32 api

i know we can monitor a particular folder with FileSystemWatcher. with the help of FileSystemWatcher we can save the log which file was deleted. suppose i have windows service which will run all the time and if any user try to delete any file from my OS with specific extension then control then my windows service will show a messge box to user and prevent the user to delete that file. i just want to know can i do this with FileSystemWatcher class. if it is possible with FileSystemWatcher then please discuss here how or if not possible with FileSystemWatcher then how could i make it possible with my win service or normal win apps. would it be possible by win32 api?? . thanks
Use proper Windows security measures - File permissions together with access groups.
The only way is to have a filesystem filter driver which will track deletion and movement request (movement to recycle bin here) and will cancel such requests. You can write your own filter driver or use our CallbackFilter product. With CallbackFilter the task is trivial - less than a dozen of lines of code in user-mode (possibly in C#).

Is it possible to trace file operations with .NET?

Is it possible when a file operation is called somehow - like open or close - that I can handle it before the request proceeds by the operating system and if possible cancel it by .NET? If .NET has no abilities like that, how can I do this?
What your asking to do can be done. Virus Scanners, for example, do it all the time. You can easily monitor file activity with Process Monitor. You can also do it programmically in C# using the FileSystemWatcher Class. But trying to prevent a program from opening up or trying to stop a program from accessing the file can not be done in C#. You will need to use either C or C++. You need to create a File System Filter Driver. It is a complex thing to build but its exactly what you need. To quote MSDN:
A file system filter driver intercepts requests targeted at a file system or another file system filter driver. By intercepting the request before it reaches its intended target, the filter driver can extend or replace functionality provided by the original target of the request. Examples of file system filter drivers include anti-virus filters, backup agents, and encryption products.
You can hook the Windows API if you want to. Check out this way to do that in .NET/C#:
EasyHook Windows API
Sysinternals offers a free tool called Process Monitor, one function of which is to attach to arbitrary Windows processes (including .NET applications) and capture system calls, including file open, close, read, etc.
You can download it at the Process Monitor Download Page.
EDIT
As I re-read your question, I see that you're asking about intercepting and possibly cancelling such operations. I believe the FileSystemWatcher class will be your best bet, although I don't think it can cancel file operations unilaterally - you'd need to build some kind of cooperative mechanism to signal the caller to abort its operation.
I'm pretty sure you've got to get into the kernel on that kind of operation and I'm pretty sure that means you'll need to code in C. Look at File System Drivers.
UPDATE: this SO link may help.
UPDATE: added a google search for Windows File System Drivers
ALSO What is a good resource to get started with Windows file system driver development?

Polling directory on File Server

I need to write an application that polls a directory which contains images on a file server and display 4 at a time.
This application will be run up to 50 times across the network at the same time.
I'm trying to think of the best architecture to complete this requirement.
I was working on the idea of opening a file with read/write access and no file share allowed so that if another PC came in to read it it would error and it would have to move on to the next one, the problem is, is that I need to access all 4 images in sequence on the same pc ensuring other pc's dont try to open them. So for example if PC1 tries to open 1.jpg it needs to be able to open 1,2,3,4.jpg. If another PC comes in at the same time to read them I need a way for it to then open 5,6,7,8.jpg and so on and so on.
It seems a simple requirement but a nightmare to try and build successfully.
You're basically dealing with a race condition here, and I don't see a way to handle it from separate instances of your application running on separate machines unless you can guarantee your file naming will always follow a standard naming convention that would allow you to work with the sequence of 4 files using only the name of the first.
The best way to handle this would be using a centralized resource to manage access to your files, either a database as was suggested in a comment or else a service (such as WCF) that would "hand out" each set of 4 files.
What about creating a 1.jpg.lock file? The presence of a the file indicates the images are locked and any other instance of the application should skip that set.

.NET FileInfo.LastWriteTime & FileInfo.LastAccessTime are wrong

When I call FileInfo(path).LastAccessTime or FileInfo(path).LastWriteTime on a file that is in the process of being written it returns the time that the file was created, not the last time it was written to (ie. now).
Is there a way to get this information?
Edit: To all the responses so far. I hadn't tried Refresh() but that does not do it either. I am returned the time that the file was started to be written to. The same goes for the static method, and creating a new instance of FileInfo.
Codymanix might have the answer, but I'm not running Windows Server (using Windows 7), and I don't know where the setting is to test.
Edit 2: Nobody finds it interesting that this function doesn't seem to work?
The FileInfo values are only loaded once and then cached. To get the current value, call Refresh() before getting a property:
f.Refresh();
t = f.LastAccessTime;
Another way to get the current value is by using the static methods on the File class:
t = File.GetLastAccessTime(path);
Starting in Windows Vista, last access time is not updated by default. This is to improve file system performance. You can find details here:
http://blogs.technet.com/b/filecab/archive/2006/11/07/disabling-last-access-time-in-windows-vista-to-improve-ntfs-performance.aspx
To reenable last access time on the computer, you can run the following command:
fsutil behavior set disablelastaccess 0
As James has pointed out LastAccessTime is not updated.
The LastWriteTime has also undergone a twist since Vista. When the process has the file still open and another process checks the LastWriteTime it will not see the new write time for a long time -- until the process has closed the file.
As a workaround you can open and close the file from your external process. After you have done that you can try to read the LastWriteTime again which is then the up to date value.
File System Tunneling:
If an application implements something like a rolling logger which closes the file and then renames it to a different file name you will also run into issues since the creation time and file size of the "old" file is remembered by the OS although you did create a new file. This includes wrong reports of the file size even if you did recreate log.txt from scratch which is still 0 bytes in size. This feature is called OS File System Tunneling which is still present on Windows 8.1 . An example how to work around this issue check out RollingFlatFileTracelistener from Enterprise Library.
You can see the effects of file system tunneling on your own machine from the cmd shell.
echo test > file1.txt
ren file1.txt file2.txt
Wait one minute
echo test > file1.txt
dir /tc file*.txt
...
05.07.2015 19:26 7 file1.txt
05.07.2015 19:26 7 file2.txt
The file system is a state machine. Keeping states correctly synchronized is hard if you care about performance and correctness.
This strange tunneling syndrome is obviously still used by application which do e.g. autosave a file and move it to a save location and then recreate the file again at the same location. For these applications it makes to sense to give the file a new creation date because it was only copied around. Some installers do also such tricks to move files temporarily to a different location and write the contents back later to get past some file exists check for some install hooks.
Have you tried calling Refresh() just before accessing the property (to avoid getting a cached value)? If that doesn't work, have you looked at what Explorer shows at the same time? If Explorer is showing the wrong information, then it's probably something you can't really address - it might be that the information is only updated when the file handle is closed, for example.
There is a setting in windows which is sometimes set especially on server systems so that modified and accessed times for files are not set for better performance.
From MSDN:
When first called, FileSystemInfo
calls Refresh and returns the
cached information on APIs to get
attributes and so on. On subsequent
calls, you must call Refresh to get
the latest copy of the information.
FileSystemInfo.Refresh()
If you're application is the one doing the writing, I think you are going to have to "touch" the file by setting the LastWriteTime property your self between each buffer of data you write. Some psuedocode:
while(bytesWritten < totalBytes)
{
bytesWritten += br.Write(buffer);
myFileInfo.LastWriteTime = DateTime.Now;
}
I'm not sure how severely this will affect write performance.
Tommy Carlier's answer got me thinking....
A good way to visualise the differences is seperately running the two snippets (I just used LinqPAD) simliar to below while also running sysinternals Process Monitor.
while(true)
File.GetLastAccessTime([file path here]);
and
FileInfo bob = new FileInfo(path);
while(true){
string accessed = bob.LastAccessTime.ToString();
}
If you look at Process Monitor while running the first snippet you will see repeated and constant access attempts to the file for the LinqPAD process. The second snippet will do an initial access of the file, for which you will see activity in process monitor, and then very little afterwards.
However if you go and modify the file (I just opened the text file I was monitoring using FileInfo and added a character and saved) you will see a series of access attempts by the LinqPAD process to the file in process monitor.
This illustrates the non-cached and cached behaviour of the two different approachs respectively.
Will the non-cached approach wear a hole in the hard drive?!
EDIT
I went away feeling all clever over my testing and then used the caching behaviour of FileInfo in my windows service (basically to sit in a loop and say 'Has-file-changed-has-file-changed...' before doing processing)
While this approach worked on my dev box, it did not work in the production environment, ie the process just kept running regardless if the file had changed or not. I ended up changing my approach to checking and just used GetLastAccessTime as part of it. Don't know why it would behave differently on production server....but I am not too concerned at this point.

Categories

Resources