C# File change/create/delete event - c#

im currently getting into security stuff and trying to build a basic "access limiter" which should register if a file or folder in a specific directory (including subdirectories) is created, changed, deleted etc.
I already figured out how to do it after the action took place using FileSystemWatcher, but I want to catch the request / event before it happens to process it. I already searched a while but haven`t really found a solution yet.
If something like this is possible I would be grateful for tips or short samples / references.
Thanks in advance.

As you've stated, you can already see what happens after the fact that something has been done to a file. To my knowledge, you can't preempt file system access from .Net as this is happening on a much lower layer (right above storage). If you are trying to ensure security, you are better off focusing on NTFS/Share level security (standard permissions like Blorgbeard said).
If you really want to intercept file system calls, a file system filter driver may be what you're after but it looks very difficult.
Into to the concept: http://msdn.microsoft.com/en-us/library/windows/hardware/dn641617
Another SO answer restating the same thing (which I based the above link on): How to intercept the access to a file in a .NET Program
Alternatively, you can tailor your security app to enumerate the security settings on files/folders as a preventative measure to ensure the proper file system security settings are in place, or take corrective action if they are not. You can use the FileSecurity class to get an idea of what is available to you (quite a bit!): http://msdn.microsoft.com/en-us/library/system.security.accesscontrol.filesecurity(v=vs.110).aspx
If you want to go the audit route and get stuck on using the FileSecurity and/or DirectorySecurity class, I would post a separate question and we can tackle that.
EDIT:
Some actual code on writing the filter driver, if you are so inclined (beware, no .net):
http://www.codeproject.com/Articles/43586/File-System-Filter-Driver-Tutorial

Related

Prevent other processes and users from accessing a file

I have a windows service written in C# running on a machine and it creates and uses a number of files. Is there a way to prevent a user on the machine, administrators included, from messing with these files(moving, editing, renaming, deleting) from the code?
I know that StreamWriter can achieve this, but I don't want to keep the files open all the time without the need to actually access the data in them, but I can't seem to find any other way.
EDIT: Let me rephrase the question base don the comments below. Is there a way to setup ACL in a way that only my service can access the files? I would also accept if only services could access the files(I have seen mention of All Services security group in Microsoft Docs but I can't seem to actually find it on the system or in .net).
You can do it changing access privileges BUT I strongly suggest to simply keep them open (just be careful to flush the stream after each batch write).
In the first part I try to address directly your question ("How to prevent...") but in the second part I tried to outline a different approach (make your application resilient: keep a backup).
How to prevent...
Assuming that you're running on Windows to avoid other users to mess with them you should:
Set the hidden attribute. By default hidden files are hidden and many users won't even see them. If you can do it at directory level then even better.
Change ACL to deny Full access to Users and Administrators group. Better if you cherry pick and just leave Read permissions. By default Windows pick the most restrictive policy, even when an user belongs to two group, then this will effectively stop everyone to write that file (if you deny also Read permissions then they won't even be able to see its content but see later).
Create a special group (with the required permissions, and only those) with one single user. Be sure that user isn't automatically added to the Users group.
Change your application to impersonate that user when writing those files. If you left the Read permissions in-place then code for reading isn't affected.
Don't forget to check with different versions and editions of Windows (HomeUsers keep bouncing in my mind.) If your application is a Windows Service then things may be slightly easier, see eryksun's comment.
You can experiment with all these things simply using Windows Explorer, just find the right balance but don't forget that each single installation is a different world and only God knows what the environment is (but he doesn't know why).
Few obvious drawbacks:
An administrator can ALWAYS do what he wants then they may find those files and revert permissions. I think (I'm not sure) that System Installer has some special privileges to prevent this but I'm not sure (and I can't imagine how to do it).
Installation is way more complicate (and you will need one if you don't have). You may do it when application is executed first time but then you will need administrative privileges (just once but probably worse.)
Your code is more complex.
More setup means more things that may go wrong, balance this with the effort of your technical support team.
Updates (and tech support job) will be more complicate.
Users with certain privileges won't be affected (see another comment) but this is really a good thing and you shouldn't every try to circumvent it.
Backup is the key!
Don't forget that if they really want to break your application then they will just delete the application directory...
I think, but I don't know your specific use-case, that maybe you're approaching the problem from the wrong angle. If what you want to prevent the user to corrupt your data files (intentionally or not) then what you need is a BACKUP. Save a copy in a different location each time your write them, mark it as hidden and live happy. If they're not too big you may even save content directly inside Windows Registry. For encrypted/hashed/checksummed files your application can easily detect when they're broken or missing: just restore backup and you're done.
I don't want to keep the files open all the time
But keeping them open is a good way that closely follows your intent and requirements.
As long as it's not about hundreds or more, this seems the best option.
The other way is to set the security properties (ACL) but that is messy and requires a higher privilege.
Excluding the Admin is not totally possible and you should not really want that. Avoiding accidental delete or rename is doable, total control is not.
2 Other options are
Set some permissions in the locations here the files are so that no one can access them
If all of the files in question will be created by your application, you could check the options in CreateFile, where you can set the sharing options to 0x00000000 to "Prevent other processes from opening a file or device if they request delete, read, or write access."
If you want to use CreateFile I guess you will have to pinvoke it

Seeking Advice on Design for Multiple Users working in the same file

I am at the earliest stage of writing a desktop application for use by multiple users. I am looking for advice on what is the best way to approach this.
The Spec
I will persist my Model in a file which would often be used on a mapped network drive. (It is for the design of roadways and other linear features like railways and streams.)
The various end users need to be able to connect to and edit the file simultaneously. For example, Billy Bob is working on the road named US321 while Rupert is working on I40. The models for each road live in the same file. End users can "claim" any road name, in which only the claimant can edit the given road. Rupert can't edit US321 while Billy Bob has it claimed, but Rupert can read US321 for reference. Once a user is finished editing the road data, he can release the claim and someone else could edit it.
Limitations on Serialization?
My understanding of Serialization is quite limited (see my profile). But it looks to me like there is a one-to-one correlation between objects and serialization files. So if I use serialization to implement this, it would not be possible to claim just a part of it nor would it be possible to update only a part of it. (Is this correct? If not, then I can use Serialization, right?)
The Solution I am Considering
I am considering using SQL Server Express, and I am interested in the community's warnings, corrections, or affirmations on this.
The end users would not have to know that I am using SQL Server Express in the background. (I would even change the file extension to something suitable to my app.) I would load roads into a list, and each road would be "claimable". Claiming a road would mark it in the database for the other instances of the app to react to accordingly, kind of like it is a shared MS Excel file that multiple people can edit simultaneously, but (in the analogy to Excel) being able to lock individual worksheets.
[Edit] See Micah Armantrout's very informative response, below. So now I am wondering about using Microsoft Access as the intermediating db app.
[Edit]
Conclusion
Thanks to everyone for their helpful answers and comments. Micah's answer was very helpful since I did not realize I would be limited to the file being controled by only one server. Although it makes perfect sense now, I had not anticipated it, and if I had gone that route, I would have run aground on it after many hours of working in that direction.
When I first read urbadave's idea, I dismissed it as something I had already considered and not liked. But after thinking it over, it is clearly the simplest approach. I just use a directory like it is a file, but with user-transparency to my my top level sub-objects. But there clearly is an appeal to having my whole model be encapsualted into a single file.
So this is what I have decided to do: Start with just writing to a directory, just as urbadave suggests. Then later test out putting it in a zip directory and using the ZipPackage class to pluck out and insert the individual serialized files (or XML files -- another decision I have to make some day).
Paul
SQL server will work for what you are looking for but if your going to have multiple users you need to have a machine setup to be a server. It will not do you any good to have sql server express installed on each machine It might be one of the users machines or an actual server with SQL server express you are going to need to set it up to be accessible outside of a current machine to do this follow this tutorial.
If you are using anything past windows XP SP2 you will need to open up the ports of the firewall follow these instructions this is also talked about in the link below.
http://blogs.msdn.com/b/sqlexpress/archive/2005/05/05/415084.aspx
As far as sharing data I mean seeing other peoples work. If you are not wanting to install sql server on a server you can use MS Access I would refer you to a article on which one to use when
http://www.techrepublic.com/article/should-you-use-sql-server-express-edition-or-microsoft-access-for-your-small-business-applications/6140859
While I have access to a nice database at work, most of my personal programming does not use a database. One of the tricks I've used in the past is that file extensions are meant to carry meaning. In your case, you can exploit file extensions to indicate claims and control writing to the master file.
You're right, you would want to serialize each road object to its own file. The Master File would be the serialization of a collection object that holds all of these individual road objects.
The users select and open these road files. Before opening the file, the user's app re-names the file, adding an extension (perhaps the user's id). This way, you can use directory scans to find claimed and unclaimed files.
The master file is only written to when the user releases their claim on the road they are working on. The user's app opens all the road files, assembles a master object using the road objects and then serializes this object into the master file. When finished, the users app releases the user's claim on the road file by renaming it.
Before writing to the master file, the user's app renames the file, indicating it is about to be written to. If an users app needs to write, it can check to see if the file is renamed, and wait for the file name to be restored to a writable name.
This is a sketch of how I would attack this spec. Good luck.

How to disable an exe file after first installation?

Does anybody know the solution for this? I create an exe file of my software. After first installation I have to disable the exe, so it cannot be run again because when someone purchases the software from me they can install it only once.
To do this you'll need to store something somewhere, that something could be:
A file
A registry entry
A call to a web service you own that stores a unique identifier for the machine, and is checked on subsequent installation attempts (Note: If you choose this method you must be clear and up-front with your users that it's what you're doing).
Bear in mind that a determined user will be able to circumvent file and registry methods and also quite possibly the web service method. The former two by using something such as Process Monitor to identify the files/registry entries you're writing to and clear them. For the latter, by using something like Fiddler to identify the web service calls you're making and replacing the responses with ones that allow them to bypass your protection.
Remember, ultimately the user can disassemble your code and remove the protection mechanisms you've put in place, so don't rely on them being 100% un-breakable
Forget it, mate. It's software - you absolutely cannot enforce something like that because the user has complete control over the environment where the binary runs, including reverse engineering, virtualization, backups etc. etc. And the ones who you want to foil are precisely the ones who will go to any length to thwart any protection measure you could invent.
No, the only thing that works is to force an online connection and register, on your system, the fact that a particular binary was installed once, then forbid it the next time. That requires you to make each installer different and have a cryptographically strong key generator, and it's still susceptible to replay attacks - but it's the only thing that is not useless by definition.
(Well, either that, or make your software so insanely great that people will fall in love you and want to give you the money. That solution is probably even harder.)
You could store the installation path in the registry or some secret location and have your .exe check that if it has started from a location different than the one stored, to simply exit, as you probably don't want to tell the user what you are doing.

C# solution for analysing files as they are written/modified

I have several projects that require me to monitor files, and then edit them as they are getting written to disk. I have a feeling that what I am looking for is operationally the same as how anti-virus tools operate. Let me give more details:
1) I need to trap all files saved by Office application, and then add specific company tags to the headers/footers of each document as they are getting written to disk.
2) I need to know immediately when an editable file (of pretty much any type) is written to disk, so that I can undertake some scanning operations to check if files content meets certain company policies.
In short, you can see that I need to process any user files as they are being written to disk.
Here is my problem. I want to use C# for this task, but I am not sure if it has the ability to meet my requirements. Everything I have seen on the net is geared towards lower-level C programming, which I specifically want to avoid due to time constraints for this project. Anyone aware of how to easily do this task in C#? Is it even feasible (ie too high-level a language, too slow a language etc.)?
Performance won't be the issue. I guess I'd question the entire process- it sounds like a recipe for disaster. You can easily hack something together in C# using a FileSystemWatcher in a matter of minutes, but it will be fraught with issues. AV software is bad enough about locking files and screwing up various software, and it's not even trying to modify the file. How do you know when the other app is "done" writing the file? What do you do when you've got the file locked and something else breaks because it can't get access?
Have you looked at the FileSystemWatcher?
C# can easily do this. Look at the FileSystemWatcher class (http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx).

How to determine who changed a file?

In Windows, how can I programmatically determine which user account last changed or deleted a file?
I know that setting up object access auditing may be an option, but if I use that I then have the problem of trying to match up audit log entries to specific files... sounds complex and messy! I can't think of any other way, so does anyone either have any tips for this approach or any alternatives?
You can divide your problem into two parts:
Write to a log whenever a file is accessed.
Parse, filter and present the relevant information of the log.
Of those two part 1, writing to the log is a built in function through auditing as you mention. Reinventing that would be hard and probably never get as good as the builtin functionality.
I would use the built in functionality for logging by setting up an audit ACL on those files. Then I would focus my efforts on providing a good interface that reads the event log, filters out relevant events and presents them in a way that is suitable and relevant for your users.
You could always create a file system filter. This might be overkill, but it depends on your purposes. You can have it load at boot and it sits behind pretty much every file access (its what virus scanners usually use to scan files as they are accessed).
Simply need to log the "owner" of the application that is writing to the file.
Also see the MSDN documentation
The only way I know of to do this is to set up a FileSystemWatcher and keep it running. Oh, and if it's across a network drive, it may randomly lose connection, so it may be good to force a disconnect/reconnect every few hours just to make sure it has a fresh connection.

Categories

Resources