In Windows, how can I programmatically determine which user account last changed or deleted a file?
I know that setting up object access auditing may be an option, but if I use that I then have the problem of trying to match up audit log entries to specific files... sounds complex and messy! I can't think of any other way, so does anyone either have any tips for this approach or any alternatives?
You can divide your problem into two parts:
Write to a log whenever a file is accessed.
Parse, filter and present the relevant information of the log.
Of those two part 1, writing to the log is a built in function through auditing as you mention. Reinventing that would be hard and probably never get as good as the builtin functionality.
I would use the built in functionality for logging by setting up an audit ACL on those files. Then I would focus my efforts on providing a good interface that reads the event log, filters out relevant events and presents them in a way that is suitable and relevant for your users.
You could always create a file system filter. This might be overkill, but it depends on your purposes. You can have it load at boot and it sits behind pretty much every file access (its what virus scanners usually use to scan files as they are accessed).
Simply need to log the "owner" of the application that is writing to the file.
Also see the MSDN documentation
The only way I know of to do this is to set up a FileSystemWatcher and keep it running. Oh, and if it's across a network drive, it may randomly lose connection, so it may be good to force a disconnect/reconnect every few hours just to make sure it has a fresh connection.
Related
I have a windows service written in C# running on a machine and it creates and uses a number of files. Is there a way to prevent a user on the machine, administrators included, from messing with these files(moving, editing, renaming, deleting) from the code?
I know that StreamWriter can achieve this, but I don't want to keep the files open all the time without the need to actually access the data in them, but I can't seem to find any other way.
EDIT: Let me rephrase the question base don the comments below. Is there a way to setup ACL in a way that only my service can access the files? I would also accept if only services could access the files(I have seen mention of All Services security group in Microsoft Docs but I can't seem to actually find it on the system or in .net).
You can do it changing access privileges BUT I strongly suggest to simply keep them open (just be careful to flush the stream after each batch write).
In the first part I try to address directly your question ("How to prevent...") but in the second part I tried to outline a different approach (make your application resilient: keep a backup).
How to prevent...
Assuming that you're running on Windows to avoid other users to mess with them you should:
Set the hidden attribute. By default hidden files are hidden and many users won't even see them. If you can do it at directory level then even better.
Change ACL to deny Full access to Users and Administrators group. Better if you cherry pick and just leave Read permissions. By default Windows pick the most restrictive policy, even when an user belongs to two group, then this will effectively stop everyone to write that file (if you deny also Read permissions then they won't even be able to see its content but see later).
Create a special group (with the required permissions, and only those) with one single user. Be sure that user isn't automatically added to the Users group.
Change your application to impersonate that user when writing those files. If you left the Read permissions in-place then code for reading isn't affected.
Don't forget to check with different versions and editions of Windows (HomeUsers keep bouncing in my mind.) If your application is a Windows Service then things may be slightly easier, see eryksun's comment.
You can experiment with all these things simply using Windows Explorer, just find the right balance but don't forget that each single installation is a different world and only God knows what the environment is (but he doesn't know why).
Few obvious drawbacks:
An administrator can ALWAYS do what he wants then they may find those files and revert permissions. I think (I'm not sure) that System Installer has some special privileges to prevent this but I'm not sure (and I can't imagine how to do it).
Installation is way more complicate (and you will need one if you don't have). You may do it when application is executed first time but then you will need administrative privileges (just once but probably worse.)
Your code is more complex.
More setup means more things that may go wrong, balance this with the effort of your technical support team.
Updates (and tech support job) will be more complicate.
Users with certain privileges won't be affected (see another comment) but this is really a good thing and you shouldn't every try to circumvent it.
Backup is the key!
Don't forget that if they really want to break your application then they will just delete the application directory...
I think, but I don't know your specific use-case, that maybe you're approaching the problem from the wrong angle. If what you want to prevent the user to corrupt your data files (intentionally or not) then what you need is a BACKUP. Save a copy in a different location each time your write them, mark it as hidden and live happy. If they're not too big you may even save content directly inside Windows Registry. For encrypted/hashed/checksummed files your application can easily detect when they're broken or missing: just restore backup and you're done.
I don't want to keep the files open all the time
But keeping them open is a good way that closely follows your intent and requirements.
As long as it's not about hundreds or more, this seems the best option.
The other way is to set the security properties (ACL) but that is messy and requires a higher privilege.
Excluding the Admin is not totally possible and you should not really want that. Avoiding accidental delete or rename is doable, total control is not.
2 Other options are
Set some permissions in the locations here the files are so that no one can access them
If all of the files in question will be created by your application, you could check the options in CreateFile, where you can set the sharing options to 0x00000000 to "Prevent other processes from opening a file or device if they request delete, read, or write access."
If you want to use CreateFile I guess you will have to pinvoke it
im currently getting into security stuff and trying to build a basic "access limiter" which should register if a file or folder in a specific directory (including subdirectories) is created, changed, deleted etc.
I already figured out how to do it after the action took place using FileSystemWatcher, but I want to catch the request / event before it happens to process it. I already searched a while but haven`t really found a solution yet.
If something like this is possible I would be grateful for tips or short samples / references.
Thanks in advance.
As you've stated, you can already see what happens after the fact that something has been done to a file. To my knowledge, you can't preempt file system access from .Net as this is happening on a much lower layer (right above storage). If you are trying to ensure security, you are better off focusing on NTFS/Share level security (standard permissions like Blorgbeard said).
If you really want to intercept file system calls, a file system filter driver may be what you're after but it looks very difficult.
Into to the concept: http://msdn.microsoft.com/en-us/library/windows/hardware/dn641617
Another SO answer restating the same thing (which I based the above link on): How to intercept the access to a file in a .NET Program
Alternatively, you can tailor your security app to enumerate the security settings on files/folders as a preventative measure to ensure the proper file system security settings are in place, or take corrective action if they are not. You can use the FileSecurity class to get an idea of what is available to you (quite a bit!): http://msdn.microsoft.com/en-us/library/system.security.accesscontrol.filesecurity(v=vs.110).aspx
If you want to go the audit route and get stuck on using the FileSecurity and/or DirectorySecurity class, I would post a separate question and we can tackle that.
EDIT:
Some actual code on writing the filter driver, if you are so inclined (beware, no .net):
http://www.codeproject.com/Articles/43586/File-System-Filter-Driver-Tutorial
Does anybody know the solution for this? I create an exe file of my software. After first installation I have to disable the exe, so it cannot be run again because when someone purchases the software from me they can install it only once.
To do this you'll need to store something somewhere, that something could be:
A file
A registry entry
A call to a web service you own that stores a unique identifier for the machine, and is checked on subsequent installation attempts (Note: If you choose this method you must be clear and up-front with your users that it's what you're doing).
Bear in mind that a determined user will be able to circumvent file and registry methods and also quite possibly the web service method. The former two by using something such as Process Monitor to identify the files/registry entries you're writing to and clear them. For the latter, by using something like Fiddler to identify the web service calls you're making and replacing the responses with ones that allow them to bypass your protection.
Remember, ultimately the user can disassemble your code and remove the protection mechanisms you've put in place, so don't rely on them being 100% un-breakable
Forget it, mate. It's software - you absolutely cannot enforce something like that because the user has complete control over the environment where the binary runs, including reverse engineering, virtualization, backups etc. etc. And the ones who you want to foil are precisely the ones who will go to any length to thwart any protection measure you could invent.
No, the only thing that works is to force an online connection and register, on your system, the fact that a particular binary was installed once, then forbid it the next time. That requires you to make each installer different and have a cryptographically strong key generator, and it's still susceptible to replay attacks - but it's the only thing that is not useless by definition.
(Well, either that, or make your software so insanely great that people will fall in love you and want to give you the money. That solution is probably even harder.)
You could store the installation path in the registry or some secret location and have your .exe check that if it has started from a location different than the one stored, to simply exit, as you probably don't want to tell the user what you are doing.
I have several projects that require me to monitor files, and then edit them as they are getting written to disk. I have a feeling that what I am looking for is operationally the same as how anti-virus tools operate. Let me give more details:
1) I need to trap all files saved by Office application, and then add specific company tags to the headers/footers of each document as they are getting written to disk.
2) I need to know immediately when an editable file (of pretty much any type) is written to disk, so that I can undertake some scanning operations to check if files content meets certain company policies.
In short, you can see that I need to process any user files as they are being written to disk.
Here is my problem. I want to use C# for this task, but I am not sure if it has the ability to meet my requirements. Everything I have seen on the net is geared towards lower-level C programming, which I specifically want to avoid due to time constraints for this project. Anyone aware of how to easily do this task in C#? Is it even feasible (ie too high-level a language, too slow a language etc.)?
Performance won't be the issue. I guess I'd question the entire process- it sounds like a recipe for disaster. You can easily hack something together in C# using a FileSystemWatcher in a matter of minutes, but it will be fraught with issues. AV software is bad enough about locking files and screwing up various software, and it's not even trying to modify the file. How do you know when the other app is "done" writing the file? What do you do when you've got the file locked and something else breaks because it can't get access?
Have you looked at the FileSystemWatcher?
C# can easily do this. Look at the FileSystemWatcher class (http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx).
I was reading the following article:
http://odetocode.com/articles/294.aspx
This article raised me a lot of question regarding logs.
(I don’t know if I should have made this in separated questions… but I don’t want to spam stackoverflow.com with questions of mine)
The 1st one is if I should store it in a .txt, or .xml file… or even in a table inside the database.
Probably saving in the .txt will be better regarding performance. But when someone needs to find something the .txt file, it may become a pain in the... neck.
So… which one should I use, and why?
The second one, is there any specific class to deal with “log” thing?
I have read several threads about this subject, and I didn’t find the answers to my questions.
Thanks in advance.
The easiest approach I've taken in the past is using log4net. That way you can configure the logging in the config file. If you need it to go to a database, set it up as such. If you want to be notified when a major error occurs, set it up that way.
As far as sorting through the logs, it really depends on the approach you want to take, and how much you plan on logging. Normally I log to a flat text file as I don't enable a lot of logging in my applications. So parsing through them isn't a big deal.
Unless you want to write a system for education purposes, I honestly think that you'd be best off sticking with log4net or nlog.
And further, you would probably be better off studying the code to those systems instead of writing your own.
As to your question, I would stick to a text file and buffer the messages before spitting them to disk.
Why bother inventing wheel? you can check MS enterprise library Logging Block.
definitely not xml.
with xml, you will need to read it all, parse it, add whatever, then generate the whole xml again, and write it back to hard disk. every single time you log something.
unless of course you append the nodes to the xml file manually, in which way you loose most of xml advantages.
warnings to fatal errors - whatever will help you to debug the application if it crashes - those logs i would store in a txt file.
append a new line for every entry.
this way you can also ask from your user to check it out (if you assist him via the phone).
if it's not a meta log, such as mentioned above, in other words, if it's anything related to the program itself you may need to analyze - keep on the db.
Regarding file vs database, it's up to you to choose.
File logs give greater performance but with pain of access.
If the logs are there just to rarely provide information (e.g. the app crashes and you need to know why), you're better off storing the logs in a file.
If you want to give access to those logs, analyze them, etc, you should store them in a database.
.net is really not my zone, but there are lots of reasons why you should use the framework's logging classes.
For my apps I have chosen to write to db. Its easier (for me) to read the logs this way. However I do not go log crazy as some people do, I only log what I need to log and nothing else.
I gave log4net a shot not to long ago and did not like it at all. It was a whole lot of junk to just write to a db and send an email. I ended up writing a custom logging class and it was a whole ~200 lines and took just a few hours. It works great, I don't have another dependency, and it can be easily changed.
If you're dealing with ASP.NET, ELMAH is another good logging tool. It's apparently what Microsoft's Scott Hanselman uses.
It does need some additional code to get it to work with ASP.NET MVC's HandleError attribute, though.
NLog and log4net both provide a rich logging API but neither addresses the challanges you face managing and analyzing all the data in your log files.
If you're willing to consider a commerical tool, take a look at GIBRLATAR - it works with NLog and log4net and also collects useful performance metrics. Most importantly, GIBRALTAR provides great tools for managing and analyzing logs.