Log4Net: How to handle concurrents write to the same file? - c#

We have a big solution composed of several application. One of the application is ran very regularly( every 10minutes), and sometimes if the computer is busy, two executions may ran in parallel. (The question is not about if it's a good idea or not).
The only issue we currently have is that sometimes, the two overlapping process are both having a ILogger for the same file, we have an error from log4net, indicating that it cannot access to the file(file already used by another process or something like that).
Here is how we have the log configured:
RollingFileAppender appender = new RollingFileAppender
{
Name = appenderName,
File = fileName,
AppendToFile = true,
MaxSizeRollBackups = 10,
MaximumFileSize = "10MB",
RollingStyle = RollingFileAppender.RollingMode.Size,
StaticLogFileName = false,
LockingModel = new FileAppender.MinimalLock(),
ImmediateFlush = true
};
What would be the best way to handle this issue? We cannot have one file per execution.
EDIT
Here is the error I get:
log4net:ERROR [RollingFileAppender] Unable to acquire lock on file XXXX. The process cannot acces the file because it is being used by another process.

When you have multiple files writing to the same file, you can use the FileAppender.InterProcessLock model. This will lock and unlock the file based on a Mutex instead of trying to fix it with a minimal locking time.

Related

Writing each json message to unique file with NLog

I want to log each json message received from the network into a unique file.
At first I thought to do File.WriteAllText($"{Guid.NewGuid()}.json", jsonMsg);. But, I need to be able to archive and delete old log files. So, I have to watch the folder when the application starts up and do all the things that NLog already knows how to do.
With NLog, I can create and add a new Target for each message from code. But I'm not sure how to remove those targets after the message has been written. Otherwise, it will create memory leak.
So, should I just stick with my first idea and just implement the logic to archive and delete files, or is there a way to do this with NLog?
I recommend that you use a single filetarget for doing the writing, but avoid using MaxArchiveFiles for archive-cleanup:
var myguid = Guid.NewGuid();
var logEvent = NLog.LogEventInfo.Create(NLog.LogLevel.Info, null, jsonMsg);
logEvent.Properties["msguid"] = myguid;
logger.Log(logEvent);
And then use ${event-properties} in the FileName-option:
<target type="file" name="fileNetworkMessages"
fileName="messages/Archive/Output-${event-properties:myguid}.json"
layout="${message}" keepFileOpen="false" />
When including Guid in the filename, then you should avoid using MaxArchiveFiles, because it will introduce a performance-hit for every new file created, and cleanup will not work (HEX-letters will disturb file-wildcard).
NLog FileTarget MaxArchiveFiles has an overhead when rolling to a new file, where it scans all existing files to see if cleanup is necessary. This works fine when only rolling once every hour/day. But when using Guid in the filename, then NLog FileTarget will trigger a cleanup-check for every new file created. This will introduce a performance overhead.

jpegoptim on ASP.Net - "error opening temporary file"

I suspect I'm failing to understand where jpegoptim tries to write its temp files.
I have IIS 7.5 running an ASP.Net 4 AppDomain. In it I have a process that optimizes JPEGs with jpegoptim like so:
FileHelper.Copy(existingPath, optimizerPath);
var jpegOptimResult = await ImageHelper.JpegOptim(optimizerPath, 30);
Running locally I get an optimized image. Running on the above server I get:
D:\www\hplusf.com\b\pc\test.jpg 4096x2990 24bit N Adobe [OK] jpegoptim: error opening temporary file.
I can show the code for FileHelper.Copy(), but it's basically just File.Copy() that overwrites if the file already exists.
Here's ImageHelper.JpegOptim:
public static async Task<string> JpegOptim(string path, int quality)
{
string jpegOptimPath = Path.GetDirectoryName(new Uri(Assembly
.GetExecutingAssembly().CodeBase).LocalPath)
+ #"\Lib\jpegoptim.exe";
var jpegOptimResult = await ProcessRunner.O.RunProcess(
jpegOptimPath,
"-m" + quality + " -o -p --strip-all --all-normal \"" + path + "\"",
false, true
);
return jpegOptimResult;
}
jpegOptimResult is what you're seeing there as the error message it's producing. And here's ProcessRunner.RunProcess:
public async Task<string> RunProcess(string command, string args,
bool window, bool captureOutput)
{
var processInfo = new ProcessStartInfo(command, args);
if (!window)
makeWindowless(processInfo);
string output = null;
if (captureOutput)
output = await runAndCapture(processInfo);
else
runDontCapture(processInfo);
return output;
}
protected void makeWindowless(ProcessStartInfo processInfo)
{
processInfo.CreateNoWindow = true;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
protected async Task<string> runAndCapture(ProcessStartInfo processInfo)
{
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
var process = Process.Start(processInfo);
var output = process.StandardOutput;
var error = process.StandardError;
while (!process.HasExited)
{
await Task.Delay(100);
}
string s = output.ReadToEnd();
s += '\n' + error.ReadToEnd();
return s;
}
So:
jpegOptim runs properly on my local machine, and optimizes the file, so it's not how I'm calling jpegOptim.
The Copy operation succeeds without Exception, so it's not a Permissions issue with the ASP.Net user reading/writing from that directory
jpegOptim just optimizes and overwrites the file, so if it is in fact running under the same ASP.Net user, it should have no problem writing this file, but...
It's unclear where jpegOptim attempts to write its temp file, so perhaps the underlying issue is where this temporary file is being written.
However, judging by the Windows source:
http://sourceforge.net/p/jpegoptim/code/HEAD/tree/jpegoptim-1.3.0/trunk/jpegoptim.c
jpegOptim's "temporary file" appears to just be the destination file when used with the above options. Relevant lines of jpegOptim source:
int dest = 0;
int main(int argc, char **argv)
{
...
There's some code here looking for the -d argument that sets dest=1 - meaning here dest remains 0. It then hits an if branch, and the else clause, for dest == 0, does this:
if (!splitdir(argv[i],tmpdir,sizeof(tmpdir)))
fatal("splitdir() failed!");
strncpy(newname,argv[i],sizeof(newname));
That's copying the directory name portion of the input image filename to the variable tmpdir - so like C:\Blah\18.jpg would assign tmpdir="C:\Blah\". Then it dumps the entire input image filename to newname, meaning it's just going to overwrite it in place.
At this point in the code the variables it's using should be:
dest=0
argv[i]=D:\www\hplusf.com\b\pc\test.jpg
tmpdir=D:\www\hplusf.com\b\pc\
newname=D:\www\hplusf.com\b\pc\test.jpg
It then in fact opens the file, and there's an opportunity to error out there, suggesting jpegoptim is successfully opening the file. It also decompresses the file further confirming it's successfully opening it.
The specific error message I'm seeing occurs in these lines - I'll confess I don't know if MKSTEMPS is set or not for a default build (which I'm using):
snprintf(tmpfilename,sizeof(tmpfilename),
"%sjpegoptim-%d-%d.XXXXXX.tmp", tmpdir, (int)getuid(), (int)getpid());
#ifdef HAVE_MKSTEMPS
if ((tmpfd = mkstemps(tmpfilename,4)) < 0)
fatal("error creating temp file: mkstemps() failed");
if ((outfile=fdopen(tmpfd,"wb"))==NULL)
#else
tmpfd=0;
if ((outfile=fopen(tmpfilename,"wb"))==NULL)
#endif
fatal("error opening temporary file");
So snprintf is like C# String.Format(), which should produce a path like:
D:\www\hplusf.com\b\pc\jpegoptim-1-2.XXXXXX.tmp
Judging by what I can find it's likely MKSTEMPS is not defined meaning fopen is being called with "wb" meaning it's writing a binary file, and it's returning null meaning it failed to open, and out comes the error message.
So - possible causes:
Bad path in tmpdir It's possible I'm following the C++ poorly (likely), but, from the looks of it it should be identical to the source path of the image. But perhaps it's mangled for tmpdir, by jpegoptim? The input path is clearly clean because jpegoptim actually emits it cleanly in the error message.
Permissions issue Seems fairly unlikely. The ASP.Net user this is running under can clearly read and write because it copies to the dir before jpegoptim fires, and the only user on the machine with any permissions to this dir is that user, so jpegoptim should have failed prior to this point if it were permissions. It could be attempting to access a different dir, but that would really be the Bad tmpdir scenario.
Something else I've not thought of.
Ideas?
Note: This question is similar:
Using jpegtran, jpegoptim, or other jpeg optimization/compression in C#
However, that question is asking about a shared env on GoDaddy, causing answers to spiral around the likelihood he can't spin up processes. We have full control over our server, and as should be clear from the above, the jpegoptim Process is definitely starting successfully, so it's a different scenario.
As it turns out my reading of jpegoptim was incorrect. The tmpdir it uses is where the executable's Working Directory points to, not where the input images are, and not where the executable sits. So, the solution was 2-fold:
Give the exe permissions to write to its own directory* (but deny it access to modify itself)
Modify ProcessRunner to run processes in-place - set the Working Directory to where the exe resides.
The second modification looks like this:
var processInfo = new ProcessStartInfo(command, args);
// Ensure the exe runs in the path where it sits, rather than somewhere
// less safe like the website root
processInfo.WorkingDirectory = (new FileInfo(command)).DirectoryName;
*Note: I happen to have jpegoptim.exe isolated on the server to its own dir to limit risk. If you had it someplace more global like Program Files, you definitely should not do this - instead set the Working Directory as above, but to someplace isolated/safe like a tmp dir or even better a scratch disk. If you've got the RAM for it a RAMdrive would be fastest.
**Second Note: Because of how hard drives and jpegoptim work if the tmp location is not the same disk as the ultimate destination of the output there is a potential, partial race condition introduced between jpegoptim and other code you might be using that depends on its outputs. In particular if you use the same disk, when jpegoptim is done the output JPEG is complete - the OS changes the entry in its file table but the data for the image on the hard drive has already been written to completion. When tmp and destination are separate disks, jpegoptim finishes by telling the OS to move from the tmpdir to the output dir. That's a data move that finishes sometime after jpegoptim is done running. If your waiting code is fast enough it will start its work with an incomplete JPEG.

what process is preventing my file from getting deleted in C#

I have a C# single thread application that creates a file. Uses that file and then deletes it. Some times the app has trouble deleting that file. The error I get is:
"The process cannot access the file --file path and file name-- because it is being used by another process."
How can I find out what process has a hold on this file and how can I make that process to let go so that the file can be deleted.
This thing rocks for that very "gotcha".
http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
Process Monitor v3.05
It has a "Filter" submenu so you can fine tune it to the file that is locked.
You need to post the relevant code so we can see.
It is however always important to make sure that your app close the file that it has opened.
usually something like this will ensure that:
using(var f = File.OpenRead("myfile")) {
...
}
or the equivalent:
try {
var f = File.OpenRead("myfile");
} finally {
f.close()
}
Make sure that you are closing file before delete.
if you are using StreamWriter class make sure that you are closing with its variable
Ex. StreamWriter sw = new StreamWriter();
// some writing operation
sw.Close();

FileSystemWatcher and Monitoring Config File Changes

I have about 5-6 Server Manager programs that write their own configuration file out to a particualr folder, such as C:\ACME. The config files all end with a *ServerConfig.cfg" where * = Program name that created it.
I have a Windows service that has a FileSystemWatcher setup that I want to FTP the configuration files each time the program updates. I've gotten everything to work, but I'm noticing that the different Server Manager programs are behaving differently.
When saving a configuration file, the FileSystemWatcher is picking up two "change" events. This is causing my program to FTP the configuration file twice where I only need it once.
In other instances I'm seeing where it may create 4, 5, or 6 "change" events when saving a configuration file.
What is the best way to handle processing/FTPing these files when they are really done saving only one time.
I really dont want o set something up to poll the directory for filechanges every so often... and like the idea that each time a configuration is saved, I get a duplicate copy along with a date/timestamp appended to the filename copied elsewhere.
I have seen lots of suggestions Googling around and even here on Stackoverflow, but nothing that seems to be all-in-one for me.
I suppose I could put the filename in a queue when a "change" event occurred if it didn't already exist in the queue. Not sure if this is the best approx.
Here is my sample code:
Startup-code:
private DateTime _lastTimeFileWatcherEventRaised = DateTime.Now;
_watcherCFGFiles = new FileSystemWatcher();
_watcherCFGFiles.Path = #"C:\ACME";
_watcherCFGFiles.IncludeSubdirectories = true;
_watcherCFGFiles.Filter = "*ServerConfig.cfg";
_watcherCFGFiles.NotifyFilter = NotifyFilters.Size;
//_watcherCFGFiles.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.FileName;
_watcherCFGFiles.Changed += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Created += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Deleted += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Renamed += new RenamedEventHandler(LogFileSystemRenaming);
_watcherCFGFiles.Error += new ErrorEventHandler(LogBufferError);
_watcherCFGFiles.EnableRaisingEvents = true;
Here is that actual handler for the "change" event. I'm skipping the first "change" event if the second is within 700ms. But this doesn't account for the files that make 3-4 change events...
void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
string log = string.Format("{0} | {1}", e.FullPath, e.ChangeType);
if( e.ChangeType == WatcherChangeTypes.Changed )
{
if(DateTime.Now.Subtract(_lastTimeFileWatcherEventRaised).TotalMilliseconds < 700)
{
return;
}
_lastTimeFileWatcherEventRaised = DateTime.Now;
LogEvent(log);
// Process file
FTPConfigFileUpdate(e.FullPath);
}
}
I had the exact same issue. I used a HashMap that mapped filenames to times of writes, I then used this as a lookup table for files to check and see if the changed event had been applied very quickly. I defined some epsilon (for me it was about 2 seconds to make sure events were flushed). If the time found in the map was older than that I would put it on a queue to be processed. Essentially all I had to do was keep the HashMap up to date with events and changes and this worked out (although you may want to change your epsilon value depending on your application).
Its normal this behavior because the antivirus system or other programs make more writes when a file change the content. I usually create a (global) HashTable and check if the filename exists, if don't, put the filename in it and start and an asynchronous operation to remove the filename after 3-5 seconds.
This is expected behavior - so you need to figure out how to handle it in your particular case.
The file system does not have a concept of "program done working with this file". I.e. one can write editor that updates (open/write/close) file on every keystroke. File system will report a lot of updates, but from the user point of view there is only one update when the editor is closed.

How to check if I can create a file in a specific folder

I need to know if I can create a file in a specific folder, but there are too many things to check such as permissions, duplicate files, etc.
I'm looking for something like File.CanCreate(#"C:\myfolder\myfile.aaa"), but haven't found such a method.
The only thing I thought is to try to create a dummy file and check for exceptions but this is an ungly solution that also affects performance.
Do you know a better solution?
In reality, creating a dummy file isn't going to have a huge performance impact in most applications. Of course, if you have advanced permissions with create but not destroy it might get a bit hairy...
Guids are always handy for random names (to avoid conflicts) - something like:
string file = Path.Combine(dir, Guid.NewGuid().ToString() + ".tmp");
// perhaps check File.Exists(file), but it would be a long-shot...
bool canCreate;
try
{
using (File.Create(file)) { }
File.Delete(file);
canCreate = true;
}
catch
{
canCreate = false;
}
You can use CAS to verify that there are no .NET policies (caspol) restricting the creating and writing of a file on that location.
But this will not cover the windows policies. You'll have to manually check the NTFS policies. And even then there are processes that can decide you're not allowed to create a file (for instance a virus scanner).
The best and most complete way is to try it.

Categories

Resources