Serilog SelfLog to File Error - c#

I need to enable self log of seri logger to text file. My configuration are follows;
__serilogLogger = new LoggerConfiguration()
.Enrich.WithProperty("ApplicationIPv4", _ipv4)
.Enrich.WithProperty("ApplicationIPv6", _ipv6)
.WriteTo.MSSqlServer(connectionString, tableName /*, columnOptions: columnOptions*/)
.WriteTo
.Seq(ConfigurationManager.AppSettings["SerilogServer"])
.CreateLogger();
var file = File.CreateText("Self.log");
Serilog.Debugging.SelfLog.Enable(TextWriter.Synchronized(file));
But is hows File access error when run the application. Please find the error details below;
Additional information: The process cannot access the file 'C:\Program
Files (x86)\IIS Express\Self.log' because it is being used by another
process.
Can anyone help me on this

try
Serilog.Debugging.SelfLog.Enable(msg => File.AppendAllText (serilogSelfLogFilePath, msg));
for a non-locking way to write to a file. serilogSelfLogFilePath is a valid path string to the file you want to use.
Don't forget Log.CloseAndFlush(); when you're closing your other logs per https://github.com/serilog/serilog/issues/864 or it won't get written anyway
Note, if you are using an async Sink then you will eventually get an exception when different threads contend to write to that file if it gets busy.

You must have another instance of this application running when it comes to this line. Or maybe this code is somehow being invoked twice? Check taskmanager and kill anything that maybe using it. If this is a web app try recycling the app pool.

I was facing a problem manifesting this same The process cannot access the file 'X' because it is being used by another process. error message. In my case it appeared in the server's event log every time the application pool recycled in IIS 8.5, even though Maximum worker processes was set to 1. Maybe worth saying: the code enabling self log executes in a static constructor.
No luck did I have after properly closing the TextWriter, not even when adding generous Thread.Sleep before File.CreateText in hope of waiting for that "another process" to finish.
The solution was to set Disable Overlapped Recycle to true in Advanced Settings for the application pool.

Related

Create a Windows service without the use of a timer [duplicate]

I made a Window service and let it work automatically and under localsystem account, when the service starts it fires this message for me and then stops
The [service name] service on local computer started and then stopped. Some Services stop automatically if they are not in use by another services or programs.
What's the problem and what's the solution?
Either you are not starting any threads on the OnStart method to do work, or there is an exception raised within your OnStart method.
If an exception is thrown, it will appear in the Windows Event log. The Windows Event log is a good place to start in any case.
Generally an OnStart method looks like this:
Thread _thread;
protected override void OnStart(string[] args)
{
// Comment in to debug
// Debugger.Break()
// Do initial setup and initialization
Setup();
// Kick off a thread to do work
_thread = new Thread(new MyClass().MyMethod)
_thread.Start();
// Exit this method to indicate the service has started
}
This particular error message means what it says - that your service has started but then quite soon it exited for some reason. The good news is that your service is actually doing something, so you have the executable configured and running as a service properly.
Once started, for some reason it is quitting. You need to find out why this is. Add some debugging to tell you its up and running and known exit cases. If that doesn't reveal the problem then add some debugging to let you know it's still running and work backwards from when that stops.
Are you tracing out any debug information? Most likely an exception is being thrown during your initialization. I would trace out all your exceptions and use Debugview to view them.
I had a similar problem that occurred because my Event Logs were full and the service was unable to write to them. As such, it was impossible to debug by looking for messages in the Event Viewer. I put a try/catch and dumped the exception out to a file. I had to change the settings on my logs to fill as needed instead of every 7 days and this allowed the services to start.
Of course, the root of the problem for me is that I have a nVidia driver issue that is flooding my event logs and now I'm probably beating on the disk, but that's another issue.
Maybe you need to run the service as Local System Account. See this post by Srinivas Ganaparthi.
I had the same issue starting JBoss, then I changed the JAVA_HOME variable, it worked for me. It was the JBoss version that doesn't support the 1.6, it supports 1.5.
I had similar problem and it turned out in my case that the program simply crashed in OnStart method. It tried to read some file that it couldn't find but I suppose that any other program crash would give the same result. In case of Windows forms application you would get some error message but here it was just "your service started and stopped"
If you ever need, like me to read some files from the directory where Windows Service .exe is located, check this topic:
Getting full path for Windows Service
In my case, a method in my service, was being called recursively (as no terminate condition being true) and after specific time my service was being stopped.

Add log to my application that can write from different EXE files at same time

i have Command Line application that received file from user (DOC, PDF), this file is locate on the same machine and my application copy this file to specific folder and return 0 is this operation passed and 1 otherwise.
This command line exe file can open several times concurrency and there is no problem with it.
Now i want to add to my application a Log that will locate in the application folder and this Log will write each file name and if the operation passed or failed.
Now i wonder how to achieve that in case i have several open processes and how to avoid situation that 2 exe files try to write to my log at the same time.
can i using lock in such case although i am using several exe files in the same time ?
You can create named system mutex to control access to log file
// Set this variable to false if you do not want to request
// initial ownership of the named mutex.
bool requestInitialOwnership = true;
bool mutexWasCreated;
// Request initial ownership of the named mutex by passing
// true for the first parameter. Only one system object named
// "MyMutex" can exist; the local Mutex object represents
// this system object. If "MyMutex" is created by this call,
// then mutexWasCreated contains true; otherwise, it contains
// false.
Mutex m = new Mutex(requestInitialOwnership, "MyMutex", out mutexWasCreated);
To ensure there are named mutex you can use Mutex.TryOpenExisting("MyMutex", resultMutex) and if it exists you can Wait, Log and Release it
resultMutex.WaitOne();
Log("success");
resultMutex.ReleaseMutex();
More info availible in MSDN: http://msdn.microsoft.com/en-us/library/System.Threading.Mutex(v=vs.110).aspx
Do not make a log file. Use ETW and log to the windows mechanisms. It is not like the event log is new (it is there for a long time) and ETW is now fully supported via nuget packages.
ETW also is kernel based.
If you don't care too much about the log's cleanliness, you can open the log file in a way that allows multiple processes to write to it:
private static Stream CreateFile(string path, bool checkHost)
{
var mode = FileMode.Append;
return new FileStream(path, mode, FileAccess.Write, FileShare.ReadWrite, 4096, FileOptions.SequentialScan, Path.GetFileName(path), false, false, checkHost);
}
The drawback is that if two processes write to the file concurrently then you'll get a mess that will look like this:
2014-11-26 11:32:93 Suc2014-11-26 11:32:93 Failed: file "Some.doc"
seeded: file "Other.doc"
I.e. the processes race each other and you end up with intermingled log entries. If you don't have too many processes writing to the same log file, and if each process writes infrequently, you should have very few collisions like this.
There are several ways around it. One is to open the file in read-only mode in each process, and wait on other processes until the file is available; the crudest way to do this is to try-catch a File.AppendAllText() in a while-loop until you succeed. There are other options listed in questions like this one.
Another alternative is to write log to multiple files, or to something other than file, e.g. to a DB.
One of the cleanest and SOA based approach would be to a use a separate logging service which your process ( or processes) will call to log information. Log4Net provides a mechanism for both client and server to post and consume messages respectively. Look at https://log4netremotelogging.codeplex.com/ for further details.

Failing to programatically overwrite a file in an IIS Virtual Directory/Application (file is always locked)

At first I thought I'm facing a very simple task. But now I realized it doesn't work as I imagined, so now I hope you people can help me out, because I'm pretty much stuck at the moment.
My scenario is this (on a Windows 2008 R2 Server):
A file gets uploaded 3 times per day to a FTP directory. The filename is always the same, which means the existing file gets overwritten every time.
I have programed a simple C# service which is watching the FTP upload directory, I'm using the FileSystemWatcher class for this.
The upload of the file takes a few minutes, so once the File Watcher registers a change, I'm periodically trying to open the file, to see if the file is still being uploaded (or locked)
Once the file isn't locked anymore, I try to move the file over to my IIS Virtual Directory. I have to delete the old file first, and then move the new file over. This is where my problem starts. The file seems to be always locked by IIS (the w3wp.exe process).
After some research, I found out that I have to kill the process which is locking the file (w3wp.exe in this case). In order to do this, I have created a new application pool and converted the virtual directory into an application. Now my directory is running under a seperate w3wp.exe process, which I supposedly can safely kill and move the new file over there.
Now I just need to find the proper w3wp.exe process (there are 3 w3wp.exe processes running in total, each running under a seperate application pool) which has the lock on my target file. But this seems to be an almost impossible task in C#. I found many questions here on SO regarding "Finding process which locked a specific file", but none of the answers helped me.
Process Explorer for example is exactly telling me which process is locking my file.
The next thing I don't understand is, that I can delete the target file through Windows Explorer without any problem. Just my C# application gets the "File is being used by another process" error. I wonder what's the difference here...
Here are the most notable questions on SO regarding locked files and C#:
Win32: How to get the process/thread that owns a mutex?
^^
The example code here does actually work, but this outputs the open handle IDs for every active process. I just can't figure out how to search for a specific filename, or at least resolve the handle ID to a filename. This WinAPI stuff is way above my head.
Using C#, how does one figure out what process locked a file?
^^
The example code here is exactly what I need, but unfortunately I can't get it to work. It is always throwing an "AccessViolationException" which I can't figure out, since the sample code is making extensive use of WinAPI calls.
Simple task, impossible to do? I appreciate any help.
EDIT
Here are some relevant parts of my server code:
Helper function to detect if a file is locked:
private bool FileReadable(string file, int timeOutSeconds)
{
DateTime timeOut = DateTime.Now.AddSeconds(timeOutSeconds);
while (DateTime.Now < timeOut)
{
try
{
if (File.Exists(file))
{
using (FileStream fs = File.Open(file, FileMode.Open, FileAccess.Read, FileShare.None))
{
return true;
}
}
return false;
}
catch (Exception)
{
Thread.Sleep(500);
}
}
m_log.LogLogic(0, "FileReadable", "Timeout after [{0}] seconds trying to open the file {1}", timeOutSeconds, file);
return false;
}
And this is the code in my FileSystemWatcher event, which is monitoring the FTP upload directory. filepath is the newly uploaded file, targetfilepath is the target file in my IIS directory.
// here I'm waiting for the newly uploaded file to be ready
if (FileReadable(filepath, FWConfig.TimeOut))
{
// move uploaded file to IIS virtual directory
string targetfilepath = Path.Combine(FWConfig.TargetPath, FWConfig.TargetFileName);
if(File.Exists(targetfilepath))
{
m_log.LogLogic(4, "ProcessFile", "Trying to delete old file first: [{0}]", targetfilepath);
// targetfilepath is the full path to my file in my IIS directory
// always fails because file is always locked my w3wp.exe :-(
if(FileReadable(targetfilepath, FWConfig.TimeOut))
File.Delete(targetfilepath);
}
File.Move(filepath, targetfilepath);
}
EDIT2:
Killing the w3wp.exe process while clients are downloading the file would be no problem for us. I'm just having a hard time finding the right w3wp.exe process which is locking the file.
Also, my client application, which is downloading the file on the clients, is checking the HTTP HEAD for the Last-Modified date. The client is checking the date every 10 minutes. So it is possible that the file is being locked by IIS because there are clients continously checking the HTTP HEAD for the file. Nonetheless, I don't understand why I can manually delete/rename/move the file through windows explorer without any problems. Why does this work, and why does my application get a "Locked by another process" exception?
One problem I've run into is that a file exists while it is still being written, which means it would be locked as well. If your FileReadable() function were called at this time, it would return false.
My solution was to, in the proc which writes the file, write the file to, say, OUTPUT1.TXT, and then after it is fully written and the FileStream closed, rename it to OUTPUT2.TXT. This way, the existence of OUTPUT2.TXT indicates that the file is written and (hopefully) unlocked. Simply check for OUTPUT2.TXT in your FileReadable() loop.
Everybody say...
"Do it a better way"
Nobody say how!!!
Here's how. Because you mentioned 'My Client Application,' there is a key opportunity here that you would not have if you didn't have control over the apps reading the file.
Just use new filenames each time.
You have control of the program reading and writing the files. Put an incrementing # in the filesnames, have the client pick the biggest # (Actually the latest date, then your numbers can wrap around). Have the writer program clean up old files if it can; if not, they won't hurt anything. IIS will eventually let go of them. If not, just open up explorer every week and do it yourself!
Other keys that make this work are the low frequency of updates (files won't build up too bad), and the fact that the FTP+webserver are on the same drive (Otherwise the MOVE is not atomic and clients could get a half-copied file. Solution if FTP drive is different would be to copy to a temp drive on the webserver then move).
but what if you can't change the client or it has to read just one name?
Front-end it with a script. Have the client hit an ASPX that sets the right HTTP headers and has the 'pick the right file' logic, and spits out the file contents. This is a very popular trick pages use to write images stored on a database out to the browser, while the img tag appears to read from a file. (google along that lines for sample code).
sounds like a hack, it's not. Modern lockless memory cache systems do a similar thing. It is impossible for a lock or corruption to occur; until the 'write' is complete, readers see the old version.
plus, it's simple, everybody from a script kiddie to a punchcard vetern will know exactly what you're up to. Go low-tech!
You're troubleshooting a symptom of the problem not a fix for the root cause. If you want to go down that path here is the code to kill processes http://www.codeproject.com/Articles/20284/My-TaskManager - but the better idea would be to do it properly and work out whats wrong. I suggest in the Catch Exception of FileReadable:
catch (Exception ex) {
if (ex is IOException && IsFileLocked(ex)) {
//Confirm the code see's it as a FileLocked issue, not some other exception
//its not safe to unlock files used by other processes, because the other process is likely reading/writing it.
}
}
private static bool IsFileLocked(Exception exception)
{
int errorCode = Marshal.GetHRForException(exception) & ((1 << 16) - 1);
return errorCode == 32 || errorCode == 33;
}
Turn off any Anti-Virus software and re-test
Increase the polling timeout duration to see if its just a timing thing
Check the FTP logfile and see the status for the disconnected client and compare the status code with the ones here.
I don't see in your sample code where you are closing your file stream. Keeping the file stream open will keep a lock on the file. It would be a good idea to close the stream. You probably don't want to be killing your w3wp.exe process, as others here have mentioned.
restarting IIS can unlock the file taken by w3wp.exe.
cmd (run as administrator) -> iisreset /stop -> update/delete file in
windows explorer -> iisreset /start

Windows service can't write to %LOCALAPPDATA%

I have built an app that works only when not run as a Windows service. Well, the service runs, but it doesn't do what it should. The service uses the Local Service account. So to kick off debugging, I thought I'd start with something simple: have it create a directory when it starts:
Directory.CreateDirectory(
Environment.SpecialFolder.LocalApplicationData + "\\MyService");
When I started the service, it stopped almost immediately and Windows reported that fact. When I commented out the above statement, recompiled and re-installed, the service ran without stopping.
Obviously the above line throws an exception of some sort. I have no way of logging the error because I can't write to the file system. Any ideas why Local Service can't create a directory in its own %LOCALAPPDATA%?
You should use GetFolderPath with LocalApplicationData like so:
string folderName = Path.Combine(Environment.GetFolderPath(
Environment.SpecialFolder.LocalApplicationData),
"MyService");
Directory.CreateDirectory(folderName)
I think this might be because there is no special folder. When running as the local service account you are running under that user, not the logged in user. so you are requesting a special folder that probably wont exist, as I don't think the local service has a profile. (I may be wrong) - I was wrong :p
Just in case anyone pops by:
C:\Windows\ServiceProfiles\LocalService
is the local service profile folder, so it will end up in there.
If you want to debug it surround that line with a try catch, and then write the error to a file:
try
{
Directory.CreateDirectory(Environment.SpecialFolder.LocalApplicationData + "\\MyService");
}
catch (Exception ex)
{
System.IO.StreamWriter file = new System.IO.StreamWriter(#"C:\MyServicelog.txt",true);
file.WriteLine(ex.Message);
file.Close();
}
At least then you can see whats causing the error
Martyn
I suggest you write the exception details to the event log. All user accounts have permission to write to the event log as long as the log and source names have already been created by an administrator (which you can do simply by running the app as yourself first).
As to the root cause of the error, it may be because LocalService doesn't normally get a full set of profile folders created by default. I'm not sure whether this is by design, or simply what I have observed on various machines.

The process cannot access the file 'C:\inetpub\wwwroot\MyApp\5-23-2011.log' because it is being used by another process

I have a singleton logger which is used inside an ASP.NET application. Sometimes I get The process cannot access the file error on this line:
StreamWriter sw = new StreamWriter("Path to log file", true);
I checked file handle by Process Explorer and w3wp.exe owns the handle so it seems different threads from the same process caused the problem.
I have used a lock around the above code, but still I get the error.
How can I make sure all threads can use the same stream safely?
Don't open the log file more than once; just open it when the application starts, close it at exit (and flush it often.) Opening and closing more than once is just inefficient.

Categories

Resources