I've just written a test service in C# 2010. In the OnStart method i'm opening an XML document in the root of C: drive, parsing it and writing out to another XML document also in the root of C: drive.
When I install the service and start it, im told it stopped again automatically. The output XML file is created but is empty. I'm running the service as Local System account.
Can anyone tell me why no content is being written?
Thanks,
EDIT (to include code for OnStart)...
protected override void OnStart(string[] args)
{
String win32ClassName = "";
String nodeSubkeyName = "";
List<String> propertyList = new List<String>();
List<String> propertyListQuery = new List<String>();
XmlTextReader reader = new XmlTextReader("C:\\hwin.xml");
XmlTextWriter writer = new XmlTextWriter("C:\\hwout.xml", null);
writer.WriteStartDocument();
writer.WriteComment("Asset hardware Inventory for " + System.Environment.MachineName);
writer.WriteStartElement("hardware");
The above code doesn't even write the starting element for the output XML file, but it does create the empty file so suspect it has rights to do that. Perhaps reading the input file IS the issue? I've little to no idea how the Local Service account works!
I'd suspect that it's not flushing the stream it's writing to. Stream output (including file output) is normally buffered because it gives better performance to batch up a bunch of bytes to write to the file rather than do them one by one. (The benefit increases the larger buffer you use until either 4 or 8k, at which point the cost of the memory used outweighs the gain. 4 and 8 both tend to be better than something in between because they also hit memory page sizes well, in any case the default buffers for most framework-supplied streams is 4k).
Anyway, if that is the problem, then you need to flush the stream. This is always done when you close the writer, and closing the writer is always done when you Dispose() it. You should always dispose anything that implements IDisposable as soon as possible anyway as a matter of good practice (always assume something bad at least could happen if you don't, even in those cases where you know a given Dispose() is currently implemented as a no-op). Most of the time, this is most easily done with a using block:
using(XmlTextWriter writer = new XmlTextWriter("C:\\hwout.xml", null))
{
//code that uses writer here
}
Even if my suspicion is wrong, it's well worth getting into the habit of doing this.
Probably an unhandled exception....
Use Try/Catch, write exception to system.diagnostics.trace.writeline and use DebugView(http://technet.microsoft.com/en-us/sysinternals/bb896647) to read the output.
Check the error log in event viewer, its most likely you don't have permission to read the file from c:\
You can make the service run under an account which has permission to read\write to c:\
You should use a try and catch around your xml write code block, and then log the error to debug this further.
It is hard to say without code snippet,but if service doesn't start definitely OnStart throw exception, put Thread.Sleep in OnStart method, attach to process with visual studio and debug
Have you tried executing your code as a console application rather than a service?
Often run-time issues with services are due to privileges. For the writing part, this is not the case here, because your code got to create the output XML file. Therefore, your service has write rights to the right directory.
But it may be that it does not have read rights on the input file, or there just may be an exception happening during execution.
My advice to you:
test your code as a Console app rather than a service, and test it from within Visual Studio
try/catch and log any exceptions
check the Event Viewer to see if windows has logged any details about the problem with your service
Related
I have Following Code in a Page_Load called function. When the Page is loaded the first time after starting Visual Studio, everything works out fine.
But any other opening call to the File after that returns IOException: "File is in use by another process", even when directly opening the File in VisualStudio Solution this Error is returned(of course not as Exception)
FileStream mailinglist_FileStream = new FileStream(#"\foobarFile.txt", FileMode.Open);
PeekingStreamReader mailinglist_Reader = new PeekingStreamReader(mailinglist_FileStream);
//Do some stuff with the file
mailinglist_FileStream.Close();
mailinglist_Reader.Close();
mailinglist_Reader.Dispose();
mailinglist_FileStream.Dispose();
Why is the file still locked? and why does fully restarting Visual Studio reset the File?
when checking file-Properties it says:
Build Action: Content
Copy to output directory: do not Copy
I am only reading this File. can i do something similiar to adLockOptimistic, so that multiple processes can access the File?
Why is the file still locked? and why does fully restarting Visual
Studio reset the File? when checking file-Properties it says [...]
I don't know why the file is still locked: probably because your code fails before the stream is closed/disposed.
About "why fully restarting Visual Studio [...]": because you may be using IIS Express or ASP.NET Dev Server whose are closed when you close the IDE, so locks on files are released since the process holding the locks is no longer running.
And about "why is the file still locked?[...]" it could be because the file stream isn't closed because sometimes the thread may not end successfully and the locks aren't released.
As other answer said, check how using block may avoid that IDisposable objects wouldn't be disposed:
// FileShare.ReadWrite will allow other processes
// to read and write the target file even if other processes
// are working with the same file
using var mailinglist_FileStream = new FileStream(#"\foobarFile.txt", FileMode.Open, FileShare.ReadWrite);
using var mailinglist_Reader = new PeekingStreamReader(mailinglist_FileStream);
// Do your stuff. Using blocks will call Dispose() for
// you even if something goes wrong, as it's equal to a try/finally!
I am only reading this File. can i do something similiar to
adLockOptimistic, so that multiple processes can access the File?
Yes, take a look at File.Open method and FileShare enumeration:
File.Open: http://msdn.microsoft.com/en-us/library/y973b725.aspx
FileShare enum: http://msdn.microsoft.com/en-us/library/system.io.fileshare.aspx
Learn to use using:
using (FileStream fileStream = File.Open(#"C:\somefile", FileMode.Open, FileAccess.Read))
{
...
}
The using construct ensures that the file will be closed when you leave the block even if an exception is thrown.
Your problem might not be here, but somewhere else in your code. You'll have to go through all your code and look for places where you have opened files but not put it inside a using statement.
An old question but unfortunately the given answers can be not applicable to the question.
The problem specifically in Windows lies in two aspects of Windows behavior:
a) when the handle to the file, opened for writing, is closed, the Microsoft Antimalware Service opens the file to check the newly written data for malware;
b) the OS itself keeps the file opened for some time after all handles to it are closed. This time can be from seconds to many minutes depending on the nature of the file and other factors.
We saw this problem many times in our products and had to provide special support for this case - our kernel-mode attempts to close the file as soon as the last handle to it is closed.
Try using using blocks, it may not fix your lock problem, but it is better form for disposable objects.
using (FileStream mailinglist_FileStream = new FileStream(#"\foobarFile.txt", FileMode.Open))
{
using (PeekingStreamReader mailinglist_Reader = new PeekingStreamReader(mailinglist_FileStream))
{
...
}
}
Also, try closing mailinglist_Reader before mailinglist_FileStream.
Ok guys, this one is a tough one.
The scenario:
I have multiple services running on multiple machines
Each service has multiple threads, and each thread writes a file on a FILER - the shared storage used by my machines (using a share such as \\filername\foo\bar)
The FILER machine is a NetApp machine
Both the FILER and the machines running the services are using SMB2 (http://en.wikipedia.org/wiki/Server_Message_Block)
The instruction used to write the file is as simple as the one listed below in [THE CODE]
[THE CODE]
using (StreamWriter outfile = new StreamWriter(pathToTheFile, false))
{
outfile.Write(stringToWriteInTheFile);
}
[/THE CODE]
The problem:
Sometimes the service remains "stuck" on this instruction. The error given is:
The process cannot access the file '\\filername\foo\bar\myfile.txt' because it is being used by another process.
After some of these errors, the service refuses to release the lock on the file. What happens then?
You can delete the file, but the file is IMMEDIATELY recreated. Like if a sort-of permanent Stream is alive and keeps writing the file indefinitely.
You can stop the service: it's stuck, and won't be stopped, so I forced a Thread.Abort (yeah, I know but practice, but what else?) after 2 minutes.
So, the service is now stopped, but the machine retains an handle to the file and you CANNOT kill the process keeping the handle alive except by rebooting the machine. . .
I don't know what to do right now, I think I tried everything.
Considerations:
Previously, the FILER and the machines were using SMB1, and this problem never arised. So I guess something fishy happens in the background, but I can't understand what...
I changed recently the code used to write the file, in a desperate attempt to "delegate" everything to .net. Now it's:
File.WriteAllText(pathToTheFile, stringToWriteInTheFile);
but my gut feeling is that, under the wraps, .net is doing the exact same thing - the change is quite recent though, so I can't still say if the "fix" is working or not.
EDIT (as per Vash comment): Usually the file is different, but it can happen (and it actually happens) sometimes that multiple threads are trying to write the same file, however :( - doing the File.WriteAllText shouldn't take care of concurrency issues?
Try explicitly opening a FileStream in "exclusive" mode, ie
using (var fs = new FileStream("path",
FileMode.Open, FileAccess.ReadWrite,
FileShare.None))
{
using (var sw = new StreamWriter(fs))
{
...
Of course your code will have to anticipate that the file might be locked when it goes to write it and react appropriately. That part is left as an exercise for the reader :-)
Disclaimer: I have used this in a multi-threaded environment, but I can't guarantee it will work over Samba.
I have many processes reading a file stored on a network share. Originally I was only able to have one process read the file, all the others would throw exceptions. I implemented the following code to deal with that:
using (StreamReader fileStreamReader = new StreamReader(File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read)))
{
content = fileStreamReader.ReadToEnd();
}
This let multiple processes read the same file, however it still seems to have issues, because sometimes multiple processes still can't access the file. Yet I can go back later when the file isn't in use and open it just fine. Right now I have some retry behavior with random delays implemented that so far, seem to help. It seems a little quirky to me to do it this way, so what would be a better method?
This is the weird part, the exception I'm getting is not from file IO at all, it's from a library called CommStudio. In short, I dump the file to a string, i modify it slightly, dump it into a memory stream, and ship it off over ymodem on rs232. The exception is telling me the remote system has canceled. The device getting the data reports that there was a transmission error, which usually means that an incomplete/empty file was received.
Normally I would blame the library on this, but it works flawlessly at desk-testing and when there is only one process accessing the file. The only thing that really seems to be consistent is that it is likely to fail when multiple processes are accessing a file.
had a similar problem but not allot of time to find an ideal solution. I created a webservice and stuck the file local to the webservice app.. then created a simple one liner GET API which was called over the office intranet.. thus ensureing only the calling application edited the log file.. messy but functional.
I have had a similar problem in the past. Try changing how you access the file to something like this.
//Use FileInfo to get around OS locking of the file
FileInfo fileInfo = new FileInfo(path);
//I actually wanted unblocked read write access so change your access and share appropriately
using (FileStream fs = fileInfo.Open(FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
//I'm using CopyTo but use whatever method matches your need
fileInfo.CopyTo(Path.Combine(destination, fileName), false);
}
I am working on an app that will keep a running index of work in accomplished.
I could write once at the end of a work session, but I don't want to risk losing data if something blows up. Therefore, I rewrite to disk (XML) every time a new entry or a correction is made by the user.
private void WriteIndexFile()
{
XmlDocument IndexDoc
// Build document here
XmlTextWriter tw = new XmlTextWriter(_filePath, Encoding.UTF8);
tw.Formatting = Formatting.Indented;
IndexDoc.Save(tw);
}
It is possible for the writes to be triggered in rapid succession. If this happens, it tries to open the file for writing before the prior write is complete. (While it would not be normal, I suppose it is possible that the file gets opened for use by another program.)
How can I check if the file can be re-written?
Edit for clarification: This is part of an automated lab data collection system. The users will click a button to capture data (saved in separate files), and identify the sub-task the the data package is for. Typically, it will be 3-10 minutes between clicks.
If they make an error, they need to be able to go back and correct it, so it's not an append-only usage.
Finally, the files will be read by other automated tools and manually by humans. (XML/XSLT)
The size will be limited as each work session (worker shift or less) will have a new index file generated.
Further question: As the overwhelming consensus is to not use XML and write in an append-only mode, how would I solve the requirement of going back and correcting earlier entries?
I am considering having a "dirty" flag, and save a few minutes after the flag is set and upon closing the work session. If multiple edits happen in that time, only one write will occur - no more rapid user - also have a retry/cancel dialog if the save fails. Thoughts?
XML is a poor choice in your case because new content has to be inserted before the closing tag. Use Text istead and simply open the file for append and write the new content at the end of the file, see How to: Open and Append to a Log File.
You can also look into a simple logging framework like log4net and use that instead of handling the low level file stuff urself.
If all you want is a simple log of all operations, XML may be the wrong choice here as it is difficult to append to an XML document without rewriting the whole file, which will become slower and slower as the file grows.
I'd suggest instead File.AppendText or even better: keeping the file open for the duration of the aplication's life time and using WriteLine.
(Oh, and as others have pointed out, you need to lock to ensure that only one thread writes to the file at a time. This is still true even with this solution.)
There are also logging frameworks that already solve this problem, such as log4net. Have you considered using an existing logging framework instead of rolling your own?
I have a logger that uses System.Collections.Queue. Basically it waits until something is queued then trys to write it. While writing items, which could be slow, more items could be added to the queue.
This will also help in just grouping messages rather than trying to keep up. It is running on a separate thread.
private AutoResetEvent ResetEvent { get; set; }
LogMessage(string fullMessage)
{
this.logQueue.Enqueue(fullMessage);
// Trigger the Reset Event to send the
this.ResetEvent.Set();
}
private void ProcessQueueMessages()
{
while (this.Running)
{
// This will process all the items in the queue.
while (this.logQueue.Count > 0)
{
// This method will just log the top item on the queue
this.LogQueueItem();
}
// Once the queue is empty will wait for a
// another message to queueed before running again.
// Rather than sleeping and checking if the queue is full,
// saves from doing a System.Threading.Thread.Sleep(1000); stuff
this.ResetEvent.WaitOne();
}
}
I handle write failures but not dequeueing until it wrote to the file with no errors. Then I just keep attempting until it finally can write. This has saved me because somebody removed permissions from one of our apps during it process. Permission was given back with out shutting down our app, and we didn't lose a single log statement.
Consider using a flat text file. I have a process that I wrote that uses an XML log... it was a poor choice. You can't just write out the state as you run without having to constantly rewrite the file to make sure the tags are correct. If it was flat entries written to a file you could have an automatic timeline that could give you details of what happened without trying to figure out if it was the XML writer/tag set that blew up and you don't have to worry about your logs bloating out as much.
I agree with others suggesting you avoid XML. Also, I would suggest you have one component (a "monitor") that is responsible for all access to the file. That component will have the job of handling multiple simultaneous requests and making the disk writes happen one after another.
I am developing a tool in c#, at one instance I start writing into a xml file continuously using my tool,when i suddenly restart my machine the particular xml file gets
corrupted, what is the reason an how to avoid it?
xmldocument x= new xmldocument();
x.open();
// change a value of the node every time
x.save();
x=null
this is my code
Use the "safe replace pattern". For example, to replace foo.txt
Write to foo.new
Move foo.txt to foo.old
Move foo.new to foo.txt
Delete foo.old
At any point, you have at least one complete, valid file.
(That helps if you want to write a new file periodically; for appending, I'd go with the answers suggesting that XML isn't the best way forward for you.)
Don't use XML.
XML has a syntax which doesn't lend itself well to writing continuously to the same file, as you always need a final end tag which you can't write unless the file is complete (which it never is with log files, for example).
That means you will always get an invalid XML file when you cancel the writing prematurely (by killing the process or restarting the computer, etc.).
We had a similar situation a while ago and settled on YAML as a nice format which allows for simply appending to the file.
Check that your file is properly closed before the application shuts down.
Also, as someone has pointed out, an XML file must be properly ended with closing tags.
Additional details would also be useful, such as the code that you use to open, write and close the file.
The reason for your file getting corrupted is that due to a crash, you never closed it.
I remember solving an issue like that once, with a file overlapping flag. But that was in C++ using method CreateFile.