I've stumbled upon a somewhat unusual issue with File.WriteAllLines.
I have code that looks like this
File.WriteAllLines(filename, data);
bool exists = File.Exists(filename);
The problem is that sometimes file writing fails, but does not raise an exception, and the code thinks the file exists when it doesn't.
The file is in a network location.
The file name is Database.lock. Does a lock extension mean anything to the OS?
Exists returns true, but the file is simply not there. No exception is raised.
Calling Exists from a separate process returns false.
Calling Process.Start(filename) results in an error (not a code exception, just the OS saying it can't find the file).
The local machine is running Windows 7.
The remote machine is running Windows XP.
How can I debug what's going on here?
Update
Following David's advice, I watched the process using procmon.exe.
This is the result: http://i.imgur.com/IBz6Ujt.png
You'll notice there's a lot of things going on repetitively, which I don't fully understand, and at the end, the file is reported to have been written successfully.
Solved
Thanks to Patrick's suggestion, I discovered that due to a code path I hadn't taken into consideration, the file was getting immediately deleted in a different segment of code. Sorry for wasting everyone's time. I am relieved though that it's just me being thoughtless, instead of unforeseeable network issues.
This could be a permissions issue. File.Exists will return false if you don't have read permissions for the file. It could be that you are maybe running your code to create the file from Visual Studio and it has admin privileges while you are running LINQPad with other permissions that don't have read access to that location.
Related
I have read a similar post, but i just cant figure out the problem.
I have changed the windows permissions and changed routes.
When i try to save a file it throws me the exception:
Access to the path **** denied.
string route="D:\\";
FileStream fs = new FileStream(route, FileMode.Create); <--here is the problem
StreamWriter write = new StreamWriter(fs);
patient person = new patient();
patient.name = textBox1.Text;
patient.name2 = textBox2.Text;
You are trying to create a FileStream object for a directory (folder). Specify a file name (e.g. #"D:\test.txt") and the error will go away.
By the way, I would suggest that you use the StreamWriter constructor that takes an Encoding as its second parameter, because otherwise you might be in for an unpleasant surprise when trying to read the saved file later (using StreamReader).
Did you try specifing some file name?
eg:
string route="D:\\somefilename.txt";
tl;dr version: Make sure you are not trying to open a file marked in the file system as Read-Only in Read/Write mode.
I have come across this error in my travels trying to read in an XML file.
I have found that in some circumstances (detailed below) this error would be generated for a file even though the path and file name are correct.
File details:
The path and file name are valid, the file exists
Both the service account and the logged in user have Full Control permissions to the file and the full path
The file is marked as Read-Only
It is running on Windows Server 2008 R2
The path to the file was using local drive letters, not UNC path
When trying to read the file programmatically, the following behavior was observed while running the exact same code:
When running as the logged in user, the file is read with no error
When running as the service account, trying to read the file generates the Access Is Denied error with no details
In order to fix this, I had to change the method call from the default (Opening as RW) to opening the file as RO. Once I made that one change, it stopped throwing an error.
I had this issue for longer than I would like to admit.
I simply just needed to run VS as an administrator, rookie mistake on my part...
Hope this helps someone <3
If your problem persist with all those answers, try to change the file attribute to:
File.SetAttributes(yourfile, FileAttributes.Normal);
You do not have permissions to access the file.
Please be sure whether you can access the file in that drive.
string route= #"E:\Sample.text";
FileStream fs = new FileStream(route, FileMode.Create);
You have to provide the file name to create.
Please try this, now you can create.
TLDR : On my end, it had something to do with AVAST ! => Whitelist your application.
All of a sudden, I also got this UnauthorizedAccessException problem in the windows WPF program I'm writing. None of the solutions worked - except I couldn't figure out how to elevate my application to full privileges (not using VS) while at the same time, being already on the administrator account, I didn't feel the need to dig that deep in permission concerns.
The files are image files (jpg, psd, webp, etc.) I wasn't trying to open/write a directory, it has always been a valid path to a file, and I needed to write to the file, FileAccess.ReadWrite was inevitable. The files (and any of their parent directory) were not readonly (I even checked by code prior calling new FileStream(path, mode, access, share) via FileInfo.IsReadOnly) - so what happenned all of a sudden ???
Thinking about : I had an had drive crash, so I unpacked a backup of my solution code from another drive. In the meantime, I added codes in my application to PInvoke APIs to directly read hard drive sectors physical bytes as well as USB plug/unplug monitoring.
I started to get the Exception when I added those, but even though I temporarly removed the related codes from the application, I still got the UnauthorizedAccessException.
Then I remembered one thing I've done long ago, a painstaking similar issue where I wanted my application to communicate sensible data via Wifi, which was to add the executable among AVAST exceptions, and the assembly directory aswell (My app was already among the authorized apps through firewall)
Just did it for my application in AVAST settings, AND THE EXCEPTION IS GONE !!! Two whole days I'm lurking StackOverflow and the web to get moving on, FINALLY !
Details : I can't pinpoint exactly what AVAST didn't like in my application as the only changes I made :
Retrieved then launched the backup code - it worked like a charm, files (images) opens/write without problems (3 days ago)
Added USB detection (3 days ago - Just tested the code, didn't tried to open an image)
Added PInvoke physical drive direct read (2 days ago - FileStream, and the logic to define where/how to scan the damaged drive - Just tested the code, didn't tried to open an image)
Added image format detection starting from Jpg/Jfif.. 2 days ago, got the exception upon testing the code.
While searching for solutions, added an Image Gallery WPF UserControl to diplay pictures based on their signature and check which files gives the exception : almost all of them (some files opens/write okay - why ???)
Tried everything I've found on SO (since the last 2 days) until I opened AVAST settings and whitelist my application.
... now I can move on into adding a bunch of file signatures to retrieve as many datas as I could.
If this may help those who like me, aren't failing on the "I'm passing a directory path instead that of a file", yet, have no time to learn exactly why antiviruses think our own code is a malware.
Just Using the below worked for me on OSX.
var path = "TempForTest";
I have a WinForms (.NET C#) OLTP application based on Oracle.
From our support environment we regularly experience loss of connectivity to the database, and a resulting minidump file is generated (by what, i am not entirely certain of) - apparently it does not cause the application to crash, but in order to actually do anything you have to close it and start it again.
After a many such minidumps have been created in the same directory, all of a sudden the minidumps starts getting rather strange file names, filenames that are apparently "illegal" on windows.
For instance we have a file name like:
"°÷ƒ
_minidump_default_pid_20248_tid_x19AC_2015_9_1_8_31_51.dmp"
And yes the carriage return is PART of the file name.
We discovered this because log4net watches the directory and all of a sudden starts to bark unhandled exceptiosn due to these invalid file names.
So we are trying to figure out why the minidump is generated in the first place, but the question here is, can we somehow prevent the minidump from being generated with an invalid filename or otherwise control the naming process?
Secondly, does anybody know why is it even possible to create invalid file names in the first place?
Update:
For anyone looking at this trying to figure out why the dump files are created in the first place, our issue was that Windows was generating them when it was near running out of memory, but for some reason we would'nt always get an OOMException.
First, you should really try to find out how those dumps are generated. Microsoft e.g. provides a nice way using a Registry key called LocalDumps which has provided great help for me. I am sure that this approach won't generate invalid file names like above.
Second, if the application does not crash, it has probably registered an unhandled exception handler. This is basically ok and designed to write crash dumps, but the unhandled exception is handled by the crashing process itself. How can the code to handle the situation be sure he himself is not affected by the crash? The better option is to let Windows as the OS handle the crash. Then the Windows kernel (which is not affected by the crash) can really handle the situation. That's what LocalDumps does.
Third, direct file system access is possible in Windows via paths that start with \\.\ when passing it to the Windows API. Starting a path like that will skip any file name check so you can generate files with reserved characters such as *, ?, : or newlines as observed by you. The unhandled exception handler of your application is probably doing that and is affected by the crash in a way that parts of the file name are overwritten.
Chkdsk should be able to repair the file system.
pls check if you are installing from network path like \remoteserver\d$\client.
then change it to \remoteserver\d\clinet
"$" in share path create issue while extration on elevated permission files
I have the following piece of code in my application:
if (!Directory.Exists(myPath))
Directory.CreateDirectory(myPath);
If I run it in a regular unit test sometimes it passes, sometimes not. The directory is always there (I made sure of it, so technically it will never be "created" by code). But every once in a while Directory.Exists(myPath) returned false, which makes the code try to create the folder and then I get an UnauthorizedAccessException!
The funny thing here is if I put a breakpoint on the CreateDirectory, and then move the yellow arrow up back to test, the test returns true!
What's going on?
myPath is \\nameOfLocalMachine\sharedFolder. The share is reliable and constantly used... .NET 4.0
I just made a fiddler simulate 3000 sequentials requests. 175 failed... All with the same message:
Access to the path '\nameOfLocalMachine\sharedFolder\randomFileName.json' is denied
This mishap is pretty normal on Windows. Programs open a handle on a directory like this and specify delete sharing. Which permits anybody to delete the directory, even though the program is using it. The directory won't actually disappear from the file system until that handle is closed. What follows is that trying to recreate that directory cannot work, it still exists. Windows generates an "access denied" error, reported in your C# program with the UnauthorizedAccessException.
While that sounds like an obscure feature, every program in Windows does this. Every process has a default working directory, the value of Environment.CurrentDirectory. Creating a handle on such a directory ensures that it cannot disappear while the program is using it. There are other cases, FileSystemWatcher would be another example. Or a program busy iterating the directory. Anti-malware and search indexers are notorious for hard to diagnose sources of such errors.
Otherwise a standard hazard of a multi-tasking operating system. You are not the only one using the file system. Not repeatedly deleting and creating the same directory ought to be very high on your list. If this is absolutely necessary then rename the directory first before you delete it. You'd still fail to delete the renamed directory but you won't fail recreating it. You can delete it later, next time you need to do this. Much lower odds for trouble then. Because more time passed.
I am trying to renme a file/folder, when i try to rename a file, i got an error, file is already in use, it is just my guess that it is caused by w3wp.exe iis process? some time its says, access to the path is denied, although the file does exist, and there are no special permission, i have all the permission to copy/delete/move and everything for the file/folder.
How to fix this problem.
the folder contains jpeg files.
this happen: when i copy a file then try to rename it.
this happen: when i rename a file then try to delete it.
what i mean to say is that it happen when i already use a file operation then for second time it gives me this error :(
this is the error:
The process cannot access the file 'C:\images\audio-aif-old.png' because it is being used by another process.
file.move(source,destination);
i am using C#. iis 6, asp.net.
Directly , ans is no. But you can delete old copy and create a new copy. See this. http://www.aspnettutorials.com/tutorials/file/file-renfile-aspnet2-csharp.aspx
Some other program must relinquish the file. If you've written a program that's still running then that must be shut down. If the file is currently open for writing you must also ensure that it's been appropriately closed. Try creating another file and see if you have the same problem with that one.
When I call FileInfo(path).LastAccessTime or FileInfo(path).LastWriteTime on a file that is in the process of being written it returns the time that the file was created, not the last time it was written to (ie. now).
Is there a way to get this information?
Edit: To all the responses so far. I hadn't tried Refresh() but that does not do it either. I am returned the time that the file was started to be written to. The same goes for the static method, and creating a new instance of FileInfo.
Codymanix might have the answer, but I'm not running Windows Server (using Windows 7), and I don't know where the setting is to test.
Edit 2: Nobody finds it interesting that this function doesn't seem to work?
The FileInfo values are only loaded once and then cached. To get the current value, call Refresh() before getting a property:
f.Refresh();
t = f.LastAccessTime;
Another way to get the current value is by using the static methods on the File class:
t = File.GetLastAccessTime(path);
Starting in Windows Vista, last access time is not updated by default. This is to improve file system performance. You can find details here:
http://blogs.technet.com/b/filecab/archive/2006/11/07/disabling-last-access-time-in-windows-vista-to-improve-ntfs-performance.aspx
To reenable last access time on the computer, you can run the following command:
fsutil behavior set disablelastaccess 0
As James has pointed out LastAccessTime is not updated.
The LastWriteTime has also undergone a twist since Vista. When the process has the file still open and another process checks the LastWriteTime it will not see the new write time for a long time -- until the process has closed the file.
As a workaround you can open and close the file from your external process. After you have done that you can try to read the LastWriteTime again which is then the up to date value.
File System Tunneling:
If an application implements something like a rolling logger which closes the file and then renames it to a different file name you will also run into issues since the creation time and file size of the "old" file is remembered by the OS although you did create a new file. This includes wrong reports of the file size even if you did recreate log.txt from scratch which is still 0 bytes in size. This feature is called OS File System Tunneling which is still present on Windows 8.1 . An example how to work around this issue check out RollingFlatFileTracelistener from Enterprise Library.
You can see the effects of file system tunneling on your own machine from the cmd shell.
echo test > file1.txt
ren file1.txt file2.txt
Wait one minute
echo test > file1.txt
dir /tc file*.txt
...
05.07.2015 19:26 7 file1.txt
05.07.2015 19:26 7 file2.txt
The file system is a state machine. Keeping states correctly synchronized is hard if you care about performance and correctness.
This strange tunneling syndrome is obviously still used by application which do e.g. autosave a file and move it to a save location and then recreate the file again at the same location. For these applications it makes to sense to give the file a new creation date because it was only copied around. Some installers do also such tricks to move files temporarily to a different location and write the contents back later to get past some file exists check for some install hooks.
Have you tried calling Refresh() just before accessing the property (to avoid getting a cached value)? If that doesn't work, have you looked at what Explorer shows at the same time? If Explorer is showing the wrong information, then it's probably something you can't really address - it might be that the information is only updated when the file handle is closed, for example.
There is a setting in windows which is sometimes set especially on server systems so that modified and accessed times for files are not set for better performance.
From MSDN:
When first called, FileSystemInfo
calls Refresh and returns the
cached information on APIs to get
attributes and so on. On subsequent
calls, you must call Refresh to get
the latest copy of the information.
FileSystemInfo.Refresh()
If you're application is the one doing the writing, I think you are going to have to "touch" the file by setting the LastWriteTime property your self between each buffer of data you write. Some psuedocode:
while(bytesWritten < totalBytes)
{
bytesWritten += br.Write(buffer);
myFileInfo.LastWriteTime = DateTime.Now;
}
I'm not sure how severely this will affect write performance.
Tommy Carlier's answer got me thinking....
A good way to visualise the differences is seperately running the two snippets (I just used LinqPAD) simliar to below while also running sysinternals Process Monitor.
while(true)
File.GetLastAccessTime([file path here]);
and
FileInfo bob = new FileInfo(path);
while(true){
string accessed = bob.LastAccessTime.ToString();
}
If you look at Process Monitor while running the first snippet you will see repeated and constant access attempts to the file for the LinqPAD process. The second snippet will do an initial access of the file, for which you will see activity in process monitor, and then very little afterwards.
However if you go and modify the file (I just opened the text file I was monitoring using FileInfo and added a character and saved) you will see a series of access attempts by the LinqPAD process to the file in process monitor.
This illustrates the non-cached and cached behaviour of the two different approachs respectively.
Will the non-cached approach wear a hole in the hard drive?!
EDIT
I went away feeling all clever over my testing and then used the caching behaviour of FileInfo in my windows service (basically to sit in a loop and say 'Has-file-changed-has-file-changed...' before doing processing)
While this approach worked on my dev box, it did not work in the production environment, ie the process just kept running regardless if the file had changed or not. I ended up changing my approach to checking and just used GetLastAccessTime as part of it. Don't know why it would behave differently on production server....but I am not too concerned at this point.