IOException "Cyclic Rendundancy Check" FileStream.CopyTo using C#.Net - c#

I have some C# code that uses FileStreams to open a PhysicalDrive and take an image of the whole disk, but s consistently throwing an IOException with the message "Data error (cyclic redundancy check)." After copying about 121MB of a 128MB disk.
using Microsoft.Win32.SafeHandles;
using System.IO;
using System;
[DllImport("kernel32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall, SetLastError = true)]
protected static extern SafeFileHandle CreateFile(
string lpFileName,
uint dwDesiredAccess,
uint dwShareMode,
IntPtr SecurityAttributes,
uint dwCreationDisposition,
uint dwFlagsAndAttributes,
IntPtr hTemplateFile
);
public void MakeImage()
{
SafeFileHandle TheDevice = null;
try
{
TheDevice = CreateFile(#"\\.\PHYSICALDRIVE1", (uint)FileAccess.Read, (uint)0, IntPtr.Zero, (uint)FileMode.Open, (uint)FILE_ATTRIBUTE_SYSTEM, IntPtr.Zero);
if (TheDevice.IsInvalid) { throw new IOException("Unable to access drive. Win32 Error Code " + Marshal.GetLastWin32Error()); }
using (FileStream Dest = System.IO.File.Open("output.bin", FileMode.Create))
{
using (FileStream Src = new FileStream(TheDevice, FileAccess.Read))
{
Src.CopyTo(Dest);
Src.Close();
}
Dest.Close();
}
}
catch(Exception Ex)
{
//Here is where i am getting the IOException
//Handle error..
}
finally
{
if (TheDevice != null)
{
if (!TheDevice.IsClosed)
TheDevice.Close();
TheDevice.Dispose();
}
}
}
I have run Scan Disk on the Drive in question and there doesn't appear to be anything wrong with it. If I change the first Param of CreateFile to just read a partition (not what I want to do), then the image is created fine.
This is a follow up to Windows C# implementation of linux dd command that I've been trying to do.
UPDATE:
Further investigations show that the error has something do with not being able to get or know the Src.Length property. I changed my code to copy byte by byte and keep a count, it errors after 126959616 bytes, which is 1 byte more than the total size of the image produced by dd.

The cyclic redundancy check, or "CRC" error indicates that the disk is corrupt or damaged in some way probably a bad spot on the disk. If you're seeing it when trying to copy a file it could be that the bad spot may be in the file itself.
Try copying the file to some other disk and see if that works.

The cause of the IOException was reading past the end of the Stream. For some reason when opening a Physical Disk as a FileSteam I am unable to get the Length of the stream or otherwise find out its size. The solution is to use the System.Management namespace objects to find out the length of the Device file in bytes and then use Read to copy the correct amount of data from the Stream.
I think the reason this would happen after 121MB is due to something along the way not using 1024B to a KB and instead using 1000B to a KB, so that was a bit of a red herring on my part.
I still don't understand why it would throw a CRC error and not a "Sector Not Found" or "Read Past the End of the Stream" type message.

Related

Why does the streamwriter in C# write multiple lines correctly one time and not the second time? [duplicate]

I have some code and when it executes, it throws a IOException, saying that
The process cannot access the file 'filename' because it is being used by
another process
What does this mean, and what can I do about it?
What is the cause?
The error message is pretty clear: you're trying to access a file, and it's not accessible because another process (or even the same process) is doing something with it (and it didn't allow any sharing).
Debugging
It may be pretty easy to solve (or pretty hard to understand), depending on your specific scenario. Let's see some.
Your process is the only one to access that file
You're sure the other process is your own process. If you know you open that file in another part of your program, then first of all you have to check that you properly close the file handle after each use. Here is an example of code with this bug:
var stream = new FileStream(path, FileAccess.Read);
var reader = new StreamReader(stream);
// Read data from this file, when I'm done I don't need it any more
File.Delete(path); // IOException: file is in use
Fortunately FileStream implements IDisposable, so it's easy to wrap all your code inside a using statement:
using (var stream = File.Open("myfile.txt", FileMode.Open)) {
// Use stream
}
// Here stream is not accessible and it has been closed (also if
// an exception is thrown and stack unrolled
This pattern will also ensure that the file won't be left open in case of exceptions (it may be the reason the file is in use: something went wrong, and no one closed it; see this post for an example).
If everything seems fine (you're sure you always close every file you open, even in case of exceptions) and you have multiple working threads, then you have two options: rework your code to serialize file access (not always doable and not always wanted) or apply a retry pattern. It's a pretty common pattern for I/O operations: you try to do something and in case of error you wait and try again (did you ask yourself why, for example, Windows Shell takes some time to inform you that a file is in use and cannot be deleted?). In C# it's pretty easy to implement (see also better examples about disk I/O, networking and database access).
private const int NumberOfRetries = 3;
private const int DelayOnRetry = 1000;
for (int i=1; i <= NumberOfRetries; ++i) {
try {
// Do stuff with file
break; // When done we can break loop
}
catch (IOException e) when (i <= NumberOfRetries) {
// You may check error code to filter some exceptions, not every error
// can be recovered.
Thread.Sleep(DelayOnRetry);
}
}
Please note a common error we see very often on StackOverflow:
var stream = File.Open(path, FileOpen.Read);
var content = File.ReadAllText(path);
In this case ReadAllText() will fail because the file is in use (File.Open() in the line before). To open the file beforehand is not only unnecessary but also wrong. The same applies to all File functions that don't return a handle to the file you're working with: File.ReadAllText(), File.WriteAllText(), File.ReadAllLines(), File.WriteAllLines() and others (like File.AppendAllXyz() functions) will all open and close the file by themselves.
Your process is not the only one to access that file
If your process is not the only one to access that file, then interaction can be harder. A retry pattern will help (if the file shouldn't be open by anyone else but it is, then you need a utility like Process Explorer to check who is doing what).
Ways to avoid
When applicable, always use using statements to open files. As said in previous paragraph, it'll actively help you to avoid many common errors (see this post for an example on how not to use it).
If possible, try to decide who owns access to a specific file and centralize access through a few well-known methods. If, for example, you have a data file where your program reads and writes, then you should box all I/O code inside a single class. It'll make debug easier (because you can always put a breakpoint there and see who is doing what) and also it'll be a synchronization point (if required) for multiple access.
Don't forget I/O operations can always fail, a common example is this:
if (File.Exists(path))
File.Delete(path);
If someone deletes the file after File.Exists() but before File.Delete(), then it'll throw an IOException in a place where you may wrongly feel safe.
Whenever it's possible, apply a retry pattern, and if you're using FileSystemWatcher, consider postponing action (because you'll get notified, but an application may still be working exclusively with that file).
Advanced scenarios
It's not always so easy, so you may need to share access with someone else. If, for example, you're reading from the beginning and writing to the end, you have at least two options.
1) share the same FileStream with proper synchronization functions (because it is not thread-safe). See this and this posts for an example.
2) use FileShare enumeration to instruct OS to allow other processes (or other parts of your own process) to access same file concurrently.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.Read))
{
}
In this example I showed how to open a file for writing and share for reading; please note that when reading and writing overlaps, it results in undefined or invalid data. It's a situation that must be handled when reading. Also note that this doesn't make access to the stream thread-safe, so this object can't be shared with multiple threads unless access is synchronized somehow (see previous links). Other sharing options are available, and they open up more complex scenarios. Please refer to MSDN for more details.
In general N processes can read from same file all together but only one should write, in a controlled scenario you may even enable concurrent writings but this can't be generalized in few text paragraphs inside this answer.
Is it possible to unlock a file used by another process? It's not always safe and not so easy but yes, it's possible.
Using FileShare fixed my issue of opening file even if it is opened by another process.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Problem
one is tying to open file System.IO.File.Open(path, FileMode) with this method and want a shared access on file but
if u read documentation of System.IO.File.Open(path, FileMode) it is explicitly saying its does not allow sharing
Solution
use you have to use other override with FileShare
using FileStream fs = System.IO.File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
with FileShare.Read
Had an issue while uploading an image and couldn't delete it and found a solution. gl hf
//C# .NET
var image = Image.FromFile(filePath);
image.Dispose(); // this removes all resources
//later...
File.Delete(filePath); //now works
As other answers in this thread have pointed out, to resolve this error you need to carefully inspect the code, to understand where the file is getting locked.
In my case, I was sending out the file as an email attachment before performing the move operation.
So the file got locked for couple of seconds until SMTP client finished sending the email.
The solution I adopted was to move the file first, and then send the email. This solved the problem for me.
Another possible solution, as pointed out earlier by Hudson, would've been to dispose the object after use.
public static SendEmail()
{
MailMessage mMailMessage = new MailMessage();
//setup other email stuff
if (File.Exists(attachmentPath))
{
Attachment attachment = new Attachment(attachmentPath);
mMailMessage.Attachments.Add(attachment);
attachment.Dispose(); //disposing the Attachment object
}
}
I got this error because I was doing File.Move to a file path without a file name, need to specify the full path in the destination.
The error indicates another process is trying to access the file. Maybe you or someone else has it open while you are attempting to write to it. "Read" or "Copy" usually doesn't cause this, but writing to it or calling delete on it would.
There are some basic things to avoid this, as other answers have mentioned:
In FileStream operations, place it in a using block with a FileShare.ReadWrite mode of access.
For example:
using (FileStream stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Note that FileAccess.ReadWrite is not possible if you use FileMode.Append.
I ran across this issue when I was using an input stream to do a File.SaveAs when the file was in use. In my case I found, I didn't actually need to save it back to the file system at all, so I ended up just removing that, but I probably could've tried creating a FileStream in a using statement with FileAccess.ReadWrite, much like the code above.
Saving your data as a different file and going back to delete the old one when it is found to be no longer in use, then renaming the one that saved successfully to the name of the original one is an option. How you test for the file being in use is accomplished through the
List<Process> lstProcs = ProcessHandler.WhoIsLocking(file);
line in my code below, and could be done in a Windows service, on a loop, if you have a particular file you want to watch and delete regularly when you want to replace it. If you don't always have the same file, a text file or database table could be updated that the service always checks for file names, and then performs that check for processes & subsequently performs the process kills and deletion on it, as I describe in the next option. Note that you'll need an account user name and password that has Admin privileges on the given computer, of course, to perform the deletion and ending of processes.
When you don't know if a file will be in use when you are trying to save it, you can close all processes that could be using it, like Word, if it's a Word document, ahead of the save.
If it is local, you can do this:
ProcessHandler.localProcessKill("winword.exe");
If it is remote, you can do this:
ProcessHandler.remoteProcessKill(computerName, txtUserName, txtPassword, "winword.exe");
where txtUserName is in the form of DOMAIN\user.
Let's say you don't know the process name that is locking the file. Then, you can do this:
List<Process> lstProcs = new List<Process>();
lstProcs = ProcessHandler.WhoIsLocking(file);
foreach (Process p in lstProcs)
{
if (p.MachineName == ".")
ProcessHandler.localProcessKill(p.ProcessName);
else
ProcessHandler.remoteProcessKill(p.MachineName, txtUserName, txtPassword, p.ProcessName);
}
Note that file must be the UNC path: \\computer\share\yourdoc.docx in order for the Process to figure out what computer it's on and p.MachineName to be valid.
Below is the class these functions use, which requires adding a reference to System.Management. The code was originally written by Eric J.:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using System.Diagnostics;
using System.Management;
namespace MyProject
{
public static class ProcessHandler
{
[StructLayout(LayoutKind.Sequential)]
struct RM_UNIQUE_PROCESS
{
public int dwProcessId;
public System.Runtime.InteropServices.ComTypes.FILETIME ProcessStartTime;
}
const int RmRebootReasonNone = 0;
const int CCH_RM_MAX_APP_NAME = 255;
const int CCH_RM_MAX_SVC_NAME = 63;
enum RM_APP_TYPE
{
RmUnknownApp = 0,
RmMainWindow = 1,
RmOtherWindow = 2,
RmService = 3,
RmExplorer = 4,
RmConsole = 5,
RmCritical = 1000
}
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
struct RM_PROCESS_INFO
{
public RM_UNIQUE_PROCESS Process;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_APP_NAME + 1)]
public string strAppName;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_SVC_NAME + 1)]
public string strServiceShortName;
public RM_APP_TYPE ApplicationType;
public uint AppStatus;
public uint TSSessionId;
[MarshalAs(UnmanagedType.Bool)]
public bool bRestartable;
}
[DllImport("rstrtmgr.dll", CharSet = CharSet.Unicode)]
static extern int RmRegisterResources(uint pSessionHandle,
UInt32 nFiles,
string[] rgsFilenames,
UInt32 nApplications,
[In] RM_UNIQUE_PROCESS[] rgApplications,
UInt32 nServices,
string[] rgsServiceNames);
[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmStartSession(out uint pSessionHandle, int dwSessionFlags, string strSessionKey);
[DllImport("rstrtmgr.dll")]
static extern int RmEndSession(uint pSessionHandle);
[DllImport("rstrtmgr.dll")]
static extern int RmGetList(uint dwSessionHandle,
out uint pnProcInfoNeeded,
ref uint pnProcInfo,
[In, Out] RM_PROCESS_INFO[] rgAffectedApps,
ref uint lpdwRebootReasons);
/// <summary>
/// Find out what process(es) have a lock on the specified file.
/// </summary>
/// <param name="path">Path of the file.</param>
/// <returns>Processes locking the file</returns>
/// <remarks>See also:
/// http://msdn.microsoft.com/en-us/library/windows/desktop/aa373661(v=vs.85).aspx
/// http://wyupdate.googlecode.com/svn-history/r401/trunk/frmFilesInUse.cs (no copyright in code at time of viewing)
///
/// </remarks>
static public List<Process> WhoIsLocking(string path)
{
uint handle;
string key = Guid.NewGuid().ToString();
List<Process> processes = new List<Process>();
int res = RmStartSession(out handle, 0, key);
if (res != 0) throw new Exception("Could not begin restart session. Unable to determine file locker.");
try
{
const int ERROR_MORE_DATA = 234;
uint pnProcInfoNeeded = 0,
pnProcInfo = 0,
lpdwRebootReasons = RmRebootReasonNone;
string[] resources = new string[] { path }; // Just checking on one resource.
res = RmRegisterResources(handle, (uint)resources.Length, resources, 0, null, 0, null);
if (res != 0) throw new Exception("Could not register resource.");
//Note: there's a race condition here -- the first call to RmGetList() returns
// the total number of process. However, when we call RmGetList() again to get
// the actual processes this number may have increased.
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, null, ref lpdwRebootReasons);
if (res == ERROR_MORE_DATA)
{
// Create an array to store the process results
RM_PROCESS_INFO[] processInfo = new RM_PROCESS_INFO[pnProcInfoNeeded];
pnProcInfo = pnProcInfoNeeded;
// Get the list
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, processInfo, ref lpdwRebootReasons);
if (res == 0)
{
processes = new List<Process>((int)pnProcInfo);
// Enumerate all of the results and add them to the
// list to be returned
for (int i = 0; i < pnProcInfo; i++)
{
try
{
processes.Add(Process.GetProcessById(processInfo[i].Process.dwProcessId));
}
// catch the error -- in case the process is no longer running
catch (ArgumentException) { }
}
}
else throw new Exception("Could not list processes locking resource.");
}
else if (res != 0) throw new Exception("Could not list processes locking resource. Failed to get size of result.");
}
finally
{
RmEndSession(handle);
}
return processes;
}
public static void remoteProcessKill(string computerName, string userName, string pword, string processName)
{
var connectoptions = new ConnectionOptions();
connectoptions.Username = userName;
connectoptions.Password = pword;
ManagementScope scope = new ManagementScope(#"\\" + computerName + #"\root\cimv2", connectoptions);
// WMI query
var query = new SelectQuery("select * from Win32_process where name = '" + processName + "'");
using (var searcher = new ManagementObjectSearcher(scope, query))
{
foreach (ManagementObject process in searcher.Get())
{
process.InvokeMethod("Terminate", null);
process.Dispose();
}
}
}
public static void localProcessKill(string processName)
{
foreach (Process p in Process.GetProcessesByName(processName))
{
p.Kill();
}
}
[DllImport("kernel32.dll")]
public static extern bool MoveFileEx(string lpExistingFileName, string lpNewFileName, int dwFlags);
public const int MOVEFILE_DELAY_UNTIL_REBOOT = 0x4;
}
}
I had this problem and it was solved by following the code below
var _path=MyFile.FileName;
using (var stream = new FileStream
(_path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Your Code! ;
}
I had a very specific situation where I was getting an "IOException: The process cannot access the file 'file path'" on the line
File.Delete(fileName);
Inside an NUnit test that looked like:
Assert.Throws<IOException>(() =>
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
});
File.Delete(fileName);
It turns out NUnit 3 uses something they call "isolated context" for exception assertions. This probably runs on a separate thread.
My fix was to put the File.Delete in the same context.
Assert.Throws<IOException>(() =>
{
try
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
}
catch
{
File.Delete(fileName);
throw;
}
});
I had the following scenario that was causing the same error:
Upload files to the server
Then get rid of the old files after they have been uploaded
Most files were small in size, however, a few were large, and so attempting to delete those resulted in the cannot access file error.
It was not easy to find, however, the solution was as simple as Waiting "for the task to complete execution":
using (var wc = new WebClient())
{
var tskResult = wc.UploadFileTaskAsync(_address, _fileName);
tskResult.Wait();
}
In my case this problem was solved by Opening the file for Shared writing/reading. Following are the sample codes for shared reading and writing:-
Stream Writer
using(FileStream fs = new FileStream("D:\\test.txt",
FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
using (StreamWriter sw = new StreamWriter(fs))
{
sw.WriteLine("any thing which you want to write");
}
Stream Reader
using (FileStream fs = new FileStream("D:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (StreamReader rr=new StreamReader(fs))
{
rr.ReadLine())
}
My below code solve this issue, but i suggest
First of all you need to understand what causing this issue and try the solution which you can find by changing code
I can give another way to solve this issue but better solution is to check your coding structure and try to analyse what makes this happen,if you do not find any solution then you can go with this code below
try{
Start:
///Put your file access code here
}catch (Exception ex)
{
//by anyway you need to handle this error with below code
if (ex.Message.StartsWith("The process cannot access the file"))
{
//Wait for 5 seconds to free that file and then start execution again
Thread.Sleep(5000);
goto Start;
}
}

How to read back status from Zebra receipt printer?

I'm using a Zebra KR403 receipt printer for a project and I need to programatically read the status from the printer (out of paper, paper near-end, printhead open, paper jam, etc). In the ZPL documentation I found that I need to send a ~HQES command and the printer responds with its status information.
In the project the printer is connected via USB, but I figured it may be easier to get it to work connecting it via COM port and work from there to get it to work over USB. I am able to open communication with the printer and send commands to it (I can print test receipts), but whenever I try to read anything back it simply hangs forever and never gets to read anything.
Here's the code I'm using:
public Form1()
{
InitializeComponent();
SendToPrinter("COM1:", "^XA^FO50,10^A0N50,50^FDKR403 PRINT TEST^FS^XZ", false); // this prints OK
SendToPrinter("COM1:", "~HQES", true); // read is never completed
}
[DllImport("kernel32.dll", SetLastError = true)]
static extern SafeFileHandle CreateFile(
string lpFileName,
FileAccess dwDesiredAccess,
uint dwShareMode,
IntPtr lpSecurityAttributes,
FileMode dwCreationDisposition,
uint dwFlagsAndAttributes,
IntPtr hTemplateFile);
private int SendToPrinter(string port, string command, bool readFromPrinter)
{
int read = -2;
// Create a buffer with the command
Byte[] buffer = new byte[command.Length];
buffer = System.Text.Encoding.ASCII.GetBytes(command);
// Use the CreateFile external func to connect to the printer port
using (SafeFileHandle printer = CreateFile(port, FileAccess.ReadWrite, 0, IntPtr.Zero, FileMode.Open, 0, IntPtr.Zero))
{
if (!printer.IsInvalid)
{
using (FileStream stream = new FileStream(printer, FileAccess.ReadWrite))
{
stream.Write(buffer, 0, buffer.Length);
// tries to read only one byte (for testing purposes; in reality many bytes will be read with the complete message)
if (readFromPrinter)
{
read = stream.ReadByte(); // THE PROGRAM ALWAYS HANGS HERE!!!!!!
}
stream.Close();
}
}
}
return read;
}
I've found out that when I print the test receipt (first call to SendToPrinter()) nothing gets printed until I close the handle with stream.Close(). I've made these tests but to no avail:
calling stream.Flush() after calling stream.Write(), but still nothing gets read (and nothing gets printed either until I call stream.Close())
only send command and then close the stream, immediately reopen and try to read
open two handles, write on handle 1, close handle 1, read handle 2. nothing
Has anyone has had any luck reading back status from a Zebra printer? Or anyone has any idea of what I may be doing wrong?
As pointed out by #l33tmike in the comments, the answer to this question has already been posted on another question: Which SDK should I use for KR403 Zebra Printer

Unmanaged file access (WriteFile) slower than managed (FileStream)

I have an application that heavily reads and write to files (a custom format), I was told to improve performance by using direct unmanaged code.
Before attempting in the real application I made a small tests just to see how the performance gains would be, but for my surprise, the unmanaged versions seems to be like 8x slower than using simply filestream.
Here is the managed function:
private int length = 100000;
private TimeSpan tspan;
private void UsingManagedFileHandle()
{
DateTime initialTime = DateTime.Now;
using (FileStream fileStream = new FileStream("data2.txt", FileMode.Create, FileAccess.ReadWrite))
{
string line = "ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890123";
byte[] bytes = Encoding.Unicode.GetBytes(line);
for (int i = 0; i < length; i++)
{
fileStream.Write(bytes, 0, bytes.Length);
}
fileStream.Close();
}
this.tspan = DateTime.Now.Subtract(initialTime);
label2.Text = "" + this.tspan.TotalMilliseconds + " Milliseconds";
}
Here is the unmanaged way:
public void UsingAnUnmanagedFileHandle()
{
DateTime initialTime;
IntPtr hFile;
hFile = IntPtr.Zero;
hFile = FileInteropFunctions.CreateFile("data1.txt",
FileInteropFunctions.GENERIC_WRITE | FileInteropFunctions.GENERIC_READ,
FileInteropFunctions.FILE_SHARE_WRITE,
IntPtr.Zero,
FileInteropFunctions.CREATE_ALWAYS,
FileInteropFunctions.FILE_ATTRIBUTE_NORMAL,
0);
uint lpNumberOfBytesWritten = 0;
initialTime = DateTime.Now;
if (hFile.ToInt64() > 0)
{
string line = "ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890123";
byte[] bytes = Encoding.Unicode.GetBytes(line);
uint bytesLen = (uint)bytes.Length;
for (int i = 0; i < length; i++)
{
FileInteropFunctions.WriteFile(hFile,
bytes,
bytesLen,
out lpNumberOfBytesWritten,
IntPtr.Zero);
}
FileInteropFunctions.CloseHandle(hFile);
this.tspan = DateTime.Now.Subtract(initialTime);
label1.Text = "" + this.tspan.TotalMilliseconds + " Milliseconds";
}
else
label1.Text = "Error";
}
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool CloseHandle(IntPtr hObject);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern unsafe IntPtr CreateFile(
String lpFileName, // Filename
uint dwDesiredAccess, // Access mode
uint dwShareMode, // Share mode
IntPtr attr, // Security Descriptor
uint dwCreationDisposition, // How to create
uint dwFlagsAndAttributes, // File attributes
uint hTemplateFile); // Handle to template file
[DllImport("kernel32.dll")]
public static extern unsafe int WriteFile(IntPtr hFile,
// byte[] lpBuffer,
[MarshalAs(UnmanagedType.LPArray)] byte[] lpBuffer, // also tried this.
uint nNumberOfBytesToWrite,
out uint lpNumberOfBytesWritten,
IntPtr lpOverlapped);
The iteration using FileStream takes about 70 ms in my computer.
The one using WriteFile takes about 550ms.
I tested several times and with several amount of iterations and the difference in performance is consistent.
I have no idea why the unmanaged code is being slower then the managed code.
EDIT
Thank you very much for your explanations, guys . I thought there was something "magical" undergoing FileStream and you have explained it so well.
So, I know now there's no easy path to gain performance in this part, and I would like to ask you for opinion for other simple ways to gain speed. The file is random access in the real application, and size could range from 1MB to 1GB.
Your unmanaged calls write the data to disk as soon as possible while FileStream is buffered (ie does most operations in-memory and should call the underlying unmanaged calls much less often)
There are constructors on FileStream that let you control the buffer size if you want to tweak performance further.
Well, FileStream is jut a wrapper around CreateFile/WriteFile. It's written by bunch of smart guys. So I see no logical explanation at all why you assume that your one should be faster :P.
As already stated, FileStream probably does extra-buffering before calling WriteFile() thus minimizing unmanaged method calls. And this is important - only make unmanaged calls when they are necessary. They cost. Buffer sizes are usually multiple of disk sector size. You can experiment with different sizes, though this is OS dependent, and most likely will yield other results on other computers.
But it's also important to know that WriteFile() does internal buffering too. It's not like you call WriteFile() and bam it's written to file. It will be flushed to HDD once it's time.
I think there is unnecessary byte[] marshaling going on. Eg when you call WriteFile(), system makes copy of your buffer. It should be avoidable by unsafe() keyword and little bit of hacking.
There is also FILE_FLAG_SEQUENTIAL_SCAN that can't be accessed through FileStream(afaik) and it should let system know that you're gonna do file writes/reads only sequentially. This might give some performance boost theoretically.
The difference is because the calls to WriteFile are synchronous while the writes to the FileStream are not.
By default CreateFile will create a synchronous file handle, so the calls to WriteFile do not return until the data is written. If you add FILE_FLAG_OVERLAPPED to the CreateFile call the un-managed implementation will take approximately the same time as the managed.
See the documenation for Synchronous and Asynchronous I/O Handles section of the CreateFile defini

invoking WinApi when using long path

Can anyone help me please?
I tried to P/Invoke the WINAPI method from managed .net code.
CreateFile() method is always returning false. If I make the given path less than 256 it just works fine but not if greater than 256. I might be doing something wrong .
According to this link I should be able to use long path file that is greater than 256 in length.
Below is the code that I tried:
static void Main(string[] args)
{
string path = #"c:\tttttttttttaaaaaaaaaaaaaaatttttttttttttttaaaaaaaaaaaaaaatttttttttttttttttttttttttttttttaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaattttttttttttttttttaaaaaaaaaaaaaaaaatttttttttttaaaaaaaaaaatttttttaaaaaaaaaaaaaaaaattttttttttttttttttaaaaaaaaaaaaaaaaattttttttttttttaaaaaaaaaaaaaaaaatttttt";
LongPath.TestCreateAndWrite(path);
}
// This code snippet is provided under the Microsoft Permissive License.
[DllImport("kernel32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
internal static extern SafeFileHandle CreateFile(
string lpFileName,
EFileAccess dwDesiredAccess,
EFileShare dwShareMode,
IntPtr lpSecurityAttributes,
ECreationDisposition dwCreationDisposition,
EFileAttributes dwFlagsAndAttributes,
IntPtr hTemplateFile);
public static void TestCreateAndWrite(string fileName) {
string formattedName = #"\\?\" + fileName;
//string formattedName = #"\\?\UNC" + fileName;
// Create a file with generic write access
SafeFileHandle fileHandle = CreateFile(formattedName, EFileAccess.GenericWrite,
EFileShare.None, IntPtr.Zero, ECreationDisposition.CreateAlways, 0, IntPtr.Zero);
// Check for errors
int lastWin32Error = Marshal.GetLastWin32Error();
if (fileHandle.IsInvalid) {
throw new System.ComponentModel.Win32Exception(lastWin32Error);
}
// Pass the file handle to FileStream. FileStream will close the
// handle
using (FileStream fs = new FileStream(fileHandle,
FileAccess.Write)) {
fs.WriteByte(80);
fs.WriteByte(81);
fs.WriteByte(83);
fs.WriteByte(84);
}
}
This method throws error code 3 which is file path not specified according to System Error Codes (0-499) (Windows).
Any help would be highly appreciable.
While the \\?\ notation allows you to use paths whose total length is longer than MAX_PATH, you still have to respect the per-component limit reported by GetVolumeInformation. For NTFS, the per-component limit is 255, which means you are not allowed to go more than 255 characters without a backslash.

Create 2 FileStream on the same file in the same process

I'm trying to create a temporary file that will be automatically deleted.
stream = new FileStream(
tmpFilePath,
FileMode.OpenOrCreate,
FileAccess.ReadWrite,
FileShare.ReadWrite,
4096,
FileOptions.DeleteOnClose|FileOptions.RandomAccess
);
This file will be used by a 3rd party API which will also create a FileStream:
stream = new FileStream(
tmpFilePath,
FileMode.Open,
FileAccess.Read,
FileShare.Read);
I think I've tried all possible combination of flags but I always get a "The process cannot access the file 'XXX' because it is being used by another process..."
Am I doing something wrong? Is there a way around?
According to the documentation, yes.
http://msdn.microsoft.com/en-us/library/system.io.fileshare.aspx
Excerpt:
Read: Allows subsequent opening of the file for reading. If this flag is not specified, any request to open the file for reading (by this process or another process) will fail until the file is closed. However, even if this flag is specified, additional permissions might still be needed to access the file.
I have exactly the same use case and encounter the same problem. What I try is using (FileShare.ReadWrite | FileShare.Delete) for both streams and it works.
In my experience, a FileStream opened with FileOptions.DeleteOnClose cannot be opened by passing the file path to another FileStream regardless of the FileShare value.
When you own all the code (clearly not your case, sorry) DuplicateHandle can be used to open a DeleteOnClose file multiple times, even from different processes.
Here's some example code for .NET 4.5.1.
using System;
using System.Diagnostics;
using System.IO;
using System.Runtime.InteropServices;
using System.Text;
using System.Windows.Forms;
using Microsoft.Win32.SafeHandles;
namespace Example
{
public static class DuplicatedHandleExample
{
[DllImport("kernel32.dll")]
private static extern bool DuplicateHandle(
SafeFileHandle hSourceProcessHandle,
IntPtr hSourceHandle,
SafeFileHandle hTargetProcessHandle,
out SafeFileHandle lpTargetHandle,
UInt32 dwDesiredAccess,
bool bInheritHandle,
UInt32 dwOptions);
[DllImport("kernel32.dll")]
private static extern SafeFileHandle OpenProcess(
UInt32 dwDesiredAccess,
bool bInheritHandle,
int dwProcessId);
private const UInt32 PROCESS_DUP_HANDLE = 0x0040;
private const UInt32 DUPLICATE_SAME_ACCESS = 0x0002;
public static void CreateFileInProcessA()
{
try
{
// open new temp file with FileOptions.DeleteOnClose
string tempFilePath = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString("D"));
using (FileStream fs = new FileStream(tempFilePath, FileMode.CreateNew,
FileAccess.ReadWrite, FileShare.Read | FileShare.Write | FileShare.Delete,
4096, FileOptions.DeleteOnClose))
{
// put a message in the temp file
fs.Write(new[] { (byte)'h', (byte)'i', (byte)'!' }, 0, 3);
fs.Flush();
// put our process ID and file handle on clipboard
string data = string.Join(",",
Process.GetCurrentProcess().Id.ToString(),
fs.SafeFileHandle.DangerousGetHandle().ToString());
Clipboard.SetData(DataFormats.UnicodeText, data);
// show messagebox (while holding file open!) and wait for user to click OK
MessageBox.Show("Temp File opened. Process ID and File Handle copied to clipboard. Click OK to close temp file.");
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
public static void OpenFileInProcessB()
{
try
{
// get process ID and file handle from clipboard
string data = (string)Clipboard.GetData(DataFormats.UnicodeText);
string[] dataParts = data.Split(',');
int sourceProcessId = int.Parse(dataParts[0]);
IntPtr sourceFileHandle = new IntPtr(Int64.Parse(dataParts[1]));
// get handle to target process
using (SafeFileHandle sourceProcessHandle =
OpenProcess(PROCESS_DUP_HANDLE, false, sourceProcessId))
{
// get handle to our process
using (SafeFileHandle destinationProcessHandle =
OpenProcess(PROCESS_DUP_HANDLE, false, Process.GetCurrentProcess().Id))
{
// duplicate handle into our process
SafeFileHandle destinationFileHandle;
DuplicateHandle(sourceProcessHandle, sourceFileHandle,
destinationProcessHandle, out destinationFileHandle,
0, false, DUPLICATE_SAME_ACCESS);
// get a FileStream wrapper around it
using (FileStream fs = new FileStream(destinationFileHandle, FileAccess.ReadWrite, 4096))
{
// read file contents
fs.Position = 0;
byte[] buffer = new byte[100];
int numBytes = fs.Read(buffer, 0, 100);
string message = Encoding.ASCII.GetString(buffer, 0, numBytes);
// show messagebox (while holding file open!) and wait for user to click OK
MessageBox.Show("Found this message in file: " + message + Environment.NewLine +
"Click OK to close temp file");
}
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
}
}
It sounds as if you may want to use a memory mapped file as a method to share a single file with multiple processes.
http://msdn.microsoft.com/en-us/library/system.io.memorymappedfiles.memorymappedfile.aspx
The problem is that you still have the first stream you created open. You need to create the file, then release it (close stream), then have the 3rd party API do it's work, then delete the file. Wrapping all this up in a class that is IDispoable might be a nice solution; create and release the file in the contructor, method wrap the 3rd party work, delete in the dispose method.
You can pass existing stream to 3-rd party Api, or if you want only read only mode for 3-rd party Api pass StreamReader instance
using (var stream = new FileStream("trace.txt", FileMode.OpenOrCreate,FileAccess.ReadWrite))
{
using (var anotherStream = new StreamReader(stream))
{
//magic here
}
}
This sequence of calls will only work if the third party API uses FileShare.ReadWrite, or your open uses FileAccess.Read.
You are opening it read/write, while allowing others to also open it read/write.
The third-party code is trying to open it read-only, while allowing others to also have it open, but only as read-only. Since you still have it open read-write, this fails.
Assuming that you can't change the third-party code, you will need to adopt the following pattern instead:
Open the file as you currently are, but without the DeleteOnClose flag.
Write any content that you need the other code to read.
Close the file.
Optionally reopen it with FileAccess.Read (and possibly DeleteOnClose).
Call the third party code.
Do any other reading (but not writing) that you want.

Categories

Resources