I am using the code below to save a posted file to a server, but that file is being read continually and need to use FileShare.ReadWrite so I don't get a locked error.
httpRequest.Files[0].SaveAs(filePath);
Below is my reading method, how can I accomplish this with the HttpPosted file is the right way with the best performance.
using (var fileStream = new FileStream(
fileLocation,
FileMode.Open,
FileAccess.Read,
FileShare.ReadWrite))
{
using (var streamReader = new StreamReader(fileStream))
{
xDocument = XDocument.Parse(streamReader.ReadToEnd());
}
}
Is this my best option?
using (var memoryStream = new MemoryStream())
{
httpRequest.Files[0].InputStream.CopyTo(memoryStream);
var bytes = memoryStream.ToArray();
using (var fs = File.Open(filePath, FileMode.OpenOrCreate, FileAccess.Write, FileShare.ReadWrite))
{
fs.Write(bytes, 0, bytes.Length);
}
}
Proplem:
You want a "Write:Once, Read:Many" Lock
Assumptions :
File is small (average write opration is 5000 ms)
No other write or read oprations (Only one programe with 2 function)
You read the file a lot more than you write to it
Solution
using System;
using System.IO;
using System.Threading;
using System.Web.Mvc;
namespace stackoverflow_56307594.Controllers
{
public class HomeController : Controller
{
public ActionResult A()
{
readFile();
return View();
}
public ActionResult B()
{
writeFile();
return View();
}
private static object writeLock = new Object();
private void readFile()
{
while (!Monitor.TryEnter(writeLock, 5000)) ; //wait 5000 ms for the writeLock (serializing access)
using (var stream = new FileStream("filePath", FileMode.Open, FileAccess.Read, FileShare.Read))
using (var reader = new StreamReader(stream))
{
// active read
// xDocument = XDocument.Parse(streamReader.ReadToEnd());
}
}
private void writeFile()
{
lock (writeLock)
{
FileStream stream = null;
while (stream == null) //wait for the active read
{
try
{
stream = new FileStream("filePath", FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (IOException)
{
// will fail if active read becase FileShare.None while (stream == null) will wait
}
}
Request.Files[0].InputStream.CopyTo(stream);
}// unlock
}
}
}
Note :
I did not test load or simply test the solution on a webserver
I only tested it on paper 😁
Refs:
locking - How long will a C# lock wait, and what if the code crashes during the lock? - Stack Overflow
c# - Deleting files in use - Stack Overflow
multithreading - Is there a way to detect if an object is locked? - Stack Overflow
Implementing Singleton in C# | Microsoft Docs
c# - Using the same lock for multiple methods - Stack Overflow
c# - Write-Once, Read-Many Lock - Stack Overflow
c# lock write once read many - Google Search
FileShare Enum (System.IO) | Microsoft Docs
Related
I have to write 4GB short[] arrays to and from disk, so I have found a function to write the arrays, and I am struggling to write the code to read the array from the disk. I normally code in other languages so please forgive me if my attempt is a bit pathetic so far:
using UnityEngine;
using System.Collections;
using System.IO;
public class RWShort : MonoBehaviour {
public static void WriteShortArray(short[] values, string path)
{
using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
foreach (short value in values)
{
bw.Write(value);
}
}
}
} //Above is fine, here is where I am confused:
public static short[] ReadShortArray(string path)
{
byte[] thisByteArray= File.ReadAllBytes(fileName);
short[] thisShortArray= new short[thisByteArray.length/2];
for (int i = 0; i < 10; i+=2)
{
thisShortArray[i]= ? convert from byte array;
}
return thisShortArray;
}
}
Shorts are two bytes, so you have to read in two bytes each time. I'd also recommend using a yield return like this so that you aren't trying to pull everything into memory in one go. Though if you need all of the shorts together that won't help you.. depends on what you're doing with it I guess.
void Main()
{
short[] values = new short[] {
1, 999, 200, short.MinValue, short.MaxValue
};
WriteShortArray(values, #"C:\temp\shorts.txt");
foreach (var shortInfile in ReadShortArray(#"C:\temp\shorts.txt"))
{
Console.WriteLine(shortInfile);
}
}
public static void WriteShortArray(short[] values, string path)
{
using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
foreach (short value in values)
{
bw.Write(value);
}
}
}
}
public static IEnumerable<short> ReadShortArray(string path)
{
using (FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
using (BinaryReader br = new BinaryReader(fs))
{
byte[] buffer = new byte[2];
while (br.Read(buffer, 0, 2) > 0)
yield return (short)(buffer[0]|(buffer[1]<<8));
}
}
You could also define it this way, taking advantage of the BinaryReader:
public static IEnumerable<short> ReadShortArray(string path)
{
using (FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
using (BinaryReader br = new BinaryReader(fs))
{
while (br.BaseStream.Position < br.BaseStream.Length)
yield return br.ReadInt16();
}
}
Memory-mapping the file is your friend, there's a MemoryMappedViewAccessor.ReadInt16 function that will allow you to directly read the data, with type short, out of the OS disk cache. Also a Write() overload that accepts an Int16. Also ReadArray and WriteArray functions if you are calling functions that need a traditional .NET array.
Overview of using Memory-mapped files in .NET on MSDN
If you want to do it with ordinary file I/O, use a block size of 1 or 2 megabytes and the Buffer.BlockCopy function to move data en masse between byte[] and short[], and use the FileStream functions that accept a byte[]. Forget about BinaryWriter or BinaryReader, forget about doing 2 bytes at a time.
It's also possible to do the I/O directly into a .NET array with the help of p/invoke, see my answer using ReadFile and passing the FileStream object's SafeFileHandle property here But even though this has no extra copies, it still shouldn't keep up with the memory-mapped ReadArray and WriteArray calls.
I have this Code
using System.IO;
using System.IO.Compression;
...
UnGzip2File("input.gz","output.xls");
Which run this procedure, it runs without error but after it, the input.gz is empty and created output.xls is also empty. At the start input.gz had 12MB. What am i doing wrong ? Or have you better/functional solution ?
public static void UnGzip2File(string inputPath, string outputPath)
{
FileStream inputFileStream = new FileStream(inputPath, FileMode.Create);
FileStream outputFileStream = new FileStream(outputPath, FileMode.Create);
using (GZipStream gzipStream = new GZipStream(inputFileStream, CompressionMode.Decompress))
{
byte[] bytes = new byte[4096];
int n;
// To be sure the whole file is correctly read,
// you should call FileStream.Read method in a loop,
// even if in the most cases the whole file is read in a single call of FileStream.Read method.
while ((n = gzipStream.Read(bytes, 0, bytes.Length)) != 0)
{
outputFileStream.Write(bytes, 0, n);
}
}
outputFileStream.Dispose();
inputFileStream.Dispose();
}
Opening the FileStream with the FileMode.Create will overwrite the existing file as documented here. This will cause the file to be empty when you try to decompress it, which in turn leads to an empty output-file.
Below is a working code sample, note that it is async, this can be changed by leaving out async/await and changing the call to the regular CopyTo-method and changing the return type to void.
public static async Task DecompressGZip(string inputPath, string outputPath)
{
using (var input = File.OpenRead(inputPath))
using (var output = File.OpenWrite(outputPath))
using (var gz = new GZipStream(input, CompressionMode.Decompress))
{
await gz.CopyToAsync(output);
}
}
I have 2 applications, one is writing to a file, and the other one reads the file. It's a log file, so the writer will be logging until the program stops, while the reader could be invoked any time to get the content of the file.
I thought that when the writer opens the file with FileShare.Read, the reader would be able to access the file, but it produces an error saying that the file is being used by another process.
Writer Application:
FileStream fs = new FileStream("file.log", FileMode.Open, FileAccess.Write, FileShare.Read);
BinaryWriter writer = new BinaryWriter(fs);
Reader Application:
BinaryReader reader = new BinaryReader(File.OpenRead("file.log"));
How do I prevent this error?
Can you try specifying FileShare.Read while reading the file also? Instead of directly using File.OpenRead use FileStream with this permission.
Also, for logging, you can use log4Net or any other free logging framework which manages logging so efficiently and we do not have to manage writing to files.
o read a locked file you are going to need to provide more flags for the FileStream.
Code such as below.
using (var reader = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Read,FileShare.ReadWrite))
{
using (var binary = new BinaryReader(reader))
{
//todo: add your code
binary.Close();
}
reader.Close();
}
This would open the file for reading only with the share mode of read write. This can be tested with a small app. (Using streamreader\write instead of binary)
static Thread writer,
reader;
static bool abort = false;
static void Main(string[] args)
{
var fs = File.Create("D:\\test.txt");
fs.Dispose();
writer = new Thread(new ThreadStart(testWriteLoop));
reader = new Thread(new ThreadStart(testReadLoop));
writer.Start();
reader.Start();
Console.ReadKey();
abort = true;
}
static void testWriteLoop()
{
using (FileStream fs = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Write, FileShare.Read))
{
using (var writer = new StreamWriter(fs))
{
while (!abort)
{
writer.WriteLine(DateTime.Now.ToString());
writer.Flush();
Thread.Sleep(1000);
}
}
}
}
static void testReadLoop()
{
while (!abort)
{
Thread.Sleep(1000);
using (var reader = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (var stream = new StreamReader(reader))
{
Console.WriteLine(stream.ReadToEnd());
stream.Close();
}
reader.Close();
}
}
}
I realize the example above is pretty simple but the fact still remains that the "testWriteLoop" never releases the lock.
Hope this helps
While writing some code dealing with logs and files, I've discovered some baffling behaviour in windows file io. Does anyone know why this test would fail with "cannot read file" message?
[TestMethod]
public void SouldAllowReads()
{
using (var file = File.Open(_path, FileMode.Create, FileAccess.Write, FileShare.Read))
{
using (var file2 = File.Open(_path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
//works ok, doesn't throw
}
try
{
using (var file3 = File.Open(_path, FileMode.Open, FileAccess.Read, FileShare.Read))
{
//fails
}
}
catch (IOException)
{
Assert.Fail("cannot read file");
}
}
}
PS. _path = Path.GetTempFileName();
EDIT:
I'll mark elevener answer as correct one, but there is one thing that bothers me in this design. .NET methods such as File.ReadAllText(_path) throw exceptions, which just shouldn't happen.
For example this snipped my test would also fail assertion:
try
{
string text = File.ReadAllText(_path);
}
catch (IOException)
{
Assert.Fail("cannot read file");
}
You have var file = open with FileAccess.Write and at the same time are trying to open var file3 = with fileshare mode FileShare.Read that doesn't allows concurrent write access.
When I execute the code below, I get the common exception The process cannot access the file *filePath* because it is being used by another process.
What is the most efficient way to allow this thread to wait until it can safely access this file?
Assumptions:
the file has just been created by me, so it is unlikely that another app is accessing it.
more than one thread from my app might be trying to run this code to append text to the file.
:
using (var fs = File.Open(filePath, FileMode.Append)) //Exception here
{
using (var sw = new StreamWriter(fs))
{
sw.WriteLine(text);
}
}
So far, the best that I have come up with is the following. Are there any downsides to doing this?
private static void WriteToFile(string filePath, string text, int retries)
{
const int maxRetries = 10;
try
{
using (var fs = File.Open(filePath, FileMode.Append))
{
using (var sw = new StreamWriter(fs))
{
sw.WriteLine(text);
}
}
}
catch (IOException)
{
if (retries < maxRetries)
{
Thread.Sleep(1);
WriteToFile(filePath, text, retries + 1);
}
else
{
throw new Exception("Max retries reached.");
}
}
}
If you have multiple threads attempting to access the same file, consider using a locking mechanism. The simplest form could be:
lock(someSharedObject)
{
using (var fs = File.Open(filePath, FileMode.Append)) //Exception here
{
using (var sw = new StreamWriter(fs))
{
sw.WriteLine(text);
}
}
}
As an alternative, consider:
File.AppendText(text);
You can set a FileShare to allow multiple access with this File.Open command like
File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite)
But i think the cleanest way if you have multiple threads that are trying to write into one file would be to put all these messages into a Queue<T> and have one additional thread that writes all elements of the queue into the file.