creating a file and allocating storage before writing to it - c#

I have a file that gets GBs of data written to it over time (looping once it reaches the end). I would like to create the file ahead of time and preset the storage so that the required storage is never taken up by other downloads during the writing to the file. This is done using visual studio 2012 in C#.
I have tried:
if (fileSizeRequirement or fileName is changed) //if filePath or file size is changed
{
//Open or create the file, set the file to size requirement, close the filestream
fileStream = new FileStream(fileName, FileMode.OpenOrCreate, FileAccess.ReadWrite);
fileStream.SetLength((long)fileSizeRequirement);
fileStream.Close();
}
1) Is this an appropriate way to "preallocate" space for a folder?
2) Will the SetLength require a seek to the beginning after setting the length or does the position in the folder stay at the beginning?
3) What is the correct way to achieve file preallocation of storage space?
Thanks ahead of time and I appreciate any suggestions.

Using SetLength is a common approach although I'd generally use a using statement here.
using(var fileStream = new FileStream(fileName, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
fileStream.SetLength((long)fileSizeRequirement);
}
Calling fileStream.Position straight after SetLength yields 0 so you shouldn't need to seek to the beginning.

Related

Best way to upload file using Stream

I am calling REST API which is accepting Stream to upload file from local device, so for that right now I am using following code to get Stream from a file and than closing that stream after it get's uploaded:
var stream = new FileStream(file, FileMode.Open, FileAccess.ReadWrite);
The problem with the above approach is that, until entire file gets uploaded to server user don't have any chance to delete that file because stream of that file is open, what would be the solution to resolve this issue?
If your typical file is reasonably sized (and I'm hoping you won't be uploading 2GB+ files to a REST API), you could always just read the stream into memory and before feeding it to your API, like so:
using (MemoryStream memoryStream = new MemoryStream())
{
using (FileStream fileStream = new FileStream(file, FileMode.Open, FileAccess.ReadWrite)) {
fileStream.CopyTo(memoryStream);
}
memoryStream.Position = 0; // Reset to origin.
// Now use the MemoryStream as you would a FileStream:
api.Upload(memoryStream);
}
Another alternative is to create a temp copy of the file on your hard drive and feed that to the API - but then dealing with cleanup can become a bit cumbersome. FileOptions.DeleteOnClose is your friend and may very well suffice for your purposes, but it still offers no bulletproof guarantees.

Creating a new file that's locked while setting the last write time

I'm creating files in C# from InputStreams and writing them to a public shared folder. I can't have someone else reading the file before I've set it's LastWriteTime property. I'm using the following code:
// Write file to disk
using (var fileStream = File.Create(fileName))
{
webStream.CopyTo(fileStream);
}
File.SetLastWriteTimeUtc(path, timestamp);
A situation could occur where someone else reads the file before File.SetLastWriteTimeUtc() is called. How do I prevent this?
You'll have to set the correct sharing options when creating the filestream:
using (var fileStream = new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite, FileShare.None)
Your last instruction should also be inside the using statement, as using guarantees that the file is closed when outside of it.

How to open a file in memory?

I have see this term branded around but I don't really understand how you open a file in memory.
I have the files written to disk in a temp location but this needs cleaning when a certain form closes and I can't do it when it's open. It's a must that this folder gets emptied. I was wondering if I opened files in memory instead whether it would make a difference?
MemoryStream inMemoryCopy = new MemoryStream();
using (FileStream fs = File.OpenRead(path))
{
fs.CopyTo(inMemoryCopy);
}
// Now you can delete the file at 'path' and still have an in memory copy
I think you want to work with Memory Mapped files added recently to .NET 4.
http://blogs.msdn.com/b/salvapatuel/archive/2009/06/08/working-with-memory-mapped-files-in-net-4.aspx
Memory Mapped Files .NET
I think it means to read the content of that file into memory as a whole and then close the connection to the file. Assuming it's a file that's not too big you could just read it into a byte[]:
byte[] fileContent = File.ReadAllBytes(fileName);
If it's a text file read it into a string using
string fileContent = File.ReadAllText(fileName);
Once you've done that use a StreamReader to read it later as you would a file on disk.
You can use DeleteOnClose parameter of FileStream constructor:
FileStream fs = new FileStream("<Path Here>", FileMode.Create,
FileAccess.ReadWrite, FileShare.None, 1024, FileOptions.DeleteOnClose);
and the file will be deleted when closed.

How to open a file from Memory Stream

Is it possible to open a file directly from a MemoryStream opposed to writing to disk and doing Process.Start() ? Specifically a pdf file? If not, I guess I need to write the MemoryStream to disk (which is kind of annoying). Could someone then point me to a resource about how to write a MemoryStream to Disk?
It depends on the client :) if the client will accept input from stdin you could push the dta to the client. Another possibility might be to write a named-pipes server or a socket-server - not trivial, but it may work.
However, the simplest option is to just grab a temp file and write to that (and delete afterwards).
var file = Path.GetTempFileName();
using(var fileStream = File.OpenWrite(file))
{
var buffer = memStream.GetBuffer();
fileStream.Write(buffer, 0, (int)memStream.Length);
}
Remember to clean up the file when you are done.
Path.GetTempFileName() returns file name with '.tmp' extension, therefore you cant't use Process.Start() that needs windows file association via extension.
If by opening a file, you mean something like starting Adobe Reader for PDF files, then yes, you have to write it to a file. That is, unless the application provides you with some API do that.
One way to write a stream to file would be:
using (var memoryStream = /* create the memory stream */)
using (var fileStream = File.OpenWrite(fileName))
{
memoryStream.WriteTo(fileStream);
}

How to save file in SQL Server database if have file path?

I am building some C# desktop application and I need to save file into database. I have come up with some file chooser which give me correct path of the file. Now I have question how to save that file into database by using its path.
It really depends on the type and size of the file. If it's a text file, then you could use File.ReadAllText() to get a string that you can save in your database.
If it's not a text file, then you could use File.ReadAllBytes() to get the file's binary data, and then save that to your database.
Be careful though, databases are not a great way to store heavy files (you'll run into some performance issues).
FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fs);
int numBytes = new FileInfo(fileName).Length;
byte[] buff = br.ReadBytes(numBytes);
Then you upload it to the DB like anything else, I'm assume you are using a varbinary column (BLOB)
So filestream would be it but since you're using SQL 2K5 you will have to do it the read into memory way; which consumes alot of resources.
First of the column type varchar(max) is your friend this give you ~2Gb of data to play with, which is pretty big for most uses.
Next read the data into a byte array and convert it to a Base64String
FileInfo _fileInfo = new FileInfo(openFileDialog1.FileName);
if (_fileInfo.Length < 2147483647) //2147483647 - is the max size of the data 1.9gb
{
byte[] _fileData = new byte[_fileInfo.Length];
_fileInfo.OpenRead().Read(_fileData, 0, (int)_fileInfo.Length);
string _data = Convert.ToBase64String(_fileData);
}
else
{
MessageBox.Show("File is too large for database.");
}
And reverse the process to recover
byte[] _fileData = Convert.FromBase64String(_data);
You'll want to dispose of those strings as quickly as possible by setting them to string.empty as soon as you have finished using them!
But if you can, just upgrade to 2008 and use FILESTREAM.
If you're using SQL Server 2008, you could use FILESTREAM (getting started guide here). An example of using this functionality from C# is here.
You would need the file into a byte array then store this as a blob field in the database possible with the name you wanted to give the file and the file type.
You could just reverse the process for putting the file out again.

Categories

Resources