I have an implementation of a custom DataObject (Virtual File) see here. I have drag and drop functionality in a control view (drag and drop a file OUT of a control view without having a temp local file).
This works fine with smaller files but as soon as the file is larger than say 12-15megs it says not enough memory available. seems like the memory stream is out of memory.
what can i do about this? can i somehow split a larger byte[] into several memoryStreams and reassemble those to a single file?
Any help would be highly appreciated.
can i somehow split a larger byte[]
into several momoryStreams and
reassemble those to a single file?
Yes.
When I had to deal with a similar situation I built my own stream that internally used byte arrays of 4mb. This "paging" means it never has to allocate ONE LARGE BYTE ARRAY, which is what memory stream does. So, dump memory stream, build your own stream based on another internal storage mechanism.
Related
I have a C# program that generates a bunch of short (10 seconds or so) video files. These are stored in an azure file storage blob. I want the user to be able to download these files at a later date as a zip. However, it would take a substantial amount of memory to load the entire collection of video files into memory to create the zip. I was wondering if it is possible to pull data from a stream into memory, zip encode it, output it to another stream, and dispose of it before moving onto the next segment of data.
Lets say the user has generated 100 10mb videos. If possible, this would allow me to send the zip to the user without first loading the entire 1GB of footage into memory (or storing the entire zip in memory after the fact).
The individual videos are pretty small, so if I need to load an entire file into memory at a time, that is fine as long as I can remove it from memory after it has been encoded and transmitted before moving onto the next file
Yes, it is certainly possible to stream in files, not requiring even any of those to be entirely in memory at any one time, and to compress, stream out, and transmit a zip file containing those, without holding the entire zip file either in memory or mass storage. The zip format is designed to be streamable. However I am not aware of a library that will do that for you.
ZipFile would require saving the entire zip file before transmitting it. If you're ok with saving the zip file in mass storage (not memory) before transmitting, then use ZipFile.
To write your own zip streamer, you would need to generate the zip file format manually. The zip format is documented here. You can use DeflateStream to do the actual compression and Crc32 to compute the CRC-32s. You would transmit the local header before each file's compressed data, followed by a data descriptor after each. You would save the local header information in memory as you go along, and then transmit the central directory and end record after all of the local entries.
zip is a relatively straightforward format, so while it would take a little bit of work, it is definitely doable.
I am building a windows project in .net 4.0 c#.
I am now saving images to hard drive, and that's not taking any memory at all because i am only loading the image once i want to view it. But now i need to remove the creating of images in hard drive and create it some other way. Like creating a memory stream and save it to an object and serialize it down to hard drive. The important part is that i cant have the images visible in hard drive, they must be encrypted or in an object or something.
So....When i tried to put it in a memory stream and save it to a list and then serialize it down to drive, i got a HUGE program memory allocation because for every image i create,and save as memory stream in my list, i allocate that memory and my program gets over 2-300 mb big.
I really don't have any idea of how to do this, can i somehow save it to memory stream and not allocate that memory in the program? Or can save it some other way without having the pictures totally visible as images in hard drive?
Main thing is as i said, i cant have the images as regular images on hard drive, they must not be able to be viewed by the user without the application. And i need to find a way that don't allocate all computers memory.
Thank you in advance!
save it to memory stream and not allocate that memory in the program
No. If it's in a memory stream, it is obviously in RAM.
So you need to either store the image entirely in RAM, or save it to disk.
If you don't want it to be viewable on the disk, then you need to encrypt the file so that the user can't view it outside of your application. How heavy the encryption is depends on how hard you want it to be to crack. If you don't care that much, then very basic XOR encryption will be very fast and not increase the size of the file. If you do care, then you want to use something like 3DES.
The file access is built on the principle of streams, which can be plug together in a chain. What you can do is, instead of directly reading/writing the images from/to disk through a filestream, you plug a CryptoStream between it.
You can use a GZIPStream and CryptoStream to make the pictures both smaller and encrypted.
This article shows you exactly how;
http://www.liensberger.it/web/blog/?p=33
I'm trying to zip a bunch of files and make the data consumable via a stream.
I would like to keep the memory footprint as small as possible.
My idea was to implement a Stream where I've got a bunch of FileStream objects as data members. When the Read method on my Stream was called, I would read some data from one of my file streams and use the ZipOutputStream instance to write zipped data to temporary storage stream which i would then forward the read request to.
This temporary storage stream would just be a queue of bytes. As these bytes are moved into a buffer (via a call to Read), they'd be deleted from the queue. This way, I'd only be storing the bytes that haven't been read yet.
Unfortunately, it seems as though when i dispose a ZipOutputStream it needs to write in random file locations in order to create a valid zip file. This will prevent me from using my "fleeting data" solution.
Hopefully this is all clear :)
Is there another way to minimize memory footprint when creating zip files? Please Help!
Thanks!
ZipOutputStream doesn't need to write to random locations in the output stream (in other words, call Seek()). But if the stream you're writing into reports that it CanSeek, it will use that ability to update some headers.
So, make sure that the stream you're writing to returns false for CanSeek() and everything should work fine.
How to do this in C#?
If I use Bitmap.FromFile(), the original file is locked.
If I use Bitmap.FromStream(), the original file is not locked, but the documentation says "You must keep the stream open for the lifetime of the Image." This probably means that the file is still linked to the image object, (for example, perhaps if the file change so do the object or vice versa).
what i want to do is just reading the bitmap and save it to an object and after that there is no link whatsoever between the file and the Image object
Some background info on this behavior: Bitmap uses a memory-mapped file to access the pixels in the bitmap. That's a very basic facility in the Windows API, it allows very efficient mapping of memory to file data. Data is read from the file only when the program read the memory, the virtual memory pages don't take any space in the Windows paging file.
The exact same mechanism is used to load .NET assemblies. It is the memory mapping that puts a lock on the file. Which is basically why assemblies are locked when they are used in a .NET program. The Image.Dispose() method releases the lock. Fighting the lock often indicates that you are forgetting to dispose your bitmaps. Very important, forgetting to call Dispose() doesn't often cause problems for .NET classes, except for Bitmap since it can need so much (unmanaged) memory.
Yes, FromStream() prevents the class from making this optimization. The cost is significant, you'll need double the memory when the bitmap is loaded. This will be a problem when the bitmap is large, you're skirting OOM when the program has been running for a while (fragmenting the address space) and its not running on a 64-bit operating system. Definitely avoid doing this if the bitmap's Width x Height x 4 >= 45 MB, give or take.
Some code, you don't have to jump through the CopyStream hoop:
public static Image LoadImageNoLock(string path) {
var ms = new MemoryStream(File.ReadAllBytes(path)); // Don't use using!!
return Image.FromStream(ms);
}
Note that you don't want to dispose the MemoryStream, you'll get a hard to diagnose "generic error" when the bitmap gets used if you do. Caused by the Image class lazy-reading the stream.
Read the file into memory by copying it from a FileStream into a MemoryStream. (Search for CopyStream in Stack Overflow to find plenty of examples of how to do that safely. Basically loop while reading, writing each chunk to the memory stream, until there's no more data to read.) Then rewind the MemoryStream (set Position = 0) and then pass that to Bitmap.FromStream.
In order to create an image without locking the file, you'll have to create a copy of the image's FileStream.
Check this page Best way to copy between two Stream instances - C# for how to copy the stream.
Afterwards just create your image from the copied stream and you're ready to go.
I have used this technique of copying to MemoryStream and then feeding the MemoryStream to Bitmap.FromStream quite a lot of times. However, there is one gotcha with this technique as well.
If you are planning to use one of the Bitmap.Save methods later on the loaded image, then you'll have to keep the stream alive (i.e., not dispose it after the image is loaded), else you'll get the dreaded "A generic GDI+ error occurred" exception!
I am guessing, as Images and Icons are stored in a resx file, I am guessing that it should be relatively easy to store a byte array (or similar stream) in an embedded Resource file.
How might this be done, should I pretend the binary stream is a Bitmap, or if the Resource file is the wrong place to be embedding binary data, what other techniques should I investigate?
Mitch has pointed to the right answer, but one trick you can keep up your sleeve is storing the data compressed and decompressing on first access. It helps keep your DLLs small. I use this trick to Embed X64 and X32 versions of a native dll:
See for example the code here: http://code.google.com/p/videobrowser/source/browse/trunk/MediaInfoProvider/LibraryLoader.cs