I'm saving a large amount of bitmaps (screenshots) to memory. Nothing special with the code, it's trivial:
var memory = new MemoryStream();
bitmap.Save(memory, ImageFormat.Png);
Since my PC is getting a bit slow, I have run a performance analysis session in Visual Studio and I found out that the Save() call takes 37% of "inclusive samples". Another big parts is used when saving to disk, so these 37% are more likely 80% when not saving to disk. (I don't care about saving to disk at the moment. All data is kept in RAM until a hotkey is pressed and I hardly have influence on hard disk speed.)
From my understanding, the Save() call has to convert the more or less "raw" data of the bitmap into the compressed PNG file format.
I wonder whether someone has a performance overview of the different image formats with respect to the processing time of the Save()method. I'd like to choose the fastest format, even if the file size is larger.
I have tried:
ImageFormat.MemoryBmp
but that throws an ArgumentNullException:
Value cannot be null. Parameter name: encoder
I found a related question that describes that some of the image formats are read-only, which reduces the list a bit.
These are the non-representative results for taking screenshots of 3 monitors on an Intel i7 CPU, where the application is assigned one core only. I was running a x64 release build and saving to a pre-allocated memory buffer.
GIF : ~5.5% CPU load
TIFF: ~4.5% CPU load
PNG : ~4.0% CPU load
JPG : ~2.0% CPU load (note that this is lossy)
BMP : ~1.0% CPU load
I also tried integrating Magick.NET, but since I could not figure out on how to create a Graphics object from MagickImage in order to save the screenshot, I had to use the constructor that takes the Bitmap as argument. This resulted in ~10.0% CPU load for PNG images.
Related
I'm currently working on machine vision project. The issue is saving all of the images fast enough so that the queue of images doesn't build up in RAM and drain the user's memory. Is there any other method available for fast image saving?
This method helps the CPU issue, but because its not fast enough. The queue of images builds up and overloads the ram so I don't know what else I can do to solve both issues.
The fastest way for writing images in Halcon is using their proprietary format .hobj. It is much faster than any other lossless compression:
You can see the benchmark shown above in example write_image_benchmark.hdev
The only disadvantage is that you cannot open this format without the Halcon license.
I am loading some JPEG and PNG images into GridView. I am using Software Bitmap to represent the images.
However, SoftwareBitmap stores an uncompressed form of the image. So the problem is, when loading multiple (many) images, my app eats lots of RAM, and I am concerned about high memory usage.
I have known that the GridView handles virtualization by itself.
While loading only about 150 images (90 MB as compressed image files on disk), the app's memory usage rises close to 500 MB!
How can I optimize? Do I need to use some SoftwareBitmap feature or it's alternative I am unaware of? Or will I have to do some kind of image processing to store compressed version in the RAM (I don't even know if that's possible).
I need to edit(Increase the height) the Image on the fly.
The file is mostly 5000*4000 in dimension. I see the memory shoots up to peak level when I create a bmp of large dimensions and call Graphics.DrawImage method on the bmp instance.
How do I get rid of the Out Of Memory exception? Is there a way to work with large bitmaps in c# ?
The problem is the Huge amount of Memory required for the operation. Yours is taking about some GigaBytes, so the solution could be to use a Stream and process the file in chunks.
Or the the best option would be to use some Third party library for it. Below are some for .Net
AForge
Image Resizer
Also have a look at this SO question.
https://stackoverflow.com/questions/158756/what-is-the-best-image-manipulation-library
It's depends on you application specific requeirements, it's not very clear from yuor post, but generaly, working with big media files (images, sounds, videos) I think really good solution is
Memory Mapped Files
Save yuor image on the disk in memory mapped file and resize it having on disk, by free yuor RAM as much as possible from a lot of data that you, probably, don't need to have a fast access(in that moment at least)
Hope this helps.
Regards.
How to do this in C#?
If I use Bitmap.FromFile(), the original file is locked.
If I use Bitmap.FromStream(), the original file is not locked, but the documentation says "You must keep the stream open for the lifetime of the Image." This probably means that the file is still linked to the image object, (for example, perhaps if the file change so do the object or vice versa).
what i want to do is just reading the bitmap and save it to an object and after that there is no link whatsoever between the file and the Image object
Some background info on this behavior: Bitmap uses a memory-mapped file to access the pixels in the bitmap. That's a very basic facility in the Windows API, it allows very efficient mapping of memory to file data. Data is read from the file only when the program read the memory, the virtual memory pages don't take any space in the Windows paging file.
The exact same mechanism is used to load .NET assemblies. It is the memory mapping that puts a lock on the file. Which is basically why assemblies are locked when they are used in a .NET program. The Image.Dispose() method releases the lock. Fighting the lock often indicates that you are forgetting to dispose your bitmaps. Very important, forgetting to call Dispose() doesn't often cause problems for .NET classes, except for Bitmap since it can need so much (unmanaged) memory.
Yes, FromStream() prevents the class from making this optimization. The cost is significant, you'll need double the memory when the bitmap is loaded. This will be a problem when the bitmap is large, you're skirting OOM when the program has been running for a while (fragmenting the address space) and its not running on a 64-bit operating system. Definitely avoid doing this if the bitmap's Width x Height x 4 >= 45 MB, give or take.
Some code, you don't have to jump through the CopyStream hoop:
public static Image LoadImageNoLock(string path) {
var ms = new MemoryStream(File.ReadAllBytes(path)); // Don't use using!!
return Image.FromStream(ms);
}
Note that you don't want to dispose the MemoryStream, you'll get a hard to diagnose "generic error" when the bitmap gets used if you do. Caused by the Image class lazy-reading the stream.
Read the file into memory by copying it from a FileStream into a MemoryStream. (Search for CopyStream in Stack Overflow to find plenty of examples of how to do that safely. Basically loop while reading, writing each chunk to the memory stream, until there's no more data to read.) Then rewind the MemoryStream (set Position = 0) and then pass that to Bitmap.FromStream.
In order to create an image without locking the file, you'll have to create a copy of the image's FileStream.
Check this page Best way to copy between two Stream instances - C# for how to copy the stream.
Afterwards just create your image from the copied stream and you're ready to go.
I have used this technique of copying to MemoryStream and then feeding the MemoryStream to Bitmap.FromStream quite a lot of times. However, there is one gotcha with this technique as well.
If you are planning to use one of the Bitmap.Save methods later on the loaded image, then you'll have to keep the stream alive (i.e., not dispose it after the image is loaded), else you'll get the dreaded "A generic GDI+ error occurred" exception!
I am working on a system that stores many images in a database as byte[]. Each byte[] is a multi page tiff already, but I have a need to retrieve the images, converting them all to one multi page tiff. The system was previously using the System.Drawing.Image classes, with Save and SaveAdd - this was nice in that it saves the file progressively, and therefore memory usage was minimal, however GDI+ concurrency issues were encountered - this is running on a back end in COM+.
The methods were converted to use the System.Windows.Media.Imaging classes, TiffBitmapDecoder and TiffBitmapEncoder, with a bit of massaging in between. This resolved the concurrency issue, but I am struggling to find a way to save the image progressively (i.e. frame by frame) to limit memory usage, and therefore the size of images that can be manipulated is much lower (i.e. I created a test 1.2GB image using the GDI+ classes, and could have gone on, but could only create a ~600MB file using the other method).
Is there any way to progressively save a multi page tiff image to avoid memory issues? If Save is called on the TiffBitmapEncoder more than once an error is thrown.
I think I would use the standard .NET way to decode the tiff images and write my own tiff encoder that can write progressively to disk. The tiff format specifications are public.
Decoding a tiff is not that easy, that's why I would use the TiffBitmapDecoder for this. Encoding is is easier, so I think it is doable to write an encoder that you can feed with separate frames and that is writing the necessary data progressively to disk. You'll probably have to update the header of the resulting tiff once you are ready to update the IFD (Image File Directory) entry.
Good luck!
I've done this via LibTIFF.NET I can handle multi-gigabyte images this way with no pain. See my question at
Using LibTIFF from c# to access tiled tiff images
Although I use it for tiled access, the memory issues are similar. LibTIFF allows full access to all TIFF functions, so files can be read and stored in a directory-like manner.
The other thing worth noting is the differences between GDI on different windows versions. See GDI .NET exceptions.