SoftwareBitmap uses lot of memory - c#

I am loading some JPEG and PNG images into GridView. I am using Software Bitmap to represent the images.
However, SoftwareBitmap stores an uncompressed form of the image. So the problem is, when loading multiple (many) images, my app eats lots of RAM, and I am concerned about high memory usage.
I have known that the GridView handles virtualization by itself.
While loading only about 150 images (90 MB as compressed image files on disk), the app's memory usage rises close to 500 MB!
How can I optimize? Do I need to use some SoftwareBitmap feature or it's alternative I am unaware of? Or will I have to do some kind of image processing to store compressed version in the RAM (I don't even know if that's possible).

Related

Which codec for saving bitmaps to memory fast

I'm saving a large amount of bitmaps (screenshots) to memory. Nothing special with the code, it's trivial:
var memory = new MemoryStream();
bitmap.Save(memory, ImageFormat.Png);
Since my PC is getting a bit slow, I have run a performance analysis session in Visual Studio and I found out that the Save() call takes 37% of "inclusive samples". Another big parts is used when saving to disk, so these 37% are more likely 80% when not saving to disk. (I don't care about saving to disk at the moment. All data is kept in RAM until a hotkey is pressed and I hardly have influence on hard disk speed.)
From my understanding, the Save() call has to convert the more or less "raw" data of the bitmap into the compressed PNG file format.
I wonder whether someone has a performance overview of the different image formats with respect to the processing time of the Save()method. I'd like to choose the fastest format, even if the file size is larger.
I have tried:
ImageFormat.MemoryBmp
but that throws an ArgumentNullException:
Value cannot be null. Parameter name: encoder
I found a related question that describes that some of the image formats are read-only, which reduces the list a bit.
These are the non-representative results for taking screenshots of 3 monitors on an Intel i7 CPU, where the application is assigned one core only. I was running a x64 release build and saving to a pre-allocated memory buffer.
GIF : ~5.5% CPU load
TIFF: ~4.5% CPU load
PNG : ~4.0% CPU load
JPG : ~2.0% CPU load (note that this is lossy)
BMP : ~1.0% CPU load
I also tried integrating Magick.NET, but since I could not figure out on how to create a Graphics object from MagickImage in order to save the screenshot, I had to use the constructor that takes the Bitmap as argument. This resulted in ~10.0% CPU load for PNG images.

GetThumbnailAsync() takes too long in Local Storage

I was creating a Windows phone 8.1 (RT) app where I'm having a few images both in the LocalStorage and also in the Pictures Library where I'm loading the images using GetThumbnailAsync().
For a PNG image of size 6MB+ the GetThumbnailAsync() in PicturesLibrary takes a few msec while the same image when copied to LocalStorage in the App takes around 10 secs to get the thumbnail.
Also I used
getThumbnailAsync(ThumbnailMode.ListView,100,ThumbnailOptions.ResizeThumbnail)
Yet it takes a long time but returns the Thumbnail in the desired pixel size. Can anyone point out why it takes so much time in case of LocalStorage and if there are any alternatives to make it fast.
The system pre-caches thumbnails for images in the pictures library, whereas it can't do that for images in an app's isolated storage.
There are two workarounds here:
Move the picture to a public location where the system can pre-generate a thumbnail
Embed a thumbnail in the EXIF data for the image in your local storage. Then the system can do a fast extract and return a thumbnail more quickly. Currently it has to decode the entire 6+ MB file to generate a thumbnail, where a fast extract only needs to pop open the much smaller thumbnail

c# .net, loading images

I have to combine lets say about 100 images (png-files).
The problem is not combining them, that runs quick enough.
But loading them from storage takes up to 4sec.
That is too much time.
So I can use TPL or multiple threads but it is still too slow.
How can I speed it up? To hold all the images in main storage is not an option unfortunately.
The images are quite small: from 4KByte to 10KByte
I'm loading the images that way:
Image img = Image.FromFile(file);
Creating multiple threads does not improve I/O speed in your case. That is about your harddisk's read-write speed.
Loading 100 high quality images in 4 seconds seems normal.
Two ideas:
If the bulk of the lag really is from IO, compress the files. Depending on their contents, even a simple ZIP compression could reduce their size, thus less bytes to read. The work will be to decompress them in memory. I don't know if that's applicable for your case
Lazy load them. Do you need all 100 images loaded all the time? Perhaps you can just load the first ones, or the most important ones first, let the software do the other stuff while it finishes loading the remaining images in the background.
How do you load your image files? Please share a piece of your code. My guess is that you read not whole file at once but something like byte by byte until the EOF... One of ways to optimize file loading is to load whole file into preallocated memory buffer.

Working with large bitmaps causes Out of Memory Exception

I need to edit(Increase the height) the Image on the fly.
The file is mostly 5000*4000 in dimension. I see the memory shoots up to peak level when I create a bmp of large dimensions and call Graphics.DrawImage method on the bmp instance.
How do I get rid of the Out Of Memory exception? Is there a way to work with large bitmaps in c# ?
The problem is the Huge amount of Memory required for the operation. Yours is taking about some GigaBytes, so the solution could be to use a Stream and process the file in chunks.
Or the the best option would be to use some Third party library for it. Below are some for .Net
AForge
Image Resizer
Also have a look at this SO question.
https://stackoverflow.com/questions/158756/what-is-the-best-image-manipulation-library
It's depends on you application specific requeirements, it's not very clear from yuor post, but generaly, working with big media files (images, sounds, videos) I think really good solution is
Memory Mapped Files
Save yuor image on the disk in memory mapped file and resize it having on disk, by free yuor RAM as much as possible from a lot of data that you, probably, don't need to have a fast access(in that moment at least)
Hope this helps.
Regards.

Convert multiple byte[] to multipage tiff equivalent of Image.SaveAdd without GDI+

I am working on a system that stores many images in a database as byte[]. Each byte[] is a multi page tiff already, but I have a need to retrieve the images, converting them all to one multi page tiff. The system was previously using the System.Drawing.Image classes, with Save and SaveAdd - this was nice in that it saves the file progressively, and therefore memory usage was minimal, however GDI+ concurrency issues were encountered - this is running on a back end in COM+.
The methods were converted to use the System.Windows.Media.Imaging classes, TiffBitmapDecoder and TiffBitmapEncoder, with a bit of massaging in between. This resolved the concurrency issue, but I am struggling to find a way to save the image progressively (i.e. frame by frame) to limit memory usage, and therefore the size of images that can be manipulated is much lower (i.e. I created a test 1.2GB image using the GDI+ classes, and could have gone on, but could only create a ~600MB file using the other method).
Is there any way to progressively save a multi page tiff image to avoid memory issues? If Save is called on the TiffBitmapEncoder more than once an error is thrown.
I think I would use the standard .NET way to decode the tiff images and write my own tiff encoder that can write progressively to disk. The tiff format specifications are public.
Decoding a tiff is not that easy, that's why I would use the TiffBitmapDecoder for this. Encoding is is easier, so I think it is doable to write an encoder that you can feed with separate frames and that is writing the necessary data progressively to disk. You'll probably have to update the header of the resulting tiff once you are ready to update the IFD (Image File Directory) entry.
Good luck!
I've done this via LibTIFF.NET I can handle multi-gigabyte images this way with no pain. See my question at
Using LibTIFF from c# to access tiled tiff images
Although I use it for tiled access, the memory issues are similar. LibTIFF allows full access to all TIFF functions, so files can be read and stored in a directory-like manner.
The other thing worth noting is the differences between GDI on different windows versions. See GDI .NET exceptions.

Categories

Resources