Ensuring exported JPEG is less then maximum file size - c#

I currently have an application which takes a screenshot of a presenter's desktop and then broadcasts it via a custom protocol to the viewers. In order for the images to be transfered quick enough to get a frame rate of 2 - 3 images per second, I need to ensure the image size is always less then ~ 300 KB.
I'm using C# for the presenter application, which encodes the screenshot into a JPEG via the process below. My concern is that the image quality can vary greatly when using a static compression setting. If I have the application capturing my screen, the images output will be ~200 KB when I have Visual Studio full screen, but if I minimize my screen and have my desktop background appearing, it will be ~400 KB.
I could put the encoding process into a loop, and continuously decrease the image size until the size of the byte array is less then 300 KB, but that seems like a tedious operation. Is there any other method I could use?
Thanks in advance.
// get the screenshot
System.Drawing.Rectangle totalSize = System.Drawing.Rectangle.Empty;
//foreach (Screen s in Screen.AllScreens)
totalSize = System.Drawing.Rectangle.Union(totalSize, Screen.PrimaryScreen.Bounds);
Bitmap screenShotBitmap = new Bitmap(totalSize.Width, totalSize.Height, System.Drawing.Imaging.PixelFormat.Format32bppRgb);
screenShotBitmap.SetResolution(96, 96);
Graphics screenShotGraphics = Graphics.FromImage(screenShotBitmap);
screenShotGraphics.CopyFromScreen(totalSize.X, totalSize.Y,
0, 0, totalSize.Size, CopyPixelOperation.SourceCopy);
screenShotGraphics.Dispose();
// image codec information
ImageCodecInfo imageCodecInfo = GetEncoderInfo("image/jpeg");
// encoder settings
System.Drawing.Imaging.Encoder encoderQuality;
System.Drawing.Imaging.Encoder encoderColor;
encoderQuality = System.Drawing.Imaging.Encoder.Quality;
encoderColor = System.Drawing.Imaging.Encoder.ColorDepth;
// compression & quality for JPEG output
Int64 quality = 40L;
// storage for exported JPEG
byte[] screenShotByteArray;
// encoder parameters
EncoderParameter encoderQualityParameter = new EncoderParameter(encoderQuality, quality);
//EncoderParameter encoderColorParameter = new EncoderParameter(encoderColor, 8L);
// encoder parameters table
EncoderParameters encoderParameters = new EncoderParameters(1);
encoderParameters.Param[0] = encoderQualityParameter;
//encoderParameters.Param[1] = encoderColorParameter;
// get the code into a memory stream
MemoryStream screenShotMemoryStream = new MemoryStream();
screenShotBitmap.Save(screenShotMemoryStream, imageCodecInfo, encoderParameters);
// convert to a byte array
screenShotByteArray = screenShotMemoryStream.GetBuffer();
// close the memory stream
screenShotMemoryStream.Close();

If you're putting things into a loop, be careful to use something similar to binary search instead of just increasing/decreasing the quality parameter by a fixed amount until the desired size is reached.
EDIT: Explaining the binary search a bit. Take the hypothetical case of a picture that compresses to quality*10000 bytes, so the optimal quality setting would be 30. Now the naive approach would be to try some fixed quality setting (f.e. 80 which would give 800,000 bytes) and then decreasing by a certain amount until 300000 bytes are reached. If you f.e. decrease image quality by 5 in each steps, you'd try 12 quality settings with this method until you found the desired setting. A binary search would give a result faster, like this:
Quality Size Next step
80 800000 Too big, so quality := quality/2
40 400000 Too big, so quality := quality/2
20 200000 Too small, so quality := (40+20)/2
30 300000 Reached desired size
This gives the result after only 4 tries (or 3 depending on 200000 bytes being too small or just fine for you). As size doesn't have a linear relation to quality, this example is a bit unrealistic, but binary search should still give you better results than the naive approach.
You could also use some typical images for "training". Encode them using different quality settings (f.e. 100,90,...,20,10) and see how big they get relative to their original size. This might give a good first estimate in most cases although you will still have to adjust when encountering images with much more or less details in them.
Alternatively, have a look at JPEG2000 encoders, those have the option to set a filesize instead of quality.
EDIT: I don't know of JPEG2000 encoding libaries for C#, there only seem to be decoders floating around, so this could get more complicated than I thought at first. You might give CSJ2K a try, but the description doesn't sound like it's ready-to-use.

Related

Saving JPG file from stream increases size in C#

A little bit of background:
I'm writing a bar code image scanner desktop app using WPF, that can take input from either a file location (previously scanned image) or have it come directly from a scanner (using NTWAIN). In both cases I create or get a stream.
Now when I create a new Bitmap from the stream and save it as a JPEG file using an Encoder
using (var bmp = Image.FromStream(rawStream))
{
EncoderParameter ratio = new EncoderParameter(Encoder.Quality, 100L);
EncoderParameter depth = new EncoderParameter(Encoder.ColorDepth, 8L);
EncoderParameters codecParams = new EncoderParameters(2);
codecParams.Param[0] = ratio;
codecParams.Param[1] = depth;
ImageCodecInfo jpegCodecInfo = ImageCodecInfo.GetImageEncoders().FirstOrDefault(x => x.FormatID == ImageFormat.Jpeg.Guid);
bmp.Save(file.FileFullPath, jpegCodecInfo, codecParams); // Save to JPG
}
or the built in
bmp.Save(file.FileFullPath, ImageFormat.Jpeg);
I tend to end up with much larger file sizes. Of course, this isn't always the case, but definately true when I'm loading a small black and white tiff file into memory and encoding as jpg.
My knowledge on image handling is rudimentary, but I think it is because the jpg files are saved with a color depth of 24 bits and the tiff images are originally stored as 1 bit. (Black and white)
No matter what I do, I can't get the jpg files to match the original file's bit depth.
The only work around I found is simply renaming the file to "filename.jpg" and saving like so
using (Bitmap bmp = new Bitmap(rawStream))
{
Save(file.FileFullPath);
}
But this feels like a solution that won't work indefinitely (as a side question, can one simply rename any *.bmp or *tiff file to *.jpg and it will still work?)
Based on my initial research it seems like
bmp.Save()
doesn't honor the encoding parameter for bit depth in jpeg images. Understandably my clients won't be happy having files grow from 16kb to 200kb for "no reason".
Is there a known work around for this problem or am I missing something obvious when it comes to working with streams and images?
JPEG works best for photographs with a multitude of colors, shades and gradients. Typical bit-depths: 8 (for greyscale) or 24 (for full color).
If you want monochrome (1-bit), I'd recommend agains using JPEG, not least because JPEG will introduce encoding artifacts that may not matter for photographs, but which will look like "added pepper and salt" if your original source is 1-bit. And the more you compress them, the more it will be there.
You should try using PNG instead, it has no such artifacts, and is better suited for digital sources with sharp edges.
You could also try making the TIFF smaller by 50% or 75% using a smart resize algorithm (using e.g. 8-bit output) that will convert micro-dots in the original into small gradients in the output. I did so long ago with 1-bit fax/scanner images, with actually quite good results. But too long to still have those sources.

Stitching Together Thousands Of Bitmaps

I have an List as such
List<Bitmap> imgList = new List<Bitmap>();>
i need to scroll through that list, very quickly and stitch all the images into one. Like so
using (Bitmap b = new Bitmap(imgList[0].Width, imgList[0].Height * imgList.Count, System.Drawing.Imaging.PixelFormat.Format24bppRgb))
{
using (Graphics g = Graphics.FromImage(b))
{
for (int i=0;i<imgList.Count;i++)
{
g.DrawImage(imgList[i], 0, i * imgList[i].Height);
}
}
b.Save(fileName, ImageFormat.Bmp);
}
imgList.Clear()
The problem that i'm running into is that the images are 2000 wide by 2 high, and there could be 30,000-100,000 images in the array. When I try to make a blank bitmap that size, it get a parameter not found error. Any help would be GREATLY appreciated.
The size of the block of memory you need is 2 x 100,000 x 20,000 x # bytes per pixel, which is going to be either 12,000,000,000 bytes for 24 bits per pixel and 16,000,000,000 bytes for 32 bits per pixel. So in other words ~12GB or ~16GB. A 32 bit address space is just too darn small for that amount of memory, so sucks to be you.
Or does it?
Since you want to create a file from this, you should be concerned with the file limits rather than your own memory limits. Lucky for you, the size of the integer type used for image dimensions in a BMP is 32 bit, which means that a 200,000 x 2,000 image is totally within those limits. So whether or not you can make that image in memory, you can make the file.
This involves you making your own version of a BMP encoder. It's not that bad - it's most writing a BMP header, a DIB header (110 bytes total) and then raw pixel data. Of course, once done you'll be hard-pressed to find code that will open it, but that's someone else's problem, right?

File size increases after reading png file from disk and saving it back

I'm currently working on a small program to read png files from disk, do some modifications and save it back. Everything is running smoothly except for one small problem, after I saved the file back to disk, its size always increases, for example, a 27.1MB file will become 33.3MB.
After some debugging I finally narrow it down to my reading and saving code. This is the code I'm currently using:
Bitmap img = new Bitmap(<path to file>);
//omitted
img.Save(<path to new file>, ImageFormat.Png);
I've verified no matter if I do or do not make any modification, simply reading and saving the image will cause it size to change. Furthermore, if I opened the saved file with Paint and save from there, the file will shrink back to original size.
How do I read and save the image without changing its size?
Apart from the color depth and how many channels (w/o alpha) are used, saved PNG file size depends mainly on two factors:
How the pre-processing on image lines (called filtering) is done.
The compression level for the deflate algorithm (0-9).
This two factors will greatly affect the output image file size. Filtering is empirical and you can use one out of 4 filtering algorithm for all image lines or different algorithms for different lines or even adaptively try different algorithms on individual lines and choose the largest compression rate. The adaptively way is the most time consuming and impractical for most image writers.
After the filtering, image data is deflate compressed. The compression level for deflate algorithm usually ranges from 0-9 from lowerest to highest compression rate. The higher the compression rate, the slower the compression process. Usually 4 is the best for most of the images.
The filtering process plays a very important sometimes crucial role in PNG compression process. Different filtering algorithm may result in large difference in saved image size. On the other hand, image size is less sensitive to compression level.
You can use tools like TweakPNG to check about the color depth and number of channels the image contains. If the original and the re-saved image has the same color depth and channels, then most probably the filtering and compression level are the culprit for the increased file size.
The truth is if the encoder is not optimized, more often than not, the file size will increase. There are however a lot PNG optimization softwares out there if you don't mind post-processing your resulting images.
Have you tried playing with the Endoder.ColorDepth field? PNG also supports transparency and might be saving some information not needed by your image.
ImageCodecInfo pngCodec = ImageCodecInfo.GetImageEncoders().Where(codec => codec.FormatID.Equals(ImageFormat.Png.Guid)).FirstOrDefault();
if (pngCodec != null)
{
EncoderParameters parameters = new EncoderParameters();
parameters.Param[0] = new EncoderParameter(Encoder.ColorDepth, 24); //8, 16, 24, 32 (base on your format)
image.Save(stream, pngCodec, parameters);
}
Additional info here: https://msdn.microsoft.com/en-us/library/system.drawing.imaging.encoder.colordepth(v=vs.110).aspx
I think you are missing the compression part.
Add to your code like this -
Bitmap img = new Bitmap(<path to file>);
here is what you missed -
ImageCodecInfo myImageCodecInfo = GetEncoderInfo("image/jpeg");
EncoderParameter myEncoderParameter = new EncoderParameter(Encoder.Quality, 25L);
EncoderParameters myEncoderParameters.Param[0] = myEncoderParameter;
and save like this -
img.Save(<path to file>, myImageCodecInfo, myEncoderParameters);
Here is the MSDN link. hope it helps.

How to decrease memory usage from multiple images?

My app downloads six images from here and plays them back in a loop. I download the images in a GIF format, convert them to PNG format using .NET Image Tools, and store each one as a BitmapImage, in a List<BitmapImage>.
The code I use to add the downloaded image to the list of images is:
List<BitmapImage> images = new List<BitmapImage>();
//WebClient used for download
...
GifDecoder decoder = new GifDecoder();
ExtendedImage eim = new ExtendedImage();
decoder.Decode(eim, DOWNLOADEDIMAGESTREAM);
using (MemoryStream ms = new MemoryStream())
{
WriteableBitmap wbmp = eim.ToBitmap();
PngEncoder encoder = new PngEncoder();
encoder.Encode(eim, ms);
ms.Flush();
ms.Position = 0;
BitmapImage bmp = new BitmapImage();
bmp.SetSource(ms);
ms.Close();
images.Add(bmp);
}
e.Result.Dispose();
Each converted image is about 10- 20 KB, with a size of 600px x 550px. (The original GIF's are about 2/3 the size.)
After downloading the images, my memory usage is around 80 MB. Without downloading the images, the memory usage is around 50 MB. 30 MB Seems like a lot of memory to use for storing six images, with a total size of around 90 KB. In addition, it cuts my framerate down to about 5 or 6, which makes for performance issues when the user zooms or moves my image. (I am not currently displaying the images, just storing them in memory. The image I am using to zoom and move is a test, and was included during both of my memory measurements.)
I also wanted to increase the size of the images downloaded, but the amount of memory they already use makes this unreasonable.
Forget about how big the compressed image is. Once you create a bitmap from it, it's going to be 600x550x (3 or 4, probably, bytes per pixel). So you're looking at over 1MB for each image. In memory they're stored as uncompressed bitmaps. That doesn't account for 30MB, but if you're really concerned about the details of your memory usage, use something like SciTech's .NET Memory Profiler (trial available here: http://memprofiler.com/) and you can find out for sure where the memory is being taken up.
I'm not affiliated with SciTech. I used the profiler a few times over the past decade (including a stretch of a few years where I used it regularly on a project). I've found it to be one of the more accurate methods of determining how memory is used in .NET. Otherwise I find it's a lot of guessing with frequently wrong assumptions.
From my view point, we can work around on WP7, because the screen of mobile is small and we cannot display whole content of image as we want. We can download but instead of display original file we should reduce the width and height correct to the screen of mobile phone. Just my two cents.

Creating a high quality ico file programmatically

I'm trying to create a high quality icon (means: suitable for Win Vista/7/8) from a PNG file programmatically in C# for use as shortcut icons. Since the Bitmap.GetHIcon() function doesn't support these kind of icons, and I want to avoid external dependencies or libraries, I'm currently using a slightly modified ICO writer I found here on SO.
I have working code but I'm experiencing some glitches in the way Windows displays these icons.
The relevant code is:
// ImageFile contains the path to PNG file
public static String IcoFromImageFile(String ImageFile) {
//...
Image iconfile = Image.FromFile(ImageFile);
//Returns a correctly resized Bitmap
Bitmap bm = ResizeImage(256,256,iconfile);
SaveAsIcon(bm, NewIconFile);
return NewIconFile;
}
// From: https://stackoverflow.com/a/11448060/368354
public static void SaveAsIcon(Bitmap SourceBitmap, string FilePath) {
FileStream FS = new FileStream(FilePath, FileMode.Create);
// ICO header
FS.WriteByte(0); FS.WriteByte(0);
FS.WriteByte(1); FS.WriteByte(0);
FS.WriteByte(1); FS.WriteByte(0);
// Image size
// Set to 0 for 256 px width/height
FS.WriteByte(0);
FS.WriteByte(0);
// Palette
FS.WriteByte(0);
// Reserved
FS.WriteByte(0);
// Number of color planes
FS.WriteByte(1); FS.WriteByte(0);
// Bits per pixel
FS.WriteByte(32); FS.WriteByte(0);
// Data size, will be written after the data
FS.WriteByte(0);
FS.WriteByte(0);
FS.WriteByte(0);
FS.WriteByte(0);
// Offset to image data, fixed at 22
FS.WriteByte(22);
FS.WriteByte(0);
FS.WriteByte(0);
FS.WriteByte(0);
// Writing actual data
SourceBitmap.Save(FS, System.Drawing.Imaging.ImageFormat.Png);
// Getting data length (file length minus header)
long Len = FS.Length - 22;
// Write it in the correct place
FS.Seek(14, SeekOrigin.Begin);
FS.WriteByte((byte)Len);
FS.WriteByte((byte)(Len >> 8));
FS.Close();
}
This compiles and works, but with one problem. Windows displays the icon on the shortcut incorrectly. I do this also programatically, but it occurs even if I do it manually (via File Properties, Change Icon). The problem is that the icon is cut off (the image itself displays correctly). It depends on the image, but usually only around 20% of the actual icon is shown. If I open the file in an image viewer like XNView it displays completely and correct, but MS Paint doesn't.
I made this screenshot, along with a correctly displayed icon for comparison
I suspect the error lies in the ICO saving method, but even after comparing them to normally displayed ICOs in a Hex editor, the header gets written correctly but the PNG image part itself seems different. Has anyone an idea? I also welcome better, less hacky solutions.
Your ico file is set to save the length of the embedded bitmap with only 16-bit precision, but the PNG file is too large (larger than 65535 bytes) so the length record overflows.
I.e. the following lines are incomplete:
// Write it in the correct place
FS.Seek(14, SeekOrigin.Begin);
FS.WriteByte((byte)Len);
FS.WriteByte((byte)(Len >> 8));
You could add these lines:
FS.WriteByte((byte)(Len >> 16));
FS.WriteByte((byte)(Len >> 24));
As a matter of cleanliness and performance, I'd generally avoid all those separate writes and just use the write overload with the byte array parameter. Also, instead of the somewhat tricky Save-To-File then seek, you might consider a Save-To-MemoryStream then a single Write for the header (which can now use the PNG's length in bytes) and a single write to copy the PNG data from the memory stream to the file.
Another point you really should address is disposing IDisposable resources. Even if you don't need to yet since you haven't encountered any problems, it will bite you someday and if you have even a fairly small codebase with all kind of undisposed disposables you'll have a very hard time finding the source of your leak and/or deadlock. In general: Never call Close unless you really can't avoid it - instead wrap your FileStream in a using block. Similarly, Image and Bitmap are disposable and allocate native resources, though at least you can't get any locking issues with those (AFAIK - but better to be safe than sorry).

Categories

Resources