Setting options to images in Magick .NET collection - c#

I'm trying to reduce the file sizes of the GIF animations I'm exporting, I've read up on how to do it. Another thread suggested to reduce the quality, add compression and slightly blur the picture which is what I'm trying to do like so:
using (MagickImageCollection col = new MagickImageCollection(#"C:/PathToGif"))
{
for (int i = 0; i < col.Count; i++)
{
col[i].Quality = 85;
col[i].CompressionMethod = CompressionMethod.LZW;
col[i].Strip();
}
col.Write(#"C:/Path/To/Outputh");
}
The code runs however the settings seem to be ignored, while setting AnimationDelay the same way does work. I verify it by checking the quality and file size of the output, which seem to be the same as when I don't use any of the settings. Even setting quality to 20, gives the same results.
I've also attempted to use QuantizeSettings where I passed a value of 255 to the colors property. Which just seemed to lock my application up, while using 50% CPU. (I gave it about 5 minutes before forcefully closing the application)
My application processes a .GIF of about 950 kB and turns it into 5.3 mB which is unacceptable. (Disclaimer: I add about 20+- frames to the .GIF and draw an overlay on it.)
Could someone who has experience with the Magick .NET library tell me if I'm doing something wrong and point me into the right direction of doing this? I was unable to find a different way of applying these settings.

The GIF coder does not use the Quality setting and the CompressionMethod will always be CompressionMethod.LZW. You should do the following if you want to optimize the output file:
using (MagickImageCollection col = new MagickImageCollection(#"C:/PathToGif"))
{
col.Coalesce();
AddOtherImages(col);
col.Optimize();
col.OptimizeTransparency();
col.Write(#"C:/Path/To/Output");
}
Make sure you upgrade to the latest version, the Optimize/OptimizeTransparency methods were bugged in previous versions.

Related

Saving the Tiff Image file using JPEG compression is time consuming

I am using Aspose.Imaging 19.11.0.0 for manipulating the Tiff Images with Compression JPEG,
But here If I have 10MB+ sized tiff files(having 50 pages) then in this case it is taking 30 to 40 minutes to rotate these all tiff pages and application went on not responding mode.
In my code, suppose I have 50 pages in Tiff image files, then from client application I am iterating each pages through foreach loop and sending corresponding rotate method for each page on server side for rotation,
I know the one of the factor for time consuming is the sending each pages instead of all pages at once,
but when I debugged the code then found that tiffImage.Save(Stream, tiffOptions) is taking more time for each page also.
Below are the server side code for rotating the page using JPEG compression ,
Here below RotatePageUsingAspose() method is called each time for all pages,
means suppose I have selected only 3rd page out of 50 then it is being called only one time for selected page with parameter pageNumber =3 and rotation degree = 90 degree
In this case, means rotating the 3rd page and saving this page is also taking almost 1 minute,which is far too slow.
Server side code for rotation:
private void RotatePageUsingAspose(int pageNo, RotationDegrees rotationDegree)
{
float angleOfRotation = (float)rotationDegree;
// Auto mode is flexible and efficient.
Cache.CacheType = CacheType.Auto;
// The default cache max value is 0, which means that there is no upper limit.
Cache.MaxDiskSpaceForCache = 1073741824; // 1 gigabyte
Cache.MaxMemoryForCache = 1073741824; // 1 gigabyte
// Changing the following property will greatly affect performance.
Cache.ExactReallocateOnly = false;
TiffOptions tiffOptions = new TiffOptions(TiffExpectedFormat.TiffJpegRgb);
//Set RGB color mode.
tiffOptions.Photometric = TiffPhotometrics.Rgb;
tiffOptions.BitsPerSample = new ushort[] { 8, 8, 8 };
try
{
using (TiffImage tiffImage = (TiffImage)Image.Load(Stream))
{
TiffFrame selectedFrame = tiffImage.Frames[pageNo - 1];
selectedFrame.Rotate(angleOfRotation);
tiffImage.Save(Stream, tiffOptions);
}
}
finally
{
tiffOptions.Dispose();
}
}
I have raised the same question to Aspose.Imaging team but they have not provide the solution for this yet.
Kindly suggest the improvements for above code for saving the pages in efficient manner.
If possible please provide the approach to achieve this.

Displaying Thumbnails of very high resolution images Fast with Minimal Delay

I need to show the preview thumbnails of high resolution images in a control for user selection. I currently use ImageListView to load images.
This works fine for low to medium resolution images.But when it comes to showing thumbnails of very high resolution images there is a noticeable delay.Sample image can be downloaded from https://drive.google.com/open?id=1Qgu_aVXBiMlbHluJFU4fBvmFC45-E81C
The image size is around 5000x3000 pixels and size is around 12 MB.The issue can be replicated by using 1000 copies of this image.
The issue screen capture is uploaded here
https://giphy.com/gifs/ZEH3T3JTfN42OL3J1A
The images are loaded using a background worker
foreach (var f in filepaths)
{
imageListView1.Items.Add(f);
}
1. In order to solve this issue I tried resizing large resolution images and adding the resized image to ImageListView ... but for resizing there is a heavy time consumption and thumbnail generation is slow.
Bitmap x = UpdatedResizeImage2(new Bitmap(f), new Size(1000, 1000));
string q = Path.GetTempPath() + Path.GetFileName(f);
x.Save(Path.GetTempPath() + Path.GetFileName(f));
x.Dispose();
imageListView1.Items.Add(Path.GetTempPath() + Path.GetFileName(f));
2. I have also tried Image.CreateThumbnail Method but this is also quite slow.
Is there a better way to solve this issue?
I would suggest using image processing library such ImageMagick.
ImageMagick has optimized this feature and you have Magick.NET a nuget package for .NET.
It is simple and straight forward:
var file = new FileInfo(#"c:\temp\input.jpg");
using (MagickImage image = new MagickImage(file))
{
{
image.Thumbnail(new MagickGeometry(100, 100));
image.Write(#"C:\temp\thumbnail.jpg");
}
}
example I made:
Here is some documentation and references that might be useful:
https://imagemagick.org/Usage/thumbnails/#creation
http://www.imagemagick.org/Usage/thumbnails/
https://github.com/dlemstra/Magick.NET
https://www.smashingmagazine.com/2015/06/efficient-image-resizing-with-imagemagick/
https://devblogs.microsoft.com/dotnet/net-core-image-processing/
https://weblogs.asp.net/bleroy/resizing-images-from-the-server-using-wpf-wic-instead-of-gdi
Alternatives to System.Drawing for use with ASP.NET?
You could use WPF interop and use the DecodePixelWidth/Height properties. They use underlying Windows imaging layer technology ("Windows Imaging Component") to create an optimized thumbnail, saving lots of memory (and possibly CPU): How to: Use a BitmapImage (XAML)
You can also use WPF/WIC by code, with a code like this (adapted from this article The fastest way to resize images from ASP.NET. And it’s (more) supported-ish.. You just need to add a reference to PresentationCore and WindowsBase which shouldn't be an issue for a desktop app.
// needs System.Windows.Media & System.Windows.Media.Imaging (PresentationCore & WindowsBase)
public static void SaveThumbnail(string absoluteFilePath, int thumbnailSize)
{
if (absoluteFilePath == null)
throw new ArgumentNullException(absoluteFilePath);
var bitmap = BitmapDecoder.Create(new Uri(absoluteFilePath), BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.None).Frames[0];
int width;
int height;
if (bitmap.Width > bitmap.Height)
{
width = thumbnailSize;
height = (int)(bitmap.Height * thumbnailSize / bitmap.Width);
}
else
{
width = (int)(bitmap.Width * thumbnailSize / bitmap.Height);
height = thumbnailSize;
}
var resized = BitmapFrame.Create(new TransformedBitmap(bitmap, new ScaleTransform(width / bitmap.Width * 96 / bitmap.DpiX, height / bitmap.Height * 96 / bitmap.DpiY, 0, 0)));
var encoder = new PngBitmapEncoder();
encoder.Frames.Add(resized);
var thumbnailFilePath = Path.ChangeExtension(absoluteFilePath, thumbnailSize + Path.GetExtension(absoluteFilePath));
using (var stream = File.OpenWrite(thumbnailFilePath))
{
encoder.Save(stream);
}
}
Otherwise there are lots of tools out there like MagicScaler, FreeImage ImageSharp, ImageMagick, Imazen, etc. Most were written for ASP.NET/Web server scenarios (for which WPF is officially not supported but works, read the article) and are also cross-platform which you don't seem to need. I'm not sure they're generally faster or use less memory than builtin Windows technology, but you should test all this in your context.
PS: otherwise there's no magic bullet, bigger images take more time.
There's also NetVips, the C# binding for libvips.
It's quite a bit quicker than Magick.NET: between 3x and 10x faster, depending on the benchmark.
Thumbnailing is straightforward:
using NetVips;
var image = Image.Thumbnail("some-image.jpg", 128);
image.WriteToFile("x.jpg");
There's an introduction in the documentation.
Most of answers approach is to resize bitmap and then save it. Its a bit offcourse slow, specially if you say very high resolution.
Why not use existing thumbnail created by windows explorer ? This is fastest way of all (specially if you use smaller thumbnails).
//https://stackoverflow.com/a/1751610
using Microsoft.WindowsAPICodePack.Shell;
var shellFile = ShellFile.FromFilePath(pathToYourFile); Bitmap
Image image = shellFile.Thumbnail.LargeBitmap;
Nuget : https://www.nuget.org/packages/WindowsAPICodePack-Shell (around 600KB)
Note: Its same as others, if thumbnail arent cached already.

Increase fps in record video ffmpeg

I need to record 2 videos from 2 cameras in full hd 30 fps.
I use ffmpeg and wrapper - Aforge for c#.
init device:
_videoCaptureDevice = new VideoCaptureDevice(deviceName);
_videoCaptureDevice.VideoResolution = _videoCaptureDevice.VideoCapabilities[0];
_videoCaptureDevice.DesiredFrameRate = _fps;
_videoSourcePlayer.VideoSource = _videoCaptureDevice;
_videoCaptureDevice.NewFrame += _videoCaptureDevice_NewFrame;
_videoSourcePlayer.Start();
saving frames
if (_videoRecordStatus == VideoRecordStatus.Recording)
{
_videoFileWriter.WriteVideoFrame(eventArgs.Frame);
}
and init file writer
_videoFileWriter = new VideoFileWriter();
_videoFileWriter.Open(_fileName, _videoCaptureDevice.VideoResolution.FrameSize.Width,
_videoCaptureDevice.VideoResolution.FrameSize.Height, 30, VideoCodec.MPEG4, 10 * 1000 * 1000);
now _videoCaptureDevice.VideoResolution.FrameSize equals 1280x720 and 640x480 (for second device). But I already have problems with recording. Maximum fps is 24 for 480p and 13-14 for 720p (when I try to record videos from 2 cameras in the same time)
How to increase it?
Or it isn't possible? Maybe more powerfull computer will solve this problem (I have Pentium(R) Dual-Core CPU 2.50Ghz and usual videocart (Geforse 8500 GT) for working with two displays, usual hdd, usb 2.0)?
I will glad any help (maybe another library, but not language (c#))
PS
I already used Emgu.CV and faced with simular problems..
The framerate is limited by the hardware.
Use AMCap or GraphEdit to check what your camera really supports. It will depend on the choosen resolution and ouput format (higher resolution -> lower framerater).
Be aware that AForge always uses the highest value for all resolutions which can lead to oversampling (e.g.: AForge produces the rames at 60Hz, but the camera only supports 15Hz at the given resolution, so the images will mostly be duplicates. See here
Also use process explorer and a profiler to see how busy your CPU really is and what it is doing.

Resize page with ABCPdf before rendering (huge images in the pdf)

I have a problem with ABCPdf, when I try to convert a pdf files into seperate image files as fallbacks for old browsers.
I have some working code that perfectly renders the page and resizes the rendering into the wanted size. Now my problem occurs when the pdf page is huge w7681px x h10978px. It nearly kills my development machine and the deployment machine cannot even chew the file.
I normally just render the page 1-to-1 as the pdf page and then uses other algorithms to resize this image. This is not efficient since ABCPdf takes alot of power to output this image.
I have the following code:
private byte[] GeneratePng(Doc pdfDoc, int dpi)
{
var useDpi = dpi;
pdfDoc.Rendering.DotsPerInch = useDpi;
pdfDoc.Rendering.SaveQuality = 100;
pdfDoc.Rect.String = pdfDoc.CropBox.String;
pdfDoc.Rendering.ResizeImages = true;
int attemptCount = 0;
for (;;)
{
try
{
return pdfDoc.Rendering.GetData("defineFileTypeDummyString.png");
}
catch
{
if (++attemptCount == 3) throw;
}
}
}
I have tried the following solutions:
Resizing the page
pdfDoc.SetInfo(pdfDoc.Page, "/MediaBox:Rect", "0 0 200 300");
Resizing the page and outputting it. Which doesn't seem to make any changes at all.
Resizing the images before rendering it:
foreach (IndirectObject io in pdfDoc.ObjectSoup) {
if (io is PixMap) {
PixMap pm = (PixMap)io;
pm.Realize(); // eliminate indexed color images
pm.Resize(pm.Width / 4, pm.Height / 4);
}
}
Didn't do anything either and still resulted in a long load time.
Running the reduzed size operation before rendering:
using (ReduceSizeOperation op = new ReduceSizeOperation(pdfDoc))
op.Compact(true);
Didn't do anything either. Just went directly to rendering and took a long time.
Can anyone help me here? Maybe point me to some ABCPdf resizing algorithm or something.
Ok so I talked to the customer support at ABCPdf and they gave me the following.
doc1.Read(originalPDF);
// Specify size of output page. (This example scales the page, maintaining the aspect ratio,
// but you could set the MediaBox Height and Width to any desired value.)
doc2.MediaBox.Height = doc1.MediaBox.Height / 8;
doc2.MediaBox.Width = doc1.MediaBox.Width / 8;
doc2.Rect.SetRect(doc2.MediaBox);
doc2.Page = doc2.AddPage();
// Create the output image
doc2.AddImageDoc(doc1, 1, null);
doc2.Rendering.Save(savePath);
Which is supposed to be used with single page PDFs, so if you have a pdf full of large pictures, then you should chop it up. Which you can do following my other Q/A: Chop PDFs into single pages
The rendering algorithm they use in the above code is auto detected by ABCPdf and you cannot control it yourself (and they told me that I didn't want to). So I put my faith in their code. At least I did a test and the quality looks quite similar to a InterpolationMode.HighQualityBicubic and only differed when zoomed. So I wouldn't be too concerned with it either.
At last the above code gave me a speed boost compared to rendering and then resizing of about 10x faster. So it is really worth something if you do this operation a lot.

ImageMagick.NET PDF to JPG conversion - insufficient memory

I'm using ImageMagick.NET to convert PDFs to JPGs. Here's my code:
MagickReadSettings settings = new MagickReadSettings();
settings.Density = new MagickGeometry(300, 300);
using (MagickImageCollection images = new MagickImageCollection())
{
images.Read(pdfFilePathString, settings);
MagickImage image = images.AppendVertically();
image.Format = MagickFormat.Jpg;
//image.Quality = 70;
//if (image.Width > 1024)
//{
// int heightRatio = Convert.ToInt32(Math.Round((decimal)(image.Height / (image.Width / 1024)), 0));
// image.Resize(1024, heightRatio);
//}
image.Write(tempFilePathString);
image.Dispose();
}
The problem is, I keep getting insufficient memory exceptions, which occur on the image.Write(). It is obviously due to file size, as a small pdf will work, but not a multi page pdf. The particular file I'm trying to get it to run through is a 12 page text pdf. I can get it to work if I sent the density low, for example (100, 100) works, but the quality is terrible.
The commented out lines were some other solutions I was trying to implement, but it keeps running for a long time (several minutes) without end (at least as far as my patience is concerned) with those enabled. One of those is to reduce quality, and the other to reduce image size. The pdfs always come out very large, much larger than necessary.
If I could reduce image size and/or quality before the file is written, that would be great. Or at least I need to be able to be able to produce an image in a quality that is decent enough without having memory issues. It doesn't seem like it should be having memory issues here as its not as if the file size is ginormous, although it probably still is bigger than desired for an image. The 12 page pdf when I could get it to render came in at around 6-7 megs.
I'm using 32-bit ImageMagick - I wonder if 64-bit would solve the issue, but there have been issues trying to get that version to run on a local environment - which is another issue entirely.
Anybody have any thoughts on anything else I can try?
Thanks

Categories

Resources