I am trying to compress image(usually around 5-30) quality / size with ImageSharp.Web() library, and I cant really understand how can I do that or what I am missing here.
Can I reuse the same memory stream / IFormFile object to save the mutated image? Or do I need to create a new Image from current image object?
To work with a memory Stream do I also need to use specific JpegDecoder() ?
Not sure if this line is correct item.SaveAsJpeg(memoryStream);.
Maybe someone can help me out with the logic or any tips or tricks would be really helpful. Thanks!
Simple code example:
private byte[] ConvertImageToByteArray(IFormFile inputImage)
{
byte[] result = null;
// filestream
using (var fileStream = inputImage.OpenReadStream()) // IFormFile inputImage
// memory stream
using (var memoryStream = new MemoryStream())
{
fileStream.CopyTo(memoryStream);
memoryStream.Position = 0; // The position needs to be reset.
var before = memoryStream.Length;
using (var item = Image.Load(memoryStream)) // do I need to use here JpegDecoder?
{
var beforeMutations = item.Size();
// dummy resize options
int width = 50;
int height = 100;
IResampler sampler = KnownResamplers.Lanczos3;
bool compand = true;
ResizeMode mode = ResizeMode.Stretch;
// init resize object
var resizeOptions = new ResizeOptions
{
Size = new Size(width, height),
Sampler = sampler,
Compand = compand,
Mode = mode
};
// mutate image
item.Mutate(x => x
.Resize(resizeOptions)
.Rotate(35));
var afterMutations = item.Size();
// try to save mutated image back to memory stream / overwrite
// this is not overwriting memory stream
item.SaveAsJpeg(memoryStream);
// prepare result to byte[]
result = memoryStream.ToArray();
}
var after = fileStream.Length; // kind of not needed.
}
I know this post is a bit old, and I'm sure that you have fixed your issue, but hopefully, it'll help some else out in the future.
Can I reuse the same memory stream / IFormFile object to save the mutated image? Or do I need to create a new Image from current image object?
You have two streams going on here, you can use one for the image from IFormFile and you can manipulate everything here, and then use your MemoryStream to save to
To work with a memory Stream do I also need to use specific JpegDecoder()?
You can save your image to the Memory Stream with .SaveAsJpeg(memoryStream)
Not sure if this line is correct item.SaveAsJpeg(memoryStream);.
You are on the right track
Maybe someone can help me out with the logic or any tips or tricks would be really helpful. Thanks!
Here is my rewrite based on what you have: Hope this helps. I'm sure there are some things that can be simplified, I tried to keep it close to OPs format
private byte[] ConvertImageToByteArray(IFormFile inputImage)
{
byte[] result = null;
// memory stream
using (var memoryStream = new MemoryStream())
// filestream
using (var image = Image.Load(inputImage.OpenReadStream())) // IFormFile inputImage
{
//var before = memoryStream.Length; Removed this, assuming you are using for debugging?
var beforeMutations = image.Size();
// dummy resize options
int width = 50;
int height = 100;
IResampler sampler = KnownResamplers.Lanczos3;
bool compand = true;
ResizeMode mode = ResizeMode.Stretch;
// init resize object
var resizeOptions = new ResizeOptions
{
Size = new Size(width, height),
Sampler = sampler,
Compand = compand,
Mode = mode
};
// mutate image
image.Mutate(x => x
.Resize(resizeOptions)
.Rotate(35));
var afterMutations = image.Size();
//Encode here for quality
var encoder = new JpegEncoder()
{
Quality = 30 //Use variable to set between 5-30 based on your requirements
};
//This saves to the memoryStream with encoder
image.Save(memoryStream, encoder);
memoryStream.Position = 0; // The position needs to be reset.
// prepare result to byte[]
result = memoryStream.ToArray();
var after = memoryStream.Length; // kind of not needed.
}
}
Related
Hi i'm getting to following exception 'System.Runtime.InteropServices.ExternalException at System.Drawing.Image.Save' when trying to save JPEG file to a memory stream.
images = new List<Bitmap>(seriesDataList.Count());
foreach (var seriesData in seriesDataList)
{
using (chart = new Chart { Width = ChartWidth, Height = ChartHeight })
{
chart.ChartAreas.Add(new ChartArea(seriesData.Key));
var series = new Series(seriesData.Key)
{
ChartType = SeriesChartType.Doughnut,
Font = ChartTextFont,
LegendText = "#VALX (#PERCENT)",
};
BindSeriesData(seriesData.ToList(), series);
series["PieLabelStyle"] = seriesData.Count() <= MaxValuesForShowingLabels ? "Outside" : "Disabled";
series["PieLineColor"] = "Black";
chart.Series.Add(series);
var legend = new Legend(seriesData.Key)
{
Enabled = true,
Font = ChartTextFont,
Docking = Docking.Right,
Title = seriesData.Key,
TitleFont = ChartTitleFont,
TitleAlignment = StringAlignment.Near
};
chart.Legends.Add(legend);
using (var memoryStream = new MemoryStream())
{
chart.SaveImage(memoryStream, ChartImageFormat.Jpeg); // Working fine
images.Add(new Bitmap(memoryStream));
}
}
}
using (var fullImage = MergeImages(images))
{
using (var memoryStream = new MemoryStream())
{
fullImage.Save(memoryStream, System.Drawing.Imaging.ImageFormat.Jpeg); // ERROR HERE
return Convert.ToBase64String(memoryStream.ToArray());
}
}
In other cases where the full image not built from many images the flow is working. It fails in low rate for this image type only.
Saw other related question suggested to wrap memoryStream with 'using' which i'm already doing.
Any suggestions?
As mentioned by Klaus Gütter in the comments, you are not allowed to dispose the stream before the bitmap.
One option to fix this would be to return a clone of the bitmap, that should ensure all the pixel-data is copied to a separate buffer, allowing the original stream to be freed.
using var bmp = new Bitmap(memoryStream);
images.Add(bmp.Clone(new Rectangle(0,0, bmp.Width, bmp.Height), PixelFormat.Format32bppArgb);
Another option could be to just not dispose the memory stream. Since MemoryStream just represents a chunk of managed memory it is safe to rely on the garbage collector for clean up. A good rule of thumb is to always dispose disposable objects, but this is less important when you know that the object only owns managed resources, like memory.
I have 27000 images that are stored in a folder and which need to be added to the database using EntityFramework. I have a code
var files = Directory.GetFiles(path, "*", SearchOption.AllDirectories);
foreach(var file in files)
{
using (ApplicationContext db = new ApplicationContext())
{
Image img = Image.FromFile(file);
var imgRes = ResizeImage(img, ImageSettings.Width, ImageSettings.Height);
MemoryStream memoryStream = new MemoryStream();
img.Save(memoryStream, ImageFormat.Png);
var label = Directory.GetParent(file).Name;
var bytes = memoryStream.ToArray();
memoryStream.Close();
db.Add(new ImageData { Image = bytes, Label = label });
img.Dispose();
memoryStream.Dispose();
imgRes.Dispose();
}
}
it only works when there are less than 10,000 images otherwise I get Out of memory exception.
how can i upload my 27000 images to the database.
First of all, this code doesn't deal with entities or objects, so using an ORM doesn't help at all. This doesn't cause the OOM though, it only makes the code a lot slower.
The real problem is that MemoryStream is actually a wrapper around a buffer. Once the buffer is full, a new one is reallocated with double the size, the original data are copied over and the old buffer deleted. Growing a 50MB byte buffer this way results in a lot of reallocations, log2(50M). This fragments the free memory to the point the runtime can no longer allocate a large enough contiguous buffer. This results in OOMs with List<T> objects too, not just MemoryStreams.
The quick fix would be to pass the expected size as the stream's capacity through the MemoryStream(Int32) constructor. This cuts down on reallocations and saves a lot of CPU cycles. The number doesn't have to be exact, just large enough to avoid too much garbage :
using(Image img = Image.FromFile(file))
using(var imgRes = ResizeImage(img, ImageSettings.Width, ImageSettings.Height))
using(var memoryStream = new MemoryStream(10_000_000))
{
img.Save(memoryStream, ImageFormat.Png);
var label = Directory.GetParent(file).Name;
var bytes = memoryStream.ToArray();
db.Add(new ImageData { Image = bytes, Label = label });
}
There's no need to close MemoryStream, it's just a wrapper over an array. That still allocates a big buffer for each file though.
If we know the maximum file size, we can allocate a single buffer and reuse it in all iterations. In this case the size matters - it's no longer possible to resize the buffer :
var buffer=new byte[100_000_000];
using(Image img = Image.FromFile(file))
using(var imgRes = ResizeImage(img, ImageSettings.Width, ImageSettings.Height))
using(var memoryStream = new MemoryStream(buffer))
{
img.Save(memoryStream, ImageFormat.Png);
var label = Directory.GetParent(file).Name;
var bytes = memoryStream.ToArray();
db.Add(new ImageData { Image = bytes, Label = label });
}
Try to avoid creating ApplicationContext in foreach loop:
using (ApplicationContext db = new ApplicationContext())
{
foreach(var file in files)
{
using (MemoryStream ms = new MemoryStream())
using(Image img = Image.FromFile(file))
using(var imgRes = ResizeImage(img, ImageSettings.Width, ImageSettings.Height))
{
var imgRes = ResizeImage(img, ImageSettings.Width, ImageSettings.Height);
MemoryStream memoryStream = new MemoryStream();
img.Save(memoryStream, ImageFormat.Png);
var label = Directory.GetParent(file).Name;
var bytes = memoryStream.ToArray();
db.Add(new ImageData { Image = bytes, Label = label });
}
}
}
I am trying to compress image(usually around 5-30) quality / size with Magick.NET library, and I cant really understand how can I use ImageOptimizer class and call LosslessCompress() method using stream.
Do I need to use FileStream or MemoryStream?
Do I need to save / create a temp file on server for each image and then proceed with the compression flow? (Performance?)
Anything else?
Simple Code example:
private byte[] ConvertImageToByteArray(IFormFile image)
{
byte[] result = null;
// filestream
using (var fileStream = image.OpenReadStream())
// memory stream
using (var memoryStream = new MemoryStream())
{
var before = fileStream.Length;
ImageOptimizer optimizer = new ImageOptimizer();
optimizer.LosslessCompress(fileStream); // what & how can I pass here stream?
var after = fileStream.Length;
// convert to byte[]
fileStream.CopyTo(memoryStream);
result = memoryStream.ToArray();
}
return result;
}
You cannot use the fileStream because the stream needs to be both readable and writable. If you first copy the data to a memorystream you can then compresses the image in that stream. Your code should be changed to this:
private byte[] ConvertImageToByteArray(IFormFile image)
{
byte[] result = null;
// filestream
using (var fileStream = image.OpenReadStream())
// memory stream
using (var memoryStream = new MemoryStream())
{
fileStream.CopyTo(memoryStream);
memoryStream.Position = 0; // The position needs to be reset.
var before = memoryStream.Length;
ImageOptimizer optimizer = new ImageOptimizer();
optimizer.LosslessCompress(memoryStream);
var after = memoryStream.Length;
// convert to byte[]
result = memoryStream.ToArray();
}
return result;
}
I've searched here for help with this, but nothing quite matches what I need. I have an image that gets uploaded, and I'd like to change the size before it gets saved to azure.
So currently my code is:
public ActionResult UserDetails(HttpPostedFileBase photo)
{ var inputFile = new Photo()
{
FileName = photo.FileName,
Data = () => photo.InputStream
};
//then I save to Azure
How would I change the photo.InputStream to 100x 100 px for example?
Here is how I do it:
byte[] imageBytes;
//Of course image bytes is set to the bytearray of your image
using (MemoryStream ms = new MemoryStream(imageBytes, 0, imageBytes.Length))
{
using (Image img = Image.FromStream(ms))
{
int h = 100;
int w = 100;
using (Bitmap b = new Bitmap(img, new Size(w,h)))
{
using (MemoryStream ms2 = new MemoryStream())
{
b.Save(ms2, System.Drawing.Imaging.ImageFormat.Jpeg);
imageBytes = ms2.ToArray();
}
}
}
}
From there, I use a MemoryStream to upload. I use blob storage and use the UploadFromStreamAsync to load to blob.
This is a basic view of it.
I have a byte[] for an image, which was read directly from the image itself, and I am trying to convert this byte[] into a Bitmap object.
I am using the code:
var provider = new MultipartMemoryStreamProvider();
var multipart = Request.Content.ReadAsMultipartAsync(provider).ContinueWith(t => {
foreach (var item in provider.Contents) {
var filename = item.Headers.ContentDisposition.FileName.Trim('\"');
var buffer = item.ReadAsByteArrayAsync();
MemoryStream mss = new MemoryStream(buffer.Result);
Bitmap bmpImage = (Bitmap)Image.FromStream(mss);
//bmpImage.GetPixel(10,10) returns ARGB values of 255,255,255,255
}
});
However, when I call bmpImage.GetPixel(10,10), the ARGB values are 255,255,255,255. This makes no sense to me. Does anyone have any ideas why this conversion could be causing a loss of my pixel information?
The above code is wrapped in the ApiController Post() method:
public async Task<IHttpActionResult> Post()
Replacing:
var buffer = item.ReadAsByteArrayAsync();
MemoryStream mss = new MemoryStream(buffer.Result);
Bitmap bmpImage = (Bitmap)Image.FromStream(mss);
with:
Stream stream = item.ReadAsStreamAsync().Result;
Bitmap bmpImage = (Bitmap)Image.FromStream(stream);
did the trick. I don't initally see the why the second works and the first doesn't, but the ReadAsStreamAsync() call works and the ReadAsByteArrayAsync() doesn't.