I have a console application written using C# on the top of Core .NET 2.2 framework.
I want to create asynchronous Task that would write a full-size image to storage. Additionally, the process will need to create a thumbnail and write it to the default storage.
Follow is the method that processes the logic. I documented each line to explain that I believe is happening
// This method accepts FileObject and returns a task
// The generated task will write the file as is to the default storage
// Then it'll create a thumbnail of that images and store it to the default storage
public async Task ProcessImage(FileObject file, int thumbnailWidth = 250)
{
// The name of the full-size image
string filename = string.Format("{0}{1}", file.ObjectId, file.Extension);
// The name along with the exact path to the full-size image
string path = Path.Combine(file.ContentId, filename);
// Write the full-size image to the storage
await Storage.CreateAsync(file.Content, path)
.ContinueWith(task =>
{
// Reset the stream to the beginning since this will be the second time the stream is read
file.Content.Seek(0, SeekOrigin.Begin);
// Create original Image
Image image = Image.FromStream(file.Content);
// Calulate the height of the new thumbnail
int height = (thumbnailWidth * image.Height) / image.Width;
// Create the new thumbnail
Image thumb = image.GetThumbnailImage(thumbnailWidth, height, null, IntPtr.Zero);
using (MemoryStream thumbnailStream = new MemoryStream())
{
// Save the thumbnail to the memory stream
thumb.Save(thumbnailStream, image.RawFormat);
// The name of the new thumbnail
string thumbnailFilename = string.Format("thumbnail_{0}", filename);
// The name along with the exact path to the thumbnail
string thumbnailPath = Path.Combine(file.ContentId, thumbnailFilename);
// Write the thumbnail to storage
Storage.CreateAsync(thumbnailStream, thumbnailPath);
}
// Dispose the file object to ensure the Stream is disposed
file.Dispose();
image.Dispose();
thumb.Dispose();
});
}
Here is my FileObject if needed
public class FileObject : IDisposable
{
public string ContentId { get; set; }
public string ObjectId { get; set; }
public ContentType ContentType { get; set; }
public string Extension { get; set; }
public Stream Content { get; set; }
private bool IsDisposed;
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (IsDisposed)
return;
if (disposing && Content != null)
{
Content.Close();
Content.Dispose();
}
IsDisposed = true;
}
}
The above code writes the correct full-size image to the storage drive. It also writes the thumbnail to storage. However, the thumbnail is always corrupted. In other words, the generated thumbnail file is always written with 0 bytes.
How can I correctly create my thumbnail from file.Content stream after writing the same stream to the storage?
I figured out the cause of the issue. For some reason the line thumb.Save(thumbnailStream, image.RawFormat); position the thumbnailStream at the end and when writing to the storage nothing gets written
Fixing that issues was to reset the seek position after writing to the stream like this
using (MemoryStream thumbnailStream = new MemoryStream())
{
// Save the thumbnail to the memory stream
thumb.Save(thumbnailStream, image.RawFormat);
// Reset the seek position to the begining
thumbnailStream.Seek(0, SeekOrigin.Begin);
// The name of the new thumbnail
string thumbnailFilename = string.Format("thumbnail_{0}", filename);
// The name along with the exact path to the thumbnail
string thumbnailPath = Path.Combine(file.ContentId, thumbnailFilename);
// Write the thumbnail to storage
Storage.CreateAsync(thumbnailStream, thumbnailPath);
}
I am not sure what is the benefit that is gained when thumb.Save(...) does not reset the position to 0 after copying into a new stream! I just feel that it should be doing that since it will always write a new stream not appending to an existing one.
Related
What is the correct way to get the thumbnails of images when using C#? There must be some built-in system method for that, but I seem to be unable find it anywhere.
Right now I'm using a workaround, but it seems to be much heavier on the computing side, as generating the thumbnails of 50 images, when using parallel processing takes about 1-1,5 seconds, and during that time, my CPU is 100% loaded. Not to mention that it builds up quite some garbage, which it later needs to collect.
This is what my class currently looks like:
public class ImageData
{
public const int THUMBNAIL_SIZE = 160;
public string path;
private Image _thumbnail;
public string imageName { get { return Path.GetFileNameWithoutExtension(path); } }
public string folder { get { return Path.GetDirectoryName(path); } }
public Image image { get
{
try
{
using (FileStream stream = new FileStream(path, FileMode.Open, FileAccess.Read))
using (BinaryReader reader = new BinaryReader(stream))
{
var memoryStream = new MemoryStream(reader.ReadBytes((int)stream.Length));
return new Bitmap(memoryStream);
}
}
catch (Exception e) { }
return null;
}
}
public Image thumbnail
{
get
{
if (_thumbnail == null)
LoadThumbnail();
return _thumbnail;
}
}
public void LoadThumbnail()
{
if (_thumbnail != null) return;
Image img = image;
if (img == null) return;
float ratio = (float)image.Width / (float)image.Height;
int h = THUMBNAIL_SIZE;
int w = THUMBNAIL_SIZE;
if (ratio > 1)
h = (int)(THUMBNAIL_SIZE / ratio);
else
w = (int)(THUMBNAIL_SIZE * ratio);
_thumbnail = new Bitmap(image, w, h);
}
I am saving up the thumbnail once generated, to save up some computing time later on. Meanwhile, I have an array of 50 elements, containing picture boxes, where I inject the thumbnails into.
Anyways... when I open a folder, containing images, my PC certainly doesn't use up 100% CPU for the thumbnails, so I am wondering what is the correct method to generate them.
Windows pregenerates the thumbnails and stores them in the thumbs.db-File (hidden) for later use.
So unless you either access the thumbs.db file and are fine with relying on it being available or cache the thumbnails yourself somewehere you always will have to render them in some way or another.
That being said, you can probably rely on whatever framework you are using for your UI to display them scaled down seeing as you load them into memory anyway.
I'm using the ImageSharp library to rescale my images before they are uploaded to Azure, the application hangs with no errors when it reaches the UploadBlob action and I think it's the stream causing it. When the image is uploaded the information is gathered from the image stream, I create an empty MemoryStream, resize the image using ImageSharp, stuff the MemoryStream with my newly scaled image and try to upload that MemoryStream to Azure and I don't think it likes it as that is where it hangs.
Is MemoryStream the correct thing to use in this instance or is it something else?
CarController.cs
[HttpPost]
[ValidateAntiForgeryToken]
public IActionResult Create(Car car)
{
// Define the cancellation token.
CancellationTokenSource source = new CancellationTokenSource();
CancellationToken token = source.Token;
if (ModelState.IsValid)
{
//Access the car record
_carService.InsertCar(car);
//Get the newly created ID
int id = car.Id;
//Give it a name with some virtual directories within the container
string fileName = "car/" + id + "/car-image.jpg";
string strContainerName = "uploads";
//I create a memory stream ready for the rescaled image, not sure this is right.
Stream outStream = new MemoryStream();
//Access my storage account
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
//Open the image read stream
var carImage = car.ImageFile.OpenReadStream();
//Rescale the image, save as jpeg.
using (Image image = Image.Load(carImage))
{
int width = 250;
int height = 0;
image.Mutate(x => x.Resize(width, height));
image.SaveAsJpeg(outStream);
}
var blobs = containerClient.UploadBlob(fileName, outStream);
return RedirectToAction(nameof(Index));
}
return View(car);
}
It's not got anything to do with the ImageSharp library.
You need to reset your outStream position after saving. BlobContainerClient is trying to read from the end of the stream.
I am writing a image viewer in C# and WPF for TIFF images stored in a SQL Server database image column. I have coded the retrieval of the images into a memory stream using a GetBytes loop and that works. What is not working is creating a TiffBitmapDecoder from the memory stream and using that as the BitmapSource for a WPF/XAML Image control. Here is my function to return the BitmapSource with the memory stream as input:
namespace ViewDBImages
{
public static class Utility
{
public static BitmapSource StreamToImage(MemoryStream imageMem)
{
//
// Decode the Memory Stream argument into a Bitmap with TIFF format
// First we have to set the Stream seek location to the origin
//
imageMem.Seek(0, SeekOrigin.Begin);
//
// Decode the stream into a TiffBitmap and return it as the image source
//
TiffBitmapDecoder decoder = new TiffBitmapDecoder(imageMem, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.OnDemand);
BitmapSource source = decoder.Frames[0];
return source;
}
}
}
I believe the images are being set as the source for the image control, because I can see the scroll bars changing as they are retrieved, but they are not visible. I also see that the position of the memory stream is "8" after this function finishes, but the size of the stream is 63,083 bytes.
To help debug this, I copied the memory stream to a TIFF file and used that as the stream input for the decoder. The images are displayed correctly that way. So I suspect there must be some kind of control information that is available when the image is stored as a file, but is not found in the memory stream. Here is that code:
namespace ViewDBImages
{
public static class Utility
{
public static BitmapSource StreamToImage(MemoryStream imageMem)
{
//
// Copy the Memory Stream argument into a filestream and save as a TIF file
// First we have to set the Stream seek location to the origin
//
imageMem.Seek(0, SeekOrigin.Begin);
FileStream imageFile = new FileStream(#"C:\Image Test\Testfile.tif", FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite);
imageMem.CopyTo(imageFile);
//
// Decode the file to a TiffBitmap and return it as the image source
//
TiffBitmapDecoder decoder = new TiffBitmapDecoder(imageFile, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.OnDemand);
BitmapSource source = decoder.Frames[0];
return source;
}
}
}
Thank you for any advice.
Try to set BitmapCacheOption.OnLoad to make the decoder immediately load the bitmap:
var decoder = new TiffBitmapDecoder(
imageMem, BitmapCreateOptions.None, BitmapCacheOption.OnLoad);
return source = decoder.Frames[0];
As a note, you could also directly create a BitmapFrame from a stream. The appropriate decoder is chosen automatically.
return BitmapFrame.Create(
imageMem, BitmapCreateOptions.None, BitmapCacheOption.OnLoad)
.NET 4.6.2
I read some (I know they are valid) files from a database and then try to combine them into a single ZIP file using the following function:
public static byte[] CompressData(IList<ZipFileData> zipFileDatas)
{
var buffer = new byte[(int)zipFileDatas.Sum(z => z.Length)];
using (var ms = new MemoryStream(buffer))
{
using (var zip = new ZipFile())
{
foreach (var zipFileData in zipFileDatas)
zip.AddEntry($"{zipFileData.FileName}.{zipFileData.FileType}", zipFileData.Data);
zip.Save(ms);
}
return ms.ToArray();
}
}
The parameter is a collection of these:
public class ZipFileData
{
public string FileName { get; set; }
public string FileType { get; set; } // Eg: PDF, JPG, XSLX
public byte[] Data { get; set; }
public long? Length { get; set; } // Length of the data
}
The function appears to work correctly, but later when I save the returned byte[] as "my.zip" and try and open it (from Windows 10), I get the error "The Compressed (zipped) Folder C:...\my.zip is invalid.
I'm trying to determine if this function (or some other code) is the cause of the issue.
Has anyone done something similar before or could verify that the function is (in)correct?
You allocate a buffer too small:
var buffer = new byte[(int)zipFileDatas.Sum(z => z.Length)];
A non compressed zip file will be slightly larger that the sum of files inside: each zip entry has a header (sometimes a footer) and there is a "table of content" at the end of the zip file (the central directory).
var ms = new MemoryStream(buffer)
will create a non-resizable memory stream a bit too small. Unfortunately for you, the last bytes you miss are the most important ones: That's where you find the offset of the central directory. Without it, you have a corrupted zip file.
To fix this, use a resizable memory stream:
var ms = new MemoryStream()
I have spent the last 10-12 hours trying to figure out how to correctly make a downloaded web image smaller in size and pixels in C# in a Windows Store app under development.
Whatever I do, I keep getting artifacts on the final images such as a "half picture", gray/same-colored areas and likewise. Like if the stream has not been flushed correctly although I believe to have done so (not done so in the code below since it works without it...)
This is my approach for retrieving the image - this part works, but is included here to make sure all info is here (see code below):
Get URL of image
Use HttpWebRequest to get response
Create stream to get response stream
Create empty StorageFile and open for writing
Copy the response stream to the storage file.
Close everything
From there, I need to do the following:
Determine the size (e.g. using BitmapDecoder)
If the width of the image is above a certain amount (e.g. 700 px), it must be resized.
No matter what, the files are always too big and need to be compressed further
The image need to be saved as a jpg with image quality set to a medium/semi-high setting
I have tried many things including messing pretty much around with BitmapEncoder/BitmapDecoder, but no matter what I am still getting half-processed images.
Can somebody please help me find the correct way to compress and resize images?
My code in the current state:
using (var response = await HttpWebRequest.CreateHttp(internetUri).GetResponseAsync())
{
using (var stream = response.GetResponseStream())
{
var imageFolder = await localFolder.CreateFolderAsync(
CachedImagesFolderEndFolderPath, CreationCollisionOption.OpenIfExists);
string fileName = string.Format("{0}.jpg",
Path.GetFileNameWithoutExtension(Path.GetRandomFileName()));
var file = await imageFolder.CreateFileAsync(fileName,
CreationCollisionOption.ReplaceExisting);
using (var filestream = await file.OpenStreamForWriteAsync())
{
await stream.CopyToAsync(filestream);
}
}
}
The following solution was provided by StefanDK in this edit:
It seems that the problem with my former solution was that I did not properly close the streams and that I did not have the correct settings.
Basically the solution incorporates elements from these articles:
How to resize Image in C# WinRT/winmd?
https://stackoverflow.com/questions/15481126/windows-store-app-resize-bitmapimage-c-sharp
http://msdn.microsoft.com/en-us/library/windows/apps/jj709942.aspx
http://msdn.microsoft.com/en-us/library/windows/apps/hh465076.aspx
From the main part of the code I make these calls for each image that needs downloading, resizing and compressing:
Main code
Note that I am well aware of the "not best practice" in assigning a string value and then setting it again. This is prototype code that has not been fine-tuned yet.
var img = await ArticleStorage.GetLocalImageAsync(src);
img = await ArticleStorage.ResizeAndCompressLocalImage(img);
Source code of the methods in ArticleStorage
public const string CachedImagesFolderFullPath = "ms-appdata:///local/cache/";
public const string CachedImagesFolderEndFolderPath = "cache";
public const string OfflinePhotoImgPath = "ms-appx:///Assets/OfflinePhoto.png";
public const int MaximumColumnWidth = 700;
public static async Task<string> GetLocalImageAsync(string internetUri)
{
if (string.IsNullOrEmpty(internetUri))
{
return null;
}
// Show default image if local folder does not exist
var localFolder = ApplicationData.Current.LocalFolder;
if (localFolder == null)
{
return OfflinePhotoImgPath;
}
// Default to offline photo
string src = OfflinePhotoImgPath;
try
{
using (var response = await HttpWebRequest.CreateHttp(internetUri)
.GetResponseAsync())
{
using (var stream = response.GetResponseStream())
{
// New random filename (e.g. x53fjtje.jpg)
string fileName = string.Format("{0}.jpg",
Path.GetFileNameWithoutExtension(Path.GetRandomFileName()));
var imageFolder = await localFolder.CreateFolderAsync(
CachedImagesFolderEndFolderPath,
CreationCollisionOption.OpenIfExists);
var file = await imageFolder.CreateFileAsync(fileName,
CreationCollisionOption.ReplaceExisting);
// Copy bytes from stream to local file
// without changing any file information
using (var filestream = await file.OpenStreamForWriteAsync())
{
await stream.CopyToAsync(filestream);
// Send back the local path to the image
// (including 'ms-appdata:///local/cache/')
return string.Format(CachedImagesFolderFullPath + "{0}",
fileName);
}
}
}
}
catch (Exception)
{
// Is implicitly handled with the setting
// of the initilized value of src
}
// If not succesfull, return the default offline image
return src;
}
public static async Task<string> ResizeAndCompressLocalImage(string imgSrc)
{
// Remove 'ms-appdata:///local/cache/' from the path ...
string sourcepathShort = imgSrc.Replace(
CachedImagesFolderFullPath,
string.Empty);
// Get the cached images folder
var folder = await ApplicationData.Current
.LocalFolder
.GetFolderAsync(
CachedImagesFolderEndFolderPath);
// Get a new random name (e.g. '555jkdhr5.jpg')
var targetPath = string.Format("{0}.jpg",
Path.GetFileNameWithoutExtension(
Path.GetRandomFileName()));
// Retrieve source and create target file
var sourceFile = await folder.GetFileAsync(sourcepathShort);
var targetFile = await folder.CreateFileAsync(targetPath);
using (var sourceFileStream = await sourceFile.OpenAsync(
Windows.Storage.FileAccessMode.Read))
{
using (var destFileStream = await targetFile.OpenAsync(
FileAccessMode.ReadWrite))
{
// Prepare decoding of the source image
BitmapDecoder decoder = await BitmapDecoder.CreateAsync(
sourceFileStream);
// Find out if image needs resizing
double proportionWidth = (double)decoder.PixelWidth /
LayoutDimensions.MaximumColumnWidth;
double proportionImage = decoder.PixelHeight /
(double)decoder.PixelWidth;
// Get the new sizes of the image whether it is the same or should be resized
var newWidth = proportionWidth > 1 ?
(uint)(MaximumColumnWidth) :
decoder.PixelWidth;
var newHeight = proportionWidth > 1 ?
(uint)(MaximumColumnWidth * proportionImage) :
decoder.PixelHeight;
// Prepare set of properties for the bitmap
BitmapPropertySet propertySet = new BitmapPropertySet();
// Set ImageQuality
BitmapTypedValue qualityValue = new BitmapTypedValue(0.75,
PropertyType.Single);
propertySet.Add("ImageQuality", qualityValue);
//BitmapEncoder enc = await BitmapEncoder.CreateForTranscodingAsync(
destFileStream, decoder);
BitmapEncoder enc = await BitmapEncoder.CreateAsync(
BitmapEncoder.JpegEncoderId,
destFileStream, propertySet);
// Set the new dimensions
enc.BitmapTransform.ScaledHeight = newHeight;
enc.BitmapTransform.ScaledWidth = newWidth;
// Get image data from the source image
PixelDataProvider pixelData = await decoder.GetPixelDataAsync();
// Copy in all pixel data from source to target
enc.SetPixelData(
decoder.BitmapPixelFormat,
decoder.BitmapAlphaMode,
decoder.PixelWidth,
decoder.PixelHeight,
decoder.DpiX,
decoder.DpiY,
pixelData.DetachPixelData()
);
// Make the encoder process the image
await enc.FlushAsync();
// Write everything to the filestream
await destFileStream.FlushAsync();
}
}
try
{
// Delete the source file
await sourceFile.DeleteAsync();
}
catch(Exception)
{
}
// Return the new path
// including "ms-appdata:///local/cache/"
return string.Format(CachedImagesFolderFullPath + "{0}",
targetPath);
}