Exception creating BitmapDecoder from WriteableBitmap for Windows Phone UWP - c#

In my UWP Windows 10 Mobile app, I am attempting to access and manipulate the transparency of individual pixels in the PixelBuffer for a given WriteableBitmap. The issue I'm having is that BitmapDecoder.CreateAsync() is throwing
"The component cannot be found. (Exception from HRESULT:
0x88982F50)".
I have spent WAY too much time searching, refactoring and debugging this to no avail; any hints, direction or help of any kind would be much appreciated.
// img is a WriteableBitmap that contains an image
var stream = img.PixelBuffer.AsStream().AsRandomAccessStream();
BitmapDecoder decoder = null;
try
{
decoder = await BitmapDecoder.CreateAsync(stream);
}
catch(Exception e)
{
// BOOM: The component cannot be found. (Exception from HRESULT: 0x88982F50)
}
// Scale image to appropriate size
BitmapTransform transform = new BitmapTransform()
{
ScaledWidth = Convert.ToUInt32(img.PixelWidth),
ScaledHeight = Convert.ToUInt32(img.PixelHeight)
};
PixelDataProvider pixelData = await decoder.GetPixelDataAsync(
BitmapPixelFormat.Bgra8, // WriteableBitmap uses BGRA format
BitmapAlphaMode.Straight,
transform,
ExifOrientationMode.IgnoreExifOrientation, // This sample ignores Exif orientation
ColorManagementMode.DoNotColorManage
);
// An array containing the decoded image data, which could be modified before being displayed
byte[] pixels = pixelData.DetachPixelData();
UPDATE: In case this helps spark some ideas, I found that if I use the overloaded CreateAsync constructor that supplies a Codec with the stream, it throws a different exception:
Specified cast is not valid.
Guid BitmapEncoderGuid = BitmapEncoder.PngEncoderId;
decoder = await BitmapDecoder.CreateAsync(BitmapEncoderGuid, stream);
It gives the same exception no matter which codec I supply (e.g. Png, Jpeg, GIF, Tiff, Bmp, JpegXR )

I don't understand, why you want to use a BitmapDecoder at all. The pixel data in WriteableBitmap is not encoded in any way. You would need a BitmapDecoder if you were loading the image from a file stream in a particular compression format - this would require you to use the correct codec.
You can just read the pixel data directly from the stream:
byte[] pixels;
using (var stream = img.PixelBuffer.AsStream())
{
pixels = new byte[(uint)stream.Length];
await stream.ReadAsync(pixels, 0, pixels.Length);
}
This will give you a byte array containing 4 bytes for each pixel, corresponding to their R, G, B and A components.

Related

Unity UWP load image bytes is missing color channels

I´m loading an image bytes and trying to apply it in a Texture2D.
Don't worry about async/await/thread problems...
UWP code:
StorageFile storageFile = StorageFile.GetFileFromPathAsync(filePath).AsTask().GetAwaiter().GetResult();
// get image size
IRandomAccessStreamWithContentType random = storageFile.OpenReadAsync().AsTask().GetAwaiter().GetResult();
BitmapDecoder decoder = BitmapDecoder.CreateAsync(random).AsTask().GetAwaiter().GetResult();
BitmapFrame bitmapFrame = decoder.GetFrameAsync(0).AsTask().GetAwaiter().GetResult();
PixelDataProvider pixelData = bitmapFrame.GetPixelDataAsync().AsTask().GetAwaiter().GetResult();
return new Dictionary<string, object>
{
{"bytes", pixelData.DetachPixelData()},
{"width", (int) decoder.PixelWidth},
{"height", (int) decoder.PixelHeight}
};
Unity code:
Texture2D texture = new Texture2D(textureSizeStruct.width, textureSizeStruct.height, TextureFormat.RGBA32, false);
texture.LoadRawTextureData(textureBytes);
texture.Apply();
This is how the images show up...
Original:
In the app (sorry for the big white square):
Your image channels are not missing, they are simply in a different order.
Check the docs of Texture2D.LoadRawTextureData:
Passed data should be of required size to fill the whole texture according to its width, height, data format and mipmapCount; otherwise a UnityException is thrown.
Solution:
Pass TextureFormat.BGRA32 instead to your Texture2D constructor.
In UWP side get the pixels from decoder with the right params is needed. Follow below the solution:
StorageFile storageFile = StorageFile.GetFileFromPathAsync(filePath).AsTask().GetAwaiter().GetResult();
IRandomAccessStreamWithContentType random = storageFile.OpenReadAsync().AsTask().GetAwaiter().GetResult();
BitmapDecoder decoder = BitmapDecoder.CreateAsync(random).AsTask().GetAwaiter().GetResult();
// here is the catch
PixelDataProvider pixelData = decoder.GetPixelDataAsync(
BitmapPixelFormat.Rgba8, // <--- you must to get the pixels like this
BitmapAlphaMode.Straight,
new BitmapTransform(),
ExifOrientationMode.RespectExifOrientation,
ColorManagementMode.DoNotColorManage // <--- you must to set this too
).AsTask().GetAwaiter().GetResult();

Convert a System.Drawing.Bitmap to Windows.Graphics.Imaging.SoftwareBitmap

I have a WPF project and captured an image from a usb camera to a System.Drawing.Bitmap (I also can capture System.Windows.Media.Imaging.BitmapSource) and need to convert it to a Windows.Graphics.Imaging.SoftwareBitmap to make a "VideoFrame" to compare to an Onnx model.
The camera driver is a .net assembly and will not bind to a uwp project. I've tried creating a .net standard assembly to bridge the gap with no success. I simply need a bitmap converted to a SoftwareBitmap. Please help!
I'm using this code for the basis of the compassion of the Bitmap image from the camera - https://github.com/Azure-Samples/cognitive-services-onnx12-customvision-sample
There is no direct conversion. You'll need to extract the image data from the System.Drawing.Bitmap and then create the new SoftwareBitmap from that data.
For example, you could use Save(Stream, ImageFormat) method to save this image to the specified stream in the specified format.
Then, you could try to call BitmapDecoder.CreateAsync method to create the decoder from the stream.
After that you could call GetSoftwareBitmapAsync to get a SoftwareBitmap object.
The following is a simple code sample:
Bitmap bitmap = getyourbitmap();
using (var stream = new Windows.Storage.Streams.InMemoryRandomAccessStream())
{
bitmap.Save(stream.AsStream(),ImageFormat.Jpeg);//choose the specific image format by your own bitmap source
Windows.Graphics.Imaging.BitmapDecoder decoder = await Windows.Graphics.Imaging.BitmapDecoder.CreateAsync(stream);
SoftwareBitmap softwareBitmap = await decoder.GetSoftwareBitmapAsync();
}
I found that you can use Windows.Security.Cryptography to create a IBuffer from the image array bytes. Then you can copy the IBuffer to the SoftwareBitmap.
using Windows.Security.Cryptography;
IBuffer buffer = CryptographicBuffer.CreateFromByteArray(ImageByteArray);
SoftwareBitmap softwareBitmap = new SoftwareBitmap(BitmapPixelFormat.Gray8, 800, 600);
softwareBitmap.CopyFromBuffer(buffer);
VideoFrame inputImage = VideoFrame.CreateWithSoftwareBitmap(softwareBitmap);
This worked for me:
Bitmap bitmap = ...;
var memoryStream = new MemoryStream();
using (var graphics = Graphics.FromImage(bitmap))
{
bitmap.Save(memoryStream, ImageFormat.Bmp);
}
var decoder = await Windows.Graphics.Imaging.BitmapDecoder.CreateAsync(stream);
var bitmap = await decoder.GetSoftwareBitmapAsync();

Using SharpAvi to save screenshots to an AVI produces 100 frames of blank video

My game takes a screenshot each game loop and stores it memory. The user can then press "print screen" to trigger "SaveScreenshot" (see code below) to store each screenshot as a PNG and also compile them into an AVI using SharpAvi. The saving of images works fine, and a ~2sec AVI is produced, but it doesn't show any video when played. It's just the placeholder VLC Player icon. I think this is very close to working, but I can't determine what's wrong. Please see my code below. If anyone has any ideas, I'd be very appreciative!
private Bitmap GrabScreenshot()
{
try
{
Bitmap bmp = new Bitmap(this.ClientSize.Width, this.ClientSize.Height);
System.Drawing.Imaging.BitmapData data =
bmp.LockBits(this.ClientRectangle, System.Drawing.Imaging.ImageLockMode.WriteOnly,
System.Drawing.Imaging.PixelFormat.Format24bppRgb);
GL.ReadPixels(0, 0, this.ClientSize.Width, this.ClientSize.Height, PixelFormat.Bgr, PixelType.UnsignedByte,
data.Scan0);
bmp.UnlockBits(data);
bmp.RotateFlip(RotateFlipType.RotateNoneFlipY);
return bmp;
} catch(Exception ex)
{
// occasionally getting GDI generic exception when rotating the image... skip that one.
return null;
}
}
private void SaveScreenshots()
{
var directory = "c:\\helioscreenshots\\";
var rootFileName = string.Format("{0}_", DateTime.UtcNow.Ticks);
var writer = new AviWriter(directory + rootFileName + ".avi")
{
FramesPerSecond = 30,
// Emitting AVI v1 index in addition to OpenDML index (AVI v2)
// improves compatibility with some software, including
// standard Windows programs like Media Player and File Explorer
EmitIndex1 = true
};
// returns IAviVideoStream
var aviStream = writer.AddVideoStream();
// set standard VGA resolution
aviStream.Width = this.ClientSize.Width;
aviStream.Height = this.ClientSize.Height;
// class SharpAvi.KnownFourCCs.Codecs contains FOURCCs for several well-known codecs
// Uncompressed is the default value, just set it for clarity
aviStream.Codec = KnownFourCCs.Codecs.Uncompressed;
// Uncompressed format requires to also specify bits per pixel
aviStream.BitsPerPixel = BitsPerPixel.Bpp32;
var index = 0;
while (this.Screenshots.Count > 0)
{
Bitmap screenshot = this.Screenshots.Dequeue();
var screenshotBytes = ImageToBytes(screenshot);
// write data to a frame
aviStream.WriteFrame(true, // is key frame? (many codecs use concept of key frames, for others - all frames are keys)
screenshotBytes, // array with frame data
0, // starting index in the array
screenshotBytes.Length); // length of the data
// save it!
// NOTE: compared jpeg, gif, and png. PNG had smallest file size.
index++;
screenshot.Save(directory + rootFileName + index + ".png", System.Drawing.Imaging.ImageFormat.Png);
}
// save the AVI!
writer.Close();
}
public static byte[] ImageToBytes(Image img)
{
using (var stream = new MemoryStream())
{
img.Save(stream, System.Drawing.Imaging.ImageFormat.Png);
return stream.ToArray();
}
}
From what I see, you're providing the byte-array in png-encoding, yet the stream is configured as KnownFourCCs.Codecs.Uncompressed.
Furthermore, from the manual:
AVI expects uncompressed data in format of standard Windows DIB, that is bottom-up bitmap of the specified bit-depth. For each frame, put its data in byte array and call IAviVideoStream.WriteFrame()
Next, all encoders expect input image data in specific format. It's BGR32 top-down - 32 bits per pixel, blue byte first, alpha byte not used, top line goes first. This is the format you can often get from existing images. [...] So, you simply pass an uncompressed top-down BGR32
I would retrieve the byte-array directly from the Bitmap using LockBits and Marshal.Copy as described in the manual.

Image size is drastically increasing after applying a simple watermark

I've a set of images that I'm programmatically drawing a simple watermark on them using System.Windows and System.Windows.Media.Imaging (yes, not with GDI+) by following a tutorial in here.
Most of the images are not more than 500Kb, but after applying a simple watermark, which is a text with a transparent background, the image size is drastically increasing.
For example, a 440Kb image is becoming 8.33MB after applying the watermark with the below method, and that is shocking me.
private static BitmapFrame ApplyWatermark(BitmapFrame image, string waterMarkText) {
const int x = 5;
var y = image.Height - 20;
var targetVisual = new DrawingVisual();
var targetContext = targetVisual.RenderOpen();
var brush = (SolidColorBrush)(new BrushConverter().ConvertFrom("#FFFFFF"));
brush.Opacity = 0.5;
targetContext.DrawImage(image, new Rect(0, 0, image.Width, image.Height));
targetContext.DrawRectangle(brush, new Pen(), new Rect(0, y, image.Width, 20));
targetContext.DrawText(new FormattedText(waterMarkText, CultureInfo.CurrentCulture, FlowDirection.LeftToRight,
new Typeface("Batang"), 13, Brushes.Black), new Point(x, y));
targetContext.Close();
var target = new RenderTargetBitmap((int)image.Width, (int)image.Height, 96, 96, PixelFormats.Default);
target.Render(targetVisual);
var targetFrame = BitmapFrame.Create(target);
return targetFrame;
}
I've noticed that the image quality is improved compared than the original image. The image is more smoother and colors are more lighter. But, you know I don't really want this. I want the image to be as it is, but include the watermark. No quality increases, and of course no drastic changes in image size.
Is there any settings that I'm missing in here to tell my program to keep the quality as same as source image? How can I prevent the significant change of the image size after the changes in my ApplyWatermark method?
Edit
1. This is how I convert BitmapFrame to Stream. Then I use that Stream to save the image to AmazonS3
private Stream EncodeBitmap(BitmapFrame image) {
BitmapEncoder enc = new BmpBitmapEncoder();
enc.Frames.Add(BitmapFrame.Create(image));
var memoryStream = new MemoryStream();
enc.Save(memoryStream);
return memoryStream;
}
2. This is how I get the BitmapFrame from Stream
private static BitmapFrame ReadBitmapFrame(Stream stream) {
var photoDecoder = BitmapDecoder.Create(
stream,
BitmapCreateOptions.PreservePixelFormat,
BitmapCacheOption.None);
return photoDecoder.Frames[0];
}
3. This is how I read the file from local directory
public Stream FindFileInLocalImageDir() {
try {
var path = #"D:\Some\Path\Image.png";
return !File.Exists(path) ? null : File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read);
} catch (Exception) {
return null;
}
}
The problem is that when you edit the image, the compression is gone. A 730x1108 JPG with 433kB disc size with 32bit (you mentioned transparency, so ARGB) will need at least 730 * 1108 * 4 = 3,09MB on disc. Of course you can compress it afterwards again (for disc, network stream of what else).
This is the reason why image software always needs much memory even when working with compressed data.
Conclusion: You will need the free memory to work with the image. Not possible to have it otherwise completly at hand.
The reason I asked my question in the comments earlier, is because I noticed there were several different encoders available. A bitmap usually has a significantly larger file size, due to the amount of information it's storing about your image.
I haven't tested this myself, but have you tried a different encoder?
var pngEncoder = new PngBitmapEncoder();
pngEncoder.Frames.Add(ApplyWatermark(null, null));
MemoryStream stm = File.Create(image);
pngEncoder.Save(stm);
return stm;

BitmapEncoder SetPixelData memory allocation

I'm working on a Metro application where I need to generate an animated GIF image.
I've found this tutorial, witch seems to be the one and only resource on animated GIFs for Metro apps.
When running this code, an Exception is thrown on the SetPixelData method, telling me that the allocated buffer memory is insufficient (The message is in my OS language even though my Visual Studio environnement is in English, I think it might be relevant).
I've reduced image size (source and output) and frame number, but I still get this error. (I manipulate way bigger images and byte array in the same application).
Any idea where this memory problem can come from ? A problem with my StorageFile maybe ?
I was seeing this exception when the frameWidth\Height being passed into SetPixelData didn't match the pixel data.
I ended up with this example below. I was seeing the exception you mentioned when the dimensions didn't match the pixelData.
I think this is more stable in Windows 8.1 as it doesn't repro on it.
BitmapDecoder decoder = await BitmapDecoder.CreateAsync(sourceStream);
BitmapTransform transform = new BitmapTransform()
{
ScaledHeight = 900,
ScaledWidth = 600
};
PixelDataProvider pixelData = await decoder.GetPixelDataAsync(BitmapPixelFormat.Rgba8,
BitmapAlphaMode.Straight,
transform,
ExifOrientationMode.RespectExifOrientation,
ColorManagementMode.DoNotColorManage);
StorageFile destinationFile = await ApplicationData.Current.LocalFolder.CreateFileAsync(Path.Combine(Database.rootMoviesFoldersPaths, movie.LocalId + ".jpg"));
using (var destinationStream = await destinationFile.OpenAsync(FileAccessMode.ReadWrite))
{
BitmapEncoder encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, destinationStream);
encoder.SetPixelData(BitmapPixelFormat.Rgba8, BitmapAlphaMode.Premultiplied, 600, 900, 96, 96, pixelData.DetachPixelData());
await encoder.FlushAsync();
movie.HasFolderImage = true;
return true;
}
}
Multiply the buffersize by the bitdepth.

Categories

Resources