Changing resolution in XNA - screen just black afterwards - c#

I'm writing a 2D game in the latest XNA (C#). I'm doing a bit of an overhaul of old code and have just put back in buttons to change resolution. However, after either fullscreen mode being toggled or the window size/resolution being changed the screen is just black (the color of my clear screen call) forever, although the game still runs, as I can quit it with a keypress.
This didn't happen in my previous version - it all worked fine - but there's no difference in the relevant code I can find. However I did redo my graphics loading to use my own Texture loader instead of the Content Loader - could this be the issue?
If no other option, is there a simple way to make the game restart, as the graphics are ok and in the selected format after a restart?
My code is:
public override void Click()
{
base.Click();
Settings.Default.screenWidth = resolution[0];
Settings.Default.screenHeight = resolution[1];
Settings.Default.Save();
Variables.screenWidth = Settings.Default.screenWidth;
Variables.screenHeight = Settings.Default.screenHeight;
Game1.graphics.PreferredBackBufferWidth = Variables.screenWidth;
Game1.graphics.PreferredBackBufferHeight = Variables.screenHeight;
Game1.graphics.ApplyChanges();
}
Thanks!
Edit: My texture loadign code - loads all files with a name included in an enum as Texture2Ds..
public static void LoadAll(string subFolder)
{
List<string> s = Directory.GetFiles(path + "\\" + subFolder, "*.png", SearchOption.AllDirectories).ToList<string>();
foreach (string S in s)
{
if (Enum.IsDefined(typeof(T), Path.GetFileNameWithoutExtension(S)))
{
FileStream stream = new FileStream(S, FileMode.Open);
Texture2D t = Texture2D.FromStream(Game1.graphics.GraphicsDevice, stream);
RenderTarget2D result = null;
//Setup a render target to hold our final texture which will have premulitplied alpha values
result = new RenderTarget2D(Game1.graphics.GraphicsDevice, t.Width, t.Height);
Game1.graphics.GraphicsDevice.SetRenderTarget(result);
Game1.graphics.GraphicsDevice.Clear(Microsoft.Xna.Framework.Color.Black);
//Multiply each color by the source alpha, and write in just the color values into the final texture
BlendState blendColor = new BlendState();
blendColor.ColorWriteChannels = ColorWriteChannels.Red | ColorWriteChannels.Green | ColorWriteChannels.Blue;
blendColor.AlphaDestinationBlend = Blend.Zero;
blendColor.ColorDestinationBlend = Blend.Zero;
blendColor.AlphaSourceBlend = Blend.SourceAlpha;
blendColor.ColorSourceBlend = Blend.SourceAlpha;
Game1.spriteBatch.Begin(SpriteSortMode.Immediate, blendColor);
Game1.spriteBatch.Draw(t, t.Bounds, Microsoft.Xna.Framework.Color.White);
Game1.spriteBatch.End();
//Now copy over the alpha values from the PNG source texture to the final one, without multiplying them
BlendState blendAlpha = new BlendState();
blendAlpha.ColorWriteChannels = ColorWriteChannels.Alpha;
blendAlpha.AlphaDestinationBlend = Blend.Zero;
blendAlpha.ColorDestinationBlend = Blend.Zero;
blendAlpha.AlphaSourceBlend = Blend.One;
blendAlpha.ColorSourceBlend = Blend.One;
Game1.spriteBatch.Begin(SpriteSortMode.Immediate, blendAlpha);
Game1.spriteBatch.Draw(t, t.Bounds, Microsoft.Xna.Framework.Color.White);
Game1.spriteBatch.End();
//Release the GPU back to drawing to the screen
Game1.graphics.GraphicsDevice.SetRenderTarget(null);
t = result;
textureDictionary.Add(Path.GetFileNameWithoutExtension(S), t);
}
// else
// Console.WriteLine("Did not load -- " + Path.GetFileNameWithoutExtension(S) + " -- (add to enum to enable loading)");
}
}
Edit: The working code from following below advice - probably not the most efficient but it works!
public static void LoadAll(string subFolder)
{
List<string> s = Directory.GetFiles(path + "\\" + subFolder, "*.png", SearchOption.AllDirectories).ToList<string>();
foreach (string S in s)
{
if (Enum.IsDefined(typeof(T), Path.GetFileNameWithoutExtension(S)))
{
FileStream stream = new FileStream(S, FileMode.Open);
Texture2D t = Texture2D.FromStream(Game1.graphics.GraphicsDevice, stream);
RenderTarget2D result = null;
Texture2D resultTexture;
//Setup a render target to hold our final texture which will have premulitplied alpha values
result = new RenderTarget2D(Game1.graphics.GraphicsDevice, t.Width, t.Height);
Game1.graphics.GraphicsDevice.SetRenderTarget(result);
Game1.graphics.GraphicsDevice.Clear(Microsoft.Xna.Framework.Color.Black);
//Multiply each color by the source alpha, and write in just the color values into the final texture
BlendState blendColor = new BlendState();
blendColor.ColorWriteChannels = ColorWriteChannels.Red | ColorWriteChannels.Green | ColorWriteChannels.Blue;
blendColor.AlphaDestinationBlend = Blend.Zero;
blendColor.ColorDestinationBlend = Blend.Zero;
blendColor.AlphaSourceBlend = Blend.SourceAlpha;
blendColor.ColorSourceBlend = Blend.SourceAlpha;
Game1.spriteBatch.Begin(SpriteSortMode.Immediate, blendColor);
Game1.spriteBatch.Draw(t, t.Bounds, Microsoft.Xna.Framework.Color.White);
Game1.spriteBatch.End();
//Now copy over the alpha values from the PNG source texture to the final one, without multiplying them
BlendState blendAlpha = new BlendState();
blendAlpha.ColorWriteChannels = ColorWriteChannels.Alpha;
blendAlpha.AlphaDestinationBlend = Blend.Zero;
blendAlpha.ColorDestinationBlend = Blend.Zero;
blendAlpha.AlphaSourceBlend = Blend.One;
blendAlpha.ColorSourceBlend = Blend.One;
Game1.spriteBatch.Begin(SpriteSortMode.Immediate, blendAlpha);
Game1.spriteBatch.Draw(t, t.Bounds, Microsoft.Xna.Framework.Color.White);
Game1.spriteBatch.End();
//Release the GPU back to drawing to the screen
Game1.graphics.GraphicsDevice.SetRenderTarget(null);
resultTexture = new Texture2D(Game1.graphics.GraphicsDevice, result.Width, result.Height);
Color[] data = new Color[result.Height * result.Width];
Color[] textureColor = new Color[result.Height * result.Width];
result.GetData<Color>(textureColor);
resultTexture.SetData(textureColor);
textureDictionary.Add(Path.GetFileNameWithoutExtension(S), resultTexture);
}
// else
// Console.WriteLine("Did not load -- " + Path.GetFileNameWithoutExtension(S) + " -- (add to enum to enable loading)");
}
}

As per your updated code, you're copying textures into a render target (RenderTarget2D).
Unlike regular textures, these are not backed by CPU-side memory. Regular textures will automatically re-set their data on the GPU when the graphics device is lost. A render target, however, will raise its ContentLost event and set IsContentLost - you are then expected to reset its content yourself!
There are a few solutions: You could simply respond to ContentLost and refresh the data.
I prefer to use GetData and SetData at load time to copy the contents of the render target into a regular Texture2D, because this "just works". Although for simple cases like switching a texture to use premultiplied alpha, I prefer to just keep everything on the CPU.
I've got a very detailed answer about RenderTarget2D usage for fixed textures over here. It includes code to do premultiplication directly on the CPU.
Although in your case, I'd try and use the content manager. It is possible to have it process all files in a directory and then load them dynamically.

Related

How can I make an offline png sequence recorder at 60fps?

I want to capture a sequence of images, 60fps, 4k, of the Unity Camera in non-realtime (offline).
So far I am able to capture each frame successfully, but I can't figure out how to make it offline, and conform to 60fps.
I am capturing the frames like this:
// get main camera and manually render scene into rt
Camera camera = this.GetComponent<Camera>(); // NOTE: added because there was no reference to camera in original script; must add this script to Camera
camera.targetTexture = renderTexture;
camera.Render();
this.lastRenderTime = Time.realtimeSinceStartup;
// read pixels will read from the currently active render texture so make our offscreen
// render texture active and then read the pixels
RenderTexture.active = renderTexture;
screenShot.ReadPixels(rect, 0, 0);
// reset active camera texture and render texture
camera.targetTexture = null;
RenderTexture.active = null;
// get our unique filename
string filename = uniqueFilename((int)rect.width, (int)rect.height);
// pull in our file header/data bytes for the specified image format (has to be done from main thread)
byte[] fileHeader = null;
byte[] fileData = null;
// create new thread to save the image to file (only operation that can be done in background)
new System.Threading.Thread(() =>
{
// create file and write optional header with image bytes
var f = System.IO.File.Create(filename);
if (fileHeader != null) f.Write(fileHeader, 0, fileHeader.Length);
f.Write(fileData, 0, fileData.Length);
f.Close();
Debug.Log(string.Format("Wrote screenshot {0} of size {1}", filename, fileData.Length));
}).Start();
Can someone point me in the right direction?
In time settings, set all timestep to 1/60 (approx 0.01666667)
Then caputre screen in OnPostRender event.

UWP Masking an image using another image as a mask

Does anybody know of any ways to use an image as a mask for another image in UWP, the only masking function I can see is CompositionMaskBrush which I don't believe can achieve what I want.
An example of what I'm looking to achieve is the following.
I have a solid black PNG in the shape of a mobile phone case, the user adds their own image which is then clipped and masked to the dimensions of the solid black PNG - Resulting in the image below.
Any help whatsoever would be greatly appreciated. I've spent quite a while browsing for a solution.
Example Image Here
Just posting for anybody else who needs and answer to this, but I finally managed to find a solution using Win2D and an Imageloader.
Here is a link to the ImageLoader. Note that I had to roll back a few versions in order make it work how the documentation states. The link below is to the version that I'm using. Anything later than this version will not work with the sample code I'm going to post.
https://www.nuget.org/packages/Robmikh.Util.CompositionImageLoader/0.4.0-alpha
private Compositor _compositor;
private IImageLoader _imageLoader;
private CompositionEffectFactory _effectFactory;
private async void InitMask()
{
// Store our Compositor and create our ImageLoader.
_compositor = ElementCompositionPreview.GetElementVisual(this).Compositor;
_imageLoader = ImageLoaderFactory.CreateImageLoader(_compositor);
// Setup our effect definition. First is the CompositeEffect that will take
// our sources and produce the intersection of the images (because we selected
// the DestinationIn mode for the effect). Next we take our CompositeEffect
// and make it the source of our next effect, the InvertEffect. This will take
// the intersection image and invert the colors. Finally we take that combined
// effect and put it through a HueRotationEffect, were we can adjust the colors
// using the Angle property (which we will animate below).
IGraphicsEffect graphicsEffect = new HueRotationEffect
{
Name = "hueEffect",
Angle = 0.0f,
Source = new InvertEffect
{
Source = new CompositeEffect
{
Mode = CanvasComposite.DestinationIn,
Sources =
{
new CompositionEffectSourceParameter("image"),
new CompositionEffectSourceParameter("mask")
}
}
}
};
// Create our effect factory using the effect definition and mark the Angle
// property as adjustable/animatable.
_effectFactory = _compositor.CreateEffectFactory(graphicsEffect, new string[] { "hueEffect.Angle" });
// Create MangedSurfaces for both our base image and the mask we'll be using.
// The mask is a transparent image with a white circle in the middle. This is
// important since the CompositeEffect will use just the circle for the
// intersectionsince the rest is transparent.
var managedImageSurface = await _imageLoader.CreateManagedSurfaceFromUriAsync(new Uri("http://sendus.pics/uploads/" + ImagePass + "/0.png", UriKind.Absolute));
//var managedImageSurface = await _imageLoader.CreateManagedSurfaceFromUriAsync(new Uri("ms-appx:///Assets/colour.jpg", UriKind.Absolute));
var managedMaskSurface = await _imageLoader.CreateManagedSurfaceFromUriAsync(new Uri("ms-appx:///" + MaskImage, UriKind.Absolute));
// Create brushes from our surfaces.
var imageBrush = _compositor.CreateSurfaceBrush(managedImageSurface.Surface);
var maskBrush = _compositor.CreateSurfaceBrush(managedMaskSurface.Surface);
// Create an setup our effect brush.Assign both the base image and mask image
// brushes as source parameters in the effect (with the same names we used in
// the effect definition). If we wanted, we could create many effect brushes
// and use different images in all of them.
var effectBrush = _effectFactory.CreateBrush();
effectBrush.SetSourceParameter("image", imageBrush);
effectBrush.SetSourceParameter("mask", maskBrush);
// All that's left is to create a visual, assign the effect brush to the Brush
// property, and attach it into the tree...
var visual = _compositor.CreateSpriteVisual();
visual.Size = new Vector2(MaskH, MaskW);
visual.Offset = new Vector3(0, 300, 0);
visual.Brush = effectBrush;
ElementCompositionPreview.SetElementChildVisual(this, visual);
}

Google Maps image distorted (Xamarin)

I have been following this tutorial on how to make a custom marker for Google maps, and I have got this part to work. However, I had to change some code in order to resize the image I am using on the marker.
This is the method I am using to do resize:
private void UpdateMarkers(float zoom)
{
// Max zoom out => zoom = 3
// Max zoom in => zoom = 21
int dimension = (int)zoom * 10;
if (dimension == currentDimension)
return;
currentDimension = dimension;
map.Clear();
foreach (var pin in customPins)
{
var immutableBitmap = BitmapFactory.DecodeResource(Context.Resources, Resource.Drawable.icon);
var mutableBitmap = immutableBitmap.Copy(Bitmap.Config.Argb8888, true);
mutableBitmap.Height = dimension;
mutableBitmap.Width = dimension;
BitmapDescriptorFactory.FromBitmap(mutableBitmap);
var img = BitmapDescriptorFactory.FromBitmap(mutableBitmap);
var marker = new MarkerOptions();
marker.SetPosition(new LatLng(pin.Pin.Position.Latitude, pin.Pin.Position.Longitude));
marker.SetTitle(pin.Pin.Label);
marker.SetSnippet(pin.Pin.Address);
marker.SetIcon(img);
map.AddMarker(marker);
}
}
And here is a picture on how it looks:
In the original code, I would do something like this
var img = BitmapDescriptorFactory.FromResource(Resource.Drawable.icon);
And this works. But I want to resize the image, so I found a way to do it, sort of.
Any idea what's going on?
Once you have a Bitmap, you can scale it via Bitmap.CreateScaledBitmap:
var bitmap = BitmapFactory.DecodeResource(Context.Resources, Resource.Drawable.icon);
var scaledBitmap = Bitmap.CreateScaledBitmap(bitmap, dimension, dimension, false);
var img = BitmapDescriptorFactory.FromBitmap(scaledBitmap);
Note: Make sure you Recycle and Dispose your Bitmaps to avoid leaking memory.
Note BitmapFactory's DecodeResource and CreateScaledBitmap are heavy API calls, you might want to cache the results versus calling them over and over in your foreach (var pin in customPins) loop.

iOS hangs when calling Texture2D.readPixels

I'm attempting to draw a RenderTexture into a Texture2D with the goal of saving it to disk. This approach has been working in the OSX editor, as well as on Android.
I don't see any errors in the XCode console, and my app becomes completely frozen when I call Texture2D.ReadPixels()
Here's a summary of the code:
// declaring variables...
RenderTexture outputTexture;
RenderTextureFormat RTFormat = RenderTextureFormat.ARGB32;
// use an appropriate format for render textures
if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBFloat)){
RTFormat = RenderTextureFormat.ARGBFloat;
}else if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBHalf)){
RTFormat = RenderTextureFormat.ARGBHalf;
}
// create instance of output texture
outputTexture = new RenderTexture (res.x, res.y, 0, RTFormat);
// in Update, draw stuff to outputTexture
Graphics.Blit (outputTexture, canvasTexture);
Graphics.Blit (canvasTexture, outputTexture, material);
// later... user wants to save the image
// draw rendertexture to a Texture2D so we can write to disk
RenderTexture.active = outputTexture;
tmpTexture = new Texture2D (outputTexture.width, outputTexture.height, TextureFormat.ARGB32, false);
tmpTexture.ReadPixels (new Rect (0, 0, outputTexture.width, outputTexture.height), 0, 0, false);
tmpTexture.Apply ();
RenderTexture.active = null;
I have tried using a variety of RenderTextureFormat and TextureFormat, but nothing seems to work!
I believe this is being caused by your render texture format call. Something similar happened to be before.
This bit of code assigns a default texture format, then alters the default format if the currently executing environment supports it (my comments are added)
//set a default render texture of RenderTextureFormat.ARGB32;
RenderTextureFormat RTFormat = RenderTextureFormat.ARGB32;
// if my system supports it, switch to either ARGBFloat or ARGBHalf
// use an appropriate format for render textures
if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBFloat)){
RTFormat = RenderTextureFormat.ARGBFloat;
}else if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBHalf)){
RTFormat = RenderTextureFormat.ARGBHalf;
}
However, when you actually later define your temp texture to fill with ReadPixels(), you only define it one way (again, my comments added)
//define a new tmpTexture container, ALWAYS with a TextureFormat of ARGB32
tmpTexture = new Texture2D (outputTexture.width, outputTexture.height, TextureFormat.ARGB32, false);
Thus on some systems (whichever supports the format), you're trying to readPixels() from one texture format into another. This is probably causing your issue.
You can fix this by also dynamically changing the format of the destination texture. So in the first section, you would change it to:
RenderTextureFormat RTFormat = RenderTextureFormat.ARGB32;
//add another variable here for the destination Texture format
var destinationFormat = TextureFormat.ARGB32;
// use an appropriate format for render textures
if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBFloat)){
RTFormat = RenderTextureFormat.ARGBFloat;
//also set destination format
destinationFormat = TextureFormat.RGBAFloat;
}else if(SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGBHalf)){
RTFormat = RenderTextureFormat.ARGBHalf;
//also set destination format
destinationFormat = TextureFormat.RGBAHalf;
}
and then of course, later consume the dynamically set format when declaring the destination object:
//define a new tmpTexture container, with a dynamically set destination format that always matches the input texture
tmpTexture = new Texture2D (outputTexture.width, outputTexture.height, destinationFormat, false);
Let me know if you still have issues in the comments.

Windows Phone generating Tile - cant seem to make it transparent

I am generating a tile from within my application and when its displayed the background image that its using as the basis fro the the tile has lost its transparency (and therefore its not picking up the theme color.
The background image has an icon on it and is transparent - when I use it as the standard Tile (i.e. not generate an image with it ) thens its fine and the transparency is all good..
But when I use it as the background image and add my own container over it then its not transparent the background is showing as black.
The relevant code is as follows:
// [...]
var container = new Grid();
if (isWide)
{
container = CreateContainerWide(tileInfo);
}
else
{
container = CreateContainerMedium(tileInfo);
}
// Add the background
container.Background = new ImageBrush
{
ImageSource = background,
Opacity = opacity
};
// Force the container to render itself
container.Arrange(new Rect(0, 0, width, height));
// Write the image to disk and return the filename
return WriteShellTileUIElementToDisk(container, baseFileName);
}
static string WriteShellTileUIElementToDisk(UIElement element, string baseFileName)
{
var wb = new WriteableBitmap(element, null);
// All content must be in this sub-folder of IsoStore
string fileName = SharedImagePath + baseFileName + ImageExtension;
var stream = new IsolatedStorageFileStream(fileName, System.IO.FileMode.Create, Isf);
// Write the JPEG using the standard tile size
// Sometimes the bitmap has (0,0) size and this fails for unknown reasons with an argument exception
if (wb.PixelHeight > 0)
wb.SaveJpeg(stream, wb.PixelWidth, wb.PixelHeight, 0, JpegQuality);
else
{
Debug.WriteLine("Can't write out file because bitmap had 0,0 size; not sure why");
// indicate that there is an issue
fileName = null;
}
stream.Close();
// Return the filename
return fileName;
}
Doesn't seem to make any difference as to what I set the Opacity of the ImageBrush to.
If I used a solid color rataher than a transparent layer then its all fine. Somehow the creation of the png is losing the transparency.
Any Ideas?
thanks
This answer might be helpful. Instead of saving a JPG you can save the image as a PNG which does support transparency. The library mentioned in the answer is quite useful.

Categories

Resources