Why do my Unity3D sprites look pixelated when rendered on my iPhone? - c#

I'm a newbie in Unity3D, I'm making a 2D game, but, I'm making the menu, and it's all pixelated! Om my PC it looks fine, but when I run it on my iPhone, it pixelates.
I have the "compress assets on import" option turned off, and the image's original size is bigger than the size that have in Unity!
This is how it looks on my iPhone:
And this is how it looks on my PC:
PS: All of the components are buttons with PNG images; I don't know if that affects anything.

In your sprite import settings, change "Filter Mode" to "Point".
I believe this prevents the blurring on small screens as displayed items are enlarged and let things stay sharp (pixelated).

one thing you can try is to set your font-size up high, then scale your objects up,and then reduce the size back down (Not from scaling). Use point for images.

Related

Unity VR Render Textures, Scene space rendering, and blending layers

[using unity 2020.3]
I'm trying to slowly blend different layers in and out in VR, with both layers being visible while the fade between occurs. Right now, I am using two cameras, one as the main camera and one as a render texture (both only rendering their selective layers). Then I use UI to fade the render texture in and out. This looks and works great in 2D view (including builds), but UI components do not render in VR.
I am aware that rendering this in VR will require 4 sets of rendering (two for each eye), but I'd still like to know how to generate and display a render texture for each eye using unity.
This effect can be done in other ways and I'm open to suggestions. There are a lot of different types of elements I wish to fade in and out so I'm aware of one solution to add transparent shaders and fade particles but this can be tedious and requires a lot of setups (I'd like more of a permanent solution for any project). This being said, I'd still like to know how to manipulate what is being rendered out to the VR headset.
I'm fairly certain that the "Screen space effects" section of Unity doc Single Pass Stereo rendering (Double-Wide rendering) -- { https://docs.unity3d.com/Manual/SinglePassStereoRendering.html } -- is what I'm looking for, however, this still doesn't answer how to get the render texture for each eye (and I'm a little confused on how to use what they have written).
I'm happy to elaborate more and test some things out! Thank you in advance!

Setting Screen to Format on all Windows PC sizes

The game looks fine inside the Unity screen view, but if I build and choose a different resolution screen size the game looks goofy and dis-proportioned. Especially when I go into full screen mode. I'm using Unity 2019.1.1a (I think it's still in beta?). I'm developing a top down 2D game for Windows PC.
How can I fix it to where the game will look the same on any screen size?
Unity has some good tutorials on this topic:
https://unity3d.com/es/learn/tutorials/topics/user-interface-ui/ui-tools-resolution-device-independence
https://docs.unity3d.com/Manual/HOWTO-UIMultiResolution.html

Resolutions in a XNA Game

I am going to play around with making an XNA game.
The windows store has two base resolutions it reccommends you support: 1024x768 and 1366x768
But after that there are no restrictions.
The common advice is to use a ViewBox that will scale your content for you.
But an XNA game does not have a viewbox. It has a draw method where you render your content.
What is the common way for Games (XNA or DirectX) to adapt to different resolutions?
I would rather not have to make images for each and evey resolution out there. It would be a lot of work and I am bound to miss some.
Is there a better way?
GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Widthand.Height will give you the current desktop resolution.
Then you can update Game.GraphicsDevice.Viewport variables to use these settings.
The above code usually goes in Game1.cs constructor
The link provided then documents one technique that makes your sprites, backgrounds etc look correct independently of the resolution(the theory should be sound if the code is not 100% up to date)
http://msdn.microsoft.com/en-us/library/bb447674%28v=xnagamestudio.10%29.aspx
There are two different approachs I read in a tutorial:
Resize everything to the new viewport (even with changing aspect ratio)
Just draw more surroundings around
Of course you can mix both: Resize everything as long as it can still be in the same aspect ratio (e.g. 4:3 or 16:9) and then show more or less background / surroundings.
You can also decide to display black content instead of the more surroundings, if it is important for everyone to have exactly the same sight (e.g. due to fairness), but in such a case it might be a better idea to use fog of war to reduce sight.

is there a way to make a 3D game in winforms or WPF in C#?

I'm thinking about making a 3D point and click game, is it possible to make one in winforms or WPF? I don't need any physics or anything all I need is to make the application render 3D objects. I know that I can use XNA but if I do then I will have to relearn almost everything again. My third approach would be to make the scenes in a 3D game engine and then print the screen and then load it as a image. Any suggestion would be appreciated.
There's a big difference between a 3D game, and just letting players interact with a rendered image.
Your approach of loading a pre-rendered image is possible to do in both Winforms and WPF. You would just need to capture click events on the image and check the location against your list of active areas. Then just handle what needed to be done, ie: move to the next area, activate item, etc.
Edit from comment:
It's not so much which is friendlier. You can host an XNA viewport in Winforms/WPF. It's more about how you want your game to work. If you never have moving 3D scenes, XNA is overkill, and images will work just fine.
If you want dynamic scenes, you'll need to be able to render them on the fly. Then, XNA makes more sense. It is a lot more work though compared to just displaying images.
If you just want to show pre-rendered 3d images in your game, why not create them using a real 3d graphics tool, such as 3D Studio Max or Maya (or a free one, such as Blender)? It sounds like there's no need for actually rendering the 3d scenes in a game engine at all.

XNA Game, Full Screen Animation / Video Playback

I have an XNA game (its a slot machine).
I have some really cool animations my artist made for me that are more or less 1600x1000 and over 50 frames.
For all of the animations so far I have been using sprite sheets. (Where all the frames are in one image file and when its rendered it chooses what part of the image to show).
The problem is that you can only load an image of a certain size. 2kx2k or 4kx4k depending on your videocard. Obviously putting each frame into one file is out of the question for this large animation.
Can you just load each image individually and display them in order? (That is what I used to do for the smaller animations anyway before I found out that isn't how you were supposed to do it)
My Questions:
What if any is a good way to play these large animations?
Is there a benefit to having a spritesheet instead of loading the frames in individually as Texture2D's?
Is there a (free) way to play fullscreen videos in XNA?
Apparently, XNA 3.1 "now supports the ability to play back video that can be used for such purposes as opening splash and logo scenes, cut scenes, or in-game video displays." That is what you'll want to use - the sizes you're talking about are far too big for conventional animation techniques. Some sample code is here.

Categories

Resources