Screen and World space UI in the same scene - c#

I have a Unity project, where I need to have both - screen space and world space UI elements (It's an AR project so I need to have some buttons in the world space, and others - attached to the screen).
As I understand I shouldn't have more than UI canvas in the scene, so how can I have both Screen and World space UI elements in the same scene?

You can have more than one canvas in Unity scene, just right click on the hierarchy an add 2 canvas, make one world space and other screen space. check the official tutorial here

Related

Why is Unity Canvas Image always rendering over my gameObjects even though it is definitely behind them?

In my 2d Unity project, I have a Canvas with an Image that I want for a Background.
I have 2 gameObjects in front of this background. But no matter how much fiddling I do with Pos Z, Sorting Layers, or hierarchy sorting, the image is always in front of the objects.
Gif above shows in 3d mode that even though the image is clearly behind these objects, it will always appear over them if they overlap.
Hierarchy:
Main
Camera (Inspector: https://i.imgur.com/Q5a52cf.png)
BackgroundCanvas (Inspector: https://i.imgur.com/m9Pxr6B.png)
BackgroundImage (Inspector: https://i.imgur.com/jTx7pEW.png)
Object1 (Inspector: https://i.imgur.com/YcClEhk.png)
Object2
Any advice to rescue me from this madness is much appreciated.
Set the sprite renderer's transform z value to 0 instead of 100
If that does not solve, please specify camera properties also, so I can try to recreate the exact setup.
Try clicking on Layers -> edit layers, inside sorting layers you can change the order grabbing layer, everything upper appears behind in the camera.
You could create a layer called Object
Assign it to the game objects.
Create an object camera
culling mask -> object layer
depth bigger than you current main camera.
Set it to Projection -> Orthographic
Clear flags -> solid colors.
canvas Render Mode -> Screen Space - Camera and assign the Render Camera to be the Object Camera
Inspector tab of the object or background.
Sprite Renderer.
Additional Settings.
Sorting Layer.
change it to a different layer.
Had this same issue and was able to fix it with these steps:
In canvas settings change Screen Space Overlay to Screen Space Camera
Set the camera variable to the one you are using for your scene.
I figured out a workaround. I basically created a VisualElement inside the UI Builder and set a render texture to the background. Then I added an extra camera to my project to view all the sprites that needed to be on top. That camera feeds the render texture, so now everything that camera sees is forced to be on top of the UI Document as the background of that VisualElement. If you want control over the whole screen, just set the VisualElement position to absolute and max out its dimensions. If your game doesn't have a fixed aspect ratio it might cause some stretching, but other than that I cant really tell the difference. Sorting layers for the UI Documents are broken and unity needs to work on that. This is the best option I've found. Hope this helps.
I had the same problem and I fixed it by attaching the camera to the canvas which is screen space and finally changing the sorting layer of my object to -1.

Spawn prefab in front of UI Image

I spawn the prefab in a gameobject and that gameobject is a child of UI Image what I'm trying to accomplish is display the prefab infront of a UI Image. I've tried making z position of gameobject to negative but nothing happens. I can't place the sorting layer of UI Image cause it doesn't exist. What should I do? Is this even possible?
Update
Tried adding second camera. I made gameobject layer to SecondCam. Change settings of second camera to depth only and it's culling mask to SecondCam only then it's layer to SecondCam also. Change settings of main camera culling mask to everything except SecondCam. But this doesn't work, Have I done something wrong?
Second Camera settings
Main Camera settings
If your Gameobject is not one of the UI components you may need to give it another layer and render it with another camera with depth property higher than the UI camera.
You can change the sorting layer on prefabs or any object created on runtime.
If your UI canvas is set to screen overlay, then it with ALWAYS render on top. You have to use a screen space or world space UI, along with Dave's answer.

How to let an EventSystem raycast through a part of UI

I'm using a modern (Event System) UI approach in Unity. I have a screen space Canvas, some interactable world space elements, and my camera is properly set up with a Physics Raycaster. I want some of my screen space Canvas elements to let rays through and hit the world space elements.
I set the Event Mask on the Physics Raycaster to just the UI layer, and the elements I want to ignore are on another layer, but that doesn't seem to do anything.
Here's a picture:
The panel itself and the labels on top and bottom are set to the Ignore Raycast layer.
I'm using Unity 2017.4 LTS.
Set off the raycast target on the UI elements you want to only be visible but no interaction.
https://docs.unity3d.com/ScriptReference/UI.Graphic-raycastTarget.html

how to convert coordinate between cameras with Unity3D

I have two cameras,
why I have two cameras?
because I want some game object in front of the Canvas so that it could move around, maybe could have some particles if it feels happy etc.. So:
I set up the main Camera with mask option UI and make it in charge of all the UI stuff, and to make sure it does not overlap everything I set the canvas mode "Screen-Space Camera"
I set up another camera and make it capture everything else except the UI stuff.
set their depth so that the main Camera with UI will behind the other Camera.
things go well now, I could see the game object before the UI stuff. Cheers!
But in some case I what to convert the UI element to the point in the world space, So that I could generate some gameObject near the UI element and maybe Tween it to move to another UI Element, let's take collecting Gems as an example.
Usually, I could just do that with following code:
GameObject go = Instantiate(eff_gem);
go.transform.position = Camera.main.ScreenToWorldPoint(p1.GetComponent<RectTransform>().position);
go.transform.DoMove(p2.GetComponent<RectTransform>().position, 1f).OnComplete(()=>{
Destory(go, 1f);
});
The main idea is to instantiate a gem near a button and make it fly to the count panel.
But since I make the canvas "Screen Space Camera" instead of "Screen Space Overlay", all the above code will be a mess. furthermore, I think I have to convert the screen point of the canvas element to the world space by the other camera, But I can't even get the actual pixel position of the canvas element by
p1.GetComponent<RectTransform>().position now.
So How Can I do that?
I have made a demo project to demonstrate the issue.
var p1R = p1.GetComponent<RectTransform>();
Vector3 p1WorldPos = p1R.TransformPoint(p1R.rect.center);
//use p1WorldPos in your other camera

Samsung Gear VR - create Menus

how can I create Menu in my application , I use canvas , but Gear VR Camera dosent see it .
is there way , to use button in gear application
3d text appear but not canvas text
Never use screen space ui with VR, switch to world space and either place it somewhere in the scene (as a "real" object) or parent it to the camera so it is always in the center of the viewport.
ChanibaL is right, canvas can not be used in screen space.
Last time, when I design a menu in vr. I add sphere out of the camera, with a full black color. And put the ui buttons inner the sphere as a menu.
I think this may help for you.
More about UI design in vr, I advice to read this post UI for VR

Categories

Resources