Touch and Canvas element location do not match - c#

I'm trying to create a game like WordCookie. I want to use a LineRenderer, so I cannot use the Screen Space - Overlay Canvas Renderer mode. When using either Screen Space - Camera or World View, my Touch position doesn't match the position I get from my RectTransforms.
I access my buttons' transform position by logging:
foreach (RectTransform child in buttons.transform)
{
Debug.Log("assigning" + child.transform);
}
For the touch cooridinates, I simply log the touch.position.
To clarify; I want to trigger my LineRenderrer when the distance between the position vectors is smaller than a certain float. However, whenever I tap on my button to test this, the button logs at (1.2, -2.6) and my touch at (212.2, 250.4).
What could be causing this?

touch.position returns a value using pixel coordinates.
In order to convert to world coordinates you need to use: Camera.ScreenToWorldPoint
Should be something like that:
Vector2 touch = Input.GetTouch(0);
float buttonPositionZ = button.transform.position.z;
Vector3 worldPosition = camera.ScreenToWorldPoint(new Vector3(touch.x, touch.y, buttonPositionZ));

Related

How do I rotate my player based on my current world mouse position 3D (isometric-like viewing angle)

This game is 3d, but the view is 'orthographic-like' please see the illustration below for more clarity on viewing angle
I'm trying to set my player's rotation to always face the mouse position in the game world, but only rotating the player around the Y rotational axis.
My pseudocode is something like:
get mouse pos on screen.
get player pos in world.
convert mouse pos on screen to mouse pos in world.
determine distance from mouse pos to player pos.
use atan2 on x, z to calculate angle.
use angle to continually adjust player rotation to mouse pos.
see: illustration
Starting from your pseudo code you could find some of these by simply looking into the API:
get mouse pos on screen.
Unity already provides this: Input.mousePosition
get player pos in world.
Once you have a reference to the according GameObject or better directly Transform you can simply access its position
convert mouse pos on screen to mouse pos in world.
There are multiple solutions like Camera.ScreenToWorldPoint. In this case however it would be easier to create a mathematical Plane and then use Camera.ScreenPointToRay in order to get a ray for your mouse and pass it into Plane.Raycast.
determine distance from mouse pos to player pos.
use atan2 on x, z to calculate angle.
use angle to continually adjust player rotation to mouse pos
These are not necessary since Unity already does all of this for you ;)
Instead you can simply calculate the vector direction from your player to the point where the mouse ray hits the plane, erase the difference in the Y axis and the use Quaternion.LookRotation in order to rotate the player to make it look into the same direction.
So it could look like e.g.
// drag in your player object here via the Inspector
[SerializeField] private Transform _player;
// If possible already drag your camera in here via the Inspector
[SerializeField] private Camera _camera;
private Plane plane;
void Start()
{
// create a mathematical plane where the ground would be
// e.g. laying flat in XZ axis and at Y=0
// if your ground is placed differently you'ld have to adjust this here
plane = new Plane(Vector3.up, Vector3.zero);
// as a fallback use the main camera
if(!_camera) _camera = Camera.main;
}
void Update()
{
// Only rotate player while mouse is pressed
// change/remove this according to your needs
if (Input.GetMouseButton(0))
{
//Create a ray from the Mouse position into the scene
var ray = _camera.ScreenPointToRay(Input.mousePosition);
// Use this ray to Raycast against the mathematical floor plane
// "enter" will be a float holding the distance from the camera
// to the point where the ray hit the plane
if (plane.Raycast(ray, out var enter))
{
//Get the 3D world point where the ray hit the plane
var hitPoint = ray.GetPoint(enter);
// project the player position onto the plane so you get the position
// only in XZ and can directly compare it to the mouse ray hit
// without any difference in the Y axis
var playerPositionOnPlane = plane.ClosestPointOnPlane(_player.position);
// now there are multiple options but you could simply rotate the player so it faces
// the same direction as the one from the playerPositionOnPlane -> hitPoint
_player.rotation = Quaternion.LookRotation(hitPoint-playerPositionOnPlane);
}
}
}
I suggest you two ways:
1) Use a raycast, which hits your surface where character stands, something like ScreePointToRay. Get the "hit" from ray and rotate your character to hit point, counting angle between character pos and hit point.
2) Convert character pos to screen point with Camera.WorldToScreenPoint and after that count angle between mouse point and character point. After that you will know angle between them.
Be aware of function called LookAt, maybe it will be handy.

Calculate object position in screen space - camera canvas

I'm in need of calculate the correct position of a 3D object which I'm displaying over a RawImage in my UI. If I put my UI image in the center of the screen I can calculate it perfectly, if I move my image around my UI canvas I'm getting results which are off by an offset, depending on the side I'm moving. I've taken a couple of screenshot of what I mean, the orange side is my 3D quad, the white square is just an image debugging where my point should be calculated.
The setup I have is this:
- a world camera pointing on a 3D quad
- a ui canvas with a dedicated perspective camera (Screen space - Camera)
- a panel in my ui canvas displaying the 3D quad
The code:
var worldPoint = t.Value.MeshRenderer.bounds.min; //t.Value is the 3D quad
var screenPoint = worldCamera.WorldToScreenPoint(worldPoint);
screenPoint.z = (baseCanvas.transform.position - uiCamera.transform.position).magnitude; //baseCanvas is the UI canvas
var pos = uiCamera.ScreenToWorldPoint(screenPoint);
debugger.transform.position = pos; //debugger is just the square image used to see where my calculated point is landing
I've tried multiple ways, like this:
var screenPoint = worldCamera.WorldToScreenPoint(t.Value.MeshRenderer.bounds.min);
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(rectTransform, screenPoint, uiCamera, out localPoint); //rectTransform is my UI panel
debugger.transform.localPosition = localPoint
But I always get the same result, how can I make the correct calculation considering the offset?

How to use Graphic Raycaster with WorldSpace UI?

I'm trying to figure out how Graphic.Raycaster works, but documentation doesn't help. I want to use it to cast raycast from some position at a certain angle and hit the UI. The other thing is that I don't know how to make it interact with the UI(drag, click etc.). I know that's broad subject, but I just can't find any good explanation of how to use it, so I would be grateful for any explanation.
From Unity docs:
The Graphic Raycaster is used to raycast against a Canvas. The
Raycaster looks at all Graphics on the canvas and determines if any of
them have been hit.
You can use EventSystem.RaycastAll to raycast against graphics(UI) elements.
Here is a short example for your case:
void Update() {
// Example: get controller's current orientation:
Quaternion ori = GvrController.Orientation;
// If you want a vector that points in the direction of the controller
// you can just multiply this quat by Vector3.forward:
Vector3 vector = ori * Vector3.forward;
// ...or you can just change the rotation of some entity on your scene
// (e.g. the player's arm) to match the controller's orientation
playerArmObject.transform.localRotation = ori;
// Example: check if touchpad was just touched
if (GvrController.TouchDown) {
// Do something.
// TouchDown is true for 1 frame after touchpad is touched.
PointerEventData pointerData = new PointerEventData(EventSystem.current);
pointerData.position = Input.mousePosition; // use the position from controller as start of raycast instead of mousePosition.
List<RaycastResult> results = new List<RaycastResult>();
EventSystem.current.RaycastAll(pointerData, results);
if (results.Count > 0) {
//WorldUI is my layer name
if (results[0].gameObject.layer == LayerMask.NameToLayer("WorldUI")){
string dbg = "Root Element: {0} \n GrandChild Element: {1}";
Debug.Log(string.Format(dbg, results[results.Count-1].gameObject.name,results[0].gameObject.name));
//Debug.Log("Root Element: "+results[results.Count-1].gameObject.name);
//Debug.Log("GrandChild Element: "+results[0].gameObject.name);
results.Clear();
}
}
}
The above script is not tested by myself. So there might be some errors.
Here are some other references to help you understand more:
Graphics Raycaster of Unity; How does it work?
Raycast against UI in world space
How to raycast against uGUI objects from an arbitrary screen/canvas position
How do you perform a Graphic Raycast?
GraphicRaycaster
Hope it helps.
Umair M's current suggestion doesn't handle the fact that the ray is originating in world space and traveling at an angle.
It doesn't look to me like you can do a GUI raycast in world space at an angle even if your canvas is in world space. This page suggests a technique of creating a non-rendering camera, moving it around in 3D space with the ray you want to cast, and then doing the GUI raycast relative to that camera. I haven't tried it yet, but it sounds promising.

How do I convert point to local coordinates?

when a user taps a button I want the tap to be ignored if a transparent pixel is hit and the button beneath it should receive the tap instead.
I can do this sort of behaviour in objective c within a few minutes, however, trying to do it in Unity in C# has proven impossible as there is no way to convert a point from the screen to a local point. Obviously either the center or the bottom left, or the top left corner should be the origin point (I don't know which because Unity continuously changes where the origin is depending on the phase of the moon)
I've tried looking at the IsRaycastLocationValid from the ICanvasRaycastFilter
and inside I've used both RectTransformUtility.ScreenPointToLocalPointInRectangle and RectTransformUtility.ScreenPointToWorldPointInRectangle and the result is always the same. The RectTransform I give is the one belonging to the button.
eg:
public bool IsRaycastLocationValid(Vector2 screenPosition, Camera raycastEventCamera) //uGUI callback
{
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(rectTransform, screenPosition, null, out localPoint);
Vector2 pivot = rectTransform.pivot - new Vector2(0.5f, 0.5f);
Vector2 pivotScaled = Vector2.Scale(rectTransform.rect.size, pivot);
Vector2 realPoint = localPoint + pivotScaled;
Debug.Log(screenPosition + " " + localPoint);
return false;
}
Which I got from someone else's unanswered attempt here
Unity 3d 4.6 Raycast ignore alpha on sprite
I've found this link
http://forum.unity3d.com/threads/alpha-area-on-round-gui-button-return-click-event.13608/
where someone tries to determine if the mousePosition is within the image bounds. The obvious problem being that the image has to have some corner at (100,100). Magic numbers are bad and trying to figure out how to get those coordinates is next to impossible.
The following properties on RectTransform is always the same no matter where you place the button:
anchoredPosition
anchoredPosition3D
anchorMax
anchorMin
offsetMax
offsetMin
pivot
rect
sizeDelta
as you can see, that is all the properties a RectTransform has. Therefore it is impossible to tell where a button is, and is impossible to tell where a click is in the button's coordinate space.
Can someone please tell me how to do what I need to do?
Should do the trick:
bool isScreenPointInsideRectTransform( Vector2 screenPoint, RectTransform transform, Camera canvasCamera )
{
Vector2 localPoint;
RectTranformUtility.ScreenPointToLocalPointInRectangle( transform, screenPoint, canvasCamera, out localPoint );
return transform.rect.Contains( localPoint );
}
Thats exactly why you dont find the location of the point inside of the button itself... but you compare the location of the point inside of the parent rect of the button and the buttons localPosition itself.
That or you can translate the pixel point to screen position then compare with mouse position

Rotate an object according to terrain in Unity (C#)

I currently have an item placement system for building. It works by instantiating a "ghost" of the object that shows where it can be placed, the ghost (semi-transparent) object is attached to the camera and instantiates the object in its place when the player clicks.
I get the position at which to keep the ghost object like so:
var pos = transform.position + transform.forward * placeDistance; // A position 'someDistance' in front of the player
pos.y = Terrain.activeTerrain.SampleHeight(pos); // Get the position at the surface of the terrain to place the object
firePlaceable.transform.position = pos + Vector3.up * 0.001f; // 'halfHeight' is used in case the pivot is not on the base
Now.. I need the object to rotate according to the terrain so that the fire place is placed more or less correctly rotated. Any ideas? What would the best plan be?
Use the terrain normal vector at the place' position.
For example you could do a raycast straight down from the fireplace. The resulting hit contains a normal that is your place' up vector.
By thinking of it... I assume you already doing a raycast to get the position to place the fireplace right?
Use the placement raycast to get the up vector instead of making a new one.
So basicly do
fireplace.transform.up = clickPlaceHit.normal;

Categories

Resources