Calculate object position in screen space - camera canvas - c#

I'm in need of calculate the correct position of a 3D object which I'm displaying over a RawImage in my UI. If I put my UI image in the center of the screen I can calculate it perfectly, if I move my image around my UI canvas I'm getting results which are off by an offset, depending on the side I'm moving. I've taken a couple of screenshot of what I mean, the orange side is my 3D quad, the white square is just an image debugging where my point should be calculated.
The setup I have is this:
- a world camera pointing on a 3D quad
- a ui canvas with a dedicated perspective camera (Screen space - Camera)
- a panel in my ui canvas displaying the 3D quad
The code:
var worldPoint = t.Value.MeshRenderer.bounds.min; //t.Value is the 3D quad
var screenPoint = worldCamera.WorldToScreenPoint(worldPoint);
screenPoint.z = (baseCanvas.transform.position - uiCamera.transform.position).magnitude; //baseCanvas is the UI canvas
var pos = uiCamera.ScreenToWorldPoint(screenPoint);
debugger.transform.position = pos; //debugger is just the square image used to see where my calculated point is landing
I've tried multiple ways, like this:
var screenPoint = worldCamera.WorldToScreenPoint(t.Value.MeshRenderer.bounds.min);
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(rectTransform, screenPoint, uiCamera, out localPoint); //rectTransform is my UI panel
debugger.transform.localPosition = localPoint
But I always get the same result, how can I make the correct calculation considering the offset?

Related

Unity - drag to resize / scale plane based on handle position?

Alright, I cant find any example of this already done and my attempt is yielding odd results - I need to drag to resize a flattened cube (like a plane, but must have thickness so it was a cube) using a handle in the corner. Like a window on your desktop.
So far Ive created my handle plane and gotten it via script attached to my cube plane. The cube plane and the handle are children of an empty to which the scale is being applied so that the cube plane scales from left to right as desired:
That works, however my attempt at using the delta scale handle position to scale the parent empty scales either way too much or in odd directions:
Awake() {
scaleHandle = GameObject.FindGameObjectWithTag("ScaleHandle");
scaleHandleInitialPos = scaleHandle.transform.localPosition;
}
Update()
{
width += -1*(scaleHandleInitialPos.x - scaleHandle.transform.localPosition.x);
height += -1 * (scaleHandleInitialPos.y - scaleHandle.transform.localPosition.y);
transform.parent.localScale = new Vector3(width, height, thickness);
}
what is wrong here?
Hierarchy:
Transform of the child:
With updating initialPos every Update() OR doing width = scaleHandle.transform.localPosition.x / scaleHandleInitialPos.x; where the white square is the scaleHandle

Touch and Canvas element location do not match

I'm trying to create a game like WordCookie. I want to use a LineRenderer, so I cannot use the Screen Space - Overlay Canvas Renderer mode. When using either Screen Space - Camera or World View, my Touch position doesn't match the position I get from my RectTransforms.
I access my buttons' transform position by logging:
foreach (RectTransform child in buttons.transform)
{
Debug.Log("assigning" + child.transform);
}
For the touch cooridinates, I simply log the touch.position.
To clarify; I want to trigger my LineRenderrer when the distance between the position vectors is smaller than a certain float. However, whenever I tap on my button to test this, the button logs at (1.2, -2.6) and my touch at (212.2, 250.4).
What could be causing this?
touch.position returns a value using pixel coordinates.
In order to convert to world coordinates you need to use: Camera.ScreenToWorldPoint
Should be something like that:
Vector2 touch = Input.GetTouch(0);
float buttonPositionZ = button.transform.position.z;
Vector3 worldPosition = camera.ScreenToWorldPoint(new Vector3(touch.x, touch.y, buttonPositionZ));

ARCore 1.4.1 Align rectangle pararell to floor

How can I set rotation of the placed on vertical plane object to make its bottom side pararell to floor? At the moment vertically placed objects are sticky to the wall but have random rotation. I have been trying to set object on horizontal plane and then adjust vetical objects vector forward to horizontal objects vector up. That seems to work somehow but multiple objects apart of being pararell to each other are rotated in christmas tree shape. Important thing - I want to rotate it relative to real world, not to bottom of the screen.
What I want:
What I get:
VerticallyPlacedObject.transform.forward = HorizontallyPlacedObject.transform.up;
Screenshot:

Camera units in Unity5

i'm currently programming an 2D topview unity game. And i want to set the camera such as, that just a specific area is visible. That means that i know the size of my area and when the camera, which is following the player currently reaches the border of the area, i want to visible stops.
So here is my question: i know where the camera is and how it can follow the player but i dont know how i can calculate the distance between the border of the field and the border of waht the camera sees. how can i do that?
Essentially, treat your playable area as a rectangle. Then, make a smaller rectangle within that rectangle that accounts for the camera's orthographic size. Don't forget to include the aspect ratio of your camera when calculating horizontal bounds.
Rect myArea; // this stores the bounds of your playable area
Camera cam; // this is your orthographic camera, probably Camera.main
GameObject playerObject; // this is your player
float newX = Mathf.Clamp(
playerObject.transform.position.x,
myArea.xMin + cam.orthographicSize * cam.aspect,
myArea.xMax - cam.orthographicSize * cam.aspect
);
float newY = Mathf.Clamp(
playerObject.transform.position.y,
myArea.yMin + cam.orthographicSize,
myArea.yMax - cam.orthographicSize
);
cam.transform.position = new Vector3(newX,newY,cam.transform.position.z);
If you're using an alternative plane (say xz instead of xy), just swap out the corresponding dimensions in all the calculations.

Translate coordinates to another plane

My main plane is Rectangle(0,0,10000,10000) for example.
My screen plane (ie virtual position) is Rectangle(1000,1000,1920,1080).
My Texture2D is Rectangle(1500,1200,200,100) in main plane.
I need to translate my Texture2D coordinates to my screen plane. I tried with Matrix.Translate without success.
I must get Texture2D = Rectangle(500,200,200,100) in screen plane.
In order to get the Texture2D from (1500, 1200) to (500, 200) you have to use a translation of (-1000, -1000) which are the inverse numbers from your screen plane's coordinates. In code your translation would be something like this
Matrix transform = Matrix.CreateTranslation(-screenPlane.x, -screenPlane.y, 0);
The theory is that you want to move the texture like if your camera was on (0, 0) instead of (1000, 1000). You have to move the texture by (-1000, -1000) in order to do so.
Check the web for 2D camera classes, always usefull to know how cameras work :)
This one for example: http://www.david-amador.com/2009/10/xna-camera-2d-with-zoom-and-rotation/

Categories

Resources