How do I convert point to local coordinates? - c#

when a user taps a button I want the tap to be ignored if a transparent pixel is hit and the button beneath it should receive the tap instead.
I can do this sort of behaviour in objective c within a few minutes, however, trying to do it in Unity in C# has proven impossible as there is no way to convert a point from the screen to a local point. Obviously either the center or the bottom left, or the top left corner should be the origin point (I don't know which because Unity continuously changes where the origin is depending on the phase of the moon)
I've tried looking at the IsRaycastLocationValid from the ICanvasRaycastFilter
and inside I've used both RectTransformUtility.ScreenPointToLocalPointInRectangle and RectTransformUtility.ScreenPointToWorldPointInRectangle and the result is always the same. The RectTransform I give is the one belonging to the button.
eg:
public bool IsRaycastLocationValid(Vector2 screenPosition, Camera raycastEventCamera) //uGUI callback
{
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(rectTransform, screenPosition, null, out localPoint);
Vector2 pivot = rectTransform.pivot - new Vector2(0.5f, 0.5f);
Vector2 pivotScaled = Vector2.Scale(rectTransform.rect.size, pivot);
Vector2 realPoint = localPoint + pivotScaled;
Debug.Log(screenPosition + " " + localPoint);
return false;
}
Which I got from someone else's unanswered attempt here
Unity 3d 4.6 Raycast ignore alpha on sprite
I've found this link
http://forum.unity3d.com/threads/alpha-area-on-round-gui-button-return-click-event.13608/
where someone tries to determine if the mousePosition is within the image bounds. The obvious problem being that the image has to have some corner at (100,100). Magic numbers are bad and trying to figure out how to get those coordinates is next to impossible.
The following properties on RectTransform is always the same no matter where you place the button:
anchoredPosition
anchoredPosition3D
anchorMax
anchorMin
offsetMax
offsetMin
pivot
rect
sizeDelta
as you can see, that is all the properties a RectTransform has. Therefore it is impossible to tell where a button is, and is impossible to tell where a click is in the button's coordinate space.
Can someone please tell me how to do what I need to do?

Should do the trick:
bool isScreenPointInsideRectTransform( Vector2 screenPoint, RectTransform transform, Camera canvasCamera )
{
Vector2 localPoint;
RectTranformUtility.ScreenPointToLocalPointInRectangle( transform, screenPoint, canvasCamera, out localPoint );
return transform.rect.Contains( localPoint );
}

Thats exactly why you dont find the location of the point inside of the button itself... but you compare the location of the point inside of the parent rect of the button and the buttons localPosition itself.
That or you can translate the pixel point to screen position then compare with mouse position

Related

Unity 3D interaction with UI Image

I am creating a 3D game and in the UI, there is an image that I want to print different things when the user puts the cursor on different parts of the Image. In a 2D game, I would add child objects to the image and add polygon colliders to them, and use OnMouseOver() method. But as I understand, this doesn't work on UI. I also tried OnPointerEnter() method and it works but I can't split the image into different parts using this. I tried to split the image into small parts using an external tool and putting them side by side and making a whole image but Unity recognizes all images as rectangular images (I am trying to make this for Girl with a Pearl Earring painting so shapes are irregular. ). How can I do this?
While this is not the best solution, you can get the current mouse position and check first if it's "inside" the picture, and then check for different parts
(For example if the image is between [100,100] to [300,300], and one part is between [100,100] to [150,150], you can check if the mouse is between those coordinates).
So you would get the position:
Vector2 mousePosition = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
float x1 = imageBoundry, x2 = imageBoundry;
float y1 = //Same but Y
if mousePosition.X between x1 and x2 && mousePosition.Y between y1 and y2
{
Check, in which part of the image the cursor is at, and decide then
}
Or, you can raycast and check the tag, and if the tag matches to the image's one, check for the specific part
ray = Camera.main.ScreenPointToRay(Input.mousePosition);
if(Physics.Raycast(ray, hit))
{
if (hit.collider.gameObject.CompareTag(YOUR_TAG))
{
Check for inner boundaries.
}
}

Unity - drag to resize / scale plane based on handle position?

Alright, I cant find any example of this already done and my attempt is yielding odd results - I need to drag to resize a flattened cube (like a plane, but must have thickness so it was a cube) using a handle in the corner. Like a window on your desktop.
So far Ive created my handle plane and gotten it via script attached to my cube plane. The cube plane and the handle are children of an empty to which the scale is being applied so that the cube plane scales from left to right as desired:
That works, however my attempt at using the delta scale handle position to scale the parent empty scales either way too much or in odd directions:
Awake() {
scaleHandle = GameObject.FindGameObjectWithTag("ScaleHandle");
scaleHandleInitialPos = scaleHandle.transform.localPosition;
}
Update()
{
width += -1*(scaleHandleInitialPos.x - scaleHandle.transform.localPosition.x);
height += -1 * (scaleHandleInitialPos.y - scaleHandle.transform.localPosition.y);
transform.parent.localScale = new Vector3(width, height, thickness);
}
what is wrong here?
Hierarchy:
Transform of the child:
With updating initialPos every Update() OR doing width = scaleHandle.transform.localPosition.x / scaleHandleInitialPos.x; where the white square is the scaleHandle

How do i prevent the player from dragging an image outside the specified area?

I am trying to drag an Image in a certain area. For that I am using IDragHandler. To prevent the image to go outside the area, i put four box colliders in a square shape. The box was still moving out. So, I put the fixed timestep to 0.0001. Now, when the image goes out of boundry, it pushes back the image in the specified area which is fine but I want the image to stop moving out the boundary the moment it touch the edge of the image.
Here's my code:
public class Draggable : MonoBehaviour, IDragHandler
{
public GameObject box;
private void OnCollisionEnter2D(Collision2D collision)
{
Debug.Log("Triggered");
}
public void OnDrag(PointerEventData eventData)
{
box.transform.position = eventData.position;
}
}
Don’t use physics and fixed timestep to solve this problem. It is performance killer to reduce timestep.
One way to do it : use Physics.OverlapBox ( see documentation https://docs.unity3d.com/ScriptReference/Physics.OverlapBox.html)
Do this test between your draggable object and the limit of your draggable zone in your OnDrag method. Calculate the « Wanted » position base on the event you receive and if your object overlap with your borders then dont move it, if not your are safe and you can move it.
Try to achieve your bounderies as simple numbers. Maybe create a rectangle and take its corners. Then you get the best user experience by using min and max functions, effectively "clamping" the allowed coordinates before they are actually set.
It makes sense to clamp x and y separately. That way the user can still drag along the Y axis if there is room despite the mouse being outside the boundary on the X axis.

Touch and Canvas element location do not match

I'm trying to create a game like WordCookie. I want to use a LineRenderer, so I cannot use the Screen Space - Overlay Canvas Renderer mode. When using either Screen Space - Camera or World View, my Touch position doesn't match the position I get from my RectTransforms.
I access my buttons' transform position by logging:
foreach (RectTransform child in buttons.transform)
{
Debug.Log("assigning" + child.transform);
}
For the touch cooridinates, I simply log the touch.position.
To clarify; I want to trigger my LineRenderrer when the distance between the position vectors is smaller than a certain float. However, whenever I tap on my button to test this, the button logs at (1.2, -2.6) and my touch at (212.2, 250.4).
What could be causing this?
touch.position returns a value using pixel coordinates.
In order to convert to world coordinates you need to use: Camera.ScreenToWorldPoint
Should be something like that:
Vector2 touch = Input.GetTouch(0);
float buttonPositionZ = button.transform.position.z;
Vector3 worldPosition = camera.ScreenToWorldPoint(new Vector3(touch.x, touch.y, buttonPositionZ));

Discontinuity in RaycastHit.textureCoord values (getting coordinates on texture right)

Hi I'm trying to get a particular coordinate on texture (under mouse cursor). So on mouse event I'm performing:
Ray outRay = Camera.main.ScreenPointToRay(Input.mousePosition);
RaycastHit clickRayHit = new RaycastHit();
if (!Physics.Raycast(outRay, out clickRayHit))
{
return;
}
Vector2 textureCoordinate = clickRayHit.textureCoord;
Texture2D objectTexture = gameObject.GetComponent<Renderer>().material.mainTexture as Texture2D;
int xCord = (int)(objectTexture.width * textureCoordinate.x);
int yCord = (int)(objectTexture.height * textureCoordinate.y);
But the problem is that the coordinate I'm getting is not preciesly under the cursor but "somewhat near it". And it's not like coordinates are consistently shifted in one way but shifting is not random as well. They are shifted differently in different points of the texture but:
they remain somwhere in the area of real cursor
and coordinates shifted in the same way when cursor is above the same point.
Here is part of coordinates log: http://pastebin.ca/3029357
If i haven't described problem good enough I can record a short screencast.
GameObject is a Plane.
If it is relevant mouseEvent is generated by windows mouseHook. (Application specific thing)
What am I doing wrong?
UPD: I've decided to record screencast - https://youtu.be/LC71dAr_tCM?t=42. Here you can see Paint window image through my application. On the bottom left corner you can see that Paint is displaying coordinates of the mouse (I'm getting this coordinates in a way I've described earlier - location of a point on a texture). So as I move mouse cursor you can see how coordinates are changing.
UPD2:
I just want to emphasize one more time that this shift is not constant or linear. There could be "jumps" around the mouse coordinates (but not only jumps). The video above explains it better.
So it was scaling problem after all. One of the scale values were negative and was causing this behaviour.

Categories

Resources