Mouse up vs. touch up in Unity - c#

I have a bit of an issue. I can't seem to find a good way to distinguish whether Unity is firing a mouse up vs. a touch up event. By touch up I mean when the user releases their finger from the screen after touching something on a touch display.
The problem is this: I have an issue with game logic where the user both lets go of something they've selected with their mouse and SIMULTANEOUSLY touches / releases their touch to click a UI button (in other words, I'm trying to detect a mouse release only, or, a touch release only - how can I do that?). I've tried using both Input.GetMouseButtonUp(0) as well as Input.GetMouseButton(0), but both change their state if its a touch up or a mouse button up event.
So how do you properly distinguish between the two? I even tried using the Touch.fingerId property which works well to track / distinguish ONLY touch events, but, the problem is then that I still cannot distinguish ONLY mouse up with Input.GetMouseButton(0) since it fires even in the event of a touch up.
Does anyone out there happen to know what the proper way would be to detect just a mouse release, or just a touch release, separately, in Unity?
Edit:
Someone didn't understand the problem at hand, so let's assume for a second you have a desktop device with touch support. Add this script:
void Update () {
if (Input.GetMouseButtonDown(0))
Debug.Log("Mouse button down.");
if (Input.GetMouseButtonUp(0))
Debug.Log("Mouse button up.");
}
If you click and HOLD with your mouse, while holding, you can touch and release with your finger, which will log Mouse button up., despite the fact that you did not release the mouse button. How can I distinguish between a mouse up vs. a touch release?

For desktops:
Mouse Down - Input.GetMouseButtonDown
Mouse Up - Input.GetMouseButtonUp
Example:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("Mouse Pressed");
}
if (Input.GetMouseButtonUp(0))
{
Debug.Log("Mouse Lifted/Released");
}
For Mobile devices:
Touch Down - TouchPhase.Began
Touch Up - TouchPhase.Ended
Example:
if (Input.touchCount >= 1)
{
if (Input.touches[0].phase == TouchPhase.Began)
{
Debug.Log("Touch Pressed");
}
if (Input.touches[0].phase == TouchPhase.Ended)
{
Debug.Log("Touch Lifted/Released");
}
}
Now, if you are having issues of clicks going through UI Objects to hit other Objects then see this post for how to properly detect clicks on any type of GameObject.

I figured it out.
I was missing this call: Input.simulateMouseWithTouches = false;
Seems to work if you set this on a Start() event only (I could be wrong, it may be you need to set it before the first Input accessor on update, test it out I guess). Otherwise, Unity simulates mouse events from touch events. I can only speculate they do this in order to make it easier for game developers to write once and play cross-platform.
Note: Once its set, its set globally, so you’ll need to unset it even in a different scene you load later on.

Related

Unity - Identify escape gesture

This is a simplified version of my code. The basic idea is to check for input. If the user presses escape the scene will change. If the user taps or clicks in an empty area a item will be placed in that area.
void Update()
{
if (Input.GetKeyDown(KeyCode.Escape))
{
CheckEscape();
}
else if (Input.GetMouseButtonUp(0))
{
PlaceItemOnTheBoard()
}
}
This all works very well when I'm running the game on my laptop but when I run it on my phone (Pixel 4a), the escape is triggered by a swipe gesture. When you perform this gesture it will place an item on the board and then triggers the CheckEscape() method.
Does anyone know a way to react to the escape gesture without placing an item on the board?

Can you get finger position from OnMouseDown and OnMouseUp?

On a 2D layout, having a fixed in place button-like GameObject that has a script with the OnMouseDown and OnMouseUp function: touching that and moving the finger away and releasing, can that OnMouseUp get that corresponding finger release position on screen?
The catch is: its a multiplayer, many other fingers on screen!
You shouldn't be using the OnMouseDown and OnMouseUp callback functions for this because that would require that you get the touch position with Input.mousePosition on the desktops and Input.touches on mobile devices.
This should be done with the OnPointerDown and OnPointerUp functions which gives you PointerEventData as parameter. You can access the pointer position there with PointerEventData.position. It will work on both mobile and desktop devices without having to write different code for each platform. Note that you must implement the IPointerDownHandler and IPointerUpHandler interfaces in order for these functions to be called.
public void OnPointerDown(PointerEventData eventData)
{
Debug.Log("Mouse Down: " + eventData.position);
}
public void OnPointerUp(PointerEventData eventData)
{
Debug.Log("Mouse Up: " + eventData.position);
}
If you run into issues with it, you can always visit the troubleshooting section on this post to see why.
short answer yes. however this becomes a little muddled with multi-touch. basically you can get the mouse or finger position at any time. you can track it in real time if you want and display the position of the mouse at all times pretty easily. the Input.MousePosition variable is easy to grab. the problem comes when you have more than one mouse or finger in that you need to track which finger is up, and which ones are still down. still, if you set it up right it should work, it will just take more effort. my advice though,if your handling more than one touch, is to use the Standalone MultiTouch Input Module. its free in the asset store, and ive included a link for you. it's pretty straightforward.

Unity game doesn't detect every click

I'm trying to create a 2d game with a player ( a ball ) that jumps when I touch the screen. I accomplish this with :
if (Input.GetMouseButtonDown (0) || Input.GetKeyDown("space")) {
gameObject.GetComponent<Rigidbody2D>().velocity=Vector.up*speed;
gameObject.GetComponent<AudioSource>().Play();
}
(The code is in Update())
There isn't any type of problem in the Editor, but when I debug the game on my Android phone the ball doesn't jump every time I touch the screen and, due to the gravity, it fall down, as if I had not touched. In particular I noticed that the problem is more evident after some minutes of play, or every time I realod the level. I've tried many things but none helped me. What am I doing wrong?
Both Input.GetMouseButtonDown() and Input.GetKeyDown() are not associated with Mobile Touch input detection. For touch detection, check this reference.
Input.GetMouseButtonDown(0) does not officially work with touches, however, it sometimes still gets called, which is undocumented behavior and should therefore not be relied on.
If you want to check for a touch you can use:
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
As a sidenote (This is not part of the touch input problem you're having) : Try to add forces to the rigidbody instead of setting the velocity directly as this might otherwise "break" the physics simulation.

How to Handle Swipe on Oculus Gear VR touchpad avoiding Tap event

I have a Scroll View containing few buttons as child elements under content-panel. Hierarchy looks like this:
I have implemented OVRTouchpad.TouchHandler event like this on my script attached to ScrollView:
void Start()
{
#if OVR && !UNITY_EDITOR
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += HandleTouchHandler;
#endif
}
void HandleTouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Left || touchArgs.TouchType == OVRTouchpad.TouchEvent.Up)
{
// Code to scroll UP for swipe in directions LEFT/UP
}
else if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Right || touchArgs.TouchType == OVRTouchpad.TouchEvent.Down)
{
// Code to scroll DOWN for swipe in directions RIGHT/DOWN
}
}
Problem :
As I am using OVR Input Muodule, it processes Tap input even if I try to swipe. So every time I swipe in any direction while gazing at button (child of scroll view). Button is clicked taking me to some other menu. I am not sure if this is desired behaviour of Gear VR Input system. As I have seen in Oculus Home App (and other apps on store) it only scrolls without triggering clicks on child element.
Is there any way to prevent click/tap if swipe is detected?
Any kind of help is highly appreciated.
Have you tried inserting an IF statement to handle the singletap trigger first?
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
//TODO: Nest IF statements here checking conditions which indicate swipe instead of tap. If nothing further detected, process single tap.
//Otherwise continue with swipe direction handling.
}
But that being said using the VRInput script mentioned in this tutorial would probably be your best bet as it can handle swipes with direction input too.
https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr
Snippit:
public event Action<SwipeDirection> OnSwipe; // Called every frame passing in the swipe, including if there is no swipe.

Unity2D: Panning a map using touch still register even a button in canvas is tapped

I have a problem cancelling pan feature of my app when a button in HUD is pressed.
We're using canvas for displaying UIs, so I can't use the RayCast to detect if a touch hit a button or not since the button is not directly in the world where the map to be panned located.
I already added collider to the buttons and tried printing the name of objects hitted by raycast but it just passes through the buttons like they are not there.
How am I suppose to detect if a button in canvas is touched so I can cancel executing pan feature?
Check EventSystem's IsPointerOverGameObject method. You can use it like this:
public bool IsPointerOverUI
{
get{
return EventSystem.current.IsPointerOverGameObject();
}
}
Don't forget to import UnityEngine.EventSystems

Categories

Resources