Unity - Identify escape gesture - c#

This is a simplified version of my code. The basic idea is to check for input. If the user presses escape the scene will change. If the user taps or clicks in an empty area a item will be placed in that area.
void Update()
{
if (Input.GetKeyDown(KeyCode.Escape))
{
CheckEscape();
}
else if (Input.GetMouseButtonUp(0))
{
PlaceItemOnTheBoard()
}
}
This all works very well when I'm running the game on my laptop but when I run it on my phone (Pixel 4a), the escape is triggered by a swipe gesture. When you perform this gesture it will place an item on the board and then triggers the CheckEscape() method.
Does anyone know a way to react to the escape gesture without placing an item on the board?

Related

Mobile, How to check what is being touched?

I am currently trying to do a clicker game in Unity and I have following situation:
I have multiple buttons on the screen. Currently (like in pretty much every clicker game) you press the screen and that´s it. Problem is that if someone clicks on the button it also runs the logic that should only be performed if the "regular" screen is being pressed.
Currently I am using:
if (Input.GetMouseButtonDown(0))
but I need to know how I can "filter out" when buttons are being touched.
You can have a touch condition, which is not fulfilled if you press in a certain area defined by the Camera with ScreenToWorldPoint. I use touches and check if there are touches on the screen. If they are below a y-coordinate in the world, than the condition is not fulfilled and there you can have your buttons:
public void Update(){
TouchInput();
}
public void TouchInput(){
if(touch.tapCount == 1 && TouchCondition(touch))
{
/*Here you put your function which is used when no buttons are pressed. In my
example
the Game starts. */
StartGame();
}
}
public bool TouchCondition(Touch touch)
{
/*You put your Condition into this if statement below.
In my example, if the touch is higher than the 2 y-coordinate,
than the condition is fulfilled and you start the game.
Otherwise it is not fulfilled and you can have all the buttons below 2y-coordinate */
if (Camera.main.ScreenToWorldPoint(touch.position).y >= 2)
{
return true;
}
else
{
return false;
}
}
you can check the input point if its in a valid firing region (outside of buttons) by making a rectangle, checking if the point is in it, and only firing event if it is. Try Rect.Contains.
https://docs.unity3d.com/ScriptReference/Rect.Contains.html

Mouse up vs. touch up in Unity

I have a bit of an issue. I can't seem to find a good way to distinguish whether Unity is firing a mouse up vs. a touch up event. By touch up I mean when the user releases their finger from the screen after touching something on a touch display.
The problem is this: I have an issue with game logic where the user both lets go of something they've selected with their mouse and SIMULTANEOUSLY touches / releases their touch to click a UI button (in other words, I'm trying to detect a mouse release only, or, a touch release only - how can I do that?). I've tried using both Input.GetMouseButtonUp(0) as well as Input.GetMouseButton(0), but both change their state if its a touch up or a mouse button up event.
So how do you properly distinguish between the two? I even tried using the Touch.fingerId property which works well to track / distinguish ONLY touch events, but, the problem is then that I still cannot distinguish ONLY mouse up with Input.GetMouseButton(0) since it fires even in the event of a touch up.
Does anyone out there happen to know what the proper way would be to detect just a mouse release, or just a touch release, separately, in Unity?
Edit:
Someone didn't understand the problem at hand, so let's assume for a second you have a desktop device with touch support. Add this script:
void Update () {
if (Input.GetMouseButtonDown(0))
Debug.Log("Mouse button down.");
if (Input.GetMouseButtonUp(0))
Debug.Log("Mouse button up.");
}
If you click and HOLD with your mouse, while holding, you can touch and release with your finger, which will log Mouse button up., despite the fact that you did not release the mouse button. How can I distinguish between a mouse up vs. a touch release?
For desktops:
Mouse Down - Input.GetMouseButtonDown
Mouse Up - Input.GetMouseButtonUp
Example:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("Mouse Pressed");
}
if (Input.GetMouseButtonUp(0))
{
Debug.Log("Mouse Lifted/Released");
}
For Mobile devices:
Touch Down - TouchPhase.Began
Touch Up - TouchPhase.Ended
Example:
if (Input.touchCount >= 1)
{
if (Input.touches[0].phase == TouchPhase.Began)
{
Debug.Log("Touch Pressed");
}
if (Input.touches[0].phase == TouchPhase.Ended)
{
Debug.Log("Touch Lifted/Released");
}
}
Now, if you are having issues of clicks going through UI Objects to hit other Objects then see this post for how to properly detect clicks on any type of GameObject.
I figured it out.
I was missing this call: Input.simulateMouseWithTouches = false;
Seems to work if you set this on a Start() event only (I could be wrong, it may be you need to set it before the first Input accessor on update, test it out I guess). Otherwise, Unity simulates mouse events from touch events. I can only speculate they do this in order to make it easier for game developers to write once and play cross-platform.
Note: Once its set, its set globally, so you’ll need to unset it even in a different scene you load later on.

How to Handle Swipe on Oculus Gear VR touchpad avoiding Tap event

I have a Scroll View containing few buttons as child elements under content-panel. Hierarchy looks like this:
I have implemented OVRTouchpad.TouchHandler event like this on my script attached to ScrollView:
void Start()
{
#if OVR && !UNITY_EDITOR
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += HandleTouchHandler;
#endif
}
void HandleTouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Left || touchArgs.TouchType == OVRTouchpad.TouchEvent.Up)
{
// Code to scroll UP for swipe in directions LEFT/UP
}
else if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Right || touchArgs.TouchType == OVRTouchpad.TouchEvent.Down)
{
// Code to scroll DOWN for swipe in directions RIGHT/DOWN
}
}
Problem :
As I am using OVR Input Muodule, it processes Tap input even if I try to swipe. So every time I swipe in any direction while gazing at button (child of scroll view). Button is clicked taking me to some other menu. I am not sure if this is desired behaviour of Gear VR Input system. As I have seen in Oculus Home App (and other apps on store) it only scrolls without triggering clicks on child element.
Is there any way to prevent click/tap if swipe is detected?
Any kind of help is highly appreciated.
Have you tried inserting an IF statement to handle the singletap trigger first?
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
//TODO: Nest IF statements here checking conditions which indicate swipe instead of tap. If nothing further detected, process single tap.
//Otherwise continue with swipe direction handling.
}
But that being said using the VRInput script mentioned in this tutorial would probably be your best bet as it can handle swipes with direction input too.
https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr
Snippit:
public event Action<SwipeDirection> OnSwipe; // Called every frame passing in the swipe, including if there is no swipe.

Unity2D: Panning a map using touch still register even a button in canvas is tapped

I have a problem cancelling pan feature of my app when a button in HUD is pressed.
We're using canvas for displaying UIs, so I can't use the RayCast to detect if a touch hit a button or not since the button is not directly in the world where the map to be panned located.
I already added collider to the buttons and tried printing the name of objects hitted by raycast but it just passes through the buttons like they are not there.
How am I suppose to detect if a button in canvas is touched so I can cancel executing pan feature?
Check EventSystem's IsPointerOverGameObject method. You can use it like this:
public bool IsPointerOverUI
{
get{
return EventSystem.current.IsPointerOverGameObject();
}
}
Don't forget to import UnityEngine.EventSystems

Difference Xna Enum touchState

I'm working on a Windows app using XNA.
Actually I succeed moving my sprite, but I want to add a different action when the user touch the sprite but don't move it. I know TouchlocationState Enum exist but I don't understand the difference between Moved and Pressed.
For now I use Released and that's enough, I update the sprite position while it's not released and then I check collision.
So how can I add only one touch method when it's clicked? I mean when the user tap the sprite but don't move it.
Some code:
TouchPanelCapabilities touchCap = TouchPanel.GetCapabilities();
if (touchCap.IsConnected)
{
TouchCollection touches = TouchPanel.GetState();
if (touches.Count >= 1)
{
Vector2 PositionTouch = touches[0].Position;
if (touches[0].State == TouchLocationState.Released)
{
// Pause button click and others buttons
Mouseclik((int)PositionTouch.X, (int)PositionTouch.Y);
}
if (!PausePopUp)
{
CheckMoove(PositionTouch);
if (touches[touches.Count - 1].State == TouchLocationState.Released)
{
// this is where i try to add/check if its only "click" on my sprite
if (touches[0].Position == touches[touches.Count - 1].Postion)
{
TempoRectangle = ListSprite[save].ShapeViser;
isclicked = true;
}
My goal is to add a picturebox above the sprite to display information then if an other sprite is touched while the picturebox is diplayed, I want to draw a line between these 2 sprites.
The difference is simple:
TouchLocationState.Pressed means that a new location is detected.
TouchLocationState.Moved means that the location position was updated or pressed at the same position
This means that you will get only a Pressed state for each touch action, a sequence of Moved every cycle while the touch is pressed, and when you release it you'll get a Released.
This is explained in the link you provided, too.

Categories

Resources