Multi-touch events in XNA - c#

I need to apply multi touch events together(vertical drag,horizontal drag) in xna. If any one knows please help.

I use the gesture as horizontal, vertical drag, this is an example :
while (TouchPanel.IsGestureAvailable)
{
GestureSample gesture =TouchPanel.ReadGesture();
if (gesture.GestureType == GestureType.VerticalDrag)
{
//insert the code here that execute when the event is detected
}
}
this is the code in your update or your thread, however, you must declare this in your class
TouchPanel.EnabledGestures = GestureType.VerticalDrag;

Related

Mobile, How to check what is being touched?

I am currently trying to do a clicker game in Unity and I have following situation:
I have multiple buttons on the screen. Currently (like in pretty much every clicker game) you press the screen and that´s it. Problem is that if someone clicks on the button it also runs the logic that should only be performed if the "regular" screen is being pressed.
Currently I am using:
if (Input.GetMouseButtonDown(0))
but I need to know how I can "filter out" when buttons are being touched.
You can have a touch condition, which is not fulfilled if you press in a certain area defined by the Camera with ScreenToWorldPoint. I use touches and check if there are touches on the screen. If they are below a y-coordinate in the world, than the condition is not fulfilled and there you can have your buttons:
public void Update(){
TouchInput();
}
public void TouchInput(){
if(touch.tapCount == 1 && TouchCondition(touch))
{
/*Here you put your function which is used when no buttons are pressed. In my
example
the Game starts. */
StartGame();
}
}
public bool TouchCondition(Touch touch)
{
/*You put your Condition into this if statement below.
In my example, if the touch is higher than the 2 y-coordinate,
than the condition is fulfilled and you start the game.
Otherwise it is not fulfilled and you can have all the buttons below 2y-coordinate */
if (Camera.main.ScreenToWorldPoint(touch.position).y >= 2)
{
return true;
}
else
{
return false;
}
}
you can check the input point if its in a valid firing region (outside of buttons) by making a rectangle, checking if the point is in it, and only firing event if it is. Try Rect.Contains.
https://docs.unity3d.com/ScriptReference/Rect.Contains.html

unity: Add UI button to input manager [duplicate]

I'm writing a script to put on a button that will detect a drag direction to move a player
void OnGUI()
{
if (buttonRect.Contains(Event.current.mousePosition))
{
if (Event.current.type == EventType.MouseDown)
{
buttonPressed = true;
}
if (Event.current.type == EventType.MouseUp)
{
buttonPressed = false;
}
}
if (buttonPressed && Event.current.type == EventType.MouseDrag)
{
}
}
If this script was to be placed on a button, how could I get the buttons bounds as a rectangle?
Also, If anyone has a better solution for controling movement by drag I would be open for suggestions.
You can implement your own Visual Joystick by using the Unity new event callback functions such as OnBeginDrag OnDrag and the OnEndDrag function. There is a already made Visual Joystick package out there for Unity, so implementing your own is like reinventing the wheel.
All you have to do is to import the CrossPlatformInputManager package from Unity's UnityStandardAssets then use CrossPlatformInputManager.GetAxis("Horizontal") and CrossPlatformInputManager.GetAxisRaw("Horizontal") to read the direction of the image/thumb.
To make one from scratch, you can flow this video tutorial.
If you want a mobile solution that is really accurate and natural like a physical joystick, take a look at this asset:
Ultra Precise D-Pad Joystick

Mouse up vs. touch up in Unity

I have a bit of an issue. I can't seem to find a good way to distinguish whether Unity is firing a mouse up vs. a touch up event. By touch up I mean when the user releases their finger from the screen after touching something on a touch display.
The problem is this: I have an issue with game logic where the user both lets go of something they've selected with their mouse and SIMULTANEOUSLY touches / releases their touch to click a UI button (in other words, I'm trying to detect a mouse release only, or, a touch release only - how can I do that?). I've tried using both Input.GetMouseButtonUp(0) as well as Input.GetMouseButton(0), but both change their state if its a touch up or a mouse button up event.
So how do you properly distinguish between the two? I even tried using the Touch.fingerId property which works well to track / distinguish ONLY touch events, but, the problem is then that I still cannot distinguish ONLY mouse up with Input.GetMouseButton(0) since it fires even in the event of a touch up.
Does anyone out there happen to know what the proper way would be to detect just a mouse release, or just a touch release, separately, in Unity?
Edit:
Someone didn't understand the problem at hand, so let's assume for a second you have a desktop device with touch support. Add this script:
void Update () {
if (Input.GetMouseButtonDown(0))
Debug.Log("Mouse button down.");
if (Input.GetMouseButtonUp(0))
Debug.Log("Mouse button up.");
}
If you click and HOLD with your mouse, while holding, you can touch and release with your finger, which will log Mouse button up., despite the fact that you did not release the mouse button. How can I distinguish between a mouse up vs. a touch release?
For desktops:
Mouse Down - Input.GetMouseButtonDown
Mouse Up - Input.GetMouseButtonUp
Example:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("Mouse Pressed");
}
if (Input.GetMouseButtonUp(0))
{
Debug.Log("Mouse Lifted/Released");
}
For Mobile devices:
Touch Down - TouchPhase.Began
Touch Up - TouchPhase.Ended
Example:
if (Input.touchCount >= 1)
{
if (Input.touches[0].phase == TouchPhase.Began)
{
Debug.Log("Touch Pressed");
}
if (Input.touches[0].phase == TouchPhase.Ended)
{
Debug.Log("Touch Lifted/Released");
}
}
Now, if you are having issues of clicks going through UI Objects to hit other Objects then see this post for how to properly detect clicks on any type of GameObject.
I figured it out.
I was missing this call: Input.simulateMouseWithTouches = false;
Seems to work if you set this on a Start() event only (I could be wrong, it may be you need to set it before the first Input accessor on update, test it out I guess). Otherwise, Unity simulates mouse events from touch events. I can only speculate they do this in order to make it easier for game developers to write once and play cross-platform.
Note: Once its set, its set globally, so you’ll need to unset it even in a different scene you load later on.

How to Handle Swipe on Oculus Gear VR touchpad avoiding Tap event

I have a Scroll View containing few buttons as child elements under content-panel. Hierarchy looks like this:
I have implemented OVRTouchpad.TouchHandler event like this on my script attached to ScrollView:
void Start()
{
#if OVR && !UNITY_EDITOR
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += HandleTouchHandler;
#endif
}
void HandleTouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Left || touchArgs.TouchType == OVRTouchpad.TouchEvent.Up)
{
// Code to scroll UP for swipe in directions LEFT/UP
}
else if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Right || touchArgs.TouchType == OVRTouchpad.TouchEvent.Down)
{
// Code to scroll DOWN for swipe in directions RIGHT/DOWN
}
}
Problem :
As I am using OVR Input Muodule, it processes Tap input even if I try to swipe. So every time I swipe in any direction while gazing at button (child of scroll view). Button is clicked taking me to some other menu. I am not sure if this is desired behaviour of Gear VR Input system. As I have seen in Oculus Home App (and other apps on store) it only scrolls without triggering clicks on child element.
Is there any way to prevent click/tap if swipe is detected?
Any kind of help is highly appreciated.
Have you tried inserting an IF statement to handle the singletap trigger first?
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
//TODO: Nest IF statements here checking conditions which indicate swipe instead of tap. If nothing further detected, process single tap.
//Otherwise continue with swipe direction handling.
}
But that being said using the VRInput script mentioned in this tutorial would probably be your best bet as it can handle swipes with direction input too.
https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr
Snippit:
public event Action<SwipeDirection> OnSwipe; // Called every frame passing in the swipe, including if there is no swipe.

How to make gameplay ignore clicks on UI Button in Unity3D?

I have a UI Button (using UnityEngine.UI).
However, clicking on the Button seems to be clicking through onto the scene (in my case clicking a nav mesh).
How to solve this problem?
I've been using typical Unity3D code to get user in put in gameplay such as
if (Input.GetMouseButtonDown(0))
{
same if I try the approach
if( Input.touches.Length > 0 )
{
if ( Input.touches[0].phase == TouchPhase.Began )
{
and it seems to be the case on iOS, Android, and desktop.
It seems to be a basic problem that clicks on the UI (UnityEngine.UI.Button etc) seem to fall through to the gameplay.
Here's how you do it in Unity today:
Naturally you'll have an EventSystem in the hierarchy - just check that you do. (You get one of those automatically when, for example, you add a Canvas; usually, every scene in an Unity project already has an EventSystem, but just check that you do have one.)
Add a physics raycaster to the camera (that takes one click)
Do this:
.
using UnityEngine.EventSystems;
public class Gameplay:MonoBehaviour, IPointerDownHandler {
public void OnPointerDown(PointerEventData eventData) {
Bingo();
}
}
Basically, again basically, that is all there is to it.
Quite simply: that is how you handle touch in Unity. That's all there is to it.
Add a raycaster, and have that code.
It looks easy and it is easy. However, it can be complicated to do well.
(Footnote: some horrors of doing drags in Unity: Horrors of OnPointerDown versus OnBeginDrag in Unity3D )
Unity's journey through touch technology has been fascinating:
"Early Unity" ... was extremely easy. Utterly useless. Didn't work at all.
"Current 'new' Unity" ... Works beautifully. Very easy, but difficult to use in an expert manner.
"Coming future Unity" ... Around 2025 they will make it BOTH actually work AND be easy to use. Don't hold your breath.
(The situation is not unlike Unity's UI system. At first the UI system was laughable. Now, it is great, but somewhat complex to use in an expert manner. As of 2019, they are about to again totally change it.)
(The networking is the same. At first it was total trash. The "new" networking is/was pretty good, but has some very bad choices. Just recently 2019 they have changed the networking again.)
Handy related tip!
Remember! When you have a full-screen invisible panel which holds some buttons. On the full-screen invisible panel itself, you must turn off raycasting! It's easy to forget:
As a historic matter: here is the rough-and-ready quick-fix for "ignoring the UI", which you used to be able to use in Unity years ago...
if (Input.GetMouseButtonDown(0)) { // doesn't really work...
if (UnityEngine.EventSystems.EventSystem.current.IsPointerOverGameObject())
return;
Bingo();
}
You cannot do this any more, for some years now.
I had this problem too and I couldn't find very much useful info on it, this is what worked for me:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
public class SomeClickableObject : MonoBehaviour
{
// keep reference to UI to detect for
// for me this was a panel with some buttons
public GameObject ui;
void OnMouseDown()
{
if (!this.IsPointerOverUIObject())
{
// do normal OnMouseDown stuff
}
}
private bool IsPointerOverUIObject()
{
// get current pointer position and raycast it
PointerEventData eventDataCurrentPosition = new PointerEventData(EventSystem.current);
eventDataCurrentPosition.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
List<RaycastResult> results = new List<RaycastResult>();
EventSystem.current.RaycastAll(eventDataCurrentPosition, results);
// check if the target is in the UI
foreach (RaycastResult r in results) {
bool isUIClick = r.gameObject.transform.IsChildOf(this.ui.transform);
if (isUIClick) {
return true;
}
}
return false;
}
}
Essentially each click checks if the click occurred over a UI target.
Unfortunately, the first recommendations didn't help me so I just five all panels tags "UIPanel" and when the mouse clicked check if any panel currently active
void Update()
{ if (Input.GetMouseButtonDown(0))
{
if (isPanelsOn())
{}//doing nothing because panels is on
else
{} //doing what we need to do
}
}
public static bool isPanelsOn()
{
GameObject[] panels = GameObject.FindGameObjectsWithTag("UIPanel");
foreach (var panel in panels)
{
if (panel.activeSelf)
{
return true;
}
}
return false;
}

Categories

Resources