I'm writing a script to put on a button that will detect a drag direction to move a player
void OnGUI()
{
if (buttonRect.Contains(Event.current.mousePosition))
{
if (Event.current.type == EventType.MouseDown)
{
buttonPressed = true;
}
if (Event.current.type == EventType.MouseUp)
{
buttonPressed = false;
}
}
if (buttonPressed && Event.current.type == EventType.MouseDrag)
{
}
}
If this script was to be placed on a button, how could I get the buttons bounds as a rectangle?
Also, If anyone has a better solution for controling movement by drag I would be open for suggestions.
You can implement your own Visual Joystick by using the Unity new event callback functions such as OnBeginDrag OnDrag and the OnEndDrag function. There is a already made Visual Joystick package out there for Unity, so implementing your own is like reinventing the wheel.
All you have to do is to import the CrossPlatformInputManager package from Unity's UnityStandardAssets then use CrossPlatformInputManager.GetAxis("Horizontal") and CrossPlatformInputManager.GetAxisRaw("Horizontal") to read the direction of the image/thumb.
To make one from scratch, you can flow this video tutorial.
If you want a mobile solution that is really accurate and natural like a physical joystick, take a look at this asset:
Ultra Precise D-Pad Joystick
Related
I'm developing a 2D mobile game in Unity and I'm trying to detect whether a player's current finger tap position is on an Object.
Basically I am developing a game similiar to Insaniquarium Deluxe by PopCap games where you can spawn fish every time you tap on a button in exchange for in-game currency. Here's how the game looks.
Game picture example
There's a button on the upper-left side of the screen where if you click on it, another fish would spawn in to the aquarium.
I'm trying to recreate this for a mobile game and I'm stuck trying to figure out the way to trace where the user clicks. I've come up with something that works if you tap anywhere on the screen and not just the button.
void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
Vector2 touchPosition = Camera.main.ScreenToWorldPoint(touch.position);
if (touch.phase == TouchPhase.Ended)
{
Collider2D touchedCollider = Physics2D.OverlapPoint(selectionEffect.transform.position);
Debug.Log(touchPosition.x);
if (col == touchedCollider)
{
spawnFish();
}
}
}
}
What's the code to detect a tap on a Game Object?
You should use unity's UI system for button creation as it will automatically handle being interacted with, then you would simply assign the code you want to execute to that button.
here's a video to help introduce you to unity's UI system
https://www.youtube.com/watch?v=_RIsfVOqTaE
I have a Scroll View containing few buttons as child elements under content-panel. Hierarchy looks like this:
I have implemented OVRTouchpad.TouchHandler event like this on my script attached to ScrollView:
void Start()
{
#if OVR && !UNITY_EDITOR
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += HandleTouchHandler;
#endif
}
void HandleTouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Left || touchArgs.TouchType == OVRTouchpad.TouchEvent.Up)
{
// Code to scroll UP for swipe in directions LEFT/UP
}
else if(touchArgs.TouchType == OVRTouchpad.TouchEvent.Right || touchArgs.TouchType == OVRTouchpad.TouchEvent.Down)
{
// Code to scroll DOWN for swipe in directions RIGHT/DOWN
}
}
Problem :
As I am using OVR Input Muodule, it processes Tap input even if I try to swipe. So every time I swipe in any direction while gazing at button (child of scroll view). Button is clicked taking me to some other menu. I am not sure if this is desired behaviour of Gear VR Input system. As I have seen in Oculus Home App (and other apps on store) it only scrolls without triggering clicks on child element.
Is there any way to prevent click/tap if swipe is detected?
Any kind of help is highly appreciated.
Have you tried inserting an IF statement to handle the singletap trigger first?
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
//TODO: Nest IF statements here checking conditions which indicate swipe instead of tap. If nothing further detected, process single tap.
//Otherwise continue with swipe direction handling.
}
But that being said using the VRInput script mentioned in this tutorial would probably be your best bet as it can handle swipes with direction input too.
https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr
Snippit:
public event Action<SwipeDirection> OnSwipe; // Called every frame passing in the swipe, including if there is no swipe.
I'm writing a script to put on a button that will detect a drag direction to move a player
void OnGUI()
{
if (buttonRect.Contains(Event.current.mousePosition))
{
if (Event.current.type == EventType.MouseDown)
{
buttonPressed = true;
}
if (Event.current.type == EventType.MouseUp)
{
buttonPressed = false;
}
}
if (buttonPressed && Event.current.type == EventType.MouseDrag)
{
}
}
If this script was to be placed on a button, how could I get the buttons bounds as a rectangle?
Also, If anyone has a better solution for controling movement by drag I would be open for suggestions.
You can implement your own Visual Joystick by using the Unity new event callback functions such as OnBeginDrag OnDrag and the OnEndDrag function. There is a already made Visual Joystick package out there for Unity, so implementing your own is like reinventing the wheel.
All you have to do is to import the CrossPlatformInputManager package from Unity's UnityStandardAssets then use CrossPlatformInputManager.GetAxis("Horizontal") and CrossPlatformInputManager.GetAxisRaw("Horizontal") to read the direction of the image/thumb.
To make one from scratch, you can flow this video tutorial.
If you want a mobile solution that is really accurate and natural like a physical joystick, take a look at this asset:
Ultra Precise D-Pad Joystick
I'm making an Android app with Unity3D and it works with click detection already, but not with touch. I need touch though for multitouch detection.
What I want: I have my player and 2 images of arrows. One arrow points right and one left. When I touch the left arrow the player should start moving left, when I touch the right arrow the player should start moving right.
But how do I detect which arrow is touched (and held)? All the code I found on Google is too old and not working anymore.
I'm working with C# scripts and it's a 2D game.
Klausar, what you're looking for is .........
A BUTTON (!!!)
it's that easy!
click "add Canvas". (You very likely want "scale to screen size" - select that option.)
click "add Button".
there is no "three" .. go drinking.
In your code, have a routine like this for the right-side button
public void UserClickedRight()
{
Debug.Log("click right whoo!");
}
and have a similar routine for the left-side button.
1. Go to your Button in the editor.
2. Simply drag the script to it and select UserClickedRight !!
You actually do not have to program buttons from scratch :)
This is a basic mechanism of Unity - drag a game object to a "slot" for it in the Editor.
(The game object you drag, has the script in question on it.)
You DO NOT need to go to the level of touch handling to achieve a button!
Now you ask, what about "when holding down on the button"?
It's very easy, you just need to know a couple of things about the Unity "event" system.
In Button, Unity have done all the work for "OnClick":
Since "click" is common, they put that one right in the Inspector panel for you for convenience.
The good news is that Button has many more events you can use.
http://docs.unity3d.com/ScriptReference/UI.Button.html
My guess is you want to use OnPointerDown and OnPointerUp in your case.
To use the other events (which Unity did not bother putting in the Inspector panel) is very simple, you just
1, make a script that references the events you want,
and,
2, you put that script ON the button in question ...
it's that easy.
Step by step explanation:
You're going to be using Unity's event system, so:
using UnityEngine.EventSystems;
Next. You know that a script normally starts like this...
public class FancyButton:MonoBehaviour
The "MonoBehaviour" part just means that it's a c# script which will be "driving a GameObject".
In this case we have to further alert the engine that you will be using those click events. So you add this.
,IPointerDownHandler,IPointerUpHandler
So far we have this
using UnityEngine;
using System.Collections;
using UnityEngine.EventSystems;
public class FancyButton:MonoBehaviour,IPointerDownHandler,IPointerUpHandler
{
Now the easy part.
You just type the two routines which Unity will run for you, when, those things happen.
using UnityEngine;
using System.Collections;
using UnityEngine.EventSystems;
public class FancyButton:MonoBehaviour,IPointerDownHandler,IPointerUpHandler
{
public void OnPointerDown(PointerEventData data)
{
Debug.Log("holy! someone put the pointer down!")
}
public void OnPointerUp(PointerEventData data)
{
Debug.Log("whoa! someone let go!")
}
}
Now all you have to do is drop that script on the Button. Done!
You can put that script on any button, where, you need that functionality.
Next, click on Obi-wan to see where we're at so far!
Finally, it sounds like you want to do something "when the button is being held down". That's just a general programming issue - you'd have a boolean which you turn on and off as the button goes up and down. Let's do it.
using UnityEngine;
using System.Collections;
using UnityEngine.EventSystems;
public class FancyButton:MonoBehaviour,IPointerDownHandler,IPointerUpHandler
{
[System.NonSerialized] public bool mouseIsDownRightNow;
public void OnPointerDown(PointerEventData data)
{
mouseIsDownRightNow = true;
}
public void OnPointerUp(PointerEventData data)
{
mouseIsDownRightNow = false;
}
}
You could access that variable from another script, or whatever you want.
Add the following if you want to run a routine while the button is down:
void Update()
{
if (buttonIsDownRightNow) WhileButtonIsDown();
}
private void WhileButtonIsDown()
{
Debug.Log("THE BUTTON IS DOWN! WHOA!");
}
Try that and watch the console as you hold the button down and up.
Here's an example of something like continually increasing a value while the button is down:
using UnityEngine;
using System.Collections;
using UnityEngine.EventSystems;
public class FancyButton:MonoBehaviour,IPointerDownHandler,IPointerUpHandler
{
[System.NonSerialized] public bool buttonIsDownRightNow;
private int countSomething;
public void OnPointerDown(PointerEventData data)
{
buttonIsDownRightNow = true;
}
public void OnPointerUp(PointerEventData data)
{
buttonIsDownRightNow = false;
}
private void WhileButtonIsDown()
{
++countSomething;
}
void Update()
{
if (buttonIsDownRightNow) WhileButtonIsDown();
Debug.Log("value is now " +countSomething.ToString());
}
}
That's all there is to it. Once you really understand events in Unity, there is not much more to learn about Unity.
The only other important topics are Mecanim, shader writing, touch, coroutines, threading, PhysX, native plugins, sprites, networking, dynamic mesh, navigation, VR, AR, animation, inverse kinematics, particles, terrain, IAP, lighting, baking, shadows, MMP, character controllers, and audio. Enjoy!
You should firsty read this.
What you are trying to do is actually simple:
foreach (Touch touch in Input.touches) {
if (touch.phase == TouchPhase.Moved || touch.phase == TouchPhase.Stationary)
if (guiTexture1.HitTest(touch.position)) {
// Maybe move left
}
if (guiTexture2.HitTest(touch.position)) {
// Maybe move right
}
}
}
Edit 1: this code now checks which image is being pressed
Edit 2: Added multitouch support
I have a UI Button (using UnityEngine.UI).
However, clicking on the Button seems to be clicking through onto the scene (in my case clicking a nav mesh).
How to solve this problem?
I've been using typical Unity3D code to get user in put in gameplay such as
if (Input.GetMouseButtonDown(0))
{
same if I try the approach
if( Input.touches.Length > 0 )
{
if ( Input.touches[0].phase == TouchPhase.Began )
{
and it seems to be the case on iOS, Android, and desktop.
It seems to be a basic problem that clicks on the UI (UnityEngine.UI.Button etc) seem to fall through to the gameplay.
Here's how you do it in Unity today:
Naturally you'll have an EventSystem in the hierarchy - just check that you do. (You get one of those automatically when, for example, you add a Canvas; usually, every scene in an Unity project already has an EventSystem, but just check that you do have one.)
Add a physics raycaster to the camera (that takes one click)
Do this:
.
using UnityEngine.EventSystems;
public class Gameplay:MonoBehaviour, IPointerDownHandler {
public void OnPointerDown(PointerEventData eventData) {
Bingo();
}
}
Basically, again basically, that is all there is to it.
Quite simply: that is how you handle touch in Unity. That's all there is to it.
Add a raycaster, and have that code.
It looks easy and it is easy. However, it can be complicated to do well.
(Footnote: some horrors of doing drags in Unity: Horrors of OnPointerDown versus OnBeginDrag in Unity3D )
Unity's journey through touch technology has been fascinating:
"Early Unity" ... was extremely easy. Utterly useless. Didn't work at all.
"Current 'new' Unity" ... Works beautifully. Very easy, but difficult to use in an expert manner.
"Coming future Unity" ... Around 2025 they will make it BOTH actually work AND be easy to use. Don't hold your breath.
(The situation is not unlike Unity's UI system. At first the UI system was laughable. Now, it is great, but somewhat complex to use in an expert manner. As of 2019, they are about to again totally change it.)
(The networking is the same. At first it was total trash. The "new" networking is/was pretty good, but has some very bad choices. Just recently 2019 they have changed the networking again.)
Handy related tip!
Remember! When you have a full-screen invisible panel which holds some buttons. On the full-screen invisible panel itself, you must turn off raycasting! It's easy to forget:
As a historic matter: here is the rough-and-ready quick-fix for "ignoring the UI", which you used to be able to use in Unity years ago...
if (Input.GetMouseButtonDown(0)) { // doesn't really work...
if (UnityEngine.EventSystems.EventSystem.current.IsPointerOverGameObject())
return;
Bingo();
}
You cannot do this any more, for some years now.
I had this problem too and I couldn't find very much useful info on it, this is what worked for me:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
public class SomeClickableObject : MonoBehaviour
{
// keep reference to UI to detect for
// for me this was a panel with some buttons
public GameObject ui;
void OnMouseDown()
{
if (!this.IsPointerOverUIObject())
{
// do normal OnMouseDown stuff
}
}
private bool IsPointerOverUIObject()
{
// get current pointer position and raycast it
PointerEventData eventDataCurrentPosition = new PointerEventData(EventSystem.current);
eventDataCurrentPosition.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
List<RaycastResult> results = new List<RaycastResult>();
EventSystem.current.RaycastAll(eventDataCurrentPosition, results);
// check if the target is in the UI
foreach (RaycastResult r in results) {
bool isUIClick = r.gameObject.transform.IsChildOf(this.ui.transform);
if (isUIClick) {
return true;
}
}
return false;
}
}
Essentially each click checks if the click occurred over a UI target.
Unfortunately, the first recommendations didn't help me so I just five all panels tags "UIPanel" and when the mouse clicked check if any panel currently active
void Update()
{ if (Input.GetMouseButtonDown(0))
{
if (isPanelsOn())
{}//doing nothing because panels is on
else
{} //doing what we need to do
}
}
public static bool isPanelsOn()
{
GameObject[] panels = GameObject.FindGameObjectsWithTag("UIPanel");
foreach (var panel in panels)
{
if (panel.activeSelf)
{
return true;
}
}
return false;
}