I am trying to give force to a rigidbody object by touch and dragging away from it and then releasing the touch. But the Touchphase.End just doesn't run. I can't find the answer to this problem. I get input from one touch until the touch releases, on releasing the touch the distance between the starting and final position is calculated and a similar force is applied on the rigidbody to make it move. The object on which the code is attached is the same object which needs to move.
// Update is called once per frame
void Update()
{
//Update the Text on the screen depending on current TouchPhase, and the current direction vector
// Track a single touch as a direction control.
if (Input.GetMouseButton(0))
{
Touch touch = Input.GetTouch(0);
_touchPosWorld =
Camera.main.ScreenToWorldPoint(touch.position); //get the position where the screen was touched
RaycastHit2D hitInformation = Physics2D.Raycast(_touchPosWorld, Vector2.zero);
FirstTouch = (hitInformation.collider.CompareTag("ball"));
if (FirstTouch || IsInTouch)
{
// Handle finger movements based on TouchPhase
switch (touch.phase)
{
//When a touch has first been detected, change the message and record the starting position
case TouchPhase.Began:
// Record initial touch position.
IsInTouch = true;
_startPos = touch.position;
GameManager.GetInstance().ChangeAccordinglyText.text = "clicked Inside";//test text
//Movement started
break;
//Determine if the touch is a moving touch
case TouchPhase.Moved:
// Determine direction by comparing the current touch position with the initial one
_direction = touch.position - _startPos;
GameManager.GetInstance().ChangeAccordinglyText.text = "MovingTouch to " + touch.position; //test text
//moving
break;
case TouchPhase.Ended:
// Report that the touch has ended when it ends
//end of movement of touch
GameManager.GetInstance().EndedText.text = "EndOfTouch";
float force = 0;
force = _direction.x > _direction.y ? _direction.x : _direction.y;
_rigidbody.AddForce(_direction * force);
FirstTouch = false;
IsInTouch = false;
break;
}
}
}
}
Your TouchPhase.End will never be reached since in the moment you release the touch also GetMouseButton(0) will be false and thus the entire block skipped!
To avoid the errors you are talking about before trying to access a certain touch, first check if there is any touch to access using Input.touchCount
if(Input.touchCount > 0)
{
var touch = Input.GetTouch(0);
// ...
}
In general for development you should rather check for e.g. Input.touchSupported and implement an alternative mouse system for simulating the touches
if(Input.touchSupported)
{
/* Implement touches */
if(Input.touchCount > 0)
{
var touch = Input.GetTouch(0);
// ...
}
}
// Optional
else
{
/* alternative mouse implementation for development */
if(Input.GetMouseButtonDown(0))
{
// simulates touch begin
}
else if(Input.GetMouseButton(0))
{
// simulates touch moved
}
// I've seen in strange occasions that down and up might get called within one frame
if(Input.GetMouseButtonUp(0))
{
// simulates touch end
}
}
Related
I am developing a game where objects fall from the top of the screen, and you try to and tap on them before they reach the bottom of the screen. The code works fine when I played the game in the editor (with the touch controls swapped to mouse controls), except when I run the game on a phone, the game only seems to register a successful hit if you tap slightly in front of the object in the direction that it is traveling, and does not register a hit if you tap towards the back end or center of the object. I have built and ran the game over 10 times now, each time trying to fix this issue but nothing seems to help. My theory at the moment is that my code for the touch controls have too much going on and/ or have redundancies and by the time it checks whether or not an object is at the position of the touch, the object has moved to a different location. Any thoughts on why the hit boxes are off, and is there a better way to do hit detection with touch screen?
void FixedUpdate()
{
if (IsTouch())
{
CheckTouch(GetTouchPosition());
}
}
// Returns true if the screen is touched
public static bool IsTouch()
{
if (Input.touchCount > 0)
{
if (Input.GetTouch(0).phase == TouchPhase.Began)
{
return true;
}
}
return false;
}
// Gets the position the touch
private Vector2 GetTouchPosition()
{
Vector2 touchPos = new Vector2(0f, 0f);
Touch touch = Input.GetTouch(0);
if (Input.GetTouch(0).phase == TouchPhase.Began)
{
touchPos = touch.position;
}
return touchPos;
}
// Checks the position of the touch and destroys the ball if the
ball is touched
private void CheckTouch(Vector2 touchPos)
{
RaycastHit2D hit =
Physics2D.Raycast(Camera.main.ScreenToWorldPoint(
(Input.GetTouch(0).position)), Vector2.zero);
if (hit.collider != null)
{
destroy(hit.collider.gameObject);
}
}
I am trying to get touch controls to work for iOS, the result i want is drag left to move left, drag right to move right and tap to jump. At the moment the functions do work but what happens is if i don't want to jump and just want to drag the character will jump first as soon as touch the screen. Is there a way i can get the result i want i have added my code below. I want the first touch to just register for the drag and when the player lifts and then taps or maybe double taps the character will jump. Thanks in advance.
//Mobile Touch Drag Controls
{
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Moved)
{
// Get movement of the finger since last frame
Vector2 touchDeltaPosition = Input.GetTouch(0).deltaPosition;
// Move object across X plane
transform.Translate(touchDeltaPosition.x * speed, 0, 0);
}
}
////Mobile Touch To Jump
{
if (Input.GetMouseButtonDown(0))
{
transform.Translate(Vector3.up * jumpForce * Time.deltaTime, Space.World);
}
When the button goes down, there is no way to know what the player wants to do. You should jump when the finger goes up instead.
An easy solution is to use a boolean to know if the player has moved before the finger was released. If yes, you don't want to jump
public class PlayerController // Or whatever name your class has
{
private bool _moved;
private void Update()
{
//Mobile Touch Drag Controls
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Moved)
{
// Get movement of the finger since last frame
Vector2 touchDeltaPosition = Input.GetTouch(0).deltaPosition;
// Move object across X plane
transform.Translate(touchDeltaPosition.x * speed, 0, 0);
_moved = true; // Remember that the player moved
}
////Mobile Touch To Jump
if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Ended)
{
if(!_moved) // Only jump if we didnt move
{
transform.Translate(Vector3.up * jumpForce * Time.deltaTime, Space.World);
}
_moved = false;
}
}
}
I assume this code goes in your update function. This will work fine but here is a more elegant solution in case you want to try it:
Your class can implement IDragHandler and IEndDragHandler (among others, look for all possibilities that extend IEventSystemHandler)
When you implement these, you can override functions that get called when the user touches the screen, such as public void OnDrag(PointerEventData eventData)
With this you don't have to use the Update function anymore
When I run my app on Android, the first finger touch calls Input.GetMouseButtonDown(0)
and the second touch calls Input.GetMouseButtonDown(1).
I want to override GetMouseButtonDown(0) in some cases- so the second finger (1) touch will become the first (0) and I don't know how to do it.
Either this or how to force mouseButtonUp on the first finger touch- I want to remove this first "click" from the system so it'll not use Input.mousePosition in a case of 2 touches.
Why?
I'm creating a paint app when the user can draw lines.
There is an area when the user can paint (in a rectangle) and an area where he shouldn't, I know how to detect when pressed on unwanted area.
But sometimes the palm of my hand creates unwanted first touch Input.GetMouseButtonDown(0) on the unwanted area (without Input.GetMouseButtonUp(0)) and when I start to draw a line
Input.mousePosition gets the average of both touches, so I just want a way to remove a "down" touch/click from the system. Or another way to solve my problem.
Here is my code:
if (Input.touchCount == 0)
{ screenPoint = new Vector3(0, 0, 0);
currentTouch = 4; //currentTouch is for GetMouseButtonUp(currentTouch) }
for (int i=0; i< Input.touchCount; i++)
{
touch = Input.GetTouch(i);
screenPointTemp = touch.position;
screenPointTemp3 = new Vector3(screenPointTemp.x, screenPointTemp.y, zCam);
//if the touch is in a "good" zone-
if (Camera.main.ScreenToWorldPoint(screenPointTemp3).z > BottomNod.transform.position.z - nodeScale)
{
screenPoint = touch.position;
currentTouch = i;
}
}
}
if (Input.GetMouseButtonUp(currentTouch))
{...}
When working on mobile devices to detect a touch on the screen without clicking on any object, you should be using Input.touchCount with Input.GetTouch or Input.touches. Although I highly recommend Input.GetTouch since that doesn't even allocate temporary variables like Input.touches. To get the postion of the touch use, Input.GetTouch(index).position.
Each of these functions returns Touch so you can use Touch.fingerId to detect/keep track of how many touches you want at the-same time. You can also use the index that is passed in to Input.GetTouch to keep track of the touch. That's totally up to you.
This detects every touch down, move and up on mobile devices:
for (int i = 0; i < Input.touchCount; ++i)
{
//Touch Down
if (Input.GetTouch(i).phase == TouchPhase.Began)
{
}
//Touch Moved
if (Input.GetTouch(i).phase == TouchPhase.Moved)
{
}
//Touch Up
if (Input.GetTouch(i).phase == TouchPhase.Ended)
{
}
}
Limit to one touch only(Use index 0):
if (Input.touchCount == 1)
{
//Touch Down
if (Input.GetTouch(0).phase == TouchPhase.Began)
{
}
//Touch Moved
if (Input.GetTouch(0).phase == TouchPhase.Moved)
{
//Draw?
}
//Touch Up
if (Input.GetTouch(0).phase == TouchPhase.Ended)
{
}
}
Like I said, you can limit this with fingerId. The implementation depends on what exactly that you want.
We are building an example that is similar to the "Kitten - Placing Virtual objects in AR" as shown here:
https://developers.google.com/tango/apis/unity/unity-howto-placing-objects.
Basically when you touch the screen, a kitten appears on the real world plane (floor).
In our app we have a side menu, with a few buttons and each shows a different game object. We want to detect touch anywhere on the screen except where there is UI. We want the UI to block touches in Tango, and only allow touches to instantiate the related game objects on areas of the screen without UI elements.
The touch specific code is here:
void Update() {
if (Input.touchCount == 1) {
// Trigger placepictureframe function when single touch ended.
Touch t = Input.GetTouch(0);
if (t.phase == TouchPhase.Ended) {
PlacePictureFrame(t.position);
}
}
}
(The PlacePictureFrame() places a picture frame object at the touch position.)
I can't find any Tango examples which has touch and UI combined. I've tried an asset called LeanTouch to block touches behind UI elements but it doesn't seem to work with Tango specifically. Please help!
I have tried using method 5 from this:
How to detect events on UI and GameObjects with the new EventSystem API
and while it does add a PhysicsRaycaster to the TangoARCamera (which is tagged as MainCamera), the OnPointerDown method produces no debug logs no matter where you touch the screen. Tango is a special case so this is not a duplicate question. See below:
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
public class PictureFrameUIController : MonoBehaviour, IPointerClickHandler {
public GameObject m_pictureFrame;
private TangoPointCloud m_pointCloud;
void Start() {
m_pointCloud = FindObjectOfType<TangoPointCloud>();
addPhysicsRaycaster();
}
void addPhysicsRaycaster() {
PhysicsRaycaster physicsRaycaster = GameObject.FindObjectOfType<PhysicsRaycaster>();
if (physicsRaycaster == null) {
Camera.main.gameObject.AddComponent<PhysicsRaycaster>();
}
}
public void OnPointerClick(PointerEventData eventData) {
Debug.Log("Clicked: " + eventData.pointerCurrentRaycast.gameObject.name);
PlacePictureFrame(eventData.pointerCurrentRaycast.screenPosition);
}
//void Update() {
// if (Input.touchCount == 1) {
// // Trigger placepictureframe function when single touch ended.
// Touch t = Input.GetTouch(0);
// if (t.phase == TouchPhase.Ended) {
// PlacePictureFrame(t.position);
// }
// }
//}
void PlacePictureFrame(Vector2 touchPosition) {
// Find the plane.
Camera cam = Camera.main;
Vector3 planeCenter;
Plane plane;
if (!m_pointCloud.FindPlane(cam, touchPosition, out planeCenter, out plane)) {
Debug.Log("cannot find plane.");
return;
}
// Place picture frame on the surface, and make it always face the camera.
if (Vector3.Angle(plane.normal, Vector3.up) > 60.0f && Vector3.Angle(plane.normal, Vector3.up) < 140.0f) {
Vector3 forward = plane.normal;
// Vector3 right = Vector3.Cross(plane.normal, cam.transform.forward).normalized;
// Vector3 forward = Vector3.Cross(right, plane.normal).normalized;
Instantiate(m_pictureFrame, planeCenter, Quaternion.LookRotation(forward, Vector3.up));
} else {
Debug.Log("surface is not steep enough for picture frame to be placed on.");
}
}
public void DeleteAllFrames() {
GameObject[] frames = GameObject.FindGameObjectsWithTag("Frame");
if (frames == null) {
return;
}
foreach (GameObject frame in frames) {
Destroy(frame);
}
}
}
If you want to detect a click anywhere on the screen except for where there is a UI control/component, you have to check if the pointer is over the UI with EventSystem.current.IsPointerOverGameObject(Input.GetTouch(0).fingerId).
If on desktop, use EventSystem.current.IsPointerOverGameObject(). You are using Tango so EventSystem.current.IsPointerOverGameObject(Input.GetTouch(0).fingerId). should be used.
void Update()
{
if (Input.touchCount == 1)
{
//Trigger placepictureframe function when single touch ended.
Touch t = Input.GetTouch(0);
if (t.phase == TouchPhase.Ended)
{
//Make sure that pointer is not over UI before calling PlacePictureFrame
if (!EventSystem.current.IsPointerOverGameObject(Input.GetTouch(0).fingerId))
{
PlacePictureFrame(t.position);
}
}
}
}
Edit:
It seems like this works with TouchPhase.Began only.
Change t.phase == TouchPhase.Ended to t.phase == TouchPhase.Began and this should work as expected. Make sure to test with a mobile device/tango instead of your mouse.
How can you read two gestures at the same time. I'm currently developing a game where two players should use the FreeDrag gesture.
What happens now is:
When player A starts, and he is dragging it works perfectly. If player B then also starts it's FreeDrag gesture, the TouchPanel.ReadGesture(); doesn't register it until player A's gesture is finished.
I use the following code:
In Initialize()
TouchPanel.EnabledGestures = GestureType.FreeDrag;
In Update()
if (TouchPanel.IsGestureAvailable)
{
GestureSample touch = TouchPanel.ReadGesture();
if (touch.GestureType == GestureType.FreeDrag)
{
if (touch.Position.Y > GraphicsDevice.Viewport.Height/2)
{
//logic Player A here
}
else
{
//logic Player B there
}
}
}
You have an example in the MSDN documentation:
http://msdn.microsoft.com/en-us/library/ff827740.aspx
// get any gestures that are ready.
while (TouchPanel.IsGestureAvailable)
{
GestureSample gs = TouchPanel.ReadGesture();
switch (gs.GestureType)
{
case GestureType.VerticalDrag:
// move the poem screen vertically by the drag delta
// amount.
poem.offset.Y -= gs.Delta.Y;
break;
case GestureType.Flick:
// add velocity to the poem screen (only interested in
// changes to Y velocity).
poem.velocity.Y += gs.Delta.Y;
break;
}
}
You can't, FreeDrag isn't a multitouch gesture, you should try using TouchLocation and TouchCollection which let you detect multitouch. Unfortunately you can't use default gestures you've declared in TouchPanel.EnabledGestures.