I'm making a simple game in unity, and I decided to make it able to be played on mobile, so to move a cube I use the AddForce method inside of the update method, but now I added two buttons to move the cube, but the problem is that I have to make the action on a hold function instead of click one, so how is that done in unity?
something like this but with a button hold...
if (Input.GetKey("a") || Input.GetKey("[4]") || Input.GetKey("left"))
{
rb.AddForce(-sideWaysForce * Time.deltaTime, 0, 0, ForceMode.VelocityChange);
}
And I'm using Unity 2020.
What you can do is you can add a rigidbody to the cube and make a script for the ui canvas. Once you done that, you can reference the cube like this : public GameObject cube;. Then in the update function what you can do is add some if statements checking if either button is pressed. And if for example the button that makes the cube move left is pressed, add a force to the cube on the x axis. I can give you a hint on how to do this : cube.GetComponent<Rigidbody>().AddForce(15, 0, 0). I'm sure you can figure the rest out by yourself.
I hope you found this useful! :D
Related
I would like to understand how to create a script to climb objects of different heights like in this video this.
I know you have to use LayerMask to understand if you are in front of an object, but I don't understand the script that brings the character from below to above the object or on the other side (if it is a wall to climb over) .
For a ladder, I thought I'd put "gravity = false" and "transform.up * Input vertical" to go up or down. But to position yourself above a wall in this way, what script do it's used?
And how does the same animation be usable on walls of different heights as in the video ?
Use an animator, setup your animations in your state machine, in the animator click ‘apply root motion’ and your animation will apply to your character position.
I know this is a really simple question but I can't figure out how to archive this:
I have a UI button in my scene and I want Vuforia to instantiate one AR Model ONLY when I press a button.
Following the tutorial on the net I was able to instantiate a model on the screen when I touch it but I need to know how to set up Vuforia for archive the same result only when I press a button.
I have to disable the "Anchor Input Listener Behaviour"?
And then?
I want to call for the PositionContentAtPlaneAnchor but I can't figure out how to call it in the right way in the OnClick field of the button. I need to make a custom script for this?
Thanks for any answer.
Ok, sorry for the delay.
I deduce that you are working with the ground plane, if you have the Ground Plane Stage and the Plane Finder in the scene and works, we're at a good point.
Now, you must only add a button to the scene and into the script add something like this:
public PlaneFinderBehaviour plane;
void Start()
{
...
buttonOnTheScene.onClick.AddListener(TaskOnClick);
...
}
void TaskOnClick()
{
Vector2 aPosition = new Vector2(0,0);
...
plane.PerformHitTest(aPosition);
}
What does it mean?
First of all, you must move the Plane Finder from Hierarchy to the script variable, so we have a reference to the plane into the script.
Then when you click (or tap) on the button you simulate the click (or tap) on the display with the PerformHitTest.
If you're wondering why my question in the comment, it's because the Plane Finder Behaviour Script has two modes type: Interactive and Automatic. The Interactive intercept the tap on the display and shows the object (on the ground plane) in the exact position of the tap, the automatic shows the object in the center of the plane.
So if you want the object in an exact position you can pass a Vector2 position in the PerformHitTest and if you want to show an object programmatically or do something when it shows the object you can call a custom method OnInteractiveHitTest.
That's all.
(I already followed the tutorial 002 from VRTK, to use the custom trigger event on right and left controller. And it do work, but not for what I want.)
I do have a character instanciated from a prefab with his own Animatorcontroller. I attached to him my own "animator script", doing hand gesture when I press the button 0,1,2,3... and it's working and it work like that.
void Update()
{
// LEFT HAND //
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
else if (Input.GetKey(KeyCode.Keypad1))
m_Animator.SetInteger("HandGestureLeft", 1); // PALM
else if (Input.GetKey(KeyCode.Keypad2))
m_Animator.SetInteger("HandGestureLeft", 2); // POINT
Now I want to do it from my VR Controller buttons inputs instead of the 0.1.2.3 keypad. And I learnt that VRTK have some functions to do it. So I goes into the "VRTK_ControllerEvents_ListenerExample script" from the 002 example scene and I put the script on the left and right controller (Which are inside the VRTK gameobject scene) (not on my prefab avatar because if I do it nothing work anymore, the 002 scene script must be on the controllers instanciated by VRTK as the tutorial says).
I can put animations inside and have them working. But they only work for my prefab if I put it inside the scene already, and do not instanciate it. So as a player I cannot have my gestures working, because when I hit play I must have my character instanciated for my game to work (And my default script does that well, when I press the 0,1,2,3... key because I do not need to have it on the hand controllers, I can have it on my prefab).
So I have a problem and they may be two solution from what I see.
I may have a call to the VRTK controller functions directly inside my old "animator script". So it could look like :
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
into
if (GetComponent().TriggerPressed)==true) (BUT it tell me that TriggerPressed isn't available)
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
(But I have no idea how to grab the VR controller input from where VRTK is listening)
or there could be a way to have a call to my prefab AnimatorController Component inside the "VRTK_ControllerEvents_ListenerExample" script from the left and right controller. But here too, i have no idea how to call the prefab component, I can only call for the AnimatorControllerComponent if I have my Avatar Prefab on the scene already.
Well finally all I could need is the way to check from my prefab script if the leftcontroller or rightcontroller is pressing the trigger. Just like the "Input.GetKey(KeyCode.Keypad0)" but with the VRTK way to get the trigger input instead.
I'm totally lost, could someone help me ? :D
Here two images. Right now my working code look like that :
![1]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD4YD6/code.png "Code0"
![2]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD6DR6/code1.png "Code1"
Alright I found it finally. It was link that :
if(RightHand.GetComponent<VRTK_ControllerEvents>().IsButtonPressed(VRTK_ControllerEvents.ButtonAlias.GripPress) == true)
{
m_Animator.SetInteger("HandGestureLeft", 0);
}
I have two scripts like this:
void Update()
{
if (Input.GetMouseButton(0))
{
if (GameObject.FindGameObjectWithTag("Object").transform.position.x > -1.9)
{
GameObject.FindGameObjectWithTag("Object").transform.Translate(Vector2.left * szybkosc);
}
}
}
And second script is similiar, only there is "Vector2.right" difference. If I have one script, it works perfectly, but when I add second script, they doesn't work. I add these scripts to two different gameobjects.
And second script is similiar, only there is "Vector2.right"
difference
That's the problem right there.
You are moving the object left with ("Object").transform.Translate(Vector2.left * szybkosc);
then you have another script that is moving it right with ("Object").transform.Translate(Vector2.right * szybkosc);.
What do you expect to happen? I think that the "Object" GameObject should remain still or even create a weird behavior. You can't be moving one Object to two opposite directions at the-same time.
Maybe the second script is supposed to be moving another GameObject GameObject.FindGameObjectWithTag("AnotherObject") instead of the first GameObject the first script is already moving....
Not related to your problem but you should call GameObject.FindGameObjectWithTag once in the Start function.
GameObject objToMove;
void Start()
{
objToMove = GameObject.FindGameObjectWithTag("Object");
}
void Update()
{
if (Input.GetMouseButton(0))
{
if (objToMove.transform.position.x > -1.9)
{
objToMove.transform.Translate(Vector2.left * szybkosc);
}
}
}
EDIT:
I'm making control system. I have two arrows - left arrow and right
arrow. When I touch and hold left arrow, object moves left, when I
touch and hold right arrow, object moves right.
It's good to mention what you are doing in your question. This is what EventSystems is used for. OnPointerClick, OnDrag and OnEndDrag are used to make such things. You can find more about this here.
In fact, you need a Visual Joystick. Use Unity's CrossPlatformInputManager from the Asset store and that should do it. It will save you so much time of having to make your own with OnPointerClick and OnDrag .
The value from CrossPlatformInputManager.GetAxisRaw("Horizontal") can then be used to move your Object left or right.
basically, I'm trying to make a game that is based off automatically hopping as soon as the ground is hit(the Controller I'm using is the pre-made one that can be imported).In order to do this, I removed the jump function from the controller script and added a script called "Cube" which reads as the following:
using UnityEngine;
using System.Collections;
public class Cube : MonoBehaviour {
void OnCollisionStay (Collision col)
{
if (col.gameObject.name == "Blue") {
Rigidbody rig;
rig= GetComponent<Rigidbody> ();
rig.velocity = new Vector3 (0, 8, 0);
print ("collison detected");
}
}
}
after doing this, I expected a controllable character which jumps as soon as you hit the ground because of the "OnCollisionStay()" trigger. However, instead I get a rapid jump that happens even when I'm in the air which looks like this:
https://youtu.be/ILtRac_RgLg
First of all, undo everything modification you performed to the RigidbodyFirstPersonController script. If possible, delete it and re-import a clean one from Unity.
Select your RigidBodyFPSController GameObject, Look at the RigidbodyFirstPersonController script attached to it in the Editor. Under it there is a setting called Advanced Settings. Under Advanced Settings, there is variable called Shell Offset. Change Shell Offset from its default value of 0 to 0.5. Play again and this problem should be gone. If that didn't work, bump it up more. This should solve your problem.
You are not in the air as the gravity is effecting on it. Whenever you are exiting the collision your y-axis upward velocity is not working and gravity force is taking place downward.
Disable gravity in FPSController's Rigidbody component if you don't want to use gravity.