Vuforia + Unity : Instantiate model in AR when UI Button pressed - c#

I know this is a really simple question but I can't figure out how to archive this:
I have a UI button in my scene and I want Vuforia to instantiate one AR Model ONLY when I press a button.
Following the tutorial on the net I was able to instantiate a model on the screen when I touch it but I need to know how to set up Vuforia for archive the same result only when I press a button.
I have to disable the "Anchor Input Listener Behaviour"?
And then?
I want to call for the PositionContentAtPlaneAnchor but I can't figure out how to call it in the right way in the OnClick field of the button. I need to make a custom script for this?
Thanks for any answer.

Ok, sorry for the delay.
I deduce that you are working with the ground plane, if you have the Ground Plane Stage and the Plane Finder in the scene and works, we're at a good point.
Now, you must only add a button to the scene and into the script add something like this:
public PlaneFinderBehaviour plane;
void Start()
{
...
buttonOnTheScene.onClick.AddListener(TaskOnClick);
...
}
void TaskOnClick()
{
Vector2 aPosition = new Vector2(0,0);
...
plane.PerformHitTest(aPosition);
}
What does it mean?
First of all, you must move the Plane Finder from Hierarchy to the script variable, so we have a reference to the plane into the script.
Then when you click (or tap) on the button you simulate the click (or tap) on the display with the PerformHitTest.
If you're wondering why my question in the comment, it's because the Plane Finder Behaviour Script has two modes type: Interactive and Automatic. The Interactive intercept the tap on the display and shows the object (on the ground plane) in the exact position of the tap, the automatic shows the object in the center of the plane.
So if you want the object in an exact position you can pass a Vector2 position in the PerformHitTest and if you want to show an object programmatically or do something when it shows the object you can call a custom method OnInteractiveHitTest.
That's all.

Related

Get VRTK input to set a value inside an animator controller

(I already followed the tutorial 002 from VRTK, to use the custom trigger event on right and left controller. And it do work, but not for what I want.)
I do have a character instanciated from a prefab with his own Animatorcontroller. I attached to him my own "animator script", doing hand gesture when I press the button 0,1,2,3... and it's working and it work like that.
void Update()
{
// LEFT HAND //
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
else if (Input.GetKey(KeyCode.Keypad1))
m_Animator.SetInteger("HandGestureLeft", 1); // PALM
else if (Input.GetKey(KeyCode.Keypad2))
m_Animator.SetInteger("HandGestureLeft", 2); // POINT
Now I want to do it from my VR Controller buttons inputs instead of the 0.1.2.3 keypad. And I learnt that VRTK have some functions to do it. So I goes into the "VRTK_ControllerEvents_ListenerExample script" from the 002 example scene and I put the script on the left and right controller (Which are inside the VRTK gameobject scene) (not on my prefab avatar because if I do it nothing work anymore, the 002 scene script must be on the controllers instanciated by VRTK as the tutorial says).
I can put animations inside and have them working. But they only work for my prefab if I put it inside the scene already, and do not instanciate it. So as a player I cannot have my gestures working, because when I hit play I must have my character instanciated for my game to work (And my default script does that well, when I press the 0,1,2,3... key because I do not need to have it on the hand controllers, I can have it on my prefab).
So I have a problem and they may be two solution from what I see.
I may have a call to the VRTK controller functions directly inside my old "animator script". So it could look like :
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
into
if (GetComponent().TriggerPressed)==true) (BUT it tell me that TriggerPressed isn't available)
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
(But I have no idea how to grab the VR controller input from where VRTK is listening)
or there could be a way to have a call to my prefab AnimatorController Component inside the "VRTK_ControllerEvents_ListenerExample" script from the left and right controller. But here too, i have no idea how to call the prefab component, I can only call for the AnimatorControllerComponent if I have my Avatar Prefab on the scene already.
Well finally all I could need is the way to check from my prefab script if the leftcontroller or rightcontroller is pressing the trigger. Just like the "Input.GetKey(KeyCode.Keypad0)" but with the VRTK way to get the trigger input instead.
I'm totally lost, could someone help me ? :D
Here two images. Right now my working code look like that :
![1]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD4YD6/code.png "Code0"
![2]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD6DR6/code1.png "Code1"
Alright I found it finally. It was link that :
if(RightHand.GetComponent<VRTK_ControllerEvents>().IsButtonPressed(VRTK_ControllerEvents.ButtonAlias.GripPress) == true)
{
m_Animator.SetInteger("HandGestureLeft", 0);
}

Create objects with ground plane detection only once with Vuforia & Unity

I am trying to create an AR app using Unity & Vuforia. I have a 3D model that needs to be spawned when ground plane is detected.But this needs to happen only once.
The way Vuforia work is, it keeps on spawning objects when new plane is detected. So what i need to do is either detect plane only once or spawn the object only once. As i am new to Unity, i need help doing this. Great if someone could tell me what i need to do to achieve this.
Vuforia has updated.Now There is no DeploymentStageOnce script.Inorder to stop duplicating while we touch, we have to turn off Duplicate Stage in Content Positioning Behaviour (Script)Check the Inspector when we click Plane Finder.
In your app you should have a Plane Finder object somewhere with the following properties set by default
The Plane Finder object has a Behaviour component attached that calls a Position Content method if a plane was found. That method belongs to the Content Positioning Behaviour and it makes an instance (Clone) of your Ground Plane Stage. In order to avoid more than one instance you should import the vuforia Deploy Stage Once script located here: https://library.vuforia.com/articles/Solution/ground-plane-guide.html and you should change the Plane Finder Behaviour as the following:
I struggled a long with it, in short we must disable AnchorInputListenerBehaviour after hit.
I attached a new script on PlaneFinder with this code below:
<!-- language-all: c# -->
public void OnInteractiveHitTest(HitTestResult result)
{
var listenerBehaviour = GetComponent<AnchorInputListenerBehaviour>();
if (listenerBehaviour != null)
{
listenerBehaviour.enabled = false;
}
}
I added event on Plane Finder Behavior
That's all, I hope it will be useful.
For Updated Versions:
go to "Advanced" setting and "On Interactive Hit Test" script -> Select "Off" option for the script.
Most of the answers are correct but kind of obsolete, the correct way to do that is by code.
Create for example a gameObject called GameManager and pass the GroundPlaneStage and a prefab of the object you want to spawn to a script attached to that GameManager for example call it GameManagerScript.cs, and create a small function called spawnObjects which does the following:
public class SceneManagerScript : MonoBehaviour {
public GameObject objPrefab;
public GameObject ground;
private int count = 0;
public void spawnObject() {
Instantiate(objPrefab, new Vector3(count, 0, 0), Quaternion.identity, ground.transform);
count += 2;
}
}
then after that go to the PlaneFinder specifically to the PlaneFinderBehaviour.cs component you will have callbacks for OnInteractiveHitTest and OnAutomaticHitTest, in your case you need the OnAutomativeHitTest, click + and add a new callback (the function spawnObject in code above like in the image below)
also when you instantiate the object of your preference via the prefab don't forget to write the proper position updates to prevent the objects from getting added in the same position
also don't forget to make the GroundPlaneStage the parent of the object and realize that the position you are adding in the Instantiate() function is relative to that parent (GroundPlaneStage which is represented in the code above with the variable ground)
Finally don't forget to uncheck Duplicate Stage from the "Content Positioning Behaviour" component in the Plane Finder as shown in the picture below:
I hope that helps
please try the vuforia website for this problem
Introduction to Ground Plane in Unity

c# OnTriggerStay with Collider Variables

So I'm making a simple "puzzle" using lights. Simply, there are 3 buttons with lights on them (red, blue, green). Each button has it's own trigger volume but when I go to play, nothing prints that I even enter, stay, or leave the trigger. I've never used Collider Variables before so I feel like I'm doing something wrong (obviously or it would be working!). But then I just did "Collider entity" in the OnTriggerStay/Enter/Exit Method and it still didn't print to the console that my player was entering. Why are my Triggers not working?
Click here for the code I'm trying
Click here to see how I have it in the Unity Scene
Triggers only respond to other colliders that have rigid bodies on them.
Try adding a Ridgidbody component to your player and set it to kinematic.
OnTriggerEnter/Stay/Exit works when the Object has a Collider Component and BluePuzzle2 doesn't have that.
Also OnTrigger function gets a Collider as parameter. Check the reference page
So in order to make that work put a script on every light and on that script copy this function
void OnTriggerEnter(Collider col) {
if (col.CompareTag("Player")) {
print("Entered the trigger");
}
}
Hope it helps.

Unity2D: Panning a map using touch still register even a button in canvas is tapped

I have a problem cancelling pan feature of my app when a button in HUD is pressed.
We're using canvas for displaying UIs, so I can't use the RayCast to detect if a touch hit a button or not since the button is not directly in the world where the map to be panned located.
I already added collider to the buttons and tried printing the name of objects hitted by raycast but it just passes through the buttons like they are not there.
How am I suppose to detect if a button in canvas is touched so I can cancel executing pan feature?
Check EventSystem's IsPointerOverGameObject method. You can use it like this:
public bool IsPointerOverUI
{
get{
return EventSystem.current.IsPointerOverGameObject();
}
}
Don't forget to import UnityEngine.EventSystems

GameObject to appear when clicking a button in unity3d?

I needed to create UIButton attached to a gameobject.
I'm using a board based game i.e., when ever clicking a button at that time place one object to that particular button.
I am using c# script in unity3d .
void UIBtn(GameObject BName)
{
//here to write Button click event.
}
I'm assuming you mean GUI.Button .
Reading your first sentence I understood that you wanted to create a button where a GameObject is, but reading your second sentence it seems that you want a GameObject to appear when clicking a button. Since I'm not sure, I'll answer both.
To make a GUI button appear at the place the mouse is use something like:
using UnityEngine;
using System.Collections;
public class Example : MonoBehaviour {
void OnGUI() {
Vector2 screenPos = Event.current.mousePosition;
GUI.Button ( new Rect(screenPos.x,screenPos.y,100,100),"Hello");
}
}
Attaching a button to a GameObject requires first identifying the GameObject via Physics.Raycast and then getting the GameObject from the HitCollider and then, on the game object's OnGUI loop. constantly translate its world coordinates to screen coordinates to be able to show a button, via GUI.Button.

Categories

Resources