Unity ARCore simple object movement across network - c#

I'm currently developing an android application using Google's ARCore and cloud anchors and I am having a problem with getting objects to change location across the network on button click. The problem has to be somewhere surrounding the actual button because I know that the button runs the function but it doesn't do what I want it to do, if I call the function from player script it works fine (but I need this to work on button click!) and I've been stuck with it for days now. Some enlightenment here would be very appreciated. If anyone's wondering, I'm simply trying to edit (as I'm learning) the example code of cloud anchors therefore the source code belongs to Google.
[Code I'm stuck with][1]
What I'm trying to achieve is a simple transform.Translate on an object and make this move visible to clients.
Things to note:
Button has network identity
Different functions to make any kind of movement/rotation
Player prefab has network identity
object prefabs in question have network transform component
My code are presented below:
[Command]
public void CmdSpawnStar(Vector3 position, Quaternion rotation)
{
//if conditional that stops further object placement until touchCounter is reset to a 0
if (touchCounter == 0)
{
// the object at hit pose
PlaceableModel = Instantiate(StarPrefab, position, rotation);
PlaceableModel.transform.Rotate(0, k_ModelRotation, 0, Space.Self);// Compensate for the hitPose rotation facing away from the raycast (i.e. camera).
NetworkServer.Spawn(PlaceableModel);
touchCounter++;
}
}
[Command]
public void CmdFlyUp()
{
_ShowAndroidToastMessage("UP button is pressed");
PlaceableModel.transform.Translate(0, 0.1f, 0);
}

Related

Changing player rotation to a camera rotation

I'm new to unity and I'm currently working on a portal-like game.
I did the whole teleportation script and it works, but the problem comes that I didn't implement the player camera correction and actually I don't have any ideas how to do it. The concept is that when you're jumping through a portal, the player (or player camera) rotation should be changed to the portal/portal camera rotation from you've come so the final effect is more 'realistic'.
I've tried some lines in teleportation script like player.transform.rotation = portal.transform.rotation but in the end it didn't work and now I end up with nothing, deleting previous scripts and trying to write it all over and over again.
I'll be glad if someone could guide me how to start coding it. Should I do it in onTriggerEnter (when you're jump through portal), or in onTriggerExit? Should the script be attached to a player or to a portals? Should I gather rotation only from camera or from the whole gameobject (portal/player)? I'm posting also couple of screens (with a video how it currently works, and also an entire teleportation script. If I missed something just ask me and I'll post it here.
https://imgur.com/a/pbqYnLD - screens with portals inspector
https://streamable.com/b14hk - video how it works
teleportation script:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Teleportation : MonoBehaviour {
[SerializeField] private GameObject otherPortal;
[SerializeField] private GameObject player;
void OnTriggerEnter(Collider col) {
if(col.tag == "Player") {
col.transform.position = new Vector3(otherPortal.transform.position.x+1, otherPortal.transform.position.y+1, otherPortal.transform.position.z+1);
Debug.Log("wszedłem w portal");
}
}
void Update() {
}
}
some informations how it is coded right now:
portals are currently in game behind 'the box', i didnt instantiate them anywhere; just changing position on lpm (blue portal) and ppm (orange portal)
portals are sticking to the walls, just like in the original game
portals have a camera attached to it and right now the cameras are static. (offtop: i have a script to move them exactly when player is moving and it quite works but also have some problems, like camera can get too far away from portal and start rendering only that green outer side of the box, and i also dont know how to fix it, so currently i didnt use this script)
the player movement im using is that from unity standard assets (if it somehow matters)
the player have a rigidbody but the portals dont; not sure if i should attach this component to them
teleportation script is attached to the both of portals - the 'otherPortal' variable is moved from inspector, like in orange portal the 'otherPortal' variable is blue portal and the other way
What you did is correct (setting the player rotation to the portal.
You can do it in onTriggerEnter after setting the position, then it should look like
player.transform.rotation = otherPortal.transform.rotation
If you do that, the player will have the same rotation. You already have something that make the camera follow the player, so it is likely that you don't need to set the camera rotation. I don't know how you did your camera follow, so I can't be sure, though. If the camera has not the proper orientation, doing Camera.main.transform.rotation = otherPortal.transform.rotation will do it.
The remaining thing that might be wring, could be that your player (and camera) is not facing the right axis. On your video, I can see that the portal faces the x-axis (the red axis in Unity editor). Check that when going forward, your player has the red axis looking forward.
It is likely that your player has the z-axis (blue) facing forward, which is (by convention) more correct and fits the names Unity uses (z-axis is also called forward-axis)
I would recomand to create the portal object (and all other objects, including the player) so that the forward-axis is the blue one. It might require editing the objects. Anycase, check that the player forward axis is that same as the portal, otherwise setting the rotation won't work

Vuforia + Unity : Instantiate model in AR when UI Button pressed

I know this is a really simple question but I can't figure out how to archive this:
I have a UI button in my scene and I want Vuforia to instantiate one AR Model ONLY when I press a button.
Following the tutorial on the net I was able to instantiate a model on the screen when I touch it but I need to know how to set up Vuforia for archive the same result only when I press a button.
I have to disable the "Anchor Input Listener Behaviour"?
And then?
I want to call for the PositionContentAtPlaneAnchor but I can't figure out how to call it in the right way in the OnClick field of the button. I need to make a custom script for this?
Thanks for any answer.
Ok, sorry for the delay.
I deduce that you are working with the ground plane, if you have the Ground Plane Stage and the Plane Finder in the scene and works, we're at a good point.
Now, you must only add a button to the scene and into the script add something like this:
public PlaneFinderBehaviour plane;
void Start()
{
...
buttonOnTheScene.onClick.AddListener(TaskOnClick);
...
}
void TaskOnClick()
{
Vector2 aPosition = new Vector2(0,0);
...
plane.PerformHitTest(aPosition);
}
What does it mean?
First of all, you must move the Plane Finder from Hierarchy to the script variable, so we have a reference to the plane into the script.
Then when you click (or tap) on the button you simulate the click (or tap) on the display with the PerformHitTest.
If you're wondering why my question in the comment, it's because the Plane Finder Behaviour Script has two modes type: Interactive and Automatic. The Interactive intercept the tap on the display and shows the object (on the ground plane) in the exact position of the tap, the automatic shows the object in the center of the plane.
So if you want the object in an exact position you can pass a Vector2 position in the PerformHitTest and if you want to show an object programmatically or do something when it shows the object you can call a custom method OnInteractiveHitTest.
That's all.

Get VRTK input to set a value inside an animator controller

(I already followed the tutorial 002 from VRTK, to use the custom trigger event on right and left controller. And it do work, but not for what I want.)
I do have a character instanciated from a prefab with his own Animatorcontroller. I attached to him my own "animator script", doing hand gesture when I press the button 0,1,2,3... and it's working and it work like that.
void Update()
{
// LEFT HAND //
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
else if (Input.GetKey(KeyCode.Keypad1))
m_Animator.SetInteger("HandGestureLeft", 1); // PALM
else if (Input.GetKey(KeyCode.Keypad2))
m_Animator.SetInteger("HandGestureLeft", 2); // POINT
Now I want to do it from my VR Controller buttons inputs instead of the 0.1.2.3 keypad. And I learnt that VRTK have some functions to do it. So I goes into the "VRTK_ControllerEvents_ListenerExample script" from the 002 example scene and I put the script on the left and right controller (Which are inside the VRTK gameobject scene) (not on my prefab avatar because if I do it nothing work anymore, the 002 scene script must be on the controllers instanciated by VRTK as the tutorial says).
I can put animations inside and have them working. But they only work for my prefab if I put it inside the scene already, and do not instanciate it. So as a player I cannot have my gestures working, because when I hit play I must have my character instanciated for my game to work (And my default script does that well, when I press the 0,1,2,3... key because I do not need to have it on the hand controllers, I can have it on my prefab).
So I have a problem and they may be two solution from what I see.
I may have a call to the VRTK controller functions directly inside my old "animator script". So it could look like :
if (Input.GetKey(KeyCode.Keypad0))
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
into
if (GetComponent().TriggerPressed)==true) (BUT it tell me that TriggerPressed isn't available)
m_Animator.SetInteger("HandGestureLeft", 0); // FIST
(But I have no idea how to grab the VR controller input from where VRTK is listening)
or there could be a way to have a call to my prefab AnimatorController Component inside the "VRTK_ControllerEvents_ListenerExample" script from the left and right controller. But here too, i have no idea how to call the prefab component, I can only call for the AnimatorControllerComponent if I have my Avatar Prefab on the scene already.
Well finally all I could need is the way to check from my prefab script if the leftcontroller or rightcontroller is pressing the trigger. Just like the "Input.GetKey(KeyCode.Keypad0)" but with the VRTK way to get the trigger input instead.
I'm totally lost, could someone help me ? :D
Here two images. Right now my working code look like that :
![1]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD4YD6/code.png "Code0"
![2]: https://vrtoolkit.slack.com/files/UA39CMQRF/FA3BD6DR6/code1.png "Code1"
Alright I found it finally. It was link that :
if(RightHand.GetComponent<VRTK_ControllerEvents>().IsButtonPressed(VRTK_ControllerEvents.ButtonAlias.GripPress) == true)
{
m_Animator.SetInteger("HandGestureLeft", 0);
}

Create objects with ground plane detection only once with Vuforia & Unity

I am trying to create an AR app using Unity & Vuforia. I have a 3D model that needs to be spawned when ground plane is detected.But this needs to happen only once.
The way Vuforia work is, it keeps on spawning objects when new plane is detected. So what i need to do is either detect plane only once or spawn the object only once. As i am new to Unity, i need help doing this. Great if someone could tell me what i need to do to achieve this.
Vuforia has updated.Now There is no DeploymentStageOnce script.Inorder to stop duplicating while we touch, we have to turn off Duplicate Stage in Content Positioning Behaviour (Script)Check the Inspector when we click Plane Finder.
In your app you should have a Plane Finder object somewhere with the following properties set by default
The Plane Finder object has a Behaviour component attached that calls a Position Content method if a plane was found. That method belongs to the Content Positioning Behaviour and it makes an instance (Clone) of your Ground Plane Stage. In order to avoid more than one instance you should import the vuforia Deploy Stage Once script located here: https://library.vuforia.com/articles/Solution/ground-plane-guide.html and you should change the Plane Finder Behaviour as the following:
I struggled a long with it, in short we must disable AnchorInputListenerBehaviour after hit.
I attached a new script on PlaneFinder with this code below:
<!-- language-all: c# -->
public void OnInteractiveHitTest(HitTestResult result)
{
var listenerBehaviour = GetComponent<AnchorInputListenerBehaviour>();
if (listenerBehaviour != null)
{
listenerBehaviour.enabled = false;
}
}
I added event on Plane Finder Behavior
That's all, I hope it will be useful.
For Updated Versions:
go to "Advanced" setting and "On Interactive Hit Test" script -> Select "Off" option for the script.
Most of the answers are correct but kind of obsolete, the correct way to do that is by code.
Create for example a gameObject called GameManager and pass the GroundPlaneStage and a prefab of the object you want to spawn to a script attached to that GameManager for example call it GameManagerScript.cs, and create a small function called spawnObjects which does the following:
public class SceneManagerScript : MonoBehaviour {
public GameObject objPrefab;
public GameObject ground;
private int count = 0;
public void spawnObject() {
Instantiate(objPrefab, new Vector3(count, 0, 0), Quaternion.identity, ground.transform);
count += 2;
}
}
then after that go to the PlaneFinder specifically to the PlaneFinderBehaviour.cs component you will have callbacks for OnInteractiveHitTest and OnAutomaticHitTest, in your case you need the OnAutomativeHitTest, click + and add a new callback (the function spawnObject in code above like in the image below)
also when you instantiate the object of your preference via the prefab don't forget to write the proper position updates to prevent the objects from getting added in the same position
also don't forget to make the GroundPlaneStage the parent of the object and realize that the position you are adding in the Instantiate() function is relative to that parent (GroundPlaneStage which is represented in the code above with the variable ground)
Finally don't forget to uncheck Duplicate Stage from the "Content Positioning Behaviour" component in the Plane Finder as shown in the picture below:
I hope that helps
please try the vuforia website for this problem
Introduction to Ground Plane in Unity

Vuforia / Lean Touch and cancelling bg interference with objects

I've been working on a project for a while and ultimately what I have is a minority report style setup with AR screens in front of me. I'm using a unity plugin called lean touch which allows me to tap on the mobile screen and select the 3d objects. I can then move them around and scale them which is great.
I have an undesired side effect though, under a different situation this would be fantastic but in this case it's hurting my head. If you trigger the target using a user defined target the screen appears in front of you with the camera (real world) behind it. I then tap on the item and can drag on my mobile device to interact. The problem is, if someone walks past my mobile in the background the tracking 'pushes' my item off my screen as they walk past.
Under a different setup I could take advantage of this as I wave my hand behind the camera and can interact with the item, unfortunately all I can do is push it from side to side not scale or anything else and it's kind of making the experience problematic. I therefore would like help to find out how to stop the tracking of my objects position if the camera background changes. Thus leaving my objects on screen until I touch the screen to interact.
I'm not sure exactly what needs turning on or off, if it's a vuforia setting or a lean touch unity setting with layers etc or rigidbody or something to add. I need directional advice on which way to go as I'm going in circles at the moment.
Screenshot below.
http://imgur.com/a/NYekP
If I wave my hand behind or someone walks past it moves the AR elements.
Ideas would be appreciated.
The only code that interacts with an object is the below which is on a plane used as a holder object for the interactive screen prefab so lean touch will move the plane as the screen thinks its a gui element and therefore not interactable, the last line is added to hide the original plane so the duplicate isnt visable when I lean touch drag the screen.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Vuforia;
public class activateChildObjects : MonoBehaviour {
public GameObject MyObjName;
//Get the original GUI item to disable on drag of the cloned item
public string pathToDeactivate;
void Update(){
if(DefaultTrackableEventHandler.lostFound == true){
foreach (Transform child in MyObjName.transform) {
child.gameObject.SetActive(true);
}
Debug.Log("GOT IT");
}else{
foreach (Transform child in MyObjName.transform) {
child.gameObject.SetActive(false);
}
Debug.Log("Lost");
}
//Remove original panel
GameObject.Find("UserDefinedTarget/"+pathToDeactivate).SetActive(false);
}
}
EDIT
Ok I haven't stopped it from happeneing but after numerous code attempts I just ticked all the tracking options to try and prevent additional tracking once found and it's not right but it's an improvement. It will have to do for now.

Categories

Resources