I am trying to find a configuration, or determine what additions I need to make to MRTK v2.1 to enable testing of HL1 in the 'Editor' work as it did with the prior workflow. Simplified, it is the ability to use Gaze and the mouse as we did prior to MRTK v2.
I have not found a way to map the left mouse button, in the profile, to behave as it does with the hand visible. With the hand visible pressing the left mouse enables manipulation. We simply need this with gaze as well. Note: Changing 'Hand Simulation' to Gestures does not produce the proper behavior it only makes the Gaze cursor persistent with no interaction.
I have asked this question in many places and it is often misunderstood so here is some background. We are continuing to deliver applications to clients that require stringent QA. They must be tested, as they have been while using the new MRTK. It is not reasonable to have our QA team use the articulated hands on a HL1 project as it introduces human error which is counterproductive to bug testing.
The needs are:
1. Have the cursor persistent in the Editor. (In the latest release, it disappears after using the hands via space bar.) This should be a toggle in our opinion. Yes, we know that you can press '1' to emulate select, but as mentioned in the above this is not acceptable for proper QA.
2. Have the left mouse button, without use of the hands, trigger 'Select' and 'Pinch'. I have written a service extension to handle the 'Select', but adding the pinch, as it is tied to the Hand articulation, has proven a challenge.
Proposed solutions so far: Change the 'Hand Simulation' mode to Gestures. Unfortunately, Gestures eliminates the ability to use the hands for HL2 testing and does not provide 'Select' or 'Pinch' behavior to the left mouse button. So it is either not working correctly or not the proper solution.
We do not see a reason why HL1 and HL2 cannot be tested in the editor at the same time. We are just missing the use of the left mouse button when the hands are not in use.
This is the Service extension I created for the left mouse button to send a 'Select' event. This will trigger an OnClick() event on an 'Intractable'. What it is missing the 'Pinch' ability to move or manipulate a 'Manipulation Handler' with the left mouse button as well.
[MixedRealityDataProvider(
typeof(IMixedRealityInputSystem),
(SupportedPlatforms)(-1), // All platforms supported by Unity
"Unity Mouse Device Manager")]
public class TharynMouseService : BaseInputDeviceManager, ITharynMouseService
{
private TharynMouseServiceProfile tharynMouseServiceProfile;
public TharynMouseService(
IMixedRealityServiceRegistrar registrar,
IMixedRealityInputSystem inputSystem,
string name,
uint priority,
BaseMixedRealityProfile profile) : base(registrar, inputSystem, name, priority, profile)
{
tharynMouseServiceProfile = (TharynMouseServiceProfile)profile;
}
private IMixedRealityInputSource inputSource;
private IMixedRealityInputSystem inputSystem;
public override void Initialize()
{
// Do service initialization here.
inputSystem = Service as IMixedRealityInputSystem;
Debug.Log("inputSystem: " + inputSystem);
inputSource = inputSystem.RequestNewGenericInputSource("Tharyn Mouse Service");
Debug.Log("inputSource: " + inputSource);
inputSystem.RaiseSourceDetected(inputSource);
}
public override void Update()
{
// Do service updates here.
if (UnityEngine.Input.GetKeyDown(KeyCode.Mouse0))
{
inputSystem.RaiseOnInputUp(inputSource, Handedness.None, tharynMouseServiceProfile.TapAction);
Debug.Log("Down");
Debug.Log("inputSystem: " + inputSystem);
Debug.Log("inputSource: " + inputSource);
Debug.Log("TapAction: " + tharynMouseServiceProfile.TapAction);
}
}
}
Needs:
Trigger the same event that the '1' key does upon releasing the space bar to return the gaze cursor.
Extend the above service to include the 'Pinch' event/action so that object can be manipulated with Gaze and Mouse as it is with Hands.
Cheers and thanks in advance!
Tharyn
you can achieve ability to simulate hand pinch, hand release events using just the mouse + cursor by creating a component that invoked OnInputDown and OnInputUp events to the MRTK Input System. You then need to attach this component to a game object in the scene.
The key code to simulate OnInputDown and OnInputUp events is the following:
public void Update()
{
if (Input.GetMouseButtonDown(0))
{
InputSystem?.RaiseOnInputDown(
InputSource,
Handedness.Right,
selectAction
);
}
if (Input.GetMouseButtonUp(0))
{
InputSystem?.RaiseOnInputUp(
InputSource,
Handedness.Right,
selectAction
);
}
}
Here is the full code for a component will raise input down and up events that will allow you to simulate pinch and drag + move using just the mouse:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using System;
using Microsoft.MixedReality.Toolkit.Input;
using Microsoft.MixedReality.Toolkit;
using Microsoft.MixedReality.Toolkit.Utilities;
public class SimulateSelectOnClick : MonoBehaviour
{
private IMixedRealityInputSystem inputSystem = null;
protected IMixedRealityInputSystem InputSystem
{
get
{
if (inputSystem == null)
{
MixedRealityServiceRegistry.TryGetService<IMixedRealityInputSystem>(out inputSystem);
}
return inputSystem;
}
}
private IMixedRealityInputSource inputSource;
private MixedRealityInputAction selectAction;
private IMixedRealityInputSource InputSource
{
get
{
if (inputSource != null)
{
return inputSource;
}
inputSource = new BaseGenericInputSource("SimulateSelect",
new IMixedRealityPointer[] { InputSystem.GazeProvider.GazePointer }, InputSourceType.Hand) ;
return inputSource;
}
}
public void Start()
{
var inputActions = MixedRealityToolkit.Instance.ActiveProfile.InputSystemProfile.InputActionsProfile.InputActions;
selectAction = new MixedRealityInputAction();
for (int i = 0; i < inputActions.Length; i++)
{
if (inputActions[i].Description.Equals("select", StringComparison.CurrentCultureIgnoreCase))
{
selectAction = inputActions[i];
}
}
}
public void Update()
{
if (Input.GetMouseButtonDown(0))
{
InputSystem?.RaiseOnInputDown(
InputSource,
Handedness.Right,
selectAction
);
}
if (Input.GetMouseButtonUp(0))
{
InputSystem?.RaiseOnInputUp(
InputSource,
Handedness.Right,
selectAction
);
}
}
}
Related
I'm working on a Mixed Reality app in Unity.
I'm trying to update my own hand mesh according to:
https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/features/input/hand-tracking?view=mrtkunity-2021-05#hand-mesh-prefab
public void OnHandMeshUpdated(InputEventData<HandMeshInfo> eventData)
{
print("Hand mesh update!");
if (eventData.Handedness == myHandedness)
{
myMesh.vertices = eventData.InputData.vertices;
myMesh.normals = eventData.InputData.normals;
myMesh.triangles = eventData.InputData.triangles;
if (eventData.InputData.uvs != null && eventData.InputData.uvs.Length > 0)
{
myMesh.uv = eventData.InputData.uvs;
}
}
}
However, the method OnHandMeshUpdated(InputEventData eventData) never triggers. I use Holographic Remoting. I made sure I set the Hand Mesh Visualization Modes to "Everything" in MRTK -> Articulated Hand Tracking.
MRTK Image
What am I missing? How can I animate my own hand mesh?
Unity: 2020.3.26f1
MRTK: 2.7.3
check if you registered for global input events.
Maybe this link will provide some information.
https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2/features/input/input-events?view=mrtkunity-2022-05
I tried Global input registration example in this link and it worked.
Long story short, I'm trying to create a UI panel where you hold and drag your mouse wheel button and select the option you want (while holding that button). So it is a pop-up menu that is triggered when you press the mouse wheel button. When you release the button, there are 2 possible situations:
You didn't move your cursor to a valid position. The pop-up menu is closed.
You moved your cursor to a valid position. You triggered an option from this pop-up menu.
To give you an example, the weapon switch system in, say, Tomb Raider does this similarly. It looks like a roulette, you hold the button then move your cursor to a certain location that "belongs" to, say, shotgun. Then you release the button, the menu is closed and now you are equiped with a shotgun.
Right now, the pop-up panel kinda works. However it's not a hold-release mechanism but a click mechanism. You click the button once, then the menu pops up and stays there.
How do you do that in Unity3D?
Here is my script so far:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class PosSelector : MonoBehaviour
{
Animator posSelectorAnimator;
public GameObject posSelector;
public Animator posButtonAnimator;
void Start ()
{
posSelectorAnimator = posSelector.GetComponent<Animator>();
}
void Update ()
{
if (Input.GetKeyDown(KeyCode.Mouse2))
{
Open();
}
if (Input.GetKeyUp(KeyCode.Mouse2))
{
Close();
}
}
void Open()
{
Vector3 mousePos = Input.mousePosition;
Debug.Log("Middle button is pressed.");
posSelector.transform.position = (mousePos);
posSelectorAnimator.SetBool("ButtonDown", true);
}
void Close()
{
if (posButtonAnimator.GetCurrentAnimatorStateInfo(0).IsName("Highlighted"))
{
Debug.Log("Position selected.");
Debug.Log(posButtonAnimator.GetCurrentAnimatorStateInfo(0).shortNameHash);
}
else
{
Debug.Log("Input not found.");
Debug.Log(posButtonAnimator.GetCurrentAnimatorStateInfo(0).shortNameHash);
}
}
}
First of all, what you are cresting is called a pie menu.
You could create a script for the buttons inside your pie menu. Then let that script inherit from MonoBehaviour, IPointerEnterHandler and IPointerExitHandler.
These interfaces force you to implement the following methods:
public void OnPointerEnter(PointerEventData eventData)
{
PosSelector.selectedObject = gameObject;
}
public void OnPointerExit(PointerEventData eventData)
{
PosSelector.selectedObject = null;
}
Now create a static field in your PosSelector script called selectedObject:
public static GameObject selectedObject;
Then in your Close() Method, you can use selectedObject as your output:
void Close()
{
//Close your menu here using the animator
//Return if nothing was selected
if(selectedObject == null)
return;
//Select a weapon or whatever you want to do with your output
}
Also, consider renaming your question to something like "how to create a pie menu in unity? " so that other people with the same question have an easier time finding this question
I'm making some game menu in unity using UI event systems.
My intention is when I move mouse pointer over the text in game menu area,
font-size get bigger and when I move mouse pointer out, font size get back
to it's original size.
It works as I intended but there's a problem that if I move mouse pointer over the text and set my game menu deactive then active again using key press,font-size doesn't get back to it's original size,just getting bigger.
Here's my code
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
using UnityEngine.UI;
public class HandleIngameMenu : MonoBehaviour,IPointerEnterHandler,IPointerExitHandler,IPointerClickHandler
{
private Text texts=null;
private void Start()
{
texts = GetComponentInChildren<Text>();
}
public void OnPointerEnter(PointerEventData data)
{
texts.fontSize += 3;
}
public void OnPointerExit(PointerEventData data)
{
if (this.gameObject.name.Equals("ReStartBtn"))
{
texts.fontSize = 30;
}
else
{
texts.fontSize = 37;
}
}
I guess setting object deactive is not same as moving mouse pointer out.
Is there any way to solve this problem??
When you activate your menu set the defaults, this way it won't matter what happens when it is deactivated. Or alternatively when you deactivate your menu run the defaults method first then deactivate. This way when it's enabled it will be back to normal.
I'm trying to build a simple MQTT application. I want to display the received MQTT message in a 3dText with Unity.
I put the string assignment into the Update() function, but appearently when a message is received the Update() function is stopped. When i click somewhere in the editor the Update() function wake up and update the string.
C#
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
using System.Net;
using uPLibrary.Networking.M2Mqtt;
using uPLibrary.Networking.M2Mqtt.Messages;
using uPLibrary.Networking.M2Mqtt.Utility;
using uPLibrary.Networking.M2Mqtt.Exceptions;
using System;
public class mqttTest : MonoBehaviour {
private MqttClient client;
public TextMesh mytext=null;
string msg;
// Use this for initialization
void Start () {
// create client instance
client = new MqttClient(IPAddress.Parse("192.168.83.128"),1883 , false , null );
// register to message received
client.MqttMsgPublishReceived += client_MqttMsgPublishReceived;
string clientId = Guid.NewGuid().ToString();
client.Connect(clientId);
client.Subscribe(new string[] { "test" }, new byte[] { MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE });
}
void client_MqttMsgPublishReceived(object sender, MqttMsgPublishEventArgs e)
{
msg = System.Text.Encoding.UTF8.GetString(e.Message);
Debug.Log("Received: " + msg );
}
// Update is called once per frame
void Update () {
Debug.Log("update");
mytext.text = msg;
}
}
This stopping is due to the development environment or the event mechanism?
That's probably because Unity only executes the PlayMode (Update events etc) while the UnityEditor window and in specific the Game view is focused. If another window has or gains focus in the moment your message is received Unity freezes until the Game view of unity editor gains focus again.
You can fix this by going to the Player Settings → Resolution and Presentation → Resolution and enable the option
Run In Background
Enable this option to make the game keep running (rather than pausing) if the app loses focus.
Apparently this also applies to the player within UnityEditor itself.
However it only works/exists if your project target is set to Standalone or Web.
For iOS and Android this does not work because on those devices an app can not run in background. Unity automatically also applies the same behaviour to the player within UnityEditor itself.
But you can still get around it setting Application.runInBackground (which is basically what before mentioned option does)
Should the player be running when the application is in the background?
Default is false (application pauses when it is in the background).
directly from a component (attached to any Scene GameObject) like
public class EditorRunInBackground : MonoBehaviour
{
private void Awake ()
{
if (Application.isEditor) Application.runInBackground = true;
}
}
which only sets the runInBackground for the UnityEditor itself.
Alternatively if you don't want to attach it to anything you could also use a script with [RuntimeInitializeOnLoadMethod]
public static class EditorRunInBackground
{
[RuntimeInitializeOnLoadMethod]
private static void OnRuntimeMethodLoad()
{
if(Application.isEditor) Application.runInBackground = true;
}
}
and could even play it within an Editor folder so it gets stripped of in a build.
Still learning here and starting of with basics (just a plain Plane and moving as the first person around)
However i can run the app and look around and up and down etc. but can;t make the camera move when touching the touchpad on the GearVR headset.
I have created the script (c#) and attached to the camera in unity:
using UnityEngine;
using System.Collections;
public class Moving : MonoBehaviour {
// Use this for initialization
void Start()
{
if (Input.GetButton("Tap"))
{
// Do something if tap starts
}
if (Input.GetButtonUp("Tap"))
{
// Do something if tap ends
}
}
// Update is called once per frame
void Update () {
if (Input.GetButton("Tap"))
{
// Do something if tap starts
}
if (Input.GetButtonUp("Tap"))
{
// Do something if tap ends
}
}
}
But it still doesn't seem to work. When i build and run the app it does nothing :-(
I know i am doing something wrong but not sure what.
Handling Single Tap in Oculus Gear VR:
void Start()
{
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += OVRTouchpad_TouchHandler;
}
void OVRTouchpad_TouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
OVRTouchpad.TouchEvent touchEvent = touchArgs.TouchType;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
// Your response to Tap goes here.
}
}
Tap and Hold to Move:
You need to register Tapkey in your input following this
tutorial.
Add OVRPlayerController prefab to move around the scene. This will
let you move using keyboard W/A/S/D keys in Unity Editor.
Next you need to integrate Tap key with OVR Player controller script
to move in forward direction.
here are some useful links:
http://forum.unity3d.com/threads/gear-vr-touchpad-fps-script.373103/
http://rifty-business.blogspot.co.uk/2014/04/unity-pro-4-using-ovrcameracontroller.html