I'm new to C# and Unity, but I'm trying to code an app with Unity which would play my customer's 3D Side by Side VR video. I'm using script from here https://docs.unity3d.com/ScriptReference/Video.VideoPlayer.html to get the video running and it works great, BUT it doesn't play as Side By Side VR video, but just a flat video with two screens for both eyes.
UnityEngine.Video has Video3DLayout.SideBySide3d enumeration (https://docs.unity3d.com/ScriptReference/Video.Video3DLayout.html) that could solve the problem, but I don't know what is the right syntax for that. Here is the code I have tried:
using UnityEngine;
using System.Collections;
public class Movie : MonoBehaviour{
enum Video3DLayout {No3D, SideBySide3D, OverUnder3D};
void Start(){
Video3DLayout myLayout;
myLayout = Video3DLayout.SideBySide3D;
// Will attach a VideoPlayer to the main camera.
GameObject camera = GameObject.Find("Main Camera");
// VideoPlayer automatically targets the camera backplane when it is added
// to a camera object, no need to change videoPlayer.targetCamera.
var videoPlayer = camera.AddComponent<UnityEngine.Video.VideoPlayer>();
// Play on awake defaults to true. Set it to false to avoid the url set
// below to auto-start playback since we're in Start().
videoPlayer.playOnAwake = false;
// By default, VideoPlayers added to a camera will use the far plane.
// Let's target the near plane instead.
videoPlayer.renderMode = UnityEngine.Video.VideoRenderMode.CameraNearPlane;
// Set the video to play.
videoPlayer.url = "movie.mp4";
// Start playback. This means the VideoPlayer may have to prepare (reserve
// resources, pre-load a few frames, etc.). To better control the delays
// associated with this preparation one can use videoPlayer.Prepare() along with
// its prepareCompleted event.
videoPlayer.Play();
}
}
I have feeling that I'm introducing the enumerations wrong way and I hope that someone could guide me to right direction. I have tried to use this tutorial https://unity3d.com/learn/tutorials/topics/scripting/enumerations to understand more about enumerations, but this way my code is not working.
There's 2 reasons why it isn't working. You're creating a new enum, instead of using the UnityEngine.Video.Video3DLayout enum. Also you're not assigning myLayout to the videoPlayer.
using UnityEngine;
using System.Collections;
using UnityEngine.Video;
public class Movie : MonoBehaviour
{
public Video3DLayout myLayout;
void Start()
{
// Will attach a VideoPlayer to the main camera.
GameObject camera = GameObject.Find("Main Camera");
// VideoPlayer automatically targets the camera backplane when it is added
// to a camera object, no need to change videoPlayer.targetCamera.
var videoPlayer = camera.AddComponent<UnityEngine.Video.VideoPlayer>();
videoPlayer.targetCamera3DLayout = myLayout;
// Play on awake defaults to true. Set it to false to avoid the url set
// below to auto-start playback since we're in Start().
videoPlayer.playOnAwake = false;
// By default, VideoPlayers added to a camera will use the far plane.
// Let's target the near plane instead.
videoPlayer.renderMode = UnityEngine.Video.VideoRenderMode.CameraNearPlane;
// Set the video to play.
videoPlayer.url = "movie.mp4";
// Start playback. This means the VideoPlayer may have to prepare (reserve
// resources, pre-load a few frames, etc.). To better control the delays
// associated with this preparation one can use videoPlayer.Prepare() along with
// its prepareCompleted event.
videoPlayer.Play();
}
}
Related
So i am trying to apply an audio source onto a game object upon destroying it.
I can add the Audio Source component fine with the script but i am unable to load the audio file onto it.
The audio source component just says "None (Audio Clip)"
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class BlockScript : MonoBehaviour
{
void OnMouseDown(){
if(this.gameObject.GetComponent<AudioSource>() == null) {
AudioSource blockBreakSound = this.gameObject.AddComponent<AudioSource>();
blockBreakSound.clip = Resources.Load("Assets/Sounds/BlockBreakingSound.wav") as AudioClip;
blockBreakSound.Play();
Destroy(this.gameObject, blockBreakSound.clip.length);
} else {
this.gameObject.GetComponent<AudioSource>().Play();
Destroy(this.gameObject, this.gameObject.GetComponent<AudioSource>().clip.length);
}
}
}
I am not receiving any errors so i am having a hard time figuring out what the problem is.
I can apply the audio file manually onto the game object / prefab i am spawning, but as i am trying to optimize it somewhat i dont want hundreds of game object with the audio file on them to begin with but rather put them on the object upon trying to destroy them
As #Hellium was really nice to point out i had missed the fact that i should keep my files in a folder called Resources to actually grab them with Resources.Load();
Thanks for the fast help :)
I'm making a game where players are put onto plates and random events happen to the plates/players. I've gone through a multitude of different networking solutions like UNET, Mirror, Mirror+Steamworks P2P but none of them really worked well (main issue being not being able to join via IP) so I settled on Photon. After Punv2 just wouldn't work on my project (an error in PhotonEditor.CS) I just used Punv1 which is working perfectly for the most part.
The issue right now is that I'm trying to spawn players on their owned plates but they only spawn on the first plate despite each plate being owned by a player (each plate has a script that specifies which player 'owns' it. This is being set correctly). This only seems to happen when I try to test with a real player/client. If I create a fake player by just spawning in the prefab and then running it, both players will be moved to their correct plate so it seems to be a networking issue.
Here's the code responsible for moving the players to their plates.
foreach (GameObject plate in spawnedPlates)
{
//Here we loop through each plate, get the player assigned to it and move the player to that plate.
GameObject player = plate.GetComponent<Plate>().assignedTo;
PlayerClientControl playerController = player.GetComponent<PlayerClientControl>(); // originally for UNET/Mirror. Left in incase I need it(had an RPC function that moved the player).
player.transform.position = plate.transform.position + new Vector3(0, 4, 0);
}
What am I doing wrong?
EDIT:
It seems they ARE being moved to the correct positions (or atleast, it's trying to move them to the correct position) via a debug print statement that prints where they are trying to be teleported to further cementing that this is a networking issue but I have no idea how to fix it.
I assume it's something to with host/client synchronization? If anyone could shed some light on this, that'd be great because I'm pulling my hair out over this.
If you inspect the player objects at runtime, it will probably give you a clue. I had the same and in my case I had InputAction attached to my players (taken from unity StarterAssets). The first player in the game was assigned keyboard and the second a controller. I solved it by disabling all the inputaction and enabling the one of the localplayer:
private void Awake()
{
photonView = GetComponent<PhotonView>();
if (photonView.IsMine)
{
// take control of the user input by disabling all and then enabling mine
var players = FindObjectsOfType<Player>();
foreach (Player player in players)
{
player.GetComponent<PlayerInput>().enabled = false;
}
GetComponent<PlayerInput>().enabled = true;
I am trying to display player tutorial when the user 1st starts the game using playerpref and want the game to be paused, the problem i am facing is that Time.timescale=0 is not pausing the game when placed inside start (tutorialCanvas gets displayed), but works when is called by a button(Pause button).
Following is the code I use
void Start()
{
if (PlayerPrefs.HasKey ("test4") ==false ) {
tutorialCanvas.SetActive (true);
Time.timeScale = 0;
}
}
Time.timeScale = 0 is really messed up for me so what i could recommend you is that if you just want to pause something like pausing a movement of a character then you can try it like this :
GameObject PlayerScript;
if(Input.GetKey(KeyCode.P)){
//lets disable the playermovement script only not the whole object
PlayerScript = GetComponent<PlayerMovement>().enabled = false;
}
I've been stucked there also so i made an alternative way. And you can also visit this one. If you want a free of that asset you can get it here
I had this issue recently, although it was only doing it on a build, and not in the editor. Changing timeScale to something like 0.0001f would fix it, but that wasn't a great solution. Creating an input settings asset (under Camera -> Player Input -> Open Input Settings) seemed to have fixed it for the builds.
So, I'm doing an FPS game in Unity5, and some days ago I started looking at multiplayer tutorials at the official site of Unity.
However, I changed some bit of the code to fit in an FPS, but I had a problem when started doing this. The client's camera moved fine, but the host's one did not: It used the client's camera to see, but his player to move, sort of a third-person view.
All players have the same code, and the camera has a Network Identity with Local Player Authority turned on. Also, it has this C# script:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class VisionControl : NetworkBehaviour {
void Start () {
Cursor.lockState = CursorLockMode.Locked;
}
void Update () {
if(!isLocalPlayer){
return;
}
transform.Rotate (new Vector3 (-Input.GetAxis ("Mouse Y")*5.0f,0.0f,0.0f));
}
}
Your camera should not have a network identiy.
It is used only by the local clients to get a perspective of the game world, so they do not need to be networked.
Each camera exists on the current client.for the current client only.
Each game instance should have 1 camera running, that of the Local Player.
Attach a simple non-networked camera component to your player prefab and test again.
Fixed! I removed the VisionControl script and putted this one on the player's movement script:
public override void OnStartLocalPlayer ()
{
if (isLocalPlayer) {
FindObjectOfType<Camera> ().gameObject.transform.position = this.transform.position+new Vector3(0f,0.5f,0f);
FindObjectOfType<Camera>().gameObject.transform.SetParent (this.transform);
}
}
public Transform OculusPlayerPrefab;
public Transform DefaultPlayerPrefab;
void Start() {
Transform player = OVRDevice.IsHMDPresent() ?
(Transform)Instantiate(OculusPlayerPrefab) :
(Transform)Instantiate(DefaultPlayerPrefab);
player.position = transform.position;
}
This should detect if the oculus rift HMD is connected and instantiate the oculus player prefab, otherwise the default. However, IsHMDPresent() returns false whether the Oculus Rift is connected or not. In the unity/oculus integration package however, OVRMainMenu uses the IsHMDPresent() method with the expected results.
As of (at least) Unity 2018.2, using the Oculus Utilities, the following works:
if (OVRManager.isHMDPresent) {
// headset connected
}
I'll add that you can subscribe to HMDMounted and HMDUnmounted events as well which is somewhat related:
OVRManager.HMDMounted += MyOnHMDMountedFunction();
OVRManager.HMDUnmounted += MyOnHMDUnmountedFunction();
Those will fire when you put on (HMDMounted) and/or take off (HMDUnmounted) your headset.
Unity now has a built-in way to detect this.
http://forum.unity3d.com/threads/simply-detecting-the-oculus-rifts-presence-solved.294089/#post-2368233
Docs: http://docs.unity3d.com/ScriptReference/VR.VRDevice-isPresent.html
Edit: this answer is from 2014 and based on Unity 4. You probably want
to use the other answers.
I found this method to be working best:
Ovr.Hmd.Detect() > 0
Also remember of the the HMDLost/HMDAcquired events, so you don't have to poll this every frame:
bool oculusPresent=false;
void CheckOculusPresence() {
oculusPresent=Ovr.Hmd.Detect() > 0;
}
void Start() {
CheckOculusPresence();
OVRManager.HMDAcquired+=CheckOculusPresence;
OVRManager.HMDLost+=CheckOculusPresence;
}
(oculus SDK 0.4.3/unity3d 4.5.5, OSX/Windows)