I want to play a stereo 360 degree video in virtual reality in Unity on an Android. So far I have been doing some research and I have two cameras for the right and left eye with each a sphere around them. I also need a custom shader to make the image render on the inside of the sphere. I have the upper half of the image showing on one sphere by setting the y-tiling to 0.5 and the lower half shows on the other sphere with y-tiling 0.5 and y-offset 0.5. With this I can show a 3D 360 degree image already correct. The whole idea is from this tutorial.
Now for video, I need control over the Video speed so it turned out I need the VideoPlayer from the new Unity 5.6 beta. Now my setup so far would require the Video Player to play the video on both spheres with one sphere playing the upper part (one eye) and the other video playing the lower part (other eye).
Here is my problem: I don't know how to get the video Player to play the same video on two different materials (since they have different tiling values). Is there a way to do that?
I got a hint that I could use the same material and achieve the tiling effect via UV, but I don't know how that works and I haven't even got the video player to play the video on two objects using the same material on both of them. I have a screenshot of that here. The Right sphere just has the material videoMaterial. No tiling since I'd have to do that via UV.
Which way to go and how to do it? Am I on the right way here?
Am I on the right way here?
Almost but you are currently using Renderer and Material instead of RenderTexture and Material.
Which way to go and how to do it?
You need to use RenderTexture for this. Basically, you render the Video to RenderTexture then you assign that Texture to the material of both Spheres.
1.Create a RenderTexture and assign it to the VideoPlayer.
2.Create two materials for the spheres.
3.Set VideoPlayer.renderMode to VideoRenderMode.RenderTexture;
4.Set the Texture of both Spheres to the Texture from the RenderTexture
5.Prepare and Play Video.
The code below is doing that exact thing. It should work out of the box. The only thing you need to do is to modify the tiling and offset of each material to your needs.
You should also comment out:
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
then use a Sphere imported from any 3D application. That line of code is only there for testing purposes and it's not a good idea to play video with Unity's sphere because the spheres don't have enough details to make the video smooth.
using UnityEngine;
using UnityEngine.Video;
public class StereoscopicVideoPlayer : MonoBehaviour
{
RenderTexture renderTexture;
Material leftSphereMat;
Material rightSphereMat;
public GameObject leftSphere;
public GameObject rightSphere;
private VideoPlayer videoPlayer;
//Audio
private AudioSource audioSource;
void Start()
{
//Create Render Texture
renderTexture = createRenderTexture();
//Create Left and Right Sphere Materials
leftSphereMat = createMaterial();
rightSphereMat = createMaterial();
//Create the Left and Right Sphere Spheres
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
//Assign material to the Spheres
leftSphere.GetComponent<MeshRenderer>().material = leftSphereMat;
rightSphere.GetComponent<MeshRenderer>().material = rightSphereMat;
//Add VideoPlayer to the GameObject
videoPlayer = gameObject.AddComponent<VideoPlayer>();
//Add AudioSource
audioSource = gameObject.AddComponent<AudioSource>();
//Disable Play on Awake for both Video and Audio
videoPlayer.playOnAwake = false;
audioSource.playOnAwake = false;
// We want to play from url
videoPlayer.source = VideoSource.Url;
videoPlayer.url = "http://www.quirksmode.org/html5/videos/big_buck_bunny.mp4";
//Set Audio Output to AudioSource
videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
//Assign the Audio from Video to AudioSource to be played
videoPlayer.EnableAudioTrack(0, true);
videoPlayer.SetTargetAudioSource(0, audioSource);
//Set the mode of output to be RenderTexture
videoPlayer.renderMode = VideoRenderMode.RenderTexture;
//Set the RenderTexture to store the images to
videoPlayer.targetTexture = renderTexture;
//Set the Texture of both Spheres to the Texture from the RenderTexture
assignTextureToSphere();
//Prepare Video to prevent Buffering
videoPlayer.Prepare();
//Subscribe to prepareCompleted event
videoPlayer.prepareCompleted += OnVideoPrepared;
}
RenderTexture createRenderTexture()
{
RenderTexture rd = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32);
rd.Create();
return rd;
}
Material createMaterial()
{
return new Material(Shader.Find("Specular"));
}
void assignTextureToSphere()
{
//Set the Texture of both Spheres to the Texture from the RenderTexture
leftSphereMat.mainTexture = renderTexture;
rightSphereMat.mainTexture = renderTexture;
}
GameObject createSphere(string name, Vector3 spherePos, Vector3 sphereScale)
{
GameObject sphere = GameObject.CreatePrimitive(PrimitiveType.Sphere);
sphere.transform.position = spherePos;
sphere.transform.localScale = sphereScale;
sphere.name = name;
return sphere;
}
void OnVideoPrepared(VideoPlayer source)
{
Debug.Log("Done Preparing Video");
//Play Video
videoPlayer.Play();
//Play Sound
audioSource.Play();
//Change Play Speed
if (videoPlayer.canSetPlaybackSpeed)
{
videoPlayer.playbackSpeed = 1f;
}
}
}
There is also Unity tutorial on how to do this with a special shader but this does not work for me and some other people. I suggest you use the method above until VR support is added to the VideoPlayer API.
Related
This is the first time I ask someone for help in this forum. Hope I'll get your help :)
Well, the issue is that, when scene loads, I start mp4 video with Unity's Video Player. I use Raw Image and RenderTexture with it. In editor and Bluestacks emulator everything is fine, but when I start the game on android, just right before the video starts - I get black screen for about ~1 second and then video starts normally. And every time it happens right before the video starts. Does anyone know how to fix this that video would start normally? I would greatly appreciate.
I am new with Video Player and don't understand what could be wrong. I assigned textures in my code.
Image of component
void Start()
{
renderTextureFirstVideo = new CustomRenderTexture(1080, 1920);
renderTextureFirstVideo.initializationColor = new Color(0f, 0f, 0f, 0f);
renderTextureLoopVideo = new CustomRenderTexture(1080, 1920);
renderTextureLoopVideo.initializationColor = new Color(0f, 0f, 0f, 0f);
time = _FirstVideo.clip.length;
Timing.RunCoroutine(IShowVideo());
}
private IEnumerator<float> IShowVideo()
{
yield return Timing.WaitForSeconds(1f);
_FirstVideoRawImage.texture = renderTextureFirstVideo;
_FirstVideo.targetTexture = _FirstVideoRawImage.texture as RenderTexture;
_FirstVideo.gameObject.SetActive(true);
_FirstVideo.Play();
yield return Timing.WaitForSeconds((float)time);
_LoopVideoRawImage.texture = renderTextureLoopVideo;
_LoopVideo.targetTexture = _LoopVideoRawImage.texture as RenderTexture;
_LoopVideo.gameObject.SetActive(true);
_LoopVideo.Play();
}
I am trying to change this script that I found online https://pressstart.vip/tutorials/2018/09/25/58/spawning-obstacles.html to work with a orthographic camera because that is what my game uses. At the moment it only works with perspective cameras and I don't really know how this works cause I have not really touched the camera matrix. Here is the code for the script:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class deployAsteroids : MonoBehaviour {
public GameObject asteroidPrefab;
public float respawnTime = 1.0f;
private Vector2 screenBounds;
// Use this for initialization
void Start () {
screenBounds = Camera.main.ScreenToWorldPoint(new Vector3(Screen.width, Screen.height, Camera.main.transform.position.z));
StartCoroutine(asteroidWave());
}
private void spawnEnemy(){
GameObject a = Instantiate(asteroidPrefab) as GameObject;
a.transform.position = new Vector2(screenBounds.x * -2, Random.Range(-screenBounds.y, screenBounds.y));
}
IEnumerator asteroidWave(){
while(true){
yield return new WaitForSeconds(respawnTime);
spawnEnemy();
}
}
}
My goal is to change the script to make it work correctly with a orthographic camera. (The indentation is messed up and that is not the problem).
For some reason the code is using Camera.main.transform.position.z for the camera depth component. This value will be negative in a typical camera setup in a 2d game.
So it's finding the left side of the screen by following the right edge of the frustum behind the camera. Very odd. You can't follow the right side of the frustum to find where the left side of the screen if it's orthographic so it is no surprise that it doesn't work.
Instead, just use the left side of the frustum and make the depth positive by negating that component:
screenBounds = Camera.main.ViewportToWorldPoint(
new Vector3(0f, 0f, -Camera.main.transform.position.z));
I am making a 2D game in Unity, and in this game, the camera will not need to move. As such, I would like to constrain the player's movement within the camera border, preferably with collision instead of just based on the player's transform. Honestly, I have no idea where to start doing something like this, but I assume it would involve some scripting. I am pretty comfortable with scripting at this point, but if your answer includes scripting, I would appreciate it if it would include thorough explanations of everything that is going on. By the by, I am using C#.
If the camera is in orthographic mode, you can use EdgeCollider2D to do this, finding the world positions of the corners of the screen using ScreenToWorldPoint to then determine the shape of the EdgeCollider2D.
The Unity Community UnityLibrary Github has an example (copied below):
// adds EdgeCollider2D colliders to screen edges
// only works with orthographic camera
using UnityEngine;
using System.Collections;
namespace UnityLibrary
{
public class ScreenEdgeColliders : MonoBehaviour
{
void Awake ()
{
AddCollider();
}
void AddCollider ()
{
if (Camera.main==null) {Debug.LogError("Camera.main not found, failed to create edge colliders"); return;}
var cam = Camera.main;
if (!cam.orthographic) {Debug.LogError("Camera.main is not Orthographic, failed to create edge colliders"); return;}
var bottomLeft = (Vector2)cam.ScreenToWorldPoint(new Vector3(0, 0, cam.nearClipPlane));
var topLeft = (Vector2)cam.ScreenToWorldPoint(new Vector3(0, cam.pixelHeight, cam.nearClipPlane));
var topRight = (Vector2)cam.ScreenToWorldPoint(new Vector3(cam.pixelWidth, cam.pixelHeight, cam.nearClipPlane));
var bottomRight = (Vector2)cam.ScreenToWorldPoint(new Vector3(cam.pixelWidth, 0, cam.nearClipPlane));
// add or use existing EdgeCollider2D
var edge = GetComponent<EdgeCollider2D>()==null?gameObject.AddComponent<EdgeCollider2D>():GetComponent<EdgeCollider2D>();
var edgePoints = new [] {bottomLeft,topLeft,topRight,bottomRight, bottomLeft};
edge.points = edgePoints;
}
}
}
I am making brick breaker game and I want to display particle having same color as brick that gets hit by ball.
Here is my code:
GameObject smokePuff = Instantiate(smoke, transform.position, Quaternion.identity) as GameObject;
ParticleSystem ps = smokePuff.GetComponent<ParticleSystem>();
ParticleSystem.MainModule psmain = ps.main;
psmain.startColor = gameObject.GetComponent<SpriteRenderer> ().color;
This is not working, the particle color is displayed Pink. How to fix it?
I'm using Unity 5.6.
This is a bug on some certain version of Unity. It should be fixed in Unity 2017.2. What happens is that when you change the ParticleSystem color, it loses its material reference.
You can either update Unity to the latest version or manually attach that material reference or new material back to the ParticleSystem after setting the color.
public GameObject smoke;
void Start()
{
GameObject smokePuff = Instantiate(smoke, transform.position, Quaternion.identity) as GameObject;
ParticleSystem ps = smokePuff.GetComponent<ParticleSystem>();
ParticleSystem.MainModule psmain = ps.main;
psmain.startColor = gameObject.GetComponent<SpriteRenderer>().color;
//Assign that material to the particle renderer
ps.GetComponent<Renderer>().material = createParticleMaterial();
}
Material createParticleMaterial()
{
//Create Particle Shader
Shader particleShder = Shader.Find("Particles/Alpha Blended Premultiply");
//Create new Particle Material
Material particleMat = new Material(particleShder);
Texture particleTexture = null;
//Find the default "Default-Particle" Texture
foreach (Texture pText in Resources.FindObjectsOfTypeAll<Texture>())
if (pText.name == "Default-Particle")
particleTexture = pText;
//Add the particle "Default-Particle" Texture to the material
particleMat.mainTexture = particleTexture;
return particleMat;
}
EDIT:
Two more things to know about creating Particle System and the pink particle issue:
1. If you create your Particle System from the Component ---> Effects ---> Particle System menu, Unity will not attach material to the Particle System so it would be pink. You will have to use the code above to create new material or do it manually from the Editor. You will get pink ParticleSystem if you don't do this.
Your problem is either this or the reference bug I described above .
2. If you create your Particle System from the GameObject ---> Effects ---> Particle System menu, Unity will create new GameObject, attach a Particle System and a material to it. You should not have pink particle issue unless it is the bug I talked about particles losing material reference when color is modified.
I'm trying to follow this "tutorial":
http://forum.unity3d.com/threads/ui-follow-scene-objects.364143/
but the guy says that should use RectTransformUtility.ScreenPointToLocalPointInRectangle
I tried to do:
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(theRectTransformOfMyImage, transform.position, theCamera, out localPoint);
but the image goes to the corner screen, instead of being in the object...
any idea?
Edit:
More info, I have a 3D space, it's an enemy scene object with a following health bar image:
In the scripts, the variables are:
private Camera _camera;
private RectTransform _healthBarReact;
...
void Update()
{
// it works more or less, with the problem of "the UI element move at half the speed"
Vector2 screenPoint = RectTransformUtility.WorldToScreenPoint(_camera, transform.position);
_healthBarReact.position = screenPoint;
// this doesn't work, the healthBar images goes to the screen corner
Vector2 localPoint;
RectTransformUtility.ScreenPointToLocalPointInRectangle(_healthBarReact, screenPoint, _camera, out localPoint);
_healthBarReact.position = localPoint;
}
the script is attached to the Enemy game object