I am trying to build a VR applications with portals into other worlds. These portals do however not have to be flat, they can be bent. Here https://www.youtube.com/watch?v=EpoKwEMtEWc is video from an early version.
Right now, the portals are 2D planes with camera textures which stretch and move according to the users position. For VR use, and to ease performance issues, I would like to define which area of the screen gets rendered by which camera. Sort of like a viewport in regular Unity, but not rectangular, and for VR with Windows Mixed Reality.
Is this somehow possible? Possibly even without Unity Pro?
Related
I want to design VR lenses for customized VR boxes and I want to develop Android apps in Unity but I cannot figure out how to change the distortion of the image on the screen accordingly.
image distorted for VR
Source
All sources I can find about this issue are created about 5 years ago.
This mentions Vertex-Displacement Lens Correction can be done with Cardboard SDK for Unity.
The SDK contains a CG including file titled CardboardDistortion.cginc.
This file contains a method which will convert a world-space vertex into inverse-lens distorted screen space ('lens space'), using the Brown–Conrady model for radial distortion correction.
Is there a simpler way to do VR Distortion Correction in Unity?
Can Unity distort the scene by changing some coefficients?
Are there any other alternative programs to solve that?
And also are there any alternatives to Google Cardboard SDK to develop VR apps on android phones?
Some of the video player apps enables users to change some settings like this in order to set the perfect angle and distortion coefficients (e.g. Deo VR). Can it be done visually in Unity?
[using unity 2020.3]
I'm trying to slowly blend different layers in and out in VR, with both layers being visible while the fade between occurs. Right now, I am using two cameras, one as the main camera and one as a render texture (both only rendering their selective layers). Then I use UI to fade the render texture in and out. This looks and works great in 2D view (including builds), but UI components do not render in VR.
I am aware that rendering this in VR will require 4 sets of rendering (two for each eye), but I'd still like to know how to generate and display a render texture for each eye using unity.
This effect can be done in other ways and I'm open to suggestions. There are a lot of different types of elements I wish to fade in and out so I'm aware of one solution to add transparent shaders and fade particles but this can be tedious and requires a lot of setups (I'd like more of a permanent solution for any project). This being said, I'd still like to know how to manipulate what is being rendered out to the VR headset.
I'm fairly certain that the "Screen space effects" section of Unity doc Single Pass Stereo rendering (Double-Wide rendering) -- { https://docs.unity3d.com/Manual/SinglePassStereoRendering.html } -- is what I'm looking for, however, this still doesn't answer how to get the render texture for each eye (and I'm a little confused on how to use what they have written).
I'm happy to elaborate more and test some things out! Thank you in advance!
Ive just started developing for the windows mixed reality headset in unity and it seems to be going well; until i build the program.
The point of my game is simple, one player navigates through a maze in VR and another watches the monitor and guides them through.
In the unity editor under the "game" tab, the cameras work as expected. I used RenderTextures to display two cameras (one of the VR view and one an overview of the entire maze) onto a canvas, which was the game view.
However, when i build my game the only thing that appears on the monitor is the VR's perspective.
I have set the target eye for the vr camera to "both" and the main camera to "None (main display) as others have suggested, but no luck.
Is this a small error ive overlooked or is there a larger problem?
Alex.
i have a 2d plane that is generated thru script and i need it to have a material on it that will work as a light.
What i am doing is having a side scroll game with a generated 2d torch which is the plane, now the environment color and the skybox are pitch black and my 2d torch needs to light up the background that is 3d so far i have gotten almost desirable effects with a particle additive material however the contrast is weak and will not help if the environment is black.
will i need to develop my own shader as i have never done this before id prefer not to or is there a simple material solution that i can use.
i am using unity3.55 free
A particle system won't light anything up. Your only out-of-the-box solution is a pixel point light.
I'm helping develop a game in Unity3d using C#. We're developing for the iPad and using Unity's
Apple plugin. Our game is a 2-D platform game that features gameplay that uses the rotation of the device (iPad). We're using physics for character movement.
The game is played mostly in Landscape Right. We can change the orientation of the device easy enough using Screen.orientation = ScreenOrientation.LandscapeRight, etc. We want to be able to rotate the device from either LandscapeLeft and LandscapeRight to either Portrait or PortraitUpsideDown, which we can do.
When running our scene within Unity3d, when we rotate from Landscape to Portrait, our character falls because we rotate the world (what she's standing on), while she remains in the same orientation (physics remains the same), causing her to fall. This is the desired behavior we want. However, when we build to Xcode and run the game on the iPad simulator (version 5.1), and we rotate the device so that the world rotates, it merely rotates the device with the game still having the same orientation (landscape). The iPad is in portrait however the game is still in landscape. Are there other variables that we need to set within the device other that ScreenOrientation? How do we have our game state consistent with the device's orientation?
A quick summary: we want our game to default to Landscape. When we rotate to portrait we want the world to rotate with the camera and the player remaining constant. Physics is tied with the character meaning once the world rotates, she falls because she's no longer standing on the ground. Currently when we rotate the iPad, the orientation of the device changes but the game's orientation doesn't change. We want the world to shift on rotation but the character remain constant. It works in Unity3d but doesn't work correctly using the Xcode iPad simulator. How do we do this?
Thank you!
Kenneth
Very confusing question, I must say. Checking the orientation should be simple and straight forward.
First and foremost: Forget XCode simulator! It sucks even if you're building directly on XCode and not using Unity or anything else. Maybe this alone already solves your issue. Get an iPad and test on it.
Also, you'll need scripts in place to handle the physics changing behavior with rotation. I couldn't understand if you got this, but in case you don't this might not be too trivial. Although it seems you got it.
Lastly, the most confusing part is when you talk about ScreenOrientation and Screen.Orientation, and just maybe there's a mistake there. When you use Screen.orientation = ScreenOrientation.LandscapeRight you're forcing the actual screen orientation to change (it is not read-only) and this should be done only for testing purposes. You could even include all code relating to it inside #if UNITY_EDITOR and #endif precompiler 'tags'. For the device, you should only have ScreenOrientation == ScreenOrientation.LandscapeRight, either on if or switch clauses, with the comparison operator, the double equal sign (==).
Can't think of any other thing that could go wrong there.