I am playing a 360 3D video using a skybox.
At first I had the video as an asset and it worked fine, without any lag.
The problem is the video is 7GB and it is not suitable to have it as an asset but it is preferable to store it on the Headset (PICO) and then read it using the url parameter of the VideoPlayer component :
videoPlayer5 = GetComponent<VideoPlayer>().gameObject.AddComponent<UnityEngine.Video.VideoPlayer>();
videoPlayer5.url = #"/storage/self/primary/Android/obb/com.Com.MyCompany.Namespace.Appname\VideoFile.mp4";
videoPlayer5.renderMode = UnityEngine.Video.VideoRenderMode.RenderTexture;
videoPlayer5.targetTexture = VideoTexture;
videoPlayer5.SetDirectAudioMute(0, true);
But by doing so, the video does not play smoothly, it "lags".
Is there any particular reason for this ? Can anything be done to fix that ?
Thank
The problem was indeed the video encoding. Optimizing the video fixed the problem.
Related
I am trying to record two audio input (webcam + microphone) and one video input (webcam) via MediaCapture from C#. Just 1 audio and 1 video input works like a charm but the class itself does not allow to specify two audio input device ids.
An example:
var captureDeviceSettings = new MediaCaptureInitializationSettings
{
VideoDeviceId = videoDeviceId, #Webcam video input
AudioDeviceId = audioDeviceId, #Webcam audio input
StreamingCaptureMode = StreamingCaptureMode.AudioAndVideo,
}
I thought about using audio graph and using a submix node but, I need to specify a device ID for the media capture, the submix node does not give that. Further on, using the output device of audio graph seems not like the solution, I do not want to play microphone to default output device. Tried that but sounds horrible. I thought about creating a virtual audio device but I don't know how.
Any suggestions on that?
MediaCapture won't be able to do this, WASAPI maybe possibile, but it is not trivial to do.
Best option may be to utilize https://learn.microsoft.com/en-us/windows/win32/coreaudio/loopback-recording.
You will still have to mux in video stream though if you get loopback recording to work.
I am beginning to learn how to use the LibVLCSharp library. Right now I am trying to play a streaming video that comes to me in multicast UDP, 224.XX.XX.XX:PORT. The problem is that said video comes to me without format.
I get to reproduce it in cmd with:
vlc udp://#224.XX.XX.XX:PORT --demux=mp4v --rawvid-fps=12
This is mi code:
public void PlayURLFile(string file)
{
var media = new Media(_libVLC, "udp://#224.XX.XX.XX:XXXXX");
media.AddOption(":demux=mp4v");
media.AddOption(":rawvid-fps=12");
_mp.Play(media);
isPlaying = true;
}
When executing it does not show me any error.
The videoview I have to show the video shows me the black screen.
I understand that the problem may be that I am not entering AddOption correctly or that the options are different. But after fighting with the code and looking at documentation, I can't find an answer that is clarifying.
Can someone help me?
Greetings and thank you.
Give the following options in the LibVLC ctor instead.
new LibVLC("--demux=mp4v", "--rawvid-fps=12");
I'm working with the Google VR SDK for Unity trying to build a simple 360 Video viewer using the components that come with the SDK. I'm trying to extend their PanoVideoSample to dynamically change the source video when the user navigates from a menu.
I'm having trouble changing the URL for the GvrVideoPlayerTexture via code. In their demo scene (VideoDemo) they have a PanoVideoSample that contains a Video Sphere, which you can edit the GVRVideoPlayerTexture script in the inspector panel to point at the proper video URL.
I'd like to dynamically set the video URL in C# rather than hard-code a bunch of individual video spheres then hide/show them. I've almost got this working with the following code.
public void SwapVideo(int index){
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().videoURL = urls [index];// my new url
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture>().ReInitializeVideo ();
videoSphere.SetActive (true);
}
public void ReturnToMainMenu(){
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture>().CleanupVideo();
videoSphere.SetActive (false);
this.gameObject.SetActive (true);
}
The code above seems to work, but the problem is the texture on the videoSphere turns white after the url is set and the texture is re-initialized. I can see that the new video loads and I can hear the audio for the new video, but the scene just shows a white texture.
See the output here
I'm wondering if I'm missing a key step on the GvrVideoPlayerTexture or perhaps additional calls to update the StereoPanoSphereMaterial that is used to render the scene. This SDK is pretty new and there doesn't seem to be many people writing about it, so any help is appreciated.
I eventually found the answer to my question in the documentation on the Google VR docs (Streaming Video Support).
I'm still not totally clear what I was doing wrong in my first attempt, but this is the code that worked for me.
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().videoURL = urls [index];
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().videoType = GvrVideoPlayerTexture.VideoType.Other;
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().videoProviderId = string.Empty;
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().videoContentID = string.Empty;
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().CleanupVideo ();
videoSphere.GetComponentInChildren<GvrVideoPlayerTexture> ().ReInitializeVideo ();
I'm recording sound via the WasapiLoopbackCapture and write it to an MP3-File via the NAudio.Lame lib:
LAMEPreset quality = LAMEPreset.ABR_320;
audiostream = new WasapiLoopbackCapture();
audiostream.DataAvailable += stream_DataAvailable;
audiostream.RecordingStopped += stream_RecordingStopped;
mp3writer = new LameMP3FileWriter(Environment.GetEnvironmentVariable("USERPROFILE") + #"\Music\record_temp.mp3",
audiostream.WaveFormat, quality);
audiostream.StartRecording();
When the user presses the stop-recording-button, I save the MP3 and stop the recording:
mp3writer.Flush();
audiostream.Dispose();
mp3writer.Dispose();
All works fine, except that the output file has some disturbing crackle noises in it. (See here for example). I think it might be the case, that my computer is a bit to slow to do the process of compressing and writing the audio data in realtime, so some of the values get lost, but that is just my guess
Edit: When recording to WAVE, the errors dont appear.
What may be the problem here and how could I possibly solve it / work around it?
start off by saving your audio to a WAV file. Does that have crackles in it? If so the crackles are coming from the soundcard. If not, they are coming from the encoding to MP3 code.
I'm trying to play a sound but I just don't hear anything coming from speakers. All the code goes trough without any errors. My code:
using (SoundPlayer player = new SoundPlayer("c:\\scifi.wav"))
{
player.PlaySync();
}
The sound is actualy there and exists, also if I try to play it with VLC or some other player it works.
I tried playing system sound like that
System.Media.SystemSounds.Asterisk.Play();
and it works.
Any ideas what is wrong?
Following code should work:
System.Media.SoundPlayer player = new System.Media.SoundPlayer();
player.SoundLocation = "c:\\scifi.wav";
player.Play();
Some other possibilities:
Are you sure that scifi.wav is actually a .wav file and not a .mp3?