first question post here. Please go easy on me.
I want to be able to record videos with device camera in Unity.
I can play the camera video as a video texture using WebCamTexture, but cannot save the video as a .mp4 file or something in the device.
Is there any way to do it inside Unity?
Thank you very much.
Basically, I want to be able to record videos with device camera and save them in device as .mp4 files, just like a normal camera app, but with Unity
You can use this plugin available in the unity asset store >>
CameraShot plugin
Related
I am making a game in which I need to stream SOME of the audio to other players. Right now I am using OnAudioFIlterRead() on the AudioListener to get ALL audio as a buffer, and I have managed to stream this using Photon Voice.
However, I only want to stream certain audio sources (about 18) which make up the music part of my music game. There will also be voice chat (which is streamed separately) and maybe other audio sources that I do not want to stream.
I could assign these audio clips to a mixer track, but I have not found a way to get audio from a mixer track as a buffer? Does anyone have a solution?
Thanks!
Hi I'm making a game about recording in-game audio, but all I can do is recording with internal microphone, the default input audio device. I try to see the list of devices in Unity with Microphone.devices, but depend of my Windows configurated devices, I need a solution to work in any computer. Maybe something related to FMOD custom DSP, faking a audio card or some solution in Unity API, I need a north. How I'm using the code:
myAudioClip = Microphone.Start ( null, false, 10, 44100 );
To save a audio file I'm using that script: https://gist.github.com/darktable/2317063
And that line:
SavWav.Save("myfile", myAudioClip);
My question is: How to record my application output audio as a input data too in Unity.
You can't do that in Unity. The furthest you can go is record individual AudioClips.
To record system output use NAudio. It's a .NET library, hope it will work on Mono.
Good luck
I have a video file and I want to play it on one computer (preferably with C#), but stream the audio to an android device and let it play there synchronous to the video content over network.
Do you have any tips how I can achieve that?
Any library or code examples are welcome :)
I want to play a video from Unity3D. The video play must be able to seek to a particular frame or time. Also I have a requirement of playing the video from web.
From Unity's documentation I am able to download an online video and play it as a movie texture. But I don't have the options to seek, fast forward etc. If you have any idea on that please let me know.
Or else suggest any libraries or plugins I can use to do this.
Thanks in advance.
I want to save the video streams that is captured by Kinect's Color camera to .avi format video, I tried many ways of doing this but nothing was succeeded. Has anyone successfully done this? I'm using Kinect for Windows SDK and WFP for application development
I guess the easiest workaround would be to use a screen capture software like http://camstudio.org/.
There is also post with the same question her:
Kinect recording a video in C# WPF
As far as I understand you need to to save the single frames delivered by the kinect by into a video file. This post should explain how to do it How to render video from raw frames in WPF?.
You can use the AVIFile Windows API using interop:
http://msdn.microsoft.com/en-us/library/windows/desktop/dd756808(v=vs.85).aspx
or you can use a wrapper like this one, done by Corina John
http://www.codeproject.com/Articles/7388/A-Simple-C-Wrapper-for-the-AviFile-Library