I'm trying to play a sound but I just don't hear anything coming from speakers. All the code goes trough without any errors. My code:
using (SoundPlayer player = new SoundPlayer("c:\\scifi.wav"))
{
player.PlaySync();
}
The sound is actualy there and exists, also if I try to play it with VLC or some other player it works.
I tried playing system sound like that
System.Media.SystemSounds.Asterisk.Play();
and it works.
Any ideas what is wrong?
Following code should work:
System.Media.SoundPlayer player = new System.Media.SoundPlayer();
player.SoundLocation = "c:\\scifi.wav";
player.Play();
Some other possibilities:
Are you sure that scifi.wav is actually a .wav file and not a .mp3?
Related
I am playing a 360 3D video using a skybox.
At first I had the video as an asset and it worked fine, without any lag.
The problem is the video is 7GB and it is not suitable to have it as an asset but it is preferable to store it on the Headset (PICO) and then read it using the url parameter of the VideoPlayer component :
videoPlayer5 = GetComponent<VideoPlayer>().gameObject.AddComponent<UnityEngine.Video.VideoPlayer>();
videoPlayer5.url = #"/storage/self/primary/Android/obb/com.Com.MyCompany.Namespace.Appname\VideoFile.mp4";
videoPlayer5.renderMode = UnityEngine.Video.VideoRenderMode.RenderTexture;
videoPlayer5.targetTexture = VideoTexture;
videoPlayer5.SetDirectAudioMute(0, true);
But by doing so, the video does not play smoothly, it "lags".
Is there any particular reason for this ? Can anything be done to fix that ?
Thank
The problem was indeed the video encoding. Optimizing the video fixed the problem.
I'm making a game music player in C# based on NAudio. The original PCM-16 four-channel audio file (Wii AST) is brought into the application, and the channels are split into two separate stereo wav files. I have a slider in my application to allow you to fade between the two files, because each pair of channels from the original AST file contains a different part/version of the song.
I'm using two instances of WaveOut and AudioFileReader for the two wave files to play them back, and I have a trackbar in the application to adjust the "channel bias" (in other words, the volume of the WaveOut instances). My problem is that a lot of the time (especially when seeking using the trackbar), the two audio files end up playing out-of-sync. I can't seem to get them to play at the exact same time. Does anyone know how to get the files to play in sync?
The only solution I can think of is to convert the two split files into a single 4-channel wave file so you could just adjust the volume of the channel pairs and not have to worry about syncing, but NAudio doesn't seem to have a way to do that.
Please keep in mind that I am new to asking questions on StackOverflow, so if I need to provide more detail or if there was something I failed to explain, please let me know. And please, be constructive.
Thanks!
So, as it turns out, I completely overlooked a class in NAudio that mixes multiple audio streams together. What I ended up doing was creating an instance of MixingWaveProvider32 at the beginning of the program and initializing it when the audio files were created by the decoder to prepare for playback. Here was the final code I came up with that ended up working:
MixingWaveProvider32 mixingWaveProvider32;
WaveOut masterOut;
AudioFileReader audioFileReader;
AudioFileReader audioFileReader2;
...
void loadAstFile(string path)
{
string stream1_Path = path + "_c1.wav";
string stream2_Path = path + "_c2.wav";
DisposeAudioFileReaders();
AstReader astReader = new AstReader(path);
if(astReader.Channels == 4)
{
astReader.Export(stream1_Path, 1.0f);
astReader.Export(stream2_Path, 0.0f);
audioFileReader = new AudioFileReader(stream1_Path);
audioFileReader2 = new AudioFileReader(stream2_Path);
mixingWaveProvider32 = new MixingWaveProvider32(new IWaveProvider[] { audioFileReader, audioFileReader2 });
}
else
{
astReader.Export(stream1_Path, 1.0f);
audioFileReader = new AudioFileReader(stream1_Path);
audioFileReader.Volume = 1.0f;
mixingWaveProvider32 = new MixingWaveProvider32(new IWaveProvider[] { audioFileReader});
}
masterOut = new WaveOut();
masterOut.Init(mixingWaveProvider32);
masterOut.Volume = 1.0f;
RefreshComponents();
Play();
}
Hopefully this helps somebody out there!
I am beginning to learn how to use the LibVLCSharp library. Right now I am trying to play a streaming video that comes to me in multicast UDP, 224.XX.XX.XX:PORT. The problem is that said video comes to me without format.
I get to reproduce it in cmd with:
vlc udp://#224.XX.XX.XX:PORT --demux=mp4v --rawvid-fps=12
This is mi code:
public void PlayURLFile(string file)
{
var media = new Media(_libVLC, "udp://#224.XX.XX.XX:XXXXX");
media.AddOption(":demux=mp4v");
media.AddOption(":rawvid-fps=12");
_mp.Play(media);
isPlaying = true;
}
When executing it does not show me any error.
The videoview I have to show the video shows me the black screen.
I understand that the problem may be that I am not entering AddOption correctly or that the options are different. But after fighting with the code and looking at documentation, I can't find an answer that is clarifying.
Can someone help me?
Greetings and thank you.
Give the following options in the LibVLC ctor instead.
new LibVLC("--demux=mp4v", "--rawvid-fps=12");
I´m trying to play a file located on network at address:
string filePath = #"\\192.168.xx.xx\folder\folder2\Audio\audio.wav";
and trying to play it in MediaPlayer.MediaPlayer player like this:
m_player = new MediaPlayer();
m_player.Stop();
m_player.Open(new Uri(path));
m_player.Play();
It doesn't return any exception, but it also does not play the sound.
When I copy the file on a local disk and try to play it, it works fine.
Any ideas where the problem could be?
Doing some Google says, that you should try a relative Uri.
m_player = new MediaPlayer();
m_player.Stop();
m_player.Open(new Uri(path, UriKind.Relative));
m_player.Play();
Otherwise have a look at this example, which opens a stream and sets the stream to the MediaPlayer.
The SoundPlayer class can do this. It looks like all you have to do is set its Stream property to the stream, then call Play.
I am trying to create a library with sounds in it, but I cant get the URIs to work, if I use a online uri like
new Uri("http://www.archive.org/download/BrahmsViolinConcerto-Heifetz/03Iii.AllegroGiocosoMaNonTroppoVivace.mp3")
it works fine, so the issue is linking correctly to my folders in my project
My in my WP Game Librarys folder I have \Sounds\letters and in that folder is a sound named a.wma
My Method for loading this is
public void PlayLetter(string letter)
{
try
{
Initialize();
FrameworkDispatcher.Update();
var uri = new Uri(#"/Sounds/letters/" + letter + ".wma", UriKind.Relative);
var song = Song.FromUri("sound", uri);
MediaPlayer.Play(song);
}
catch(Exception e)
{
Console.WriteLine(e.ToString());
}
}
And I of course give it string "a" as a parameter when it fails
I have also included the sound file in my project like
I just get a
A first chance exception of type 'System.InvalidOperationException' occurred in Microsoft.Xna.Framework.dll
But its an uri problem I am certain as I tried a online URI that worked just fine
Also I am in doubt of 2 things, is MediaPlayer the right thing to use in a game? And can a library play sounds (Or even contain them)
The typical thing in XNA would be to use a SoundEffectInstance:
http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.audio.soundeffectinstance.aspx
Unfortunately SoundEffectInstance only works with wav files. If you want to play back longer music files - you can use a MediaElement - but that allows for playback of a single compressed audio file at a time only. Another option might be to play compressed from the MediaLibrary using the MediaPlayer class. You could also save your own compressed audio file in the MediaLibrary to play it from there. See:
http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.media.medialibrary.songs.aspx