I have an app which is receiving wave data (PCM raw data) via network through UDP port.
How can I set up to play received wave data using Naudio.
I have tried to find with google and read some stuff related to Naudio documentation, but so far haven't any success.
Any help or hint would be appreciated. Thanks in advance.
The NAudioDemo application demonstrates how to do this in the Network Chat demo. You use the BufferedWaveProvider to store the decompressed audio as it arrives, and use that to feed WaveOut. you might also want to automatically pause if there is not enough buffered audio, to prevent stuttering playback.
Well I did some work on this nAudio stuff long back and am sorry I might not be of much help as I am afraid I hardly remember it...
But I think there was something like WaveOut Class and a WaveStream which contained your WAV data and you call Play on the WaveOut class after associating it with WaveStream.
Try give a look to this WavOut class you might get some clues, also I was quite new to this Audio world when I worked on that and my approach was to take their sample program that plays a wav file and see how they are doing it...that was how I figured out what needs to be done....
Good Luck...
Simple.
UDP Stream --> buffer --> NAudio WaveStream
First, check that the source PCM audio can be played correctly by NAudio. Do this "offline", before sending it via a socket.
I will do some research and post some code later.
Related
I need to broadcast audio content to the network, receive it and play in the browser “on the fly”. The audio content is just list of mp3 files. and on client side it should looks like endless audio stream without state. Something like YouTube live streams. Or smth like online radio.
But I really don’t know anything about that. Can anyone helps me with that? How it works, which protocol is used for sending and receiving data? Anything that can help me with that.
At the best I’m looking for a solution for .NET, but I will be glad for anything that can help, at least to understand how it works in general.
Thank you.
One way to do it would be with help of ffmpeg.
You can use ffmpeg to create DASH or HLS playlist https://ffmpeg.org/ffmpeg-formats.html#hls-2
FFMPEG supports other streaming solutions too.
To invoke ffmpeg you have to find binaries that are compatible with your system that your server is running on (windows, linux). Here is how can you start external process from C#: How do I start a process from C#?
To play your playlist in browser you can use VideoJS. It has built-in support for DASH and HLS: https://videojs.com/ (it can play audio too)
Build your logic to manage / update playlists and then you just need to create HTTP service that can serve your playlist file. VideoJS will play it for you.
If you go with HLS then you probably should read this: https://developer.apple.com/streaming/
If you go with DASH then read this:
https://mpeg.chiariglione.org/standards/mpeg-dash
Another way is to use out of the box solutions which often aren't free:
https://azure.microsoft.com/en-us/services/media-services/
https://www.wowza.com/
I am currently developing a c# Windows Form. My intention is to create a form that can make calls to other PCs in the same network.
So far I have found solutions that record the audio from my microphone and then convert it to bytes and send it using Tcp sockets. The thing is, is there a way to directly convert the audio to bytes and send it through a socket without recording the audio in a file and then send it.
Thanks in advance.
would converting the record to a memorystream be what your looking for?
if so you want this How to record audio using naudio onto byte[] rather than file
You can then write the stream to a tcpsocket. (you could write the thing direct to networkstream but i would consider it bad practice)
It would be wise to write samplerate*3 just in case of lantcy issues.
I am trying to create a radio app, which contains a collection of MP3-stream urls. However, I have run into some trouble streaming (or playing back) the audio.
I am trying to connect to a raw data stream like the ones you plug into VLC. An example url is http://mp3.ht-stream.net/;80 (just plugged this one and a few others into VLC and it worked perfectly). Basically your standard internet radio feed.
I have created the AudioPlaybackAgent, filled in what I think is needed to get this up and running, and everything works fine when I stream regular .mp3 files over the internet. But when I try to connect to these streams, it doesn't do (or play) anything.
I reckon it could be because I'm not using an AudioStreamingAgent (with a MediaStreamSource implementation), but that stuff is a bit too advanced for me, and as I understood after some hours searching the interwebs, mp3 streams could use the AudioPlaybackAgent instead.
Any advice on how I can make this work? Will I have to use an AudioStreamingAgent instead? Is there any open source examples as to how I implement this mysterious MediaStreamSource class? Should I scrap my great idea? Any answers will be greatly appreciated.
My AudioPlayer.cs code is available here if you'd like a peek - but it's mostly the standard stuff.
You can't easily play audio from the provided link because of this is not a regular audio file. This is SHOUTcast stream.
You can check some open source implementations and to figure out how to work with this audio stream. For example: Shoutcast MediaStreamSource.
So I can say that you need to implement a lot of stuff in your app to play this stream. There is no quick and easy way.
Look at the Background Audio Streamer sample.
I'm looking at developing an application for the WP7 platform which accepts an audio stream from a computer and outputs that stream on the phone speaker. This involves either dealing with the audio encoding / decoding myself, or somehow passing off an audio stream to the WP7 platform.
I've struggled so far to find any raw audio output API's and I am not sure what I have to do on the server (computer) side to get the phone to just deal with the audio stream.
I have looked at a few MSDN articles, but I can't quite tell if they do what I want. If somebody could point me in the right direction that would be great!
I think the MediaStreamSource class does what I'm looking for, and the MediaStreamSource.ReportGetSampleCompleted method appears to confirm this, but nowhere does it say clearly that it can be used for raw audio.
If you need any information, or if you have any suggestions of better ways to do this that would also be appreciated!
You should be able to use the XNA.Framework.Audio namespace SoundEffect to get a soundeffect instance from a raw audio stream. Which should usually be a standard PCM format.
This post on microphone by Charles Petzold could be useful.
An excerpt
In contrast, classes in the Microsoft.Xna.Framework.Audio namespace
work with uncompressed audio data in the standard PCM format, which is
the same method used for audio CDs and Windows WAV files.
I am trying to read bytes from a wav file and send it across to a stream but it plays slowly.
Could you please help me to know the right way of populating the byte[]?
Thanks for you help.
are you using NAudio to both read the WAV file and play the data?
you need to make sure you use the same WaveFormat at both ends