Im new to this forum (as registered user) and trying to figure out how to properly get an audio stream from an IP camera via rtsp protocol and transfer it e.g directly to byte array.
It should be real time so I want to avoid creating any audio files or so.
I've found Ozeki SDK but it's shareware.
My project is only for academical purposes.
Of course Im not looking for exact solution, just for some proper library which could contain features to handle this issue.
Thank you very much for any answer in advance.
Related
I need to broadcast audio content to the network, receive it and play in the browser “on the fly”. The audio content is just list of mp3 files. and on client side it should looks like endless audio stream without state. Something like YouTube live streams. Or smth like online radio.
But I really don’t know anything about that. Can anyone helps me with that? How it works, which protocol is used for sending and receiving data? Anything that can help me with that.
At the best I’m looking for a solution for .NET, but I will be glad for anything that can help, at least to understand how it works in general.
Thank you.
One way to do it would be with help of ffmpeg.
You can use ffmpeg to create DASH or HLS playlist https://ffmpeg.org/ffmpeg-formats.html#hls-2
FFMPEG supports other streaming solutions too.
To invoke ffmpeg you have to find binaries that are compatible with your system that your server is running on (windows, linux). Here is how can you start external process from C#: How do I start a process from C#?
To play your playlist in browser you can use VideoJS. It has built-in support for DASH and HLS: https://videojs.com/ (it can play audio too)
Build your logic to manage / update playlists and then you just need to create HTTP service that can serve your playlist file. VideoJS will play it for you.
If you go with HLS then you probably should read this: https://developer.apple.com/streaming/
If you go with DASH then read this:
https://mpeg.chiariglione.org/standards/mpeg-dash
Another way is to use out of the box solutions which often aren't free:
https://azure.microsoft.com/en-us/services/media-services/
https://www.wowza.com/
I am trying to create a radio app, which contains a collection of MP3-stream urls. However, I have run into some trouble streaming (or playing back) the audio.
I am trying to connect to a raw data stream like the ones you plug into VLC. An example url is http://mp3.ht-stream.net/;80 (just plugged this one and a few others into VLC and it worked perfectly). Basically your standard internet radio feed.
I have created the AudioPlaybackAgent, filled in what I think is needed to get this up and running, and everything works fine when I stream regular .mp3 files over the internet. But when I try to connect to these streams, it doesn't do (or play) anything.
I reckon it could be because I'm not using an AudioStreamingAgent (with a MediaStreamSource implementation), but that stuff is a bit too advanced for me, and as I understood after some hours searching the interwebs, mp3 streams could use the AudioPlaybackAgent instead.
Any advice on how I can make this work? Will I have to use an AudioStreamingAgent instead? Is there any open source examples as to how I implement this mysterious MediaStreamSource class? Should I scrap my great idea? Any answers will be greatly appreciated.
My AudioPlayer.cs code is available here if you'd like a peek - but it's mostly the standard stuff.
You can't easily play audio from the provided link because of this is not a regular audio file. This is SHOUTcast stream.
You can check some open source implementations and to figure out how to work with this audio stream. For example: Shoutcast MediaStreamSource.
So I can say that you need to implement a lot of stuff in your app to play this stream. There is no quick and easy way.
Look at the Background Audio Streamer sample.
I'm looking at developing an application for the WP7 platform which accepts an audio stream from a computer and outputs that stream on the phone speaker. This involves either dealing with the audio encoding / decoding myself, or somehow passing off an audio stream to the WP7 platform.
I've struggled so far to find any raw audio output API's and I am not sure what I have to do on the server (computer) side to get the phone to just deal with the audio stream.
I have looked at a few MSDN articles, but I can't quite tell if they do what I want. If somebody could point me in the right direction that would be great!
I think the MediaStreamSource class does what I'm looking for, and the MediaStreamSource.ReportGetSampleCompleted method appears to confirm this, but nowhere does it say clearly that it can be used for raw audio.
If you need any information, or if you have any suggestions of better ways to do this that would also be appreciated!
You should be able to use the XNA.Framework.Audio namespace SoundEffect to get a soundeffect instance from a raw audio stream. Which should usually be a standard PCM format.
This post on microphone by Charles Petzold could be useful.
An excerpt
In contrast, classes in the Microsoft.Xna.Framework.Audio namespace
work with uncompressed audio data in the standard PCM format, which is
the same method used for audio CDs and Windows WAV files.
In .NET 3.5, System.Speech.Recognition SpeechRecognitionEngine.SetInputToAudioStream Method doesn't seem to support real-time input.
I am developing a windows application and I want to provide real-time input stream over the network. How could I accomplish this? Could someone help me with a work-around?
Thank you.
Can you do something where you buffer 5 second samples and send that in as a stream rather than the live stream direct from the network? The problem with that approach is you could break mid word, and I'm not sure how that is dealt with programmatically.
According to that community post, if the internal wrapper doesn't support the required interface, there isn't much you can do except keep feeding it samples. You could write your own custom input stream as well. What happens when you just feed the engine the network stream directly?
I have a requirement to build a very simple streaming server. It needs to be able to capture video from a device and then stream that video via multicast to several clients on a LAN.
The capture part of this is pretty easy (in C#) thanks to a library someone wrote with DirectShow.Net (http://www.codeproject.com/KB/directx/directxcapture.aspx).
The question I have now is how to multicast this? This is the part I'm stuck on. I'm not sure what to do next, or what steps to take.
There are no filters available that you can plug and use.
You need to do three things here:
Compress the video into MPEG2 or MPEG4
Mux it into MPEG Transport Stream
Broadcast it
There are lots of codecs available for part 1, and some devices can even output compressed video.
The part 3 is quite simple too.
Main problem goes with part 2, as MPEG Transport Stream is patented. It is licensed so that you cannot develop free software based on it (VLC and FFMPEG violate that license), and you have to pay several hundred dollars just to obtain a copy of specification.
If you have to develop it, you need to:
Obtain a copy of ISO/IEC 13818-1-2000 (you may download it as PDF from their site), it describes MPEG Transport Stream
Develop a renderer filter that takes MPEG Elementary Streams and muxes them into Transport Stream
It has to be a renderer as Transport Stream is not a transform filter. There are some kind of outband data (program allocation tables and reference clocks) that need to be sent on a regular basis, and you need to keep a worker thread to do that.
To achieve that you need to setup/write some kind of video streaming server.
I've used VideoCapX for the same purpose on my project. The documentation and support is not top notch, but it's good enough. It's using WMV streaming technology. The stream is called MMS stream. You can view it with any most media player. I've tested with Windows Media Player, Media Player Classics and VLC. If you would like to see it's capability without writing any code just yet, take a look at U-Broadcast, it uses VideoCapX to do the job behind the scene.
I've been using DirectShow.Net for almost 2 years, and I still find it hard to write a streaming server myself, due to the complexity of DirectShow technology.
Other than WMV, you can take a look at Helix Server or Apple Streaming Server. The latter one is not free, so is WMV Streaming Server from Microsoft.
You can also take a look at VLC or Windows Media Encoder to do streaming straight from the application. But so far I find U-Broadcast out do both of the above. VLC has some compatibility issue with codec and playback from non VLC player, WME has problem with starting up capturing device.
Good Luck
NOTE: I'm not associated with VideoCapX or it's company, I'm just a happy user of it.
http://www.codeproject.com/KB/directx/DShowStreamingServer.aspx might help, and http://en.wikipedia.org/wiki/VLC_media_player#cite_note-14
VLC also "should" be able to stream from any device natively.