In .NET 3.5, System.Speech.Recognition SpeechRecognitionEngine.SetInputToAudioStream Method doesn't seem to support real-time input.
I am developing a windows application and I want to provide real-time input stream over the network. How could I accomplish this? Could someone help me with a work-around?
Thank you.
Can you do something where you buffer 5 second samples and send that in as a stream rather than the live stream direct from the network? The problem with that approach is you could break mid word, and I'm not sure how that is dealt with programmatically.
According to that community post, if the internal wrapper doesn't support the required interface, there isn't much you can do except keep feeding it samples. You could write your own custom input stream as well. What happens when you just feed the engine the network stream directly?
Related
I need to broadcast audio content to the network, receive it and play in the browser “on the fly”. The audio content is just list of mp3 files. and on client side it should looks like endless audio stream without state. Something like YouTube live streams. Or smth like online radio.
But I really don’t know anything about that. Can anyone helps me with that? How it works, which protocol is used for sending and receiving data? Anything that can help me with that.
At the best I’m looking for a solution for .NET, but I will be glad for anything that can help, at least to understand how it works in general.
Thank you.
One way to do it would be with help of ffmpeg.
You can use ffmpeg to create DASH or HLS playlist https://ffmpeg.org/ffmpeg-formats.html#hls-2
FFMPEG supports other streaming solutions too.
To invoke ffmpeg you have to find binaries that are compatible with your system that your server is running on (windows, linux). Here is how can you start external process from C#: How do I start a process from C#?
To play your playlist in browser you can use VideoJS. It has built-in support for DASH and HLS: https://videojs.com/ (it can play audio too)
Build your logic to manage / update playlists and then you just need to create HTTP service that can serve your playlist file. VideoJS will play it for you.
If you go with HLS then you probably should read this: https://developer.apple.com/streaming/
If you go with DASH then read this:
https://mpeg.chiariglione.org/standards/mpeg-dash
Another way is to use out of the box solutions which often aren't free:
https://azure.microsoft.com/en-us/services/media-services/
https://www.wowza.com/
I am trying to create a radio app, which contains a collection of MP3-stream urls. However, I have run into some trouble streaming (or playing back) the audio.
I am trying to connect to a raw data stream like the ones you plug into VLC. An example url is http://mp3.ht-stream.net/;80 (just plugged this one and a few others into VLC and it worked perfectly). Basically your standard internet radio feed.
I have created the AudioPlaybackAgent, filled in what I think is needed to get this up and running, and everything works fine when I stream regular .mp3 files over the internet. But when I try to connect to these streams, it doesn't do (or play) anything.
I reckon it could be because I'm not using an AudioStreamingAgent (with a MediaStreamSource implementation), but that stuff is a bit too advanced for me, and as I understood after some hours searching the interwebs, mp3 streams could use the AudioPlaybackAgent instead.
Any advice on how I can make this work? Will I have to use an AudioStreamingAgent instead? Is there any open source examples as to how I implement this mysterious MediaStreamSource class? Should I scrap my great idea? Any answers will be greatly appreciated.
My AudioPlayer.cs code is available here if you'd like a peek - but it's mostly the standard stuff.
You can't easily play audio from the provided link because of this is not a regular audio file. This is SHOUTcast stream.
You can check some open source implementations and to figure out how to work with this audio stream. For example: Shoutcast MediaStreamSource.
So I can say that you need to implement a lot of stuff in your app to play this stream. There is no quick and easy way.
Look at the Background Audio Streamer sample.
This is a bit of a weird question but, with the functionalities of C++, c# and objective C as we speak is there any possible way for video content to be uploaded whilst its recording. So as you record the video it would be being compressed and uploaded to a website.
Would this involve cutting the video into small parts as you record, hardly noticeable stops and starts during the recording?
If anyone knows if this is at all possible, please let me know.
Sorry for the odd question.
You've just asked for streaming media -- something that's been done for over a decade (and, if you overlook "television", something that's probably been underway in research settings for several decades).
Typically, the video recorder will feed the raw data through filters of some sort -- correct white balance, sharpen or soften the video, image stabilize, and then compress the raw data using a codec. Most codec designs will happily take a block of input, work on it, and then produce a block of encoded data ready for writing. Instead of writing to disk, you could "write" to a socket opened to a remote machine.
Or, if you're working with an API that only writes to disk, you could easily re-read the data off disk as it is being written and send the data to a remote site. You'd have to "follow" the writing using something like tail -f's magic ability to follow the file as it is written. (Heck, if you're just bodging something together for a one-off, I'd even recommend using tail -f as part of your system.)
It depends on if the application recording to disk is locking the file. My guess is that, unless you wrote the recording software, the application locks the file(or doesn't even create the real file) until it stops recording. If you are writing the recording software as well, then yes, you can do this. you would just use sychronized threads.
I would like to emulate video input from a webcam for testing purposes.
So I need to be able to emulate a software video capture device in Windows and be able to dynamically generate its output.
How can I achieve this?
I would prefer a solution in C# or C++.
You can use a Virtual Webcam (old link, but there are others) it will take a video/images file and will display it in a webcam device. Your system will think that its a normal device.
Then you will need to create something that will generate the video/images, if you need static image then its pretty easy to generate a bmp.
Old (no selected answer) question.... actually probably one of the oldest I've ever seen... but I came across this looking for an answer myself, I remembered the day when "Virtual Webcam" still existed (now just a chinese ad site).
Fear not! There are new sources to solve your decade long quest:
First of all, checkout OBS, open source does a LOT with video streams:
https://obsproject.com/
Second, checkout this virtual webcam plugin for it. Does exactly what you're talking about, and does use #qbeuek's suggestion of DirectDraw:
https://obsproject.com/forum/resources/obs-virtualcam.949/
It is written in C++, so grabbing the bits you need and rewriting to C# is left as an exercise to the reader, but the capability is there.
As far as I know, there is a set of COM interfaces that govern the recording and playback of audio and video in Windows. It used to be called DirectShow, but maybe in the meantime the name has been changed. Those interfaces are used to construct a graph of audio and video filters, to encode / decode the data stream.
The way to go:
- read about the Microsoft DirectShow API,
- implement a COM object that implements the video source interface,
I have a requirement to build a very simple streaming server. It needs to be able to capture video from a device and then stream that video via multicast to several clients on a LAN.
The capture part of this is pretty easy (in C#) thanks to a library someone wrote with DirectShow.Net (http://www.codeproject.com/KB/directx/directxcapture.aspx).
The question I have now is how to multicast this? This is the part I'm stuck on. I'm not sure what to do next, or what steps to take.
There are no filters available that you can plug and use.
You need to do three things here:
Compress the video into MPEG2 or MPEG4
Mux it into MPEG Transport Stream
Broadcast it
There are lots of codecs available for part 1, and some devices can even output compressed video.
The part 3 is quite simple too.
Main problem goes with part 2, as MPEG Transport Stream is patented. It is licensed so that you cannot develop free software based on it (VLC and FFMPEG violate that license), and you have to pay several hundred dollars just to obtain a copy of specification.
If you have to develop it, you need to:
Obtain a copy of ISO/IEC 13818-1-2000 (you may download it as PDF from their site), it describes MPEG Transport Stream
Develop a renderer filter that takes MPEG Elementary Streams and muxes them into Transport Stream
It has to be a renderer as Transport Stream is not a transform filter. There are some kind of outband data (program allocation tables and reference clocks) that need to be sent on a regular basis, and you need to keep a worker thread to do that.
To achieve that you need to setup/write some kind of video streaming server.
I've used VideoCapX for the same purpose on my project. The documentation and support is not top notch, but it's good enough. It's using WMV streaming technology. The stream is called MMS stream. You can view it with any most media player. I've tested with Windows Media Player, Media Player Classics and VLC. If you would like to see it's capability without writing any code just yet, take a look at U-Broadcast, it uses VideoCapX to do the job behind the scene.
I've been using DirectShow.Net for almost 2 years, and I still find it hard to write a streaming server myself, due to the complexity of DirectShow technology.
Other than WMV, you can take a look at Helix Server or Apple Streaming Server. The latter one is not free, so is WMV Streaming Server from Microsoft.
You can also take a look at VLC or Windows Media Encoder to do streaming straight from the application. But so far I find U-Broadcast out do both of the above. VLC has some compatibility issue with codec and playback from non VLC player, WME has problem with starting up capturing device.
Good Luck
NOTE: I'm not associated with VideoCapX or it's company, I'm just a happy user of it.
http://www.codeproject.com/KB/directx/DShowStreamingServer.aspx might help, and http://en.wikipedia.org/wiki/VLC_media_player#cite_note-14
VLC also "should" be able to stream from any device natively.