.Net Webcam Media Server for Html5 - c#

I am trying to create an Html5 media server that can take in and stream out webcam video and need a bit more direction.
What ways can a webcam via Html5 be streamed to a server and how should the server stream it back out to the client?
Is there a way to do this via something like SignalR?
Are there any real-time compression server side .net dlls that can be used to take the incoming video and stream it to clients?

The HTML5 <video> tag is protocol agnostic. This means that it does not care about the video format.
The issue that you have is knowing what browsers support a streaming format. Some browsers support RTP or ASF streams. At the moment there is no streaming format that all browsers support. HTML5 isn't your best option for streaming today.

Related

How to send RTMP video stream in UWP?

I'm working on living Stream to server in UWP using MediaCapture, but I can't find any useful solution about it.
Microsoft's library, but it only supports Azure.
https://github.com/MicrosoftDX/AzureRTMPIngestLib
I can play RTMP live video streaming from server but can't send video streaming to server, I want to know if there has any solution or library can send RTMP live streaming in UWP?
Following example below uses STSP. I've tested this example on a local network with using ipv4 addresses of two different computers. These computers are transmitting and receiving data at the same time to each other. Client and server sides of your app have to support the same protocol. And it gives too much properties about video recording and streaming processes.
Real-time communication sample
A simple end-to-end video call client that demonstrates the low latency mode of the Windows Runtime capture engine. This is enabled using the msRealTime the video tag or RealTimePlayback on the MediaElement. The sample uses a custom network source and a custom sink extension to send and receive captured audio and video data between two computers.
A demonstration of the end-to-end latency of video captured using the Media Capture API and displayed using a video and MediaElement with low latency mode enabled. Two output windows are displayed. The first shows a camera preview window of the raw output from your camera. The second is a local host client window that shows the video from the camera when compressed, streamed, and received over machine's loopback network interface. This window demonstrates the end-to-end latency of video captured, streamed to, and displayed by a remote client minus network latency.
Now it's your turn. Please inform us about results.

Streaming video using raw sockets

I'm trying to generate a live video stream and surface it via a UPnP framework.
I'm using the UPnP framework that was originally developed by intel available here. It seems to have rolled it's own lightweight webserver. I'm using FFMPEG to generate my video stream from images, I can set it up to feed it frames on a timer. But how I manage the data that's generated? How do I use send an HTTP response that could be a stream of unlimited length?
Is there a well worn technology to do this that I'm not aware of?
Any input would be great.
https://trac.ffmpeg.org/wiki/StreamingGuide possibly ffmpeg could listen on a tcp port, though that would require you to restart ffmpeg each time a client exits...and would only serve one client at a time. If you want more than one client at a time you would have to use some type of real server...

Real time video streaming with ASP.net Web API

My task is to pick up a video stream from an AXIS M1011 IP-camera and relay it to multiple clients with a REST web service.
The reason I'm using such a thing is because the camera can hold only up to 15-20 connections, and will have a bit more. Also there is some security and authentication issues, some additional features and so on. Basically the ideal thing is to have a "singlton" video streaming from the camera relayed to multiple clients.
The camera can stream MJPEG and H.264 stream and static images. I'm having troubles capturing the H.264 stream or MJPEG with the service. Since the H.264 stream is served via RTSP protocol, and the MJPEG has poor compression and I'm not clear how to resend it to the client.
So one of my ideas is to pull the current static image, for example, 25 times a second and somehow create an H.264 stream which would be streamed to the user. Is there any way to create one H.264 stream object which would be shown to all clients or not? Because what would a new client see if he connects later is the stream from the start of capturing. Does that mean I should start separate encoding processes for every connected client? That sounds like it is resource-hungry solution (reading and encoding images).
What would be the best path to take for this? Every advice would be a great help, code snippets even better.
Tnx in advance.

Re-publish video stream as streaming FLV

A client of ours has a mobile web cam placed in a forest that is streaming video on a public IP address. Since the web cam has a limited bandwidth (and it is streaming with a format that often requires clients to install a codec), the stream needs to be re-broadcast by a server on a landline, preferably as streaming FLV.
What components can be used to write a client/server that can do this? It would be written using C#.
(Software solutions would be fine too, but we're on a limited budget so it can't be something very expensive...)
What's the format that the camera is sending you?
Rebroadcasting is easy using off-the-shelf servers - which means no programming as such, no C#.
camera -> ffserver -> flash players
ffserver is part of ffmpeg.

How does VLC Media player implement HTTP Streaming?

How does VLC Media player implement HTTP Streaming?
I am aware that, VLC Media player can be used as a streaming server and also a streaming client.
My requirement is to stream(over HTTP) a proprietary protocol data from the server to clients and I am not comfortable with C/C++ code. I am comfortable with C# and Java. Can somebody point me to example implementations of HTTP Streaming in either C# or Java?
"Streaming" in this context simply means sending a large binary HTTP response to a request. You can get a reference to the output stream in Java by calling HttpServletResponse.getOutputStream. You can then send whatever data you like through the stream.
You can review the VLC source.
Java Media Framework (link) provides video streaming. You can implement not only a client but also the server using this API.
If I remember correctly the SDK includes some examples that might help.
what about CLI (command line interface)?
vlc --repeat /path_to/1.avi --sout '#standard{access=http,mux=ts,dst=:8000}'

Categories

Resources