Probably, this is a complicated question, but I would try to shoot.
I use Monodroid(Mono for Android),and try to implement media stream to UDP socket.
Best way to stream audio and video from an Android phone to a RTMP server
This is a useful pointer for me as a result to take advantage of
FileDescriptor
However, basically, FileDescriptor is a UNIX oriented concept, and Mono(.net) is Windows oriented conept, it appears the UDPClient class of .net does not support to obtain FileDescriptor for the instance.
Is there any way to obtain FileDescriptor with Mono(.net)?
It seems Android doesn't natively support UDP streaming: Network Protocols, so yes you should look for RTMP server option and no you probably can't use UdpClient.
The code in the answer to Best way to stream audio and video from an Android phone to a RTMP server is fairly easy to convert to C#. LocalServerSocket, LocalSocket and LocalSocketAddress are in the Android.Net namespace and so is FileDescriptor in Java.IO.
Related
I'm trying to generate a live video stream and surface it via a UPnP framework.
I'm using the UPnP framework that was originally developed by intel available here. It seems to have rolled it's own lightweight webserver. I'm using FFMPEG to generate my video stream from images, I can set it up to feed it frames on a timer. But how I manage the data that's generated? How do I use send an HTTP response that could be a stream of unlimited length?
Is there a well worn technology to do this that I'm not aware of?
Any input would be great.
https://trac.ffmpeg.org/wiki/StreamingGuide possibly ffmpeg could listen on a tcp port, though that would require you to restart ffmpeg each time a client exits...and would only serve one client at a time. If you want more than one client at a time you would have to use some type of real server...
I'm looking for a method to send video and audio to the rtsp/rtmp server. It should be compatible with notebook webcams and USB webcams. I would like to do it in C#.
I found some library called rtmpclient, but it looks it can't send a message to the server. I found also a framework to support webcams that gives a posibility to do an action for every captured frame. But I can't find any similar solution to send a voice with the image.
How can I send the video and audio to the server?
Check out https://net7mma.codeplex.com/ ( I am the Author )
It supports Rtsp and Rtp and not Rtmp however it could support Rtmp with some changes. Depending on what you want to achieve it may be able to do it already... You can possibly fake the Rtmp Packet as a RtpPacket and the library wouldn't care.
If you need Rtmp specefically check out
http://www.broccoliproducts.com/softnotebook/rtmpclient/rtmpclient.php
or
http://www.fluorinefx.com/docs/fluorine/netconnectionrtmp.html
Let me know if you need anything else!
In actionscript3 I am able to create a p2p network and send arbitary text data around. While this works fine in a network of flash clients I would like to connect a non flash client written instead in c# to the same p2p network.
Is anybody working on such a thing already? Are there any articles on how to do it?
i think the most simple thing really is to create a C# client that'll embed an SWF for communication purpose ... communication between C# and SWF can be done easily through local connections ...
otherwise, you will be having a hard time ... i summarized how flash p2p works in a related question ... there is an awful lot of net-code you will be having to write (starting with the whole protocols that stratus alone uses, and then reimplementing the whole p2p protocol based on UDP, which secures data much like TCP does, and should not be a twoliner) ... i'm not saying it is not feasible ... but it seems unneccessarily uncomplicated ...
greetz
back2dos
A client of ours has a mobile web cam placed in a forest that is streaming video on a public IP address. Since the web cam has a limited bandwidth (and it is streaming with a format that often requires clients to install a codec), the stream needs to be re-broadcast by a server on a landline, preferably as streaming FLV.
What components can be used to write a client/server that can do this? It would be written using C#.
(Software solutions would be fine too, but we're on a limited budget so it can't be something very expensive...)
What's the format that the camera is sending you?
Rebroadcasting is easy using off-the-shelf servers - which means no programming as such, no C#.
camera -> ffserver -> flash players
ffserver is part of ffmpeg.
How does VLC Media player implement HTTP Streaming?
I am aware that, VLC Media player can be used as a streaming server and also a streaming client.
My requirement is to stream(over HTTP) a proprietary protocol data from the server to clients and I am not comfortable with C/C++ code. I am comfortable with C# and Java. Can somebody point me to example implementations of HTTP Streaming in either C# or Java?
"Streaming" in this context simply means sending a large binary HTTP response to a request. You can get a reference to the output stream in Java by calling HttpServletResponse.getOutputStream. You can then send whatever data you like through the stream.
You can review the VLC source.
Java Media Framework (link) provides video streaming. You can implement not only a client but also the server using this API.
If I remember correctly the SDK includes some examples that might help.
what about CLI (command line interface)?
vlc --repeat /path_to/1.avi --sout '#standard{access=http,mux=ts,dst=:8000}'