I'm trying to make a remote desktop app where user controls his pc from a webapp (as in logmein).
I achieved that with C# for the desktop part, and NodeJS for the webapp, the communication was made using Socket.IO.
My first attempt was capturing the screenshot (only 5 fps), then comparing it to the previous screenshot and sending only the difference in 8-bit image color which resulted - in a 800 * 600 resolution virtual desktop - in a 100kb first image, then from 5kb to 60Kb depending on the changes on the screen.
With my local machine controlling a virtualbox, everything was perfect, but when I hosted the webapp online, the result was catastrophic, an improbable lag was taking place.
After a few researches It turned out this kind of app was impossible to achieve with my way, and that I have to use a real-time protocol and make a live streaming out of the client screen.
My questions are :
Is there any free / open-source RTP libraries that is ready-to-use ?
How would-I transfer a live streaming from the desk app to the webapp since it's coming from the client side which has no open port ? I was thinking of another desktop app that will run on the server (hosting the webapp) and then it will stream the same content again, and then the webapp can simply display the content by acceding to the local ip with the RTP port, but this doesn't solve the mystery of transferring a live streaming from the client to the server ?
Is there any free / open-source RTP libraries that is ready-to-use ?
live555 - I've used and is excellent, but C++ so you would have to interop.
gstreamer - also native requiring interop.
Managed Media Aggregation I've not used but it is completely managed.
How would-I transfer a live streaming from the desk app to the webapp
since it's coming from the client side which has no open port ? I was
thinking of another desktop app that will run on the server (hosting
the webapp) and then it will stream the same content again, and then
the webapp can simply display the content by acceding to the local ip
with the RTP port, but this doesn't solve the mystery of transferring
a live streaming from the client to the server ?
This would be tricky. All the libraries above follow strict RTSP/RTP specification which requires opening a listening port on your host side, which is undoubtedly going to be behind a nat'd address. I would stick with each end being a client and reaching 'up' to your webservice. You also need to guarantee delivery of your frames (because your delivering incremental deltas) so RTP (which is traditionally over UDP) would be challenging.
Some thoughts
At the end of the day RTP is just a standardized 12 byte header and packetization rules for compressed media. It's not going to help with latency. The real benefit would allow you to connect to the endpoint with in a standards compliant way, like with a VLC client.
You could tune your sockets and that will help a bit but what I would focus on to be honest is compression and screen capture efficiency. What image compression are you using? VNC has traditionally used zlib and some others lossy like jpeg. The smaller you get those frames the better.
Also another thought which may help - Microsoft has an API for getting 'dirty screen areas'. It is called Desktop Duplication API and it performs incredibly fast. It is Win8 and up however.
All the best on your endeavor!
Related
I have been looking around the internet to see what protocols are available for the client/server communication. I'm currently not able to find a durable solution for my problem, maybe looking the wrong places ?
The problem in short is that today I have a desktop C#, WPF application that communicates with a C# server application using WCF and TCP on dedicated ports. However, the use of TCP requires that external users needs to open the specific ports in their companies firewall to be able to use the application. This proves to be a bigger challenge than expected and therefor I'm trying to find if there are other ways to perform the communication.
I looked at websockets and signalr which uses HTTP(S), however I need at protocol that supports large amounts (sometimes 10MB at startup) of data transfer, which they don't seem to do well?
In short:
I need a protocol for client/server communication using the HTTP(s) channel and also performing well with larger amount of data.
This might be an impossible mission, due to lack of performance compared to TCP ?
I'm working on living Stream to server in UWP using MediaCapture, but I can't find any useful solution about it.
Microsoft's library, but it only supports Azure.
https://github.com/MicrosoftDX/AzureRTMPIngestLib
I can play RTMP live video streaming from server but can't send video streaming to server, I want to know if there has any solution or library can send RTMP live streaming in UWP?
Following example below uses STSP. I've tested this example on a local network with using ipv4 addresses of two different computers. These computers are transmitting and receiving data at the same time to each other. Client and server sides of your app have to support the same protocol. And it gives too much properties about video recording and streaming processes.
Real-time communication sample
A simple end-to-end video call client that demonstrates the low latency mode of the Windows Runtime capture engine. This is enabled using the msRealTime the video tag or RealTimePlayback on the MediaElement. The sample uses a custom network source and a custom sink extension to send and receive captured audio and video data between two computers.
A demonstration of the end-to-end latency of video captured using the Media Capture API and displayed using a video and MediaElement with low latency mode enabled. Two output windows are displayed. The first shows a camera preview window of the raw output from your camera. The second is a local host client window that shows the video from the camera when compressed, streamed, and received over machine's loopback network interface. This window demonstrates the end-to-end latency of video captured, streamed to, and displayed by a remote client minus network latency.
Now it's your turn. Please inform us about results.
I have a situation where videos reside on an FTP server, and I need to stream them through my Website Project.
Using a very crude method of including the FTP username and password in the URL, I can just drop the formed URL in as a link in the HTML video player.
http(s)://username:password#server
I am a bit stuck on how to proceed with consuming the video from this remote secured FTP site. There is no web server running to "serve" the videos over http. It is a dedicated FTP server.
Initially, I have played around with making a physical FTP connection from code, but the streaming seemed to be a problem using this method. I just temporarily used the URL authentication method, and it is time to revisit.
Unfortunately, I do not have the original code where I attempted to make the FTP connection through code.
I need to re-visit this, and would like some input before I proceed.
Video streaming is quite a specialist area, especially if you want to use techniques like Adaptive Bit Rate streaming (ABR - see note below) to give your users the best possible user experience.
Given this I think the best approach might be for your to FTP the videos to a streaming server where they can be prepared and made available for streaming properly.
There are open source and commercial streaming servers available, some cloud SaaS based, which means you don't have to invent and develop all this yourself. Some examples are:
https://gstreamer.freedesktop.org (opensource)
https://www.wowza.com (commercial with free trial)
Note: ABR - this essential means you have multiple versions of your video available on the server. Each is a different bit rate, and all are broken into (for example) 10 second chunks. The client requests the next chunk of the video from whatever is the most appropriate bit rate for the current network conditions. Many clients will also request a low bit rate to start the video to ensure a quick start and then 'step up' through the bit rates to the most appropriate one. You can see this when you start a new video on sites like Netflix etc.
I have a server & a client. The client is my Android phone. The server is my PC running Windows.
I'm needing to find out the best programmable method of sending a short string to the PC and having it displayed real-time.
Please keep in mind the only languages I can program in on my server-side is VB.NET, C#, and C++ (and my experience with them is in that order).
Edit:
I don't care about security or anything. Both devices will be on a private wifi network. I can't rely on 3rd party applications though, considering the computer running the server will have very little RAM.
If it's just a simple string, and you don't care too much about security or reliability, you could go about this using java.net.Socket. Create a Socket, giving it your PCs IP address, an open port, etc. Write up an application on the PC to listen to that port, and handle the data as it would. I have no idea how you would go about doing the server-side part, as I've only really used Java, but it shouldn't be that hard if you're going with raw sockets. With the Socket on your android, create an output stream, and pass the string through that stream.
If you need a more secure, more reliable, more standard protocol of delivering a string, I'd probably go with HTTP. You might have to read up more about it, but they're better overall. Sockets aren't a whole lot simpler, though.
You can write a .NET project which communicates with sockets. Your PC could listen to a given port and your phone would access that port. This article might help you to learn more about .NET socket programming and you will be able to use C# or Visual Basic as you want.
A client of ours has a mobile web cam placed in a forest that is streaming video on a public IP address. Since the web cam has a limited bandwidth (and it is streaming with a format that often requires clients to install a codec), the stream needs to be re-broadcast by a server on a landline, preferably as streaming FLV.
What components can be used to write a client/server that can do this? It would be written using C#.
(Software solutions would be fine too, but we're on a limited budget so it can't be something very expensive...)
What's the format that the camera is sending you?
Rebroadcasting is easy using off-the-shelf servers - which means no programming as such, no C#.
camera -> ffserver -> flash players
ffserver is part of ffmpeg.