I'm trying to make an uwp app which will be a client and will run on PI3. The server is a C# Winforms app, that runs on my Windows 10 computer, which I've found here: https://www.codeproject.com/Articles/482735/TCP-Audio-Streamer-and-Player-Voice-Chat-over-IP. The server can stream audio from microphone device to all the connected clients. Although the project has its own client and I can run both server and client on my local machine. Now I want to build a similar client app in UWP C#. By using the UWP StreamSocketActivity sample, I can connect to the server. But I don't know how to receive the audio data and play it on UWP client. Could anyone give me a hand?
Blow is the screenshot of running server which has one connection from uwp client:
Client connects to the server
Thanks in advance!
As mentioned in the article, the protocol used to transfer the audio data is customized.
Note !!! This is a proprietary project. You can't use my servers or
clients with any other standardized servers or clients. I don't use
standards like RTCP or SDP.
You can find the code in TcpProtocols.cs. In UWP client app, you need to convert the code for UWP. This document shows how to build a basic TCP socket client in UWP. But you also need to modify the code for receive data from server continuously. Following code may be helpful for you.
private async void StartClient()
{
try
{
// Create the StreamSocket and establish a connection to the echo server.
using (var streamSocket = new Windows.Networking.Sockets.StreamSocket())
{
// The server hostname that we will be establishing a connection to. In this example, the server and client are in the same process.
var hostName = new Windows.Networking.HostName(TxtHostName.Text);
await streamSocket.ConnectAsync(hostName, TxtPortNumber.Text);
while(true)
{
using (var reader = new DataReader(streamSocket.InputStream))
{
reader.InputStreamOptions = InputStreamOptions.Partial;
uint numAudioBytes = await reader.LoadAsync(reader.UnconsumedBufferLength);
byte[] audioBytes = new byte[numAudioBytes];
reader.ReadBytes(audioBytes);
//Parse data to RTP packet
audioBytes = Convert_Protocol_LH(audioBytes);
var pcmStream = audioBytes.AsBuffer().AsStream().AsRandomAccessStream();
MediaElementForAudio.SetSource(pcmStream, "audio/x-wav");
MediaElementForAudio.Play();
}
}
}
}
catch (Exception ex)
{
Windows.Networking.Sockets.SocketErrorStatus webErrorStatus = Windows.Networking.Sockets.SocketError.GetStatus(ex.GetBaseException().HResult);
}
}
UPDATE:
RTP Packet
RTSP for living audio is recommended and wide used.The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in entertainment and communications systems to control streaming media servers. The protocol is used for establishing and controlling media sessions between end points. There are some advantages of RTSP. In this solution, you need to build a RTSP server and then use VLC.MediaElement library or other library which support on Windows IoT Core in your UWP app. But i am not sure this library supports RTP.
In addition, this document shows supported codecs on Windows IoT Core.
Related
I have a UWP app that needs to use UdpClient to receive some data. The code looks very similar to this:
var udp = new UdpClient(port);
var groupEP = new IPEndPoint(IPAddress.Any, port);
while (true)
{
Trace.WriteLine("Waiting for broadcast");
byte[] bytes = udp.Receive(ref groupEP);
Trace.WriteLine($"Received broadcast from {groupEP} :");
Trace.WriteLine($" {Encoding.ASCII.GetString(bytes, 0, bytes.Length)}");
}
When I run this code in UWP app it stops at Receive(), does not receive anything, and there are no exceptions.
If I run the same exact code in NET 5 console app everything works fine.
How can I make this code run in UWP app?
A common reason for such kind of network issue is the local network loopback. UWP apps are running in the sandbox and are isolated from the system resources like network and file system. In other works, UWP apps are not allowed to access the local host address by default. Enabling the local network loopback could make UWP apps able to access local network resources.
Please also make sure that you've enabled the enterpriseAuthentication and privateNetworkClientServer capability in the manifest file.
I'm pretty new to Android programming that's why I need your advice.
Current Situation:
I built an Android application (C#) aswell as a regular Server application (C++) which runs on a Raspberry Pi. Both programs communicate via UDP. At the moment that the Server application receives a signal it sends out a broadcast message which the Android application is listening for. Everything works just fine to the moment that the Android device falls asleep/goes idle which leads to my question.
Question:
How can I accomplish that the Android applications' listener still works, when the device falls asleep? I do not expect any solutions but any kind of advice so I don't waste time with wrong approaches.
Research:
- I read about and tried services that will keep running in the background but the service also stopped as the device went to sleep.
- I read about Broadcast Receivers which allow the application/service to get further information of the system.
- I read about WAKELOCK which allows me to keep the CPU alive, but for my purpose it should be up 'all the time' and that would drain to much energy.
Code that I would like to run in the background:
public void AsyncReceive()
{
// ...
Task.Run(() =>
{
while (this.isActive)
{
byte[] buffer = new byte[1];
DatagramPacket incoming = new DatagramPacket(buffer,
buffer.Length);
try
{
sock.Receive(incoming);
}
catch (...)
{
// Exception handling goes here...
}
// Communicate with the Android application
this.FireBroadCastReceivedEvent();
}
});
}
Edit
I also need to notice the application about incoming messages (#the 'FireBroadCastReceivedEvent()' part of the code). What would be a good way to do that?
I think you must read this link : https://developer.android.com/training/run-background-service/index.html
Hope you find what are you looking for.
I currently have client socket code on visual studio written in c++ that creates a Bluetooth client socket. Previously, I had written code in Java for a Bluetooth server socket and everything was working, but I want to have the server socket on unity so that I can receive data into the Unity application to perform a function. I decided to rewrite the server socket code in C# to use in unity and am using the InTheHand.dll library. Is this doable in Unity? I don't have pro but I understand net libraries can be used in the free version. I am receiving a lot of errors but I am unsure if it is because what I am trying to do is not possible with the free version of Unity or if I'm doing it wrong. Here is the piece of my Code where I try to make the Bluetooth server socket in unity. I have "using InTheHand.Net" and a couple of other includes for the other Bluetooth functionalities :
Guid UUID = new Guid("0000-1000-8000-00805F9B34FB");
bool ServerRunning = false;
Stream mStream;
BluetoothListener BTListener = new BluetoothListener(UUID);
BTListener.Start();
ServerRunning = true;
BluetoothClient conn = BTListener.AcceptBluetoothClient();
mStream = conn.GetStream()
My c# console app runs on an different machine than my SQL Server 2014. I use ado.net to connect to it. How can I detect if the sql server automatically reboots after installing windows updates? On my client application I use SystemEvents_SessionEnding but this does not help me.
I read about connection resiliency, but this seems also not to solve this problem.
Is there a specific ado.net event I can capture? Creating an app on the server sending UDP is not my prefered solution, aswell I dont want to use ping etc.
I'm really looking for something like an event to react on.
e.g. the notification services: https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlnotificationinfo(v=vs.110).aspx
Thanks!
If you want to see if the server is down or not you can use Ping Class.
using System.Net.NetworkInformation;
var ping = new Ping();
var reply = ping.Send("SqlServerIP");
if (reply.Status == IPStatus.Success)
{
//server is available
}
else
{
//server is down
}
I'm developing a voip server-client app using NAUDIO and Sockets.
I've read the naudio's documentation and I have tried alot to get the data from the microphone then send it to the client, the thing that you can get the data, but you have to save it to a byte array first then send it which is almost like sending a file using TCP.
How can I get data from naudio and send it at the same time "Stream it" to the client using UDP protocol.
Thanks in advance.
NAudio has a Network Chat Demo within the Examples if you download the source code, which does a good job of showing how to implement a very simple Chat application.
Basically though what you want the client to do is this:
void Initialize()
{
waveIn = new WaveIn();
waveIn.BufferMilliseconds = 50;
waveIn.DeviceNumber = inputDeviceNumber;
waveIn.WaveFormat = codec.RecordFormat;
waveIn.DataAvailable += waveIn_DataAvailable;
waveIn.StartRecording();
...
}
void waveIn_DataAvailable(object sender, WaveInEventArgs e)
{
//Encode and send e.Buffer
}
With this you get a byte array every 50 ms (or however long you set you buffer to) and send it to the server. You'll need to encode it however, since sending the sound un-encoded will take up too much bandwidth. NAudio has codecs of its own, so that shouldn't be much of a problem. See here for NAudio's network chat demo.
Another thing to consider, if you plan to implement a client to client voip (either via p2p or streamed through the server itself) is a good networking library to handle all the communications. I've used Lidgren on a similar project which worked pretty well. It's open source, but can easily be set up to fit your needs.
There is a demo in the NAudioDemo app called "Network Chat", which records from the microphone, compresses the audio with a codec, and sends it out via UDP. It also receives audio from UDP, decompresses it and plays it. So looking at that code should point you in the right direction. What it doesn't show is the use of any protocol on top of UDP, so just the raw compressed audio is being sent over the network with no timestamps or indications of what codec is being used.