NAudio proper way to stream MediaFoundationReader - c#

I'm trying to use NAudio to play shoutcast streams. This is doable (apparently) with a few lines of code:
var url = "http://dance.radiomonster.fm/320.mp3";
using(var radioStream = new MediaFoundationReader(url))
using(var wo = new WaveOutEvent())
{
wo.Init(radioStream);
wo.Play();
while (wo.PlaybackState == PlaybackState.Playing)
{
Thread.Sleep(1000);
}
}
Works fine for playback. However I need to take that as float samples converted to 48 KHz mono to be sent off as a bytestream elsewhere. So I do this:
//Convert the wave to 48 khz, mono
str1 = new WaveFormatConversionProvider(new WaveFormat(48000,1), radioStream);
//Convert the converted wave to floats
str = new Wave16ToFloatProvider(str1);
Under previous iterations of this I would do something like this to retrieve the data:
while ((readResultCount = str.Read(result, 0, result.Length)) > 0)
{
float[] output = new float[result.Length / sizeof(float)];
Buffer.BlockCopy(result, 0, output, 0, sizeof(float) * output.Length);
au.EnqueueEncodeVoiceData(output);
}
However, this results in audio output that skips worse than a CD that's been ran over.
I've tried to see if there was any way to determine if there was data enough in the buffer and avoid reading the stream if not, but there doesn't seem to be any facilities to do this.
What am I doing wrong?

Ah, here's a matter of hidden knowledge.
The loop itself was being called in a timer event tick, which meant that multiple instances of the loop could (and would) get called, since the Read operation is apparently blocking until either the requested data is received or the stream ends for whatever reason.
Since the timer ticked indiscriminately of whether the loop had exited or not, it was running over its own toes.
I resolved this by removing the timer tick and shoving the routine into a Task.

Related

Uploading Large Files to WCF from Xamarin Android App Crashes

I'm trying to upload a large video (1 GB+) from my xamarin app and it keeps crashing once it reaches about 0.5 GB of my file. The only way I've found to get the videos to post to my WCF service while sending data along with it is using the MultiPart logic but I'm not sure if I'm running out of memory or what because even in debug mode, it simply crashes without any real error message.
I'm trying to run it on a native device (not a sim) and it's a Samsung Galaxy S9 with Android 9.
Here's the upload code that I'm using: (p.s. - as a test, I tried putting the WriteAsync into a for loop thinking that maybe trying to write the whole gig was the problem, but the result was the same. That's why you'll see the MAXFILESIZEPART constant in there which is just an int equal to 10000000.)
private async Task<byte[]> GetMultipartFormDataAsync(Dictionary<string, object> postParameters, string boundary)
{
try
{
using (Stream formDataStream = new System.IO.MemoryStream())
{
bool needsCLRF = false;
foreach (var param in postParameters)
{
// Thanks to feedback from commenters, add a CRLF to allow multiple parameters to be added.
// Skip it on the first parameter, add it to subsequent parameters.
if (needsCLRF)
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes("\r\n"), 0, Encoding.UTF8.GetByteCount("\r\n"));
needsCLRF = true;
if (param.Value is FileParameter)
{
FileParameter fileToUpload = (FileParameter)param.Value;
// Add just the first part of this param, since we will write the file data directly to the Stream
string header = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"; filename=\"{2}\"\r\nContent-Type: {3}\r\n\r\n",
boundary,
param.Key,
fileToUpload.FileName ?? param.Key,
fileToUpload.ContentType ?? "application/octet-stream");
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(header), 0, Encoding.UTF8.GetByteCount(header));
// Write the file data directly to the Stream, rather than serializing it to a string.
if (fileToUpload.File.Length > MAXFILESIZEPART)
{
for (var i = 0; i < fileToUpload.File.Length; i += MAXFILESIZEPART)
{
var len = i + MAXFILESIZEPART > fileToUpload.File.Length
? fileToUpload.File.Length - i
: MAXFILESIZEPART;
await formDataStream.WriteAsync(fileToUpload.File, i, len);
}
}
else
{
await formDataStream.WriteAsync(fileToUpload.File, 0, fileToUpload.File.Length);
}
}
else
{
string postData = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"\r\n\r\n{2}",
boundary,
param.Key,
param.Value);
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(postData), 0, Encoding.UTF8.GetByteCount(postData));
}
}
// Add the end of the request. Start with a newline
string footer = "\r\n--" + boundary + "--\r\n";
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(footer), 0, Encoding.UTF8.GetByteCount(footer));
// Dump the Stream into a byte[]
formDataStream.Position = 0;
byte[] formData = new byte[formDataStream.Length];
formDataStream.Read(formData, 0, formData.Length);
return formData;
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
}
And it's eventually failing on the following line
await formDataStream.WriteAsync(fileToUpload.File, i, len);
but only after a certain point (about 500MB) so I'm assuming it's a memory issue but it doesn't say so. Is there a better way to accomplish this task? I'm doing it so that it also records the progress as the upload happens. I'm trying to accomplish something similar to uploading large videos via the facebook app so that it will upload in the background while you continue working. It works great when working with smaller files (i.e. - < 500 MB) but this is the first time I've tried a file that was almost a gig in size.
NOTE: This happens BEFORE it starts posting anything to the server so it's not IIS or WCF related. This code crashes just writing the bytes to the memory stream.
Any suggestions?
Thanks!
According to your description, the service will stop at a certain time point, and because the file you transfer is about 1G, it is likely to be sendtimeout.No transfer completed within the specified time, causing exception。The SendTimeout that specifies how long the write operation has to complete before timing out. The default value is 1 minute.
I set sendtimeout to 15 seconds in my configuration file.If the data takes more than 15 seconds, an exception will occur. You can set it to a higher value to avoid timeout and exception.
For information about sendtimeout, please refer to the following link:
https://learn.microsoft.com/en-us/dotnet/api/system.servicemodel.channels.binding.sendtimeout?view=dotnet-plat-ext-3.1
UPDATE
I think it might be a memory overflow problem.Large file may cause memory overflow, unable to read at the same time.
You can refer to the following links for solutions
https://learn.microsoft.com/en-us/archive/blogs/johan/are-you-getting-outofmemoryexceptions-when-uploading-large-files

Ffmpeg possible sws_scale memory leak

I'm decoding whatever the camera codec is and then always encode it to H264 and more specifically qsv if it is supported. Currently having 2 cameras to test. One is H264 encoding and one is rawvideo. The problem comes with rawvideo. The pixel format is BGR24 and scaling to NV12
I will simplify the code because it is as any other example.
avcodec_send_packet()
// while
avcodec_receive_frame()
// if frame is not EAGAIN convert BGR24 to NV12
if (_pConvertContext == null)
{
_pConvertContext = CreateContext(sourcePixFmt, targePixFmt);
}
if (_convertedFrameBufferPtr == IntPtr.Zero)
{
int buffSize = ffmpeg.av_image_get_buffer_size(targePixFmt, sourceFrame->width, sourceFrame->height, 1);
_convertedFrameBufferPtr = Marshal.AllocHGlobal(buffSize);
ffmpeg.av_image_fill_arrays(ref _dstData, ref _dstLinesize, (byte*)_convertedFrameBufferPtr, targePixFmt, sourceFrame->width, sourceFrame->height, 1);
}
return ScaleImage(_pConvertContext, sourceFrame, targePixFmt, _dstData, _dstLinesize);
And ScaleImage method
ffmpeg.sws_scale(ctx, sourceFrame->data, sourceFrame->linesize, 0, sourceFrame->height, dstData, dstLinesize);
AVFrame* f = ffmpeg.av_frame_alloc();
var data = new byte_ptrArray8();
data.UpdateFrom(dstData);
var linesize = new int_array8();
linesize.UpdateFrom(dstLinesize);
f->data = data;
f->linesize = linesize;
f->width = sourceFrame->width;
f->height = sourceFrame->height;
f->format = (int)targePixelFormat;
return f;
After that sending the scaled frame to the encoder and receiving it and writing the file. After that I call av_frame_free(&frame) on the frame returned from the method. But when I set breakpoint I can see the address of the frame is the same even after calling av_frame_alloc() and cleaning everytime. And I think this is the reason for the memory leak. If I do deep clone of the f before returning it everything is fine. Why is that happening while the same logic works fine with the other camera?
Well, after 3 lost days of trying to figure out why it happens with certain cameras only, I've made a memory dump using DebugDiag 2 with a memory leak tracker. It turned out there is tons of memory allocations by _aligned_offset_malloc as you can see in the picture below. I was using ffmpeg 4.2.2, downgrading to 4.0.2 fixed the problem for me. No memory leak now. I'm using 32bit version.

Reusing TcpClient and NetworkStream results in wrong data

I'm writing a screen mirroring app on WPF. My original code sends a bitmap over TCP from a server to a client. The original code works fine, but closes and recreates the tcp connection every time it sends a frame. This results in 30 socket open and close per second, which I assume isn't the ideal way to do it.
So I tried to rewrite it to reuse the stream each time it sends the data, but the stream starts to spit out wrong data after a while.
public void SendStream(byte[] byteArray)
{
/*
_client = IsServer ? _server.AcceptTcpClient() : new TcpClient(IP.ToString(), Port);
using (var clientStream = _client.GetStream())
{
var comp = Compress(byteArray);
clientStream.Write(comp, 0, comp.Length);
}
*/
var comp = Compress(byteArray);
_stream.Write(BitConverter.GetBytes(comp.Length), 0, 4);
_stream.Write(comp, 0, comp.Length);
}
public byte[] ReceiveStream()
{
/*
_client = IsServer ? _server.AcceptTcpClient() : new TcpClient(IP.ToString(), Port);
var stream = _client.GetStream();
return Decompress(stream);
*/
var lengthByte = new byte[4];
_stream.Read(lengthByte, 0, 4);
var length = BitConverter.ToInt32(lengthByte, 0);
var data = new byte[length];
_stream.Read(data, 0, length);
return Decompress(new MemoryStream(data));
}
Compress and Decompress function are just wrappers around the built in DeflateStream.
I have checked that the sent comp.Length and received length are the same when the error happens.
Any ideas on whats going on? Thanks. It always throws an exception after at least a few frames, never the first one (at least that I've tried so far)
(It seems to happen faster when the bitmaps are larger in size i.e. when the compression algorithm doesn't do as much cause the screen is more complicated. Not 100% sure though)
Try doing the following:
int receivedBytesCount = _stream.Read(data, 0, length);
The length variable you pass to the Read method is the maximum. The Read method may actually read less bytes than length. It will return the number of bytes it actually read. This will happen when your data is fragmented into TCP packets.
You need to keep calling Read until you receive enough bytes and combine everything to get the full frame. You will need to adjust the offset in order to avoid overwriting the buffer. In the code you posted it is hardcoded to 0.

C# BroadCast Mp3 File To ShoutCast Server

Im trying to Make a radio Like Auto Dj to Play List Of Mp3 Files in series Like What Happen In Radio.
I tried a lot of work around but finally i thought of sending mp3 files to shoutcast server and play the output of that server my problem is i don't how to do that
i have tried bass.radio to use bass.net and that's my code
private int _recHandle;
private BroadCast _broadCast;
EncoderLAME l;
IStreamingServer server = null;
// Init Bass
Bass.BASS_Init(-1, 44100, BASSInit.BASS_DEVICE_DEFAULT,IntPtr.Zero);
// create the stream
int _stream = Bass.BASS_StreamCreateFile("1.mp3", 0, 0,
BASSFlag.BASS_SAMPLE_FLOAT | BASSFlag.BASS_STREAM_PRESCAN);
l= new EncoderLAME(_stream);
l.InputFile = null; //STDIN
l.OutputFile = null;
l.Start(null, IntPtr.Zero, false);
// decode the stream (if not using a decoding channel, simply call "Bass.BASS_ChannelPlay" here)
byte[] encBuffer = new byte[65536]; // our dummy encoder buffer
while (Bass.BASS_ChannelIsActive(_stream) == BASSActive.BASS_ACTIVE_PLAYING)
{
// getting sample data will automatically feed the encoder
int len = Bass.BASS_ChannelGetData(_stream, encBuffer, encBuffer.Length);
}
//l.Stop(); // finish
//Bass.BASS_StreamFree(_stream);
//Server
SHOUTcast shoutcast = new SHOUTcast(l);
shoutcast.ServerAddress = "50.22.219.37";
shoutcast.ServerPort = 12904;
shoutcast.Password = "01008209907";
shoutcast.PublicFlag = true;
shoutcast.Genre = "Hörspiel";
shoutcast.StationName = "Kravis Server";
shoutcast.Url = "";
shoutcast.Aim = "";
shoutcast.Icq = "";
shoutcast.Irc = "";
server = shoutcast;
server.SongTitle = "BASS.NET";
// disconnect, if connected
if (_broadCast != null && _broadCast.IsConnected)
{
_broadCast.Disconnect();
}
_broadCast = null;
GC.Collect();
_broadCast = new BroadCast(server);
_broadCast.Notification += OnBroadCast_Notification;
_broadCast.AutoReconnect = true;
_broadCast.ReconnectTimeout = 5;
_broadCast.AutoConnect();
but i don't get my File Streamed to streamed to the server even the _broadCast Is Connected.
so if any solution of code or any other thing i can do.
I haven't used BASS in many years, so I can't give you specific advice on the code you have there. But, I wanted to give you the gist of the process of what you need to do... it might help you get started.
As your file is in MP3, it is possible to send it directly to the server and hear it on the receiving end. However, there are a few problems with that. The first is rate control. If you simply transmit the file data, you'll send say 5 minutes of data in perhaps a 10 second time period. This will eventually cause failures as the clients aren't going to buffer much data, and they will disconnect. Another problem is that your MP3 files often have extra data in them in the form of ID3 tags. Some players will ignore this, others won't. Finally, some of your files might be in different sample rates than others, so even if you rate limit your sending, the players will break when they hit a file in a different sample rate.
What needs to happen is the generation of a fresh stream. The pipeline looks something like this:
[Source File] -> [Codec] -> [Raw PCM Audio] -> [Codec] -> [MP3 Stream] -> [SHOUTcast Server] -> [Clients]
Additionally, that raw PCM audio step needs to run in at a realtime rate. While your computer can definitely decode and encode faster than realtime, it needs to be ran at realtime so that the players can listen in realtime.

NetworkStream.Read() not correctly reading after first iteration of loop

I'm having a problem where my code works as expected when I'm stepping through it but will read incorrect data when running normally. I thought the problem could have been timing however NetworkStream.Read() should be blocking and I also tested this by putting the thread to sleep for 1000ms (more than enough time and more time than I was giving whilst stepping through).
The purpose of the code (and what is does when stepping through) is to read a bitmap image into a buffer that is preceded by a string containing the image size in bytes followed by a carriage return and a new line. I believe the problem lies in the read statements, but I can't be sure. The following code is contained within a larger loop also containing Telnet reads, however I have not has a problem with those and they are only reading ASCII strings, no binary data.
List<byte> len = new List<byte>();
byte[] b = new byte[2];
while (!Encoding.ASCII.GetString(b).Equals("\r\n"))
{
len.Add(b[0]);
b[0] = b[1];
b[1] = (byte)stream.ReadByte();
}
len = len.FindAll(x => x != 0);
len.Add((byte)0);
string lenStr = Encoding.ASCII.GetString(len.ToArray());
int imageSize = int.Parse(lenStr);
byte[] imageIn = new byte[imageSize];
stream.Read(imageIn, 0, imageSize);
using (MemoryStream g = new MemoryStream(imageIn))
{
g.Position = 0;
bmp = (Bitmap)Image.FromStream(g);
}
The actual problem that occurs with the code is that the first time it runs it correctly receives the length and image, however it does not seem to recognize the \r\n in consecutive reads, however this may only be a symptom and not the problem itself.
Thanks in advance!
EDIT:
So I did narrow the problem down and manage to fix it by adding in some artificial delay between my Telnet call using NetworkStream.Write() to retrieve the image and the networkStream.Read() to retrieve it, however this solution is messy and I would still like to know why this issue is happening
The Read() operation returns the number of bytes actually read. It only blocks when there is no data to read, and it can return less bytes as specified by the count parameter.
You can easily fix this by putting this inside a loop:
byte[] imageIn = new byte[imageSize];
int remaining = imageSize;
int offset = 0;
while (remaining > 0)
{
int read = stream.Read(imageIn, offset, remaining);
if (read == 0) throw new Exception("Connection closed before expected data was read");
offset += read;
remaining -= read;
}

Categories

Resources