Capturing digital TV and stream it on local network - c#

I need to capture digital TV signals and stream the TV channels on the local network (via Http, RTSP or any protocol like this). I'm using CodeTV to find TV channels, capture and decode them. The project uses DirectShow.Net to do that. I found Vlc.DotNet helpful to stream data on the local network. The problem is, I'm not familiar with Directshow and I can't figure it out how to get the video stream and give it to the Vlc library.
I tried to replace this code with the code that records the video stream but "bs" remains null.
IStream bs = this.currentGraphBuilder as IStream;
var currentDirectory =
Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
currentDirectory = Path.Combine(currentDirectory, "libvlc");
var libDirectory = new DirectoryInfo(Path.Combine(currentDirectory,
IntPtr.Size == 4 ? "win-x86" : "win-x64"));
using (Vlc.DotNet.Core.VlcMediaPlayer mediaPlayer = new
Vlc.DotNet.Core.VlcMediaPlayer(libDirectory))
{
var mediaOptions = new[]
{
":sout=#rtp{sdp=rtsp://127.0.0.1:554/}",
":sout-keep"
};
mediaPlayer.SetMedia(bs as Stream, mediaOptions);
mediaPlayer.Play();
}
I don't know if I need to create a filter and add it to the graph or there is a simpler way.

Related

How to record an input device with more than 2 channels to mp3 format

I am building a recording software for recording all connected devices to PC into mp3 format.
Here is my code:
IWaveIn _captureInstance = inputDevice.DataFlow == DataFlow.Render ?
new WasapiLoopbackCapture(inputDevice) : new WasapiCapture(inputDevice)
var waveFormatToUse = _captureInstance.WaveFormat;
var sampleRateToUse = waveFormatToUse.SampleRate;
var channelsToUse = waveFormatToUse.Channels;
if (sampleRateToUse > 48000) // LameMP3FileWriter doesn't support a rate more than 48000Hz
{
sampleRateToUse = 48000;
}
else if (sampleRateToUse < 8000) // LameMP3FileWriter doesn't support a rate less than 8000Hz
{
sampleRateToUse = 8000;
}
if (channelsToUse > 2) // LameMP3FileWriter doesn't support a number of channels more than 2
{
channelsToUse = 2;
}
waveFormatToUse = WaveFormat.CreateCustomFormat(_captureInstance.WaveFormat.Encoding,
sampleRateToUse,
channelsToUse,
_captureInstance.WaveFormat.AverageBytesPerSecond,
_captureInstance.WaveFormat.BlockAlign,
_captureInstance.WaveFormat.BitsPerSample);
_mp3FileWriter = new LameMP3FileWriter(_currentStream, waveFormatToUse, 32);
This code works properly, except the cases when a connected device (also virtual as SteelSeries Sonar) has more than 2 channels.
In the case with more than 2 channels all recordings with noise only.
How can I solve this issue? It isn't required to use only LameMP3FileWriter, I only need it will mp3 or any format with good compression. Also if it's possible without saving intermediate files on the disk (all processing in memory), only the final file with audio.
My recording code:
// When the capturer receives audio, start writing the buffer into the mentioned file
_captureInstance.DataAvailable += (s, a) =>
{
lock (_writerLock)
{
// Write buffer into the file of the writer instance
_mp3FileWriter?.Write(a.Buffer, 0, a.BytesRecorded);
}
};
// When the Capturer Stops, dispose instances of the capturer and writer
_captureInstance.RecordingStopped += (s, a) =>
{
lock (_writerLock)
{
_mp3FileWriter?.Dispose();
}
_captureInstance?.Dispose();
};
// Start audio recording
_captureInstance.StartRecording();
If LAME doesn't support more than 2 channels, you can't use this encoder for your purpose. Have you tried it with the Fraunhofer surround MP3 encoder?
Link: https://download.cnet.com/mp3-surround-encoder/3000-2140_4-165541.html
Also, here's a nice article discussing how to convert between most audio formats (with C# code samples): https://www.codeproject.com/articles/501521/how-to-convert-between-most-audio-formats-in-net

Playing from wav stream

var result = service.Synthesize(
text: text,
accept: "audio/wav",
voice: "en-US_AllisonVoice"
//voice: "en-US_HenryV3Voice"
);
using (FileStream fs = File.Create(#"C:\Users\nkk01\Desktop\voice.wav"))
{
result.Result.WriteTo(fs);
fs.Close();
result.Result.Close();
}
var waveStream = new WaveFileReader(#"C:\Users\nkk01\Desktop\voice.wav");
var waveOut = new WaveOutEvent();
waveOut.Init(waveStream);
Console.WriteLine("Playing");
waveOut.Play();
Console.WriteLine("Finished playing");
Hi this is my current code which is extremely inefficient as it has to save an audio stream to a file to play it. I would like to pass the audio stream directly to my laptop's speaker using the NAudio library. I still have not managed to find a solution. It will be of great help, thanks.
i'm not familiar with naudio but as far as i can see from their git hub repo, the constructor for WaveFileReader also accepts a stream instead of a filename ...
simply try writing everything you have to a memory stream instead of a file... don't forget to reposition the memory stream before you read it ('seek' to offset 0)

C# BroadCast Mp3 File To ShoutCast Server

Im trying to Make a radio Like Auto Dj to Play List Of Mp3 Files in series Like What Happen In Radio.
I tried a lot of work around but finally i thought of sending mp3 files to shoutcast server and play the output of that server my problem is i don't how to do that
i have tried bass.radio to use bass.net and that's my code
private int _recHandle;
private BroadCast _broadCast;
EncoderLAME l;
IStreamingServer server = null;
// Init Bass
Bass.BASS_Init(-1, 44100, BASSInit.BASS_DEVICE_DEFAULT,IntPtr.Zero);
// create the stream
int _stream = Bass.BASS_StreamCreateFile("1.mp3", 0, 0,
BASSFlag.BASS_SAMPLE_FLOAT | BASSFlag.BASS_STREAM_PRESCAN);
l= new EncoderLAME(_stream);
l.InputFile = null; //STDIN
l.OutputFile = null;
l.Start(null, IntPtr.Zero, false);
// decode the stream (if not using a decoding channel, simply call "Bass.BASS_ChannelPlay" here)
byte[] encBuffer = new byte[65536]; // our dummy encoder buffer
while (Bass.BASS_ChannelIsActive(_stream) == BASSActive.BASS_ACTIVE_PLAYING)
{
// getting sample data will automatically feed the encoder
int len = Bass.BASS_ChannelGetData(_stream, encBuffer, encBuffer.Length);
}
//l.Stop(); // finish
//Bass.BASS_StreamFree(_stream);
//Server
SHOUTcast shoutcast = new SHOUTcast(l);
shoutcast.ServerAddress = "50.22.219.37";
shoutcast.ServerPort = 12904;
shoutcast.Password = "01008209907";
shoutcast.PublicFlag = true;
shoutcast.Genre = "Hörspiel";
shoutcast.StationName = "Kravis Server";
shoutcast.Url = "";
shoutcast.Aim = "";
shoutcast.Icq = "";
shoutcast.Irc = "";
server = shoutcast;
server.SongTitle = "BASS.NET";
// disconnect, if connected
if (_broadCast != null && _broadCast.IsConnected)
{
_broadCast.Disconnect();
}
_broadCast = null;
GC.Collect();
_broadCast = new BroadCast(server);
_broadCast.Notification += OnBroadCast_Notification;
_broadCast.AutoReconnect = true;
_broadCast.ReconnectTimeout = 5;
_broadCast.AutoConnect();
but i don't get my File Streamed to streamed to the server even the _broadCast Is Connected.
so if any solution of code or any other thing i can do.
I haven't used BASS in many years, so I can't give you specific advice on the code you have there. But, I wanted to give you the gist of the process of what you need to do... it might help you get started.
As your file is in MP3, it is possible to send it directly to the server and hear it on the receiving end. However, there are a few problems with that. The first is rate control. If you simply transmit the file data, you'll send say 5 minutes of data in perhaps a 10 second time period. This will eventually cause failures as the clients aren't going to buffer much data, and they will disconnect. Another problem is that your MP3 files often have extra data in them in the form of ID3 tags. Some players will ignore this, others won't. Finally, some of your files might be in different sample rates than others, so even if you rate limit your sending, the players will break when they hit a file in a different sample rate.
What needs to happen is the generation of a fresh stream. The pipeline looks something like this:
[Source File] -> [Codec] -> [Raw PCM Audio] -> [Codec] -> [MP3 Stream] -> [SHOUTcast Server] -> [Clients]
Additionally, that raw PCM audio step needs to run in at a realtime rate. While your computer can definitely decode and encode faster than realtime, it needs to be ran at realtime so that the players can listen in realtime.

c# cscore how to instantly play loopback capture recorded bytes

Yo guys, its me again with my noob questions. so this time I've used cscore to record windows sounds then send the recorded bytes to another pc by sockets and let them play there.
I just could not figure out how to play the gotten bytes under DataAvailable callback...
I've tried to write the bytes gotten to a file and play that file that worked but sound is not playing correctly like there's some unexpected sounds being heard with it too.
so here's my code:
WasapiCapture capture = new WasapiLoopbackCapture();
capture.Initialize();
capture.DataAvailable += (s, e) =>
{
WaveWriter w = new WaveWriter("file.mp3", capture.WaveFormat);
w.Write(e.Data, e.Offset, e.ByteCount);
w.Dispose();
MemoryStream stream = new MemoryStream(File.ReadAllBytes("file.mp3"));
SoundPlayer player = new SoundPlayer(stream);
player.Play();
stream.Dispose();
};
capture.Start();
any help would be highly appreciated ;-;.
if you wanna hear how sound comes out by that way I would record you the result.
NOTE: if I just record sounds to a file and open later it just works perfectly but if I write and play instantly it unexpected sounds being heard.....
Use the SoundInSource as an adapater.
var capture = new WasapiCapture(...)
capture.Initialize(); //initialize always first!!!!
var soundInSource = new SoundInSource(capture)
{ FillWithZeros = true }; //set FillWithZeros to true, to prevent WasapiOut from stopping for the case WasapiCapture does not serve any data
var soundOut = new WasapiOut();
soundOut.Initialize(soundInSource);
soundOut.Play();

How to convert any audio format to mp3 using NAudio

public void AudioConvert()
{
FileStream fs = new FileStream(InputFileName, FileMode.Open, FileAccess.Read);
NAudio.Wave.WaveFormat format = new NAudio.Wave.WaveFormat();
NAudio.Wave.WaveStream rawStream = new RawSourceWaveStream(fs, format);
NAudio.Wave.WaveStream wsDATA = WaveFormatConversionStream.CreatePcmStream(rawStream);
WaveStream wsstream = wst.CanConvertPcmToMp3(2, 44100);
.....
}
// Here is the class
public class WaveFormatConversionStreamTests
{
public WaveStream CanConvertPcmToMp3(int channels,int sampleRate)
{
WaveStream ws = CanCreateConversionStream(
new WaveFormat(sampleRate, 16, channels),
new Mp3WaveFormat(sampleRate, channels, 0, 128000/8));
return ws;
}
}
Here, i am trying to convert any audio format to mp3 but my code is throwing exception like "ACMNotPossible" at ConvertPCMToMp3 function call. I am using NAudio 1.6 version dll. Right now i am working on windows 7. Please tell me where i went wrong in this code.
WaveFormatConversionStream is a wrapper around the Windows ACM APIs, so you can only use it to make MP3s if you have an ACM MP3 encoder installed. Windows does not ship with one of these. The easiest way to make MP3s is simply to use LAME.exe. I explain how to do this in C# in this article.
Also, if you are using the alpha of NAudio 1.7 and are on Windows 8 then you might be able to use the MP3 encoder which seems to come with Windows 8 as a Media Foundation Transform. Use the MediaFoundationEncoder (the NAudio WPF demo shows how to do this).

Categories

Resources