Sound Effect Class in background agent - c#

I want to be able to play an audio file (a sound basically) from the background agent. I am using the following two approaches :
1)
SoundEffectInstance ClockTickInstance;
StreamResourceInfo ClockTickStream;
SoundEffect ClockTickSound;
try {
ClockTickStream = Application.GetResourceStream(new Uri(
#"AudioFiles/NewHighScore.wav", UriKind.Relative));
ClockTickSound = SoundEffect.FromStream(ClockTickStream.Stream);
ClockTickInstance = ClockTickSound.CreateInstance();
ClockTickInstance.IsLooped = true;
ClockTickInstance.Volume = 1.0f;
ClockTickInstance.Pitch = 1.0f;
ClockTickInstance.Play();
ClockTickInstance.Stop();
}
OR
2)
var localFolder = Package.Current.InstalledLocation;
Stream fileStream = await localFolder.OpenStreamForReadAsync("NewHighScore.wav");
byte[] buffer = new byte[fileStream.Length];
fileStream.Read(buffer, 0, System.Convert.ToInt32(fileStream.Length));
fileStream.Close();
SoundEffect soundefct = new SoundEffect(buffer, 16000, AudioChannels.Mono);
FrameworkDispatcher.Update();
soundefct.Play();
When i run the code from the xaml.cs file (i.e. from the foreground app) evrything works fine and the sound is played.
But from the background agent, the code runs but no sound is heard.
What could be the problem?
the following article shows the list of APIs that can be used while the app is running in the background -
http://msdn.microsoft.com/en-us/library/windowsphone/develop/jj662941(v=vs.105).aspx

Related

MediaStreamSource video streaming in UWP

I just started to experiment with MediaStreamSource in UWP.
I took the MediaStreamSource streaming example from MS and tried to rewrite it to support mp4 instead of mp3.
I changed nothing but the InitializeMediaStreamSource part, it now looks like this:
{
var clip = await MediaClip.CreateFromFileAsync(inputMP3File);
var audioTrack = clip.EmbeddedAudioTracks.First();
var property = clip.GetVideoEncodingProperties();
// initialize Parsing Variables
byteOffset = 0;
timeOffset = new TimeSpan(0);
var videoDescriptor = new VideoStreamDescriptor(property);
var audioDescriptor = new AudioStreamDescriptor(audioTrack.GetAudioEncodingProperties());
MSS = new MediaStreamSource(videoDescriptor)
{
Duration = clip.OriginalDuration
};
// hooking up the MediaStreamSource event handlers
MSS.Starting += MSS_Starting;
MSS.SampleRequested += MSS_SampleRequested;
MSS.Closed += MSS_Closed;
media.SetMediaStreamSource(MSS);
}
My problem is, that I cannot find a single example where video streams are used instead of audio, so I can't figure out what's wrong with my code. If I set the MediaElement's Source property to the given mp4 file, it works like a charm. If I pick an mp3 and leave the videoDescriptor out then as well. But if I try to do the same with a video (I'm still not sure whether I should add the audioDescriptor as a second arg to the MediaStreamSource or not, but because I've got one mixed stream, I guess it's not needed), then nothing happens. The SampleRequested event is triggered. No error is thrown. It's really hard to debug it, it's a real pain in the ass. :S
I have solution to build working video MediaStreamSource from file bitmaps but unfortunately have not found solution for RGBA buffer.
First of all read MediaStreamSource Class documentation https://learn.microsoft.com/en-us/uwp/api/windows.media.core.mediastreamsource
I'm creating MJPEG MediaStreamSource
var MediaStreamSource = new MediaStreamSource(
new VideoStreamDescriptor(
VideoEncodingProperties.CreateUncompressed(
CodecSubtypes.VideoFormatMjpg, size.Width, size.Height
)
)
);
Then initialize some buffer time
MediaStreamSource.BufferTime = TimeSpan.FromSeconds(1);
Then subscribe for event to set requested frame.
MediaStreamSource.SampleRequested += async (MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args) =>
{
var deferal = args.Request.GetDeferral();
try
{
var timestamp = DateTime.Now - startedAt;
var file = await Windows.ApplicationModel.Package.Current.InstalledLocation.GetFileAsync(#"Assets\grpPC1.jpg");
using (var stream = await file.OpenReadAsync())
{
args.Request.Sample = await MediaStreamSample.CreateFromStreamAsync(
stream.GetInputStreamAt(0), (uint)stream.Size, timestamp);
}
args.Request.Sample.Duration = TimeSpan.FromSeconds(5);
}
finally
{
deferal.Complete();
}
};
As you may see in my sample I use CodecSubtypes.VideoFormatMjpg and hardcoded path to jpeg file that I permanently use as MediaStreamSample. We need to research which CodecSubtypes to set to use RGBA (4 byte per pixel) format bitmap like thin
var buffer = new Windows.Storage.Streams.Buffer(size.Width * size.Height * 4);
// latestBitmap is SoftwareBitmap
latestBitmap.CopyToBuffer(buffer);
args.Request.Sample = MediaStreamSample.CreateFromBuffer(buffer, timestamp);

How to get ImageStream from MediaCapture?

In WP 8 I used PhotoCamera to make a camera app and to save the image in camera roll I used this method:
private void cam_CaptureImageAvailable(object sender, ContentReadyEventArgs e)
{
string fileName = "photo.jpg";
MediaLibrary library = new MediaLibrary();
library.SavePictureToCameraRoll(fileName, e.ImageStream);
}
In WPSL 8.1 I use MediaCapture and I use the same style to save image in camera roll but I don't know how to retrieve ImageStream from MediaCapture like in e.ImageStream. I am open to suggestions even with other programming style for saving to camera roll.
var file = await Windows.Storage.KnownFolders.PicturesLibrary.CreateFileAsync(IMAGECAPTURE_FILENAME, Windows.Storage.CreationCollisionOption.ReplaceExisting);
await _exceptionHandler.Run(async () =>
{
await _mediaCapture.CapturePhotoToStorageFileAsync(_imageEncodingProperties, file);
var photoStream = await file.OpenAsync(Windows.Storage.FileAccessMode.Read);
await bitmap.SetSourceAsync(photoStream);
});
The above is taken from a UWP app to save an image to storage, then read it from disk as a stream. I've never been able to capture the image directly as a stream.

Windows Phone 8.1 Load Sound Issue with StreamResourceInfo

I am converting an app from WP7 to WP8.1 The codes for WP7 no longer works for WP8.1
sfxLeft = new MediaElement();
sfxRight = new MediaElement();
StreamResourceInfo streamInfo = Application.GetResourceStream(new Uri(wav, UriKind.Relative));
var sfx = SoundEffect.FromStream(streamInfo.Stream);
sfxLeft = sfx.CreateInstance();
sfxRight = sfx.CreateInstance();
StreamResourceInfo does not exists for WP8.1 anymore. Anyone know how I can re-write this line to make it work for WP8.1?
Updated Code.
Here's the new code below, but now it seems the sfxLeft and sfxRight are always NULL. I thought the below code would set sfxLeft and sfxRight, but it's still NULL.
protected override void OnNavigatedTo(NavigationEventArgs e)
{
test();
}
async private Task test()
{
Uri wav = new Uri("ms-appx:///Assets/eye_poke.wav", UriKind.RelativeOrAbsolute);
StorageFile file = await StorageFile.GetFileFromApplicationUriAsync(wav);
stream = await file.OpenStreamForReadAsync();
sfxLeft = SoundEffect.FromStream(stream).CreateInstance();
sfxRight = SoundEffect.FromStream(stream).CreateInstance();
}
To play sound using XNA Framework in Windows Phone 8.1 Silverlight App
StorageFile file = await StorageFile.GetFileFromApplicationUriAsync(wav);
Stream stream = await file.OpenStreamForReadAsync();
SoundEffect Sound1 = SoundEffect.FromStream(stream);
FrameworkDispatcher.Update();
Sound1.Play();
Or you can use MediaElement for both RT and Silverlight but they are in different namespace,
MediaElement mediaElement1 = new MediaElement();
mediaElement1.Source = wav
mediaElement1.AutoPlay = false;
rootGrid.Children.Add(mediaElement1)
Thanks. I figured it out by using the media element in XAML and then add codes in c# to play it.

BufferedWaveProvider doesnt work in Unity

I am trying to implement NAudio into Unity. I managed to link the NAudio dll, but I am getting a strange error when I try to play music with NAudio BufferedWaveProvider.
If I to this:
WaveOut player;
BufferedWaveProvider buf;
AudioFileReader reader;
void Start () {
reader = new AudioFileReader(#"..\music.mp3"); // some music
player = new WaveOut();
player.Init(reader );
player.Play();
}
The music plays normal, without any problems.
But when I try use BufferedWaveProvider:
WaveOut player;
BufferedWaveProvider buf;
AudioFileReader reader;
void Start () {
reader = new AudioFileReader(#"..\music.mp3"); // some music
buf = new BufferedWaveProvider(reader.WaveFormat);
byte[] tmp = new byte[50000];
reader.Read(tmp, 0, tmp.Length); //read 50000 bytes
buf.AddSamples(tmp, 0, tmp.Length); //add bytes to buf
player = new WaveOut();
player.Init(buf); //init the WaveOut with buff
player.Play(); // play
}
It doesnt play! I debuged really a lot, and found out that the BufferedWaveProvider is using the samples (BufferedBytes are lowering), but I dont get any sound out of it!
I am using BufferedWaveProvider because of a more complex project, but its already a problem in such a simple example..
What am I missing?
Note: The same code WORKS in C# Windows Forms...
Try using WaveOutEvent instead of WaveOut, it worked at least for me in one of the projects.
As Mark pointed out:
it works because WaveOut uses Windows message callbacks by default, so if you have no gui thread (e.g. you are in a console app), then it can't be used and WaveOutEvent should be preferred

Loading image in picturebox control repeatedly from local machine

I have some c# code that gets an image from a webpage then downloads it to my local machine. This is done in the background 1/sec. If I leave this running it works fine and my pictures get updated correctly. These pictures are basically feeds from a camera. I want to put these pictures into a picturebox or some other control so that I can display the images as if they were a camera feed. However when I tried doing this I've got errors saying the image is being used so I can not load it into my picturebox. Is there a better way to do this?
Thanks,
byte[] lnBuffer;
byte[] lnFile;
HttpWebRequest lxRequest = (HttpWebRequest)WebRequest.Create(uri);
lxRequest.Credentials = credentials;
using (HttpWebResponse lxResponse = (HttpWebResponse)lxRequest.GetResponse())
{
using (BinaryReader lxBR = new BinaryReader(lxResponse.GetResponseStream()))
{
using (MemoryStream lxMS = new MemoryStream())
{
lnBuffer = lxBR.ReadBytes(1024);
while (lnBuffer.Length > 0)
{
lxMS.Write(lnBuffer, 0, lnBuffer.Length);
lnBuffer = lxBR.ReadBytes(1024);
}
lnFile = new byte[(int)lxMS.Length];
lxMS.Position = 0;
lxMS.Read(lnFile, 0, lnFile.Length);
lxMS.Close();
lxBR.Close();
}
}
lxResponse.Close();
}
using (System.IO.FileStream lxFS = new FileStream("images/camppic1.jpg", FileMode.Create))
{
lxFS.Write(lnFile, 0, lnFile.Length);
lxFS.Close();
}
This is what I use to create the file. Then in the same method after this code I do this:
image = Image.FromFile("C:\camppic1.jpg");
pictureBox23.Image = image;
If you need the file, then load the file content and copy to a MemoryStream and use Image.FromStream. If you don't need the file, you could skip it and use the MemoryStream directly from the downloading... (Faster since no disc access would be needed.)

Categories

Resources