How to properly dispose of resources in wpf - c#

Hi I have an application in which I have to save images from three different IP Cameras whenever a button is pressed.
I am using a class that has all the members that I need to save the images from the IP camera namely the BitmapImage and the DateTime of when the photo was saved.
I have the following problem. I need to save a certain amount of photos of each camera every couple hundred milliseconds. And I am currently testing it by saving 50 photos of each camera every 200ms to a ConcurrentQueue and then the items gets saved from the ConcurrentQueue to file. After I have taken about 110 photos altogether of all three cameras then it just saves blank images.
I think my problem is that the program memory is too full, so I need to clear an item from the memory when ever I save the item with the TryDequeue() method of the ConcurrentQueue.
Can anyone please advise me or give me maybe some links that can help me to save this problem so that I can save as many photos as I want to of each camera and that it will not run out of memory after a certain amount photos?
A button is pressed and then it goes into a for loop where it calls the following method.
private void EnqueuePhotos1()
{
IPCamera1 ipCam1Enqueue = new IPCamera1();
BitmapImage cam1Image = new BitmapImage();
cam1Image.BeginInit();
cam1Image.CacheOption = BitmapCacheOption.OnLoad;
cam1Image.CreateOptions = BitmapCreateOptions.IgnoreImageCache;
cam1Image.UriSource = null;
cam1Image.UriSource = new Uri("http://" + ipCam1IP + "/image?res=full&x0=0&y0=0&x1=1600&y1=1200&quality=21&doublescan=0", UriKind.Absolute);
while (cam1Image.IsDownloading) { ; }
cam1Image.EndInit();
ipCam1Enqueue.IPCamImage = cam1Image;
ipCam1Enqueue.TimeTook = DateTime.Now;
ipCam1ConQ.Enqueue(ipCam1Enqueue);
}
for a certain amount of times depending on how many photos the user wants to take.
Just before the for loop I start my timer to check every 100ms if there is something on the ConcurrentQueue and then if something is found it calls the following function.
private void GetPhotos1()
{
IPCamera1 ipCam1Dequeue = new IPCamera1();
while (ipCam1ConQ.TryDequeue(out ipCam1Dequeue))
{
cam1Photos++;
cam1ImgLoc = cam1Location + "\\Image " + cam1Photos + ".jpg";
FileStream cam1Stream = new FileStream(cam1ImgLoc, FileMode.Create);
JpegBitmapEncoder cam1Encoder = new JpegBitmapEncoder();
cam1Encoder.Frames.Add(BitmapFrame.Create(ipCam1Dequeue.IPCamImage));
cam1Encoder.Save(cam1Stream);
cam1Stream.Dispose();
}
}

using (FileStream cam1Stream = new FileStream(cam1ImgLoc, FileMode.Create))
{
// do stuff...
}
Resources defined in a way like this are automagically disposed after the statements in the using statement are executed.

Related

Get actual upload progress on Azure Blob

i know that this has been already asked, but the marked solution is not correct. Usually this article is marked as solution: https://learn.microsoft.com/en-us/archive/blogs/kwill/asynchronous-parallel-blob-transfers-with-progress-change-notification-2-0
It works and give an actual progress, but not the real time progress (and in some cases it gives a totally wrong value). Let me explain:
It gives the progress on the local read buffer, so when i upload something my first "uploaded value" is the read buffer total size. In my case this buffer is 4mb so every file smaller than 4mb results completed in 0 seconds for the progress bar, but it takes the real upload time to complete for real.
Also, if you try to kill your connection just before the upload start it gives as actual progress the first buffer size, so for my 1mb file i get 100% progress while disconnect.
I found another article with another solution, it reads the http response from azure everytime it complete a single block upload, but i need my blocks to be 4mb (since max block count for a single file is 50.000) and its not a perfect solution even with low block size.
The first article overrides the stream class and create a ProgressStream class with an ProgressChanged event that is triggered every time a read is done, there is some way to know the actual uploaded bytes when that ProgressChanged is triggered?
You can do this by using code similar to https://learn.microsoft.com/en-us/archive/blogs/kwill/asynchronous-parallel-block-blob-transfers-with-progress-change-notification (version 1.0 of the blog post you referenced), but instead of calling m_Blob.PutBlock you would instead upload the block with an HTTPWebRequest object and use the progress events from the HTTPWebRequest class. This introduces a lot more code complexity and you would have to add some additional error handling.
The alternative would be to download the Storage Client Library source code from GitHub and modify the block upload methods to track and report progress. The challenge you will face here is that you will have to make these same changes to every new version of the SCL if you plan on staying up to date with the latest fixes.
I must admit I didn't check if everything is as you desired, but here are my 2 cents on uploading with progress indication.
public async Task UploadVideoFilesToBlobStorage(List<VideoUploadModel> videos, CancellationToken cancellationToken)
{
var blobTransferClient = new BlobTransferClient();
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
_videoCount = _videoCountLeft = videos.Count;
foreach (var video in videos)
{
var blobUri = new Uri(video.SasLocator);
//create the sasCredentials
var sasCredentials = new StorageCredentials(blobUri.Query);
//get the URL without sasCredentials, so only path and filename.
var blobUriBaseFile = new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path,
UriFormat.UriEscaped));
//get the URL without filename (needed for BlobTransferClient (seems to me like a issue)
var blobUriBase = new Uri(blobUriBaseFile.AbsoluteUri.Replace("/"+video.Filename, ""));
var blobClient = new CloudBlobClient(blobUriBaseFile, sasCredentials);
//upload using stream, other overload of UploadBlob forces to put online filename of local filename
using (FileStream fs = new FileStream(video.FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
await blobTransferClient.UploadBlob(blobUriBase, video.Filename, fs, null, cancellationToken, blobClient,
new NoRetry(), "video/x-msvideo");
}
_videoCountLeft -= 1;
}
blobTransferClient.TransferProgressChanged -= BlobTransferClient_TransferProgressChanged;
}
private void BlobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
{
Console.WriteLine("progress, seconds remaining:" + e.TimeRemaining.Seconds);
double bytesTransfered = e.BytesTransferred;
double bytesTotal = e.TotalBytesToTransfer;
double thisProcent = bytesTransfered / bytesTotal;
double procent = thisProcent;
//devide by video amount
int videosUploaded = _videoCount - _videoCountLeft;
if (_videoCountLeft > 0)
{
procent = (thisProcent + videosUploaded) / _videoCount;
}
procent = procent * 100;//to real %
UploadProgressChangedEvent?.Invoke((int)procent, videosUploaded, _videoCount);
}
Actually Microsoft.WindowsAzure.MediaServices.Client.BlobTransferClient should be able to do concurrent uploads but there is no Method for uploading multiple yet it has properties for NumberOfConcurrentTransfers and ParallelTransferThreadCount, not sure how to use this.
There is a bug in this BlobTransferClient, when uploading using the localFile parameter, it will use the filename of that file, while I gave permissions on a specific file name in the SaSLocator.
This example shows how to upload from a client (not on server), so we don't need any CloudMediaContext which is usually the case. Everything about SasLocators can be found here.

Image loading time in windows phone apps

i m creating a app in which I need to change the source of the image on button click.eg if the images are:
sample1.png, sample2.png, sample3.png ...
I have written this code on button click:
int count=1;
imagename.Source=new BitmapImage(new uri("/sample"+count+".png",uriKind.Relative));
but the problem is when I run the app on a device it takes some time to load the image source everytime the button is clicked whereas on the emulator it changes properly.is there any way to reduce the loading time on device?
is there any way to reduce the loading time on device?
As far as I know: no. If performance is unsatisfactory, you may want to try with some caching. Basically, instead of creating a new BitmapImage each time, re-use the old ones.
First, pre-load the bitmaps. Don't forget to set the CreateOptions property, otherwise the picture won't be loaded until you assign it to an actual Image control:
var bitmaps = new List<BitmapImage>(count);
for (int i = 0; i < count; i++)
{
var bitmap = new BitmapImage(new uri("/sample" + i + ".png",uriKind.Relative));
bitmap.CreateOptions = BitmapCreateOptions.None;
bitmaps.Add(bitmap);
}
Then, re-use them as needed:
imagename.Source = bitmaps[1];
Please be aware that it will increase the memory usage of your app, so don't do that with large pictures. Performance is often a compromise between CPU time and memory usage.
Like KooKiz said you can prefetch the image but to force the load of the images, I believe you will need to use SetSourceAsync, here is an example:
StorageFile file= await StorageFile.GetFileFromApplicationUriAsync(new Uri("appx-data:////sample" + i + ".png"));
using (var stream = await file.OpenReadAsync())
{
bitmap.SetSourceAsync(stream);
}
Also what could be possible to do is for you to preload a thumbnail version of the image first (by using file.GetThumbnailAsync for example) and then the full image latter.
Finally if the images you are loading are actually bigger than the resolution of the surface you are loading it on, another parameter you can set on the Bitmap object is DecodePixelHeight and DecodePixelWidth.
int count=1;
BitmapImage bmp=new BitmapImage();
StorageFile sFile = await Windows.ApplicationModel.Package.Current.InstalledLocation.GetFileAsync(#"Assets\Images\img" + count + ".png");
var fileStream = await sFile.OpenReadAsync();
await bmp.SetSourceAsync(fileStream);

C# MemoryStream slowing programme performance

I'm working on a project using WPF to display the Kinect ColorImageFrame and a skeleton representation. I also have to record those two videos.
I'm able to display and record (using EmguCV) those two images, but I have some performance issues. It seems that this part of my code is the reason of my loss of performance.
private void DrawSkeleton(Skeleton[] skeletons)
{
using (System.Drawing.Bitmap skelBitmap = new System.Drawing.Bitmap(640, 480))
{
foreach (Skeleton S in skeletons)
{
if (S.TrackingState == SkeletonTrackingState.Tracked)
{
DrawBonesAndJoints(S,skelBitmap);
}
else if (S.TrackingState == SkeletonTrackingState.PositionOnly)
{
}
}
_videoArraySkel.Add(ToOpenCVImage<Bgr, Byte>(skelBitmap));
BitmapSource source = ToWpfBitmap(skelBitmap);
this.skeletonStream.Source = source;
}
}
and more precisely from the ToWpfBitmap which allows me to display it in my Window:
public static BitmapSource ToWpfBitmap(System.Drawing.Bitmap bitmap)
{
using (MemoryStream stream = new MemoryStream())
{
bitmap.Save(stream, System.Drawing.Imaging.ImageFormat.Bmp);
stream.Position = 0;
BitmapImage result = new BitmapImage();
result.BeginInit();
// According to MSDN, "The default OnDemand cache option retains access to the stream until the image is needed."
// Force the bitmap to load right now so we can dispose the stream.
result.CacheOption = BitmapCacheOption.OnLoad;
result.StreamSource = stream;
result.EndInit();
result.Freeze();
return result;
}
}
The loss of performance is characterized by:
- The videos displayed on the Window are not fluent anymore
- The video recording seems to miss some frames which leads to a video going faster/lower than the normal.
Can you help me by telling me where this problem may come from?
Try to use RecyclableMemoryStream instead of MemoryStream. It was designed for solving some issue with memory.
Check out this article for details - Announcing Microsoft.IO.RecycableMemoryStream
Have you tried doing the memory write i/o in a separate thread, while maintaining the data in a buffer like a queue?

Download file in chunks (Windows Phone)

In my application I can download some media files from web. Normally I used WebClient.OpenReadCompleted method to download, decrypt and save the file to IsolatedStorage. It worked well and looked like that:
private void downloadedSong_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e, SomeOtherValues someOtherValues) // delegate, uses additional values
{
// Some preparations
try
{
if (e.Result != null)
{
using (isolatedStorageFile = IsolatedStorageFile.GetUserStoreForApplication())
{
// working with the gained stream, decryption
// saving the decrypted file to isolatedStorage
isolatedStorageFileStream = new IsolatedStorageFileStream("SomeFileNameHere", FileMode.OpenOrCreate, isolatedStorageFile);
// and use it for MediaElement
mediaElement.SetSource(isolatedStorageFileStream);
mediaElement.Position = new TimeSpan(0);
mediaElement.MediaOpened += new RoutedEventHandler(mediaFile_MediaOpened);
// and some other work
}
}
}
catch(Exception ex)
{
// try/catch stuff
}
}
But after some investigation I found out that with large files(for me it's more than 100 MB) I'm getting OutOfMemory exception during downloading this file. I suppose that's because WebClient.OpenReadCompleted loads the whole stream into RAM and chokes... And I will need more memory to decrypt this stream.
After another investigation, I've found how to divide large file into chunks after OpenReadCompleted event at saving this file to IsolatedStorage(or decryption and then saving in my ocasion), but this would help with only a part of problem... The primary problem is how to prevent phone chokes during download process. Is there a way to download large file in chunks? Then I could use the found solution to pass through decryption process. (and still I'd need to find a way to load such big file into mediaElement, but that would be another question)
Answer:
private WebHeaderCollection headers;
private int iterator = 0;
private int delta = 1048576;
private string savedFile = "testFile.mp3";
// some preparations
// Start downloading first piece
using (IsolatedStorageFile isolatedStorageFile = IsolatedStorageFile.GetUserStoreForApplication())
{
if (isolatedStorageFile.FileExists(savedFile))
isolatedStorageFile.DeleteFile(savedFile);
}
headers = new WebHeaderCollection();
headers[HttpRequestHeader.Range] = "bytes=" + iterator.ToString() + '-' + (iterator + delta).ToString();
webClientReadCompleted = new WebClient();
webClientReadCompleted.Headers = headers;
webClientReadCompleted.OpenReadCompleted += downloadedSong_OpenReadCompleted;
webClientReadCompleted.OpenReadAsync(new Uri(song.Link));
// song.Link was given earlier
private void downloadedSong_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e)
{
try
{
if (e.Cancelled == false)
{
if (e.Result != null)
{
((WebClient)sender).OpenReadCompleted -= downloadedSong_OpenReadCompleted;
using (IsolatedStorageFile myIsolatedStorage = IsolatedStorageFile.GetUserStoreForApplication())
{
using (IsolatedStorageFileStream fileStream = new IsolatedStorageFileStream(savedFile, FileMode.Append, FileAccess.Write, myIsolatedStorage))
{
int mediaFileLength = (int)e.Result.Length;
byte[] byteFile = new byte[mediaFileLength];
e.Result.Read(byteFile, 0, byteFile.Length);
fileStream.Write(byteFile, 0, byteFile.Length);
// If there's something left, download it recursively
if (byteFile.Length > delta)
{
iterator = iterator + delta + 1;
headers = new WebHeaderCollection();
headers[HttpRequestHeader.Range] = "bytes=" + iterator.ToString() + '-' + (iterator + delta).ToString();
webClientReadCompleted.Headers = headers;
webClientReadCompleted.OpenReadCompleted += downloadedSong_OpenReadCompleted;
webClientReadCompleted.OpenReadAsync(new Uri(song.Link));
}
}
}
}
}
}
To download a file in chunks you'll need to make multiple requests. One for each chunk.
Unfortunately it's not possible to say "get me this file and return it in chunks of size X";
Assuming that the server supports it, you can use the HTTP Range header to specify which bytes of a file the server should return in response to a request.
You then make multiple requests to get the file in pieces and then put it all back together on the device. You'll probably find it simplest to make sequential calls and start the next one once you've got and verified the previous chunk.
This approach makes it simple to resume a download when the user returns to the app. You just look at how much was downloaded previously and then get the next chunk.
I've written an app which downloads movies (up to 2.6GB) in 64K chunks and then played them back from IsolatedStorage with the MediaPlayerLauncher. Playing via the MediaElement should work too but I haven't verified. You can test this by loading a large file directly into IsolatedStorage (via Isolated Storage Explorer, or similar) and check the memory implications of playing that way.
Confirmed: You can use BackgroundTransferRequest to download multi-GB files but you must set TransferPreferences to None to force the download to happen while connected to an external power supply and while connected to wi-fi, else the BackgroundTransferRequest will fail.
I wonder if it's possible to use a BackgroundTransferRequest to download large files easily and let the phone worry about the implementation details? The documentation seems to suggest that file downloads over 100 MB are possible, and the "Range" verb is reserved for it's own use, so it probably uses this automatically if it can behind the scenes.
From the documentation regarding files over 100 MB:
For files larger than 100 MB, you must set the TransferPreferences
property of the transfer to None or the transfer will fail. If you do
not know the size of a transfer and it is possible that it could
exceed this limit, you should set the value to None, meaning that the
transfer will only proceed when the phone is connected to external
power and has a Wi-Fi connection.
From the documentation regarding use of the "Range" verb:
The Headers property of the BackgroundTransferRequest object is used
to set the HTTP headers for a transfer request. The following headers
are reserved for use by the system and cannot be used by calling
applications. Adding one of the following headers to the Headers
collection will cause a NotSupportedException to be thrown when the
Add(BackgroundTransferRequest) method is used to queue the transfer
request:
If-Modified-Since
If-None-Match
If-Range
Range
Unless-Modified-Since
Here's the documentation:
http://msdn.microsoft.com/en-us/library/windowsphone/develop/hh202955(v=vs.105).aspx

How do I save files to hard disk in a separate thread?

I've a camera and I'm reading the images in real time into an array.
I'm applying some algorithm to the image and displaying it. Then I get the next image and display it as well. So I'm streaming images from the camera to the display. However I also want to save images to hard disk once I've displayed them. I tried using the main thread but everything slowed down too much.
I then tried using ThreadPool (see code below). This doesn't slow the display down but I've found the images aren't being saved properly. It looks like they are not in the expected order and after about 50 images have been saved the subsequent image data looks garbled. I'm guessing too many threads are being started.
Is there a better way to do this? I think I only need one thread to save the images. Maybe some kind of queue that saves each image sequentially. Just as long as its done in the background and doesn't slow down the display. If someone could post a code snippet that would be fantastic.
short[] image1 = new short[20000];
while(streaming)
{
ReadImageFromCamera(ref image1)
ImageData data;
data.fileName = imageNumber;
data.image = image1;
ThreadPool.QueueUserWorkItem(WriteImageToFile, data); // Send the writes to the queue
}
private void WriteImageToFile(object imageData) {
try {
ImageData data = (ImageData)imageData;
System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
string fName = myDirectory + #"/" + Convert.ToString(data.fileName) + #".spe";
using (Stream myStream = new FileStream(fName, FileMode.Create)) {
bf.Serialize(myStream, data.image);
}
}
catch (Exception) { }
}
I think you should avoid starting a new thread for each particular image. Since you have got just a single hard drive and store all files into the single directory, you should use just one disk writer thread. Then I'd recommend using some concurrent queue to transfer jobs from camera thread to disk writer thread. I don't show "code snippet" because this is not a thing you can write in good quality in a few lines of code.
Also you definitely must somewhere put 'new short[20000]' for each image, otherwise it is overwritten by next image before you save it to disk.
Also, I would expect that it is sufficient to write files in the main thread, because Windows uses concurrent techniques (mainly disk cache) automatically when you write data to disk. Are you sure that your hardware is fast enough to write all those data in real time?
When dealing with threads, ordering is no longer in your control. The thread pool can choose to schedule the threads in any order it likes. If you need things to happen sequentially in a specific order, threading does not make much sense anyway.
Regarding the corrupted images, it looks like the short[] image1 instance is being passed around. It is unclear what happens inside ReadImageFromCamera, but since you pass a pre-initialized array into it, chances are that the method will use that array and simply copy data into it (even though the ref keyword indicates that it might create a brand new array instance and assign that instead). Then you pass that array instance to WriteImageToFile on a separate thread.
Meanwhile, in parallell, you get the next image. Now you have a scenario where ReadImageFromCamera might write data into the array at the same time as WriteImageToFile is storing the data on disk. There you have your corrupted image. This can be avoided by passing a new array instance to WriteImageToFile:
ReadImageFromCamera(ref image1)
ImageData data;
data.fileName = imageNumber;
data.image = (short[])image1.Clone(); // create a new array instance, so that
// the next call to ReadImageFromCamera
// will not corrupt the data
ThreadPool.QueueUserWorkItem(WriteImageToFile, data);
Still, as has been mentioned by Al Kepp, since you have only one hard drive, launching many threads might not be your best option here. You could look into having one long-running separate thread for storing data on disk, and putting the images into some sort of queue that the storage thread picks up data from and writes to disk. This comes with its own set of problems dealing with concurrency, limiting the size of the queue and what not.
You need to create a distinct buffer for the thread to read data from, otherwise main thread will overwrite it when you dump it to a file. The way you are doing it seems to copy only references (image1 in particular).
So:
ThreadPool.QueueUserWorkItem(WriteImageToFile, data);
instead of data you'll send in a deep copy of data. Since it seems you are already doing it - but in the worker thread - you just need to move the copy before sending it.
HTH
You have to check before thinking about threads if the speed of a normal disk will be sufficient for your task as you may create images faster than writing to the disk. If the image creation is faster than writing I would look at using a Memory disk, but then you need to calculate if the size is sufficient until you stop the camera, so that you can write to the normal disk overnight.
If you use .NET 4.0 I would suggest that you use Concurrent queue together with a normal thread (as the thread will run until the program finishes).
Quick and dirty way is starting single new thread and work with global class members - the new thread should be able to access them while the main thread will update them.
First of all, have these lines outside of any function:
private List<ImageData> arrGlobalData = new List<ImageData>();
private bool keepWritingImages = true;
Now change the code in the "main" thread to this:
short[] image1 = new short[20000];
ThreadPool.QueueUserWorkItem(WriteImageToFile, null);
while(streaming)
{
ReadImageFromCamera(ref image1)
ImageData data = new ImageData();
data.fileName = imageNumber;
data.image = image1;
arrGlobalData.Add(data);
}
keepWritingImages = false;
And finally have such function for the new thread:
private void WriteImageToFile(object imageData)
{
while (keepWritingImages)
{
if (arrGlobalData.Count > 0)
{
ImageData data = arrGlobalData[0];
try
{
System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
string fName = myDirectory + #"/" + Convert.ToString(data.fileName) + #".spe";
using (Stream myStream = new FileStream(fName, FileMode.Create))
{
bf.Serialize(myStream, data.image);
}
}
catch
{
}
finally
{
arrGlobalData.Remove(data);
}
}
Thread.Sleep(10);
}
}
You can do following.
public class AsyncFileWriter
{
private readonly FileStream fs;
private readonly AsyncCallback callback;
public Action FinishedCallback;
private IAsyncResult result;
private class AsyncState
{
public FileStream Fs;
}
private void WriteCore(IAsyncResult ar)
{
if (result != null)
{
FileStream stream = ((AsyncState)ar.AsyncState).Fs;
stream.EndWrite(result);
if (this.FinishedCallback != null)
{
FinishedCallback();
}
}
}
public AsyncFileWriter(FileStream fs, Action finishNotification)
{
this.fs = fs;
callback = new AsyncCallback(WriteCore);
this.FinishedCallback = finishNotification;
}
public AsyncFileWriter(FileStream fs)
: this(fs, null)
{
}
public void Write(Byte[] data)
{
result = fs.BeginWrite(data, 0, data.Length, callback, new AsyncState() { Fs = fs });
}
}
Later you can consume it as.
static void Main(string[] args)
{
FileStream fs = File.Create("D:\\ror.txt");
ManualResetEvent evt = new ManualResetEvent(false);
AsyncFileWriter writer = new AsyncFileWriter(fs, () =>
{
Console.Write("Write Finished");
evt.Set();
}
);
byte[] bytes = File.ReadAllBytes("D:\\test.xml");//Getting some random bytes
writer.Write(bytes);
evt.WaitOne();
Console.Write("Write Done");
}

Categories

Resources