OutOfMemoryException handling and Windows Media Player SDK - c#

How it must work:
WindowsMediaPlayer windowsMediaPlayer = new WindowsMediaPlayer();
IWMPMediaCollection collection = windowsMediaPlayer.mediaCollection;
IWMPMedia newMedia = collection.add(path); //causes OutOfMemoryException after some thousands method's iterations
I've tried to avoid it this way:
try
{
newMedia = collection.add(path);
return newMedia;
}
catch (Exception)
{
collection = null;
GC.Collect();
GC.WaitForPendingFinalizers();
WindowsMediaPlayer windowsMediaPlayer = new WindowsMediaPlayer();
IWMPMediaCollection collectionNew = windowsMediaPlayer.mediaCollection;
return CreateNewMedia(collectionNew, path);
}
But this does not work – I still get infinite exception loop inside catch.

You can not handle OutOfMemoryException like ordinary one. The reason you may have handling that, is just, in some way, to save the state of application, in order to provide to the consumer of your application a way to recover lost data.
What I mean is that there is no meaning calling GC.Collect or whatever, cause application is going to dead anyway, but CLR kindly give you notification about that before.
So to resolve this issue, check the memory consumption of your application, that on 32bit machine has to be something about 1.2GB of RAM, or control the quantity of the elements in collections you have, that can not exceed, for ordinary list, 2GB of memory on 32bit and on 64bit too.

Related

Memory leak analysis and help requested

I've been using the methodology outlined by Shivprasad Koirala to check for memory leaks from code running inside a C# application (VoiceAttack). It basically involves using the Performance Monitor to track an application's private bytes as well as bytes in all heaps and compare these counters to assess if there is a leak and what type (managed/unmanaged). Ideally I need to test outside of Visual Studio, which is why I'm using this method.
The following portion of code generates the below memory profile (bear in mind the code has a little different format compared to Visual Studio because this is a function contained within the main C# application):
public void main()
{
string FilePath = null;
using (FileDialog myFileDialog = new OpenFileDialog())
{
myFileDialog.Title = "this is the title";
myFileDialog.FileName = "testFile.txt";
myFileDialog.Filter = "txt files (*.txt)|*.txt|All files (*.*)|*.*";
myFileDialog.FilterIndex = 1;
if (myFileDialog.ShowDialog() == DialogResult.OK)
{
FilePath = myFileDialog.FileName;
var extension = Path.GetExtension(FilePath);
var compareType = StringComparison.InvariantCultureIgnoreCase;
if (extension.Equals(".txt", compareType) == false)
{
FilePath = null;
VA.WriteToLog("Selected file is not a text file. Action canceled.");
}
else
VA.WriteToLog(FilePath);
}
else
VA.WriteToLog("No file selected. Action canceled.");
}
VA.WriteToLog("done");
}
You can see that after running this code the private bytes don't come back to the original count and the bytes in all heaps are roughly constant, which implies that there is a portion of unmanaged memory that was not released. Running this same inline function a few times consecutively doesn't cause further increases to the maximum observed private bytes or the unreleased memory. Once the main C# application (VoiceAttack) closes all the related memory (including the memory for the above code) is released. The bad news is that under normal circumstances the main application may be kept running indefinitely by the user, causing the allocated memory to remain unreleased.
For good measure I threw this same code into VS (with a pair of Thread.Sleep(5000) added before and after the using block for better graphical analysis) and built an executable to track with the Performance Monitor method, and the result is the same. There is an initial unmanaged memory jump for the OpenFileDialog and the allocated unmanaged memory never comes back down to the original value.
Does the memory and leak tracking methodology outlined above make sense? If YES, is there anything that can be done to properly release the unmanaged memory?
Does the memory and leak tracking methodology outlined above make sense?
No. You shouldn't expect unmanaged committed memory (Private Bytes) always be released. For instance processes have an unmanaged heap, which is managed to allow for subsequent allocations. And since Windows can page your committed memory, it isn't critical to minimize each processes committed memory.
If repeated calls don't increase memory use, you don't have a memory leak, you have delayed initialization. Some components aren't initialized until you use them, so their memory usage isn't being taken into account when you establish your baseline.

Release Memory After Loading Files

First off, I freely admit that I don't understand memory management very well, so I may be missing something obvious here. I've been looking through articles, etc. such as this, but haven't found anything helpful.
I'm looping through a collection of StorageFiles and creating a SlideTemplate for each one. (The SlideTemplate is just a UserControl containing an InkCanvas.) I'm then loading strokes into the InkCanvas from the file stream and adding the SlideTemplate to my ObservableCollection<SlideTemplate>. Here's the code:
foreach (var slide in slides)
{
using (var stream = await slide.OpenSequentialReadAsync())
{
InkCanvas inkCanvas = new InkCanvas();
SlideTemplate newSlide = new SlideTemplate();
await inkCanvas.InkPresenter.StrokeContainer.LoadAsync(stream);
newSlide.Set_Canvas(inkCanvas);
newSlide.Set_PenSize(new Size(DataModel.Instance.CurrentProject.PenSize, DataModel.Instance.CurrentProject.PenSize));
DataModel.Instance.CurrentProject.Slides.Add(newSlide);
}
}
When it gets to about number 61 or 62 it throws an exception on the "new InkCanvas()", saying that it "could not create instance of... InkCanvas." Here's the memory usage from the VS Diagnostic tools when it throws the exception:
The memory usage just keeps climbing as it loops through them. Putting this:
inkCanvas = null;
GC.Collect();
GC.WaitForPendingFinalizers();
in the foreach actually loads all eighty slides successfully, but once it's done the memory usage stays put and won't drop again. And then going back out (which uses a similar method to save the ink strokes back to the file) and reloading the project throws the same error using about 150MB of memory. Can someone please explain what's at play here, or link to a helpful article?

Parallel Azure CloudBlockBlob-operations crashing (access violation)

The code sample below results once in a while in an access violation (1 out of 5,000 to 10,000 messages). Using a serial foreach instead of Parallel.ForEach seems to circumvent the problem.
public void DequeBatch<T>(int count)
{
var messages = this.queueListen.ReceiveBatch(count);
var received = new ConcurrentBag<KeyValuePair<Guid, T>>();
Action<BrokeredMessage> UnwrapMessage = message =>
{
blobName = message.GetBody<string>();
obj = Download<T>(blobName);
received.Add(new KeyValuePair<Guid, T>(new Guid(blobName), obj));
};
// offending operation
Parallel.ForEach(messages, new ParallelOptions { MaxDegreeOfParallelism = count }, UnwrapMessage);
}
public override T Download<T>(string blobName)
{
CloudBlockBlob blob;
lock (this.containerDownloadLock)
{
blob = this.containerDownload.GetBlockBlobReference(blobName);
}
T result;
using (var stream = new MemoryStream())
{
blob.DownloadToStream(stream);
stream.Position = 0;
result = Decompress<T>(stream); // dehydrate an object of type T from a GZipStream
}
return result;
}
Q1: What is the offending part which makes the code above thread-unsafe?
Q2: What is the correct and safe approach to up- and download CloudBlockBlobs in parallel?
Edit
Today, the code outlined above ran into a dead-lock. After hitting break-all in the debugger I observed that all of the worker-threads executing blob.DownloadToStream(stream); were trapped in
System.Net.AutoWebProxyScriptEngine.EnterLock
except for one which was blocked (no exception or anything else) in
System.Net.WinHttpProxyFinder.WinHttpGetProxyForUrl
An exception System.AccessViolationException can only originate from unmanaged code or from unsafe managed code. What you have above is normal (i.e. safe) managed code, so you should not be scrutinizing that code at the moment, but instead focus on other possibilities:
Do you have any unmanaged or unsafe code in your app? If so, that might be a reason for memory corruption, which in turn would cause an Access Violation. Test your app under paged heap and GFlags.
Execute your app under debugger and collect a crash dump. Look at the crash dump and check if you have familiar code in the call stack. A Windgb's !analyze command would get the analysis for you automatically. You will have to have to know how to fix up symbols for your and 3-rd party libraries. Example is here.
It might be a bug in Microsoft's implementation of Blob.
If you reasonably excluded #1 and #2, and suspect #3 might be the issue, you should collect a crash dump and send it over to Microsoft, only they would be able to help in that case.

foreach mysteriously hangs on first item of a ResourceSet

Occasionally our site slows down and the RAM usage goes up massively high. Then the app pool stops and I have to restart it. Then it's ok for a few days before the RAM suddenly spikes again and the app pool soon stops. The CPU isn't high.
Before the app pool stops I've noticed that one of our pages always hangs. The line it hangs on is a foreach on a ResourceSet :
var englishLocations = Lang.Countries.ResourceManager.GetResourceSet(new CultureInfo("en-GB"),true,true);
foreach(DictionaryEntry entry2 in englishLocations) // THIS LINE HANGS
We have the same code deployed on a different box and this doesn't happen. The main differences between the two boxes are :
Bad box
Window Server 2008 R2 Standard SP 1
IIS 7.5.7600.16385
.NET 4.5
24GB RAM
Good box
Window Server 2008 Server SP 2
IIS 7.0.6000.16386 SP 2
.NET 4.0
24GB RAM
I've tried adding uploadReadAheadSize="0" to the web.config as described here :
http://rionscode.wordpress.com/2013/03/11/resolving-controller-blocking-within-net-4-5-and-asp-net-mvc/
Which didn't work.
Why would foreach hang? It's hanging on the very first item, actually on the foreach.
Thanks.
I know it is an old post, but nevertheless... There is the potential of a deadlock when iterating over a ResourceSet and at the same time retrieving some other object through from the same Resources.
The problem is that when using a ResourceSet the iterator takes out locks on the internal resource cache of the ResourceReader http://referencesource.microsoft.com/#mscorlib/system/resources/resourcereader.cs,1389 and then in the method AllocateStringNameForIndex takes out a lock on the reader itself: http://referencesource.microsoft.com/#mscorlib/system/resources/resourcereader.cs,447
lock (_reader._resCache) {
key = _reader.AllocateStringForNameIndex(_currentName, out _dataPosition); // locks the reader
Getting an object takes out the same locks int the opposite order:
http://referencesource.microsoft.com/#mscorlib/system/resources/runtimeresourceset.cs,300
and http://referencesource.microsoft.com/#mscorlib/system/resources/runtimeresourceset.cs,335
lock(Reader) {
....
lock(_resCache) {
_resCache[key] = resLocation;
}
}
This can lead to a deadlock. We had this exact issue recently..
I experienced very similar problem.
Every once in a while IIS would hang, and I would see number of requests just sitting there. They were all in state ExecuteRequestHandler and with ManagedPipelineHandler module name.
After investigating with process explorer, I could see that all of them were sitting at mscorlib.dll!ResourceEnumerator.get_Entry, additional stack trace suggested some NGen action and then ntdll.dll!WaitForMultipleObjects.
My working hypothesis is that when multiple threads start enumerating those resources, we can run into a deadlock (possibly on some native code file generation), and alll subsequent threads then just keep on piling up.
To resolve it, I just created a critical section around this code block, to ensure that it is executed sequentially - I haven't experienced the issue since.
private static readonly object ResourceLock = new object();
public static MvcHtmlString SerializeGlobalResources(this HtmlHelper helper)
{
lock (ResourceLock)
{
// Existing code goes here ....
}
}
Based upon another answer to give you some idea how about using a try catch model ?
Perhaps it hangs because that resource isnt available / locked /..permissions etc.
var englishLocations = Lang.Countries.ResourceManager.GetResourceSet(new CultureInfo("en-GB"),true,true);
foreach(DictionaryEntry entry2 in englishLocations) // THIS LINE HANGS
ResourceManager CultureResourceManager = new ResourceManager("My.Language.Assembly", System.Reflection.Assembly.GetExecutingAssembly());
ResourceSet resourceSet = CultureResourceManager.GetResourceSet("sv-SE", true, true);
try { resourceSet.GetString("my_language_resource");}
catch (exception ex) { // from here log your error ex to wherever you like with some code }

Process Memory Size - Different Counters

I'm trying to find out how much memory my own .Net server process is using (for monitoring and logging purposes).
I'm using:
Process.GetCurrentProcess().PrivateMemorySize64
However, the Process object has several different properties that let me read the memory space used:
Paged, NonPaged, PagedSystem, NonPagedSystem, Private, Virtual, WorkingSet
and then the "peaks": which i'm guessing just store the maximum values these last ones ever took.
Reading through the MSDN definition of each property hasn't proved too helpful for me. I have to admit my knowledge regarding how memory is managed (as far as paging and virtual goes) is very limited.
So my question is obviously "which one should I use?", and I know the answer is "it depends".
This process will basically hold a bunch of lists in memory of things that are going on, while other processes communicate with it and query it for stuff. I'm expecting the server where this will run on to require lots of RAM, and so i'm querying this data over time to be able to estimate RAM requirements when compared to the sizes of the lists it keeps inside.
So... Which one should I use and why?
If you want to know how much the GC uses try:
GC.GetTotalMemory(true)
If you want to know what your process uses from Windows (VM Size column in TaskManager) try:
Process.GetCurrentProcess().PrivateMemorySize64
If you want to know what your process has in RAM (as opposed to in the pagefile) (Mem Usage column in TaskManager) try:
Process.GetCurrentProcess().WorkingSet64
See here for more explanation on the different sorts of memory.
OK, I found through Google the same page that Lars mentioned, and I believe it's a great explanation for people that don't quite know how memory works (like me).
http://shsc.info/WindowsMemoryManagement
My short conclusion was:
Private Bytes = The Memory my process has requested to store data. Some of it may be paged to disk or not. This is the information I was looking for.
Virtual Bytes = The Private Bytes, plus the space shared with other processes for loaded DLLs, etc.
Working Set = The portion of ALL the memory of my process that has not been paged to disk. So the amount paged to disk should be (Virtual - Working Set).
Thanks all for your help!
If you want to use the "Memory (Private Working Set)" as shown in Windows Vista task manager, which is the equivalent of Process Explorer "WS Private Bytes", here is the code. Probably best to throw this infinite loop in a thread/background task for real-time stats.
using System.Threading;
using System.Diagnostics;
//namespace...class...method
Process thisProc = Process.GetCurrentProcess();
PerformanceCounter PC = new PerformanceCounter();
PC.CategoryName = "Process";
PC.CounterName = "Working Set - Private";
PC.InstanceName = thisProc.ProcessName;
while (true)
{
String privMemory = (PC.NextValue()/1000).ToString()+"KB (Private Bytes)";
//Do something with string privMemory
Thread.Sleep(1000);
}
To get the value that Task Manager gives, my hat's off to Mike Regan's solution above. However, one change: it is not: perfCounter.NextValue()/1000; but perfCounter.NextValue()/1024; (i.e. a real kilobyte). This gives the exact value you see in Task Manager.
Following is a full solution for displaying the 'memory usage' (Task manager's, as given) in a simple way in your WPF or WinForms app (in this case, simply in the title). Just call this method within the new Window constructor:
private void DisplayMemoryUsageInTitleAsync()
{
origWindowTitle = this.Title; // set WinForms or WPF Window Title to field
BackgroundWorker wrkr = new BackgroundWorker();
wrkr.WorkerReportsProgress = true;
wrkr.DoWork += (object sender, DoWorkEventArgs e) => {
Process currProcess = Process.GetCurrentProcess();
PerformanceCounter perfCntr = new PerformanceCounter();
perfCntr.CategoryName = "Process";
perfCntr.CounterName = "Working Set - Private";
perfCntr.InstanceName = currProcess.ProcessName;
while (true)
{
int value = (int)perfCntr.NextValue() / 1024;
string privateMemoryStr = value.ToString("n0") + "KB [Private Bytes]";
wrkr.ReportProgress(0, privateMemoryStr);
Thread.Sleep(1000);
}
};
wrkr.ProgressChanged += (object sender, ProgressChangedEventArgs e) => {
string val = e.UserState as string;
if (!string.IsNullOrEmpty(val))
this.Title = string.Format(#"{0} ({1})", origWindowTitle, val);
};
wrkr.RunWorkerAsync();
}`
Is this a fair description? I'd like to share this with my team so please let me know if it is incorrect (or incomplete):
There are several ways in C# to ask how much memory my process is using.
Allocated memory can be managed (by the CLR) or unmanaged.
Allocated memory can be virtual (stored on disk) or loaded (into RAM pages)
Allocated memory can be private (used only by the process) or shared (e.g. belonging to a DLL that other processes are referencing).
Given the above, here are some ways to measure memory usage in C#:
1) Process.VirtualMemorySize64(): returns all the memory used by a process - managed or unmanaged, virtual or loaded, private or shared.
2) Process.PrivateMemorySize64(): returns all the private memory used by a process - managed or unmanaged, virtual or loaded.
3) Process.WorkingSet64(): returns all the private, loaded memory used by a process - managed or unmanaged
4) GC.GetTotalMemory(): returns the amount of managed memory being watched by the garbage collector.
Working set isn't a good property to use. From what I gather, it includes everything the process can touch, even libraries shared by several processes, so you're seeing double-counted bytes in that counter. Private memory is a much better counter to look at.
I'd suggest to also monitor how often pagefaults happen. A pagefault happens when you try to access some data that have been moved from physical memory to swap file and system has to read page from disk before you can access this data.

Categories

Resources