How to decrease the user objects and Handles for console application - c#

I am having the console application which creates the PDF for a set of records. For if the count is 9000 above an error occurred that "Error creating window Handle". In the application level i am using 6 threads.
As i observed in the Task Manager the Handles are increasing and the user objects are also increasing.
I have written the obj.Dispose method where ever i have created the object. So now my question is how to decrease the user objects and Handles.
I am using the console application with 3.5 Framework in C#.
Update:
Below is the code which i have used
Thread FirstTreadPDFs = new Thread(() => objPDFsProcess.DoGeneratePDFsProcess());
FirstTreadPDFs.Start();
//Thread2
Thread SecondTreadPDFs = new Thread(() => objPDFsProcess.DoGeneratePDFsProcess());
SecondTreadPDFs.Start();
//Thread3
Thread ThirdTreadPDFs = new Thread(() => objPDFsProcess.DoGeneratePDFsProcess2());
ThirdTreadPDFs.Start();
//Thread4
Thread FourthTreadPDFs = new Thread(() => objPDFsProcess.DoGeneratePDFsProcess());
FourthTreadPDFs.Start();
//Thread5
Thread FifthTreadPDFs = new Thread(() => objPDFsProcess.DoGeneratePDFsProcess1());
FifthTreadPDFs.Start();
FirstTreadPDFs.Join();
SecondTreadPDFs.Join();
ThirdTreadPDFs.Join();
FourthTreadPDFs.Join();
FifthTreadPDFs.Join();
DataSet dsHeader1 = new DataSet();
//Pending Cusotmers need to get to generate PDFs
dsHeader1 = objCustStatementDAL.GetCustStatementdetailsForPDF(IsEDelivery, 1);
if (dsHeader1 != null && dsHeader1.Tables.Count > 0)
{
if (dsHeader1.Tables[0].Rows.Count > 0)
{
writerLog.WriteLine(DateTime.Now + " Trying to get Pending Records");
objPDFsProcess.DoGeneratePDFsProcess2();
writerLog.WriteLine(DateTime.Now + " Exit Trying to get Pending Records block");
}
}
dsHeader1.Dispose();
After executing 9000+ records the Exit Trying line is executing and stopping the application.
Where ever i use the object i placed Dispose method.

From your question it is not really clear what are you doing, but if I'm guessing right, you are keeping too many open file handlers.
So here it comes. If you open a StreamReader for example, you open a File Handler, what happens to be an unmanaged and limited resource. Unmanaged means the .NET runtime can't keep tabs on its use, and even if you lose reference to the StreamReader object, the handler won't be closed. So for that, you need to call the Dispose function (and if you are creating a class that uses native resources, implement the IDisposable interface, wich contains the Dispose function properly). You can do the calling explicitly, but the best for everyone is to use the using block. That way your handlers will be closed properly everytime you leave the scope of the block, whatever the means.
Of course if you are triing to keep open and use this much handlers, you need to trick your way around it somehow, and that would still involve closing not currently used ones.

Related

Is this behavior due to unpredictable timings of garbage collection when it cleans up? [duplicate]

This question already has answers here:
IOException: The process cannot access the file 'file path' because it is being used by another process
(12 answers)
Closed 1 year ago.
When I change selection in a list, the dialog gets updated, reading from the file associated with that item in the list. I fixed the problem by using the using statement in below code.
private void ViewModel_SelectedNoteChanged(object sender, EventArgs e)
{
try
{
contentRichTextBox.Document.Blocks.Clear();
if (VM.SelectedNote != null)
{
if (!string.IsNullOrEmpty(VM.SelectedNote.FileLocation))
{
using (FileStream fileStream = new FileStream(VM.SelectedNote.FileLocation, FileMode.Open))
{
var contents = new TextRange(contentRichTextBox.Document.ContentStart, contentRichTextBox.Document.ContentEnd);
contents.Load(fileStream, DataFormats.Rtf);
}
}
}
}
catch(Exception ex)
{
MessageBox.Show(ex.Message);
}
}
If I don't use the using statement above, its hit or miss. Sometimes it reads the contents from a file and display them, otherwise it throws an exception.
System.IO.IOException: 'The process cannot access the file
'D:\EClone\bin\Debug\netcoreapp3.1\3.rtf'
because it is being used by another process.'
My question is why the unpredictable behavior without the using statement? Is it because of known unpredictability of garbage collection (when it decides to clean up), because we can never be sure when does the garbage collection cleans up the file? Is this a good example of non-deterministic nature of garbage collection? Or is it something else?
This is exactly due to incorrect usage of IDisposable and GC.
Without using
you open file (and thus lock it) and store a reference in a variable
the function exits, but the file is still opened and locked.
if GC runned before you will try to open file again - all works fine, because in a file wrapper's finalizer there is a code to close (and unlock) a file.
if you try to access the same file before GC run - you will get exception.
In case with using, you will close and unlock file explicitly by calling Dispose (using will call it for you). Dispose will close and unlock file immediately without waiting GC, so with using your code will always work as expected.
Moreover, as you can't control, when the GC will run, the code without using is unstable by design, so even if it works fine today, it may stop working tomorrow because of the system have more free memory (and GC runs no so often as today), or because .net update (with some GC changes/improvements) and so on.
So, as conclusion: if you have something, which implements IDisposable, always call Dispose when you not need the object anymore (with using or manually). You can omit calling Dispose only if it is explicitly allowed in documentation or guides (for example, as for Task class)

Do I have to call Application.ExitThread()?

using System.Windows.Forms;
public class App
{
[STAThread]
public static void Main()
{
string fname;
using (var d = new OpenFileDialog())
{
if (d.ShowDialog() != DialogResult.OK)
{
return;
}
fname = d.FileName;
}
//Application.ExitThread();
for (; ;)
;
}
}
The above code shows me a file dialog. Once I select a file and press open, the for loop is executed, but the (frozen) dialog remains.
Once I uncomment Application.ExitThread() the dialog disappears as expected.
Does that work as intended? Why doesn't using make the window disappear? Where can I find more info about this?
You have discovered the primary problem with single-threaded applications... long running operations freeze the user interface.
Your DoEvents() call essentially "pauses" your code and gives other operations, like the UI, a chance to run, then resumes. The problem is that your UI is now frozen again until you call DoEvents() again. Actually, DoEvents() is a very problematic approach (some call it evil). You really should not use it.
You have better options.
Putting your long running operation in another thread helps to ensure that the UI remains responsive and that your work is done as efficiently as possible. The processor is able to switch back and forth between the two threads to give the illusion of simultaneous execution without the difficulty of full-blown multi-processes.
One of the easier ways to accomplish this is to use a BackgroundWorker, though they have generally fallen out of favor (for reasons I'm not going to get into in this post: further reading). They are still part of .NET however and have a lower learning curve then other approaches, so I'd still suggest that new developers play around with them in hobby projects.
The best approach currently is .NET's Tasks library. If your long running operation is already in a thread (for example, it's a database query and you are just waiting for it to complete), and if the library supports it, then you could take advantage of Tasks using the async keyword and not have to think twice about it. Even if it's not already in a thread or in a supported library, you could still spin up a new Task and have it executed in a separate Thread via Task.Run(). .NET Tasks have the advantage of baked in language support and a lot more, like coordinating multiple Tasks and chaining Tasks together.
JDB already explained in his answer why (generally speaking) your code doesn't work as expected. Let me add a small bit to suggest a workaround (for your specific case and for when you just need to use a system dialog and then go on like it was a console application).
You're trying to use Application.DoEvents(), OK it seems to work and in your case you do not have re-entrant code. However are you sure that all relevant messages are correctly processed? How many times you should call Application.DoEvents()? Are you sure you correctly initialize everything (I'm talking about the ApplicationContext)? Second problem is more pragmatic, OpenFileDialog needs COM, COM (here) needs STAThread, STAThread needs a message pump. I can't tell you in which way it will fail but for sure it may fail.
First of all note that usually applications start main message loop using Application.Run(). You don't expect to see new MyWindow().ShowDialog(), right? Your example is not different, let Application.Run(Form) overload creates the ApplicationContext for you (and handle HandleDestroyed event when form closes which will finally call - surprise - Application.ExitThread()). Unfortunately OpenFileDialog does not inherit from Form then you have to host it inside a dummy form to use Application.Run().
You do not need to explicitly call dlg.Dispose() (let WinForms manage objects lifetime) if you add the dialog inside the form with the designer.
using System;
using System.Windows.Forms;
public class App
{
[STAThread]
public static void Main()
{
string fname = AskForFile();
if (fname == null)
return;
LongRunningProcess(fname);
}
private static string AskForFile()
{
string fileName = null;
var form = new Form() { Visible = false };
form.Load += (o, e) => {
using (var dlg = new OpenFileDialog())
{
if (dlg.ShowDialog() == DialogResult.OK)
fileName = dlg.FileName;
}
((Form)o).Close();
};
Application.Run(form);
return fileName;
}
}
No, you don't have to call Application.ExitThread().
Application.ExitThread() terminates the calling thread's message loop and forces the destruction of the frozen dialog. Although "that works", it's better to unfreeze the dialog if the cause of the freeze is known.
In this case pressing open seems to fire a close-event which doesn't have any chance to finish. Application.DoEvents() gives it that chance and makes the dialog disappear.

Parallel and COM (Outlook MAPI) aqcuiring and releasing?

Ive checked already some answers, but still Im not convinced what is this right approach of acquiring and releasing COM objects in parallel.
In particular I use a Parallel.ForEach to increase performance, and inside it, it makes calls to MS.Outlook (2010 ExchangeServer). However, by releasing the COM objects I get occasionally COMExceptions.
What is the right approach of working with COM objects with the Parallel library ?
System.Threading.Tasks.Parallel.ForEach(myList, myItem =>
{
String freeBusySlots = "";
Outlook.Recipient myReceipient = null;
try
{
myReceipient = namespaceMAPI.CreateRecipient(myItem.ToString());
}
catch (Exception ex)
{
...
}
finally
{
if (myReceipient == null)
{
...
}
Marshal.ReleaseComObject(myReceipient ); // -> I get an exception here sometimes ... how to avoid this
myReceipient = null;
}
}); // Parllel.forEach
Outlook Object Model cannot be used from secondary threads. Sometimes it works, but it tends to bomb out at the most inappropriate moment.
As of Outlook 2013, Outlook will immediately raise an error if an OOM object is accessed from a secondary thread.
If your code is running from another application, keep in mind that all calls will be serialized to the main Outlook thread anyway so there is really no point using multiple threads.
Also note that Extended MAPI (C++ or Delphi only) or Redemption (which wraps Extended MAPI and can be accessed from any language - I am its author) can be used from multiple threads, but your mileage will vary depending on the particular MAPI provider (IMAP4 store provider is the worst).
As a general rule you should never be using ReleaseComObject. Instead just wait for the RCW object to be collected and let the GC do the work for you. Using this API correctly is extremely hard.
Additionally it's very likely that all of Outlooks COM objects live in the STA. If that is the case there is nothing gained by operating over them in parallel. Every call made from a background thread will simply be marashalled back to the foreground thread for processing. Hence the background thread will add no value, just confusion.
I'm not 100% certain these are STA objects but I'd be very surprised if they weren't.

Thread Monitor class in c#

In my c# application multiple clients will access the same server, to process one client ata a time below code is written.In the code i used Moniter class and also the queue class.will this code affect the performance.if i use Monitor class, then shall i remove queue class from the code.
Sometimes my remote server machine where my application running as service is totally down.is the below code is the reasond behind, coz all the clients go in a queue, when i check the netstatus -an command using command prompt, for 8 clients it shows 50 connections are holding in Time-wait...
Below is my code where client acces the server ...
if (Id == "")
{
System.Threading.Monitor.Enter(this);
try
{
if (Request.AcceptTypes == null)
{
queue.Enqueue(Request.QueryString["sessionid"].Value);
string que = "";
que = queue.Dequeue();
TypeController.session_id = que;
langStr = SessionDatabase.Language;
filter = new AllThingzFilter(SessionDatabase, parameters, langStr);
TypeController.session_id = "";
filter.Execute();
Request.Clear();
return filter.XML;
}
else
{
TypeController.session_id = "";
filter = new AllThingzFilter(SessionDatabase, parameters, langStr);
filter.Execute();
}
}
finally
{
System.Threading.Monitor.Exit(this);
}
}
Locking this is pretty wrong, it won't work at all if every thread uses a different instance of whatever class this code lives in. It isn't clear from the snippet if that's the case but fix that first. Create a separate object just to store the lock and make it static or give it the same scope as the shared object you are trying to protect (also not clear).
You might still have trouble since this sounds like a deadlock rather than a race. Deadlocks are pretty easy to troubleshoot with the debugger since the code got stuck and is not executing at all. Debug + Break All, then Debug + Windows + Threads. Locate the worker threads in the thread list. Double click one to select it and use Debug + Call Stack to see where it got stuck. Repeat for other threads. Look back through the stack trace to see where one of them acquired a lock and compare to other threads to see what lock they are blocking on.
That could still be tricky if the deadlock is intricate and involves multiple interleaved locks. In which case logging might help. Really hard to diagnose mandelbugs might require a rewrite that cuts back on the amount of threading.

Issue writing to single file in Web service in .NET

I have created a webservice in .net 2.0, C#. I need to log some information to a file whenever different methods are called by the web service clients.
The problem comes when one user process is writing to a file and another process tries to write to it. I get the following error:
The process cannot access the file because it is being used by another process.
The solutions that I have tried to implement in C# and failed are as below.
Implemented singleton class that contains code that writes to a file.
Used lock statement to wrap the code that writes to the file.
I have also tried to use open source logger log4net but it also is not a perfect solution.
I know about logging to system event logger, but I do not have that choice.
I want to know if there exists a perfect and complete solution to such a problem?
The locking is probably failing because your webservice is being run by more than one worker process.
You could protect the access with a named mutex, which is shared across processes, unlike the locks you get by using lock(someobject) {...}:
Mutex lock = new Mutex("mymutex", false);
lock.WaitOne();
// access file
lock.ReleaseMutex();
You don't say how your web service is hosted, so I'll assume it's in IIS. I don't think the file should be accessed by multiple processes unless your service runs in multiple application pools. Nevertheless, I guess you could get this error when multiple threads in one process are trying to write.
I think I'd go for the solution you suggest yourself, Pradeep, build a single object that does all the writing to the log file. Inside that object I'd have a Queue into which all data to be logged gets written. I'd have a separate thread reading from this queue and writing to the log file. In a thread-pooled hosting environment like IIS, it doesn't seem too nice to create another thread, but it's only one... Bear in mind that the in-memory queue will not survive IIS resets; you might lose some entries that are "in-flight" when the IIS process goes down.
Other alternatives certainly include using a separate process (such as a Service) to write to the file, but that has extra deployment overhead and IPC costs. If that doesn't work for you, go with the singleton.
Maybe write a "queue line" of sorts for writing to the file, so when you try to write to the file it keeps checking to see if the file is locked, if it is - it keeps waiting, if it isn't locked - then write to it.
You could push the results onto an MSMQ Queue and have a windows service pick the items off of the queue and log them. It's a little heavy, but it should work.
Joel and charles. That was quick! :)
Joel: When you say "queue line" do you mean creating a separate thread that runs in a loop to keep checking the queue as well as write to a file when it is not locked?
Charles: I know about MSMQ and windows service combination, but like I said I have no choice other than writing to a file from within the web service :)
thanks
pradeep_tp
Trouble with all the approached tried so far is that multiple threads can enter the code.
That is multiple threads try to acquire and use the file handler - hence the errors - you need a single thread outside of the worker threads to do the work - with a single file handle held open.
Probably easiest thing to do would be to create a thread during application start in Global.asax and have that listen to a synchronized in-memory queue (System.Collections.Generics.Queue). Have the thread open and own the lifetime of the file handle, only that thread can write to the file.
Client requests in ASP will lock the queue momentarily, push the new logging message onto the queue, then unlock.
The logger thread will poll the queue periodically for new messages - when messages arrive on the queue, the thread will read and dispatch the data in to the file.
To know what I am trying to do in my code, following is the singletone class I have implemented in C#
public sealed class FileWriteTest
{
private static volatile FileWriteTest instance;
private static object syncRoot = new Object();
private static Queue logMessages = new Queue();
private static ErrorLogger oNetLogger = new ErrorLogger();
private FileWriteTest() { }
public static FileWriteTest Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new FileWriteTest();
Thread MyThread = new Thread(new ThreadStart(StartCollectingLogs));
MyThread.Start();
}
}
}
return instance;
}
}
private static void StartCollectingLogs()
{
//Infinite loop
while (true)
{
cdoLogMessage objMessage = new cdoLogMessage();
if (logMessages.Count != 0)
{
objMessage = (cdoLogMessage)logMessages.Dequeue();
oNetLogger.WriteLog(objMessage.LogText, objMessage.SeverityLevel);
}
}
}
public void WriteLog(string logText, SeverityLevel errorSeverity)
{
cdoLogMessage objMessage = new cdoLogMessage();
objMessage.LogText = logText;
objMessage.SeverityLevel = errorSeverity;
logMessages.Enqueue(objMessage);
}
}
When I run this code in debug mode (simulates just one user access), I get the error "stack overflow" at the line where queue is dequeued.
Note: In the above code ErrorLogger is a class that has code to write to the File. objMessage is an entity class to carry the log message.
Alternatively, you might want to do error logging into the database (if you're using one)
Koth,
I have implemented Mutex lock, which has removed the "stack overflow" error. I yet have to do a load testing before I can conclude whether it is working fine in all cases.
I was reading about Mutex objets in one of the websites, which says that Mutex affects the performance. I want to know one thing with putting lock through Mutex.
Suppose User Process1 is writing to a file and at the same time User Process2 tries to write to the same file. Since Process1 has put a lock on the code block, will Process2 will keep trying or just die after the first attempet iteself.?
thanks
pradeep_tp
It will wait until the mutex is released....
Joel: When you say "queue line" do you
mean creating a separate thread that
runs in a loop to keep checking the
queue as well as write to a file when
it is not locked?
Yeah, that's basically what I was thinking. Have another thread that has a while loop until it can get access to the file and save, then end.
But you would have to do it in a way where the first thread to start looking gets access first. Which is why I say queue.

Categories

Resources