C# - CPU and memory issues when using smaller timer interval [closed] - c#

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a timer in my application that fires every 10 ms. I know this is a very small value for a Windows timer and aware of the precision issues. Anyway, it causes the CPU usage to increase to 10% on average and the memory usage is slowly increasing but then eventually goes back down to a lower value. Without the timer, there are no CPU or memory issues. From what I've read, the memory increasing and then decreasing is a normal thing and it is due to Windows not releasing memory unless it has to. However, is this going to cause any performance problems with my application? Is 10% CPU usage going to cause problems as well? When I increase the timer to 100 ms it seems to be a little better but still seeing a similar type of effect. I need the timer interval to be as small as possible.

A 10% CPU usage (in my opinion) is not a big deal. I mean it's ok but definitly not the best. It's ok if you will need to do a lot of extra stuff to achieve better performance.
I wrote a lot of apps that uses 20% CPU and it works kinda fine. However, a timer set to 10ms is kind of weird. I guess you want to use it to constantly check for something. If you are doing that, don't use timers at 10ms. It's better to use events to do this. If you don't know events, here is a simple guide.
You declare an event like this:
public event EventHandler SomethingHappened;
For the purpose of this example, I will put the event in a class called MyClass. When you want to raise the event i.e. make the event occur, do this:
SomethingHappened (this, EventArgs.Empty);
Now let's see how do you subscribe to the event. Of course you need to create an object:
MyClass obj = new MyClass ();
And then write a method to execute when the event happened. The return value type and parameters must be the same as this one:
public void DoSomething (object sender, EventArgs e) {
}
Now you do the subscription:
obj.SomethingHappened += DoSomething;
For more information, here's an MSDN tutorial:
https://msdn.microsoft.com/en-us/library/aa645739(v=vs.71).aspx

SOLVED. The issue was that I had some code in the timer event that was slowing everything down. After replacing a few lines of code the CPU usage went back down to 0% and memory is no longer increasing. Hopefully this may help someone else in the future.

Related

what is the reason for high % time in GC? for our app pool in APM perfmonitor tool, we see it crossing 99% and staying their for hours [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
Does this mean memory leakage? The %Time in GC goes to 99% even when noone is using the application. Could you please help me why this %time in GC counter is strangely behaving. this could be a code issue? application is in Asp.net and uses services to call some methods.
For disposing oracle connections, we used disposed method. we used standard dispose pattern in the application.
Could someone give me insights into this?
It is hard to diagnose this kind of problem without very detailed analysis and direct observation of the measurements, but on the surface what this suggests is that you have a very large number of objects that have been allocated and which are retained for a long time - combined with some form of memory pressure. The net performance of full GC2 garbage collection is essentially bound by the number of alive / reachable objects in your system. So: what is the memory consumption? Is it in the GB area? If you have a large memory footprint, it doesn't necessarily mean a leak - but it can mean a leak. You can use memory analysis tools (usually against memory dump files) to investigate what objects exist, and how they are "rooted" - i.e. what is stopping them from being collected.
The most common things that cause this are:
a huge object model loaded into memory and retained for a long period - for example, loading a large chunk of your database into very large arrays/lists and keeping them globally to "help performance"
a common case of the above is reusing a single "data context" / "unit of work" / etc in your DAL between many requests
inappropriate use of events - especially registering objects to listen to events on a long-lived object - that cause objects to stay reachable forever via a hypothetical event that actually never happens - for example, for every row doing globalObj.SomeEvent += row.SomeHandler; : once you've done this, row is reachable from globalObj, so if globalObj doesn't die: neither will row
a common case of the above is subscribing temporary objects to static events (and not unsubscribing them); static events don't die
As for what it is in your case - if there even is an actual problem: only deeper analysis will show what.

making c# accurtate as posible (maybe with hacs)? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
From what I know it is rather known that c# can not be accurate when timing is critical. I certainly can understand that but was hoping there were known game hacks to help my issue.
tech:
I'm using an API for USB that sends data over a control transfer. In the API I get an event when an interrupt transfer occurs (one every 8 ms). I then simply fire off my control transfer at that exact time. What I have noticed, however not often, is that it takes more then 8ms to fire. Most of the time it does so in a timely matter (< 1ms after the interrupt event). The issue is that control transfers can not happen at the same time of an interrupt transfer so the control transfer must be done with in 5ms of the interrupt transfer so that it is complete and the interrupt transfer can take place.
So usb stuff aside my issue is getting an event to fire < 5ms after another event. I'm hoping there is a solution for this as gaming would also suffer form this sort of thing. For example some games can be put in a high priority mode. I wonder if that can be done in code? I may also try a profiler to back up my suspicions, it may be something I can turn off.
For those that want to journey down the technical road, the api is https://github.com/signal11/hidapi
If maybe someone has a trick or idea that may work, here are some of the considerations in my case.
1) usb interrupt polls happen ever 8 ms and are only a few hundred us long
2) control transfer should happen once every 8-32 ms (fast the better)
3) this control transfer can take up to 5 ms to complete
4) Skipping oscillations is ok for the controller transfer
5) this is usb 1.1
This is not even a C# problem, you are in a multi tasking non-realtime OS, so you don't know when your program is going to be active, the OS can give priority to other tasks.
Said that, you can raise the priority of the program thread, but I doubt it will solve anything:
System.Threading.Thread.CurrentThread.Priority = ThreadPriority.Highest;
When such restrictive timmings must be met then you must work at kernel level, per example as a driver.

C# Garbage collection? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I'm writing a service right now to automate a couple of my routines. Now I've really only started learning C# within the last month or so I'm still fairly new (but really liking it so far). I've designed my service that it just runs a method on a 5 minute timer which I made via AppSettings, runs a few checks, and organizes a few things if need be and that's it until the next interval.
I quickly realized the initial way I was doing things seems to have a pretty bad memory leak. So I've re-written much of it to embed things in "using" blocks and then disposing those when it's done. The "using" blocks were recommended to me by a developer at work who's been really helpful, but I don't like to bother him too much with my personal projects when he's got work to do.
Currently I'm not really having a problem with memory usage, as it's only using about 25Mb of ram, but when it starts, it's only using about 8Mb, and with each polling interval it climbs. But once it reaches that 25Mb threshold, I can see if dip a little lower which I'm assuming is garbage collection doing things, and then it climbs back up to 25Mb, and rinse and repeat. So my applications memory usage is stable, but it just seems higher than it needs to be so I'm curious.
Now if I call GC.Collect manually the memory usage drops to half. I realize this isn't ideal as I've already done some research on this. But now my question really comes down to, is there a some sort of default threshold in .NET when it comes to memory usage and garbage collection? I ask because it would explain what I'm seeing.
I did look at this page on the Process.MaxWorkingSet Property, but I'm not sure if it would make a difference at all or just potentially cause me problems.
I also tried running a profiler against, but to be honest, this is still new to me and I wasn't entirely clear at what I was looking for.
Conditions for a garbage collection
Garbage collection occurs when one of the following conditions is
true:
The system has low physical memory.
> The memory that is used by allocated objects on the managed heap
surpasses an acceptable threshold. This threshold is continuously
adjusted as the process runs.
The GC.Collect method is called. In almost all cases, you do not have
to call this method, because the garbage collector runs continuously.
This method is primarily used for unique situations and testing.
> When the garbage collector detects that the survival rate is high in a
generation, it increases the threshold of allocations for that
generation, so the next collection gets a substantial size of
reclaimed memory. The CLR continually balances two priorities: not
letting an application's working set get too big and not letting the
garbage collection take too much time.
This are quotes from the msdn GC article

Using many timers may cause application crash?

I have an application and I have 6 timers. each timer have different interval which mean 1s, 1s, 3s, 3s, 3s, 3s, respectively. Require CPU is always 2% to 3%.
in my PC is fine due to my PC's capability.
I am sure it may cause application if PC's capability is low.
Is there any effective way to use timer? or other running background?
The reason, I use timer because this timer will query database(get total amount) whenever user added or edit or delete product record, not just product record any record.
Timer 1s is for show Date and Time label
Timer 1s is to interact with datagriview, update the whole column
and Other timers is to get data from MySql Server. As my estimation, the max num of records can be 10 records.
Thanks
It's unclear why you think you need multiple timers here, and you don't even say which timer implementation you are using - and it would likely make a difference.
Employing a single timer that triggers on a reasonable minimal precision (1s, 100ms, etc) would reduce the overall overhead and would likely serve your purpose better. Of course that's said without any indication of what your actually trying to achieve.
It sounds as if you may have multiple issues, but to answer your question, running multiple timers will no cause your application to crash. How you implement the timers and if you are locking the code blocks that are called when a time is fired are important. If you are allowing code blocks to be executed before the code block has finished executing a previous call it can cause your application to become unstable. You should look at timers and perhaps even threads. Without knowing more about what you are doing it is difficult to provide a an more definitive answer to your question.

Setting a timeout on a DLL usage [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I am writing a DLL in C# and I need, (not want), to limit the usage of the DLL within a specific time.
I want it to stop working after N hours from the start of the usage, while notifying the user.
And if the DLL is not used for M hours, also stop the usage (as a sort of keep alive).
Usage being calling one of the functions of the DLL.
Using exceptions always collide with the availability of the DLL (I can't keep it in blocking).
I would very appreciate any help I can get, I am not very experienced using C#.
Any solution in any language will be great, obviously I am running on a windows computer but its components (service pack etc.) can be easily modified, if any modification is needed.
Your help is very much appreciated, thank you.
EDIT
I think I wasn't clear.
What I need is notifying the user of the end of the usage of the functions in the DLL, so he could initialize those functions again if he wants.
The DLL is not unloaded.
And I don't want to interfere with the main program process, the one using the DLL functions.
use a timer:
System.Threading.Timer t = new System.Threading.Timer(End, null, N * 60 * 1000, Timeout.Infinite);
private static void End(object state)
{
// Tell the user and end the program
}
for not used use another timer:
System.Threading.Timer t2 = new System.Threading.Timer(End, null, M * 60 * 1000, Timeout.Infinite);
now i don't know what "used" mean for you, but in any function that is considered a "use" do:
t2.Change(M * 60 * 1000, Timeout.Infinite);
to end the program you can use:
Environment.Exit(0);
keep in mind, that if you have more then one thread you'll might get a zombie, so make sure that all other threads other then the main thread are Background or that you abort them or single them to close.
Now i understand that you only want to notify the user...
so what i suggest is using:
MessageBox.Show(this, "what message you want to give the user");

Categories

Resources