In a high performance C# program (where performance is the no1 concern), what sacrifices would be made to the code?
For example, how would exception handling change (less exceptions thrown/same no of exceptions caught?)?
I ask as I used to work in a monitoring company where a collector was written.
Thanks
Usually you should Throw exceptions only when is strictly needed as it could leads the performance
What’s the alternative to exception handling then?
You write code to check the input values and return values of each one of your methods and pass a return value back up to the caller.
http://codebetter.com/raymondlewallen/2004/12/13/performance-issues-with-exception-handling/
http://www.codeproject.com/Articles/11265/Performance-implications-of-Exceptions-in-NET
You can measure performance by yourself and see how much exception handling slows the code down. Generally, if an exception is thrown, it is very wasteful, but if no exception is thrown, try ... catch slows the code only by a bit.
Another point: Linq is slower that simple iteration.
Exceptions should only be thrown in exceptional cases. They should not be used for flow control. A good goal is to be able to run your application under the debugger with normal use and exceptions should not be thrown.
If that's the case, in a high performance app, the cost of exceptions shouldn't be as much of a concern if they are truly exceptional.
Exception handling is slow. By default, Exception instances are, what the name implies, an exception.
However if you expect them to occure a lot, you can decide to throw less Exception objects and handle the logical exceptions in another way. In other words, they became expected.
In the case they actually are unexpected, you can try to handle them earlier in the stack. A solution would be by using bool or enum retVals. Rethrowing exceptions causes loss of performance.
Exceptions are the best way to handle exceptional circumstances. If some condition is expected then check for it with a conditional statement like if.
Apart from these generic rules, how much maintainability are you willing to sacrafice for performance? You could always write the code in assembler.
Related
It's often said that you shouldn't use exceptions for regular error handling because of bad performance. My guess is that that bad performance is caused by having to instantiate a new exception object, generate a stack trace, etc. So why not have lightweight exceptions? Code like this is logically sound:
string ageDescription = "Five years old";
try {
int age = int.Parse(ageDescription);
}
catch (Exception) {
// Couldn't parse age; handle parse failure
}
And yet we're recommended to use TryParse instead to avoid the overhead of the exception. But if the exception were just a static object that got initialized when the thread started, all the code throwing the exception would need to do is set an error code number and maybe an error string. No stack trace, no new object instantiation. It would be a "lightweight exception", and so the overhead for using exceptions would be greatly reduced. Why don't we have such lightweight exceptions?
The exception object instantiation is the smallest problem in the whole case. The real performance killer is that the control flow must stop executing your program and has to look up the call stack for possible handlers (catch blocks) that can catch the thrown exception, then it has to execute the correct ones (and their finally blocks), rethrow exceptions when told so and then continue executing the program on the right place, i.e. after the last handler. Your idea of "lightweight" exceptions would change nothing of this, it would even slow down creation of threads, because it would have to create and store the exception object, and would prevent exception filtering by type, which is now possible.
By using TryParse, you avoid all this by a simple conditional clause, also you actually write less code and it is much easier to read and reason about.
Exceptions are for exceptional cases and in such scenarios, they provide lots of useful information for logs/debugger.
The performance hit isn't just because you're creating a new Exception object. A lot of it has to do with the conditional unwinding of the stack that needs to be done when an exception occurs.
For example, think about the work that would have to be done when you have exception handlers that catch different kinds of exceptions. At each point in the stack, as it's unwound from callee to caller, the language must do a type check to see not only if the exception can be handled, but what the most appropriate handler is. That's a significant amount of overhead in its own right.
If you really want to be lightweight, you should return a result from your functions -- that's what Int32.TryParse() does. No stack unwinding, no type checking, just a simple condition which can easily be optimized for.
EDIT: One somewhat interesting thing to note is that C# was created after Java. Java has a couple of interesting constructs which cause exception handling to be more complicated than what we see in C#, namely checked exceptions and the throws keyword. Kind of interesting reads. I'm (personally) glad that C# didn't include this "feature". My guess is that they bifurcated their exception handlers to boost performance. In the real world, as I understand it, a LOT of developers just end up specifying throws exception in their function declarations.
You should use int.TryParse in your case. And it is faster and more readable to test some conditions then to throw and catch exception. Use exceptions for exceptional situations not for regular validation.
The problem with exceptions isn't just generating the exception itself, that's honestly not even the most time consuming part. When you throw an exception (after it has been created) it needs to unwind the stack going through each scope level, determining if that scope is a try/catch block that would catch this exception, update the exception to indicate it went through that section of the stack, and then tearing down that section of the stack. And then of course there are all of the finally blocks that may need to be executed. Making the Exception itself store less information wouldn't really simplify any of that.
Because the utility offered by the "heavyweight" exceptions is exceptionally (ha ha) useful. I can't tell you how often I've wanted the ability to dump something like a stack trace in C land without having to ask people to yank out a debugger.
Generating things like the stack trace after the fact (i.e. after the exception has been caught on demand) are infeasible because once the exception has been caught, the stack has been unwound -- the information is gone. You want information about the point of failure; so the data must be collected at the point of failure.
As for the "new object instantiation" -- that is so cheap in comparison to other expensive exception features (unwinding the stack, stack trace, multiple function exit points, etc.) that it isn't worth worrying about.
You're not recommended to use TryParse instead of Parse because of performance. If there's any chance a parse can fail (because it's user generated input for example) then the failure to parse is not exceptional, it's to be expected. As the name implies, exceptions are for exceptional circumstances. For stuff that should have been caught earlier, but wasn't, so unexpected that you can't really continue.
If a function expects an object but instead null is passed in, then it's up to the designer of the method what the right thing to do is in this case. If the parameter is an optional override for a default, the program ca continue and use the default or ignore the parameter. But otherwise, the program should simply throw a ArgumentNullException.
Performance shouldn't be a consideration at all when deciding to use exceptions or not. It's a matter of intent and purpose. They're not even that slow, sure they're many times slower than adding two integers, but I can still throw 50,000 exceptions per second on my aging Core 2 Duo. If the use of exceptions ever becomes a bottleneck, you're not using them in the right way.
I have a C# application..I continuously get a null reference exception..I manage to catch this exception and log it..But i doubt if this exception will affect the performance of my application..Please note that i am not trying to avoid the exception instead i need to know if this exception affects the performance of my application if it is continuously fired .
If you're getting a NullReferenceException, you should fix it rather than catching it. You should only ever catch it if it occurs in code in a way that you cannot fix (e.g. a broken third party library). A NullReferenceException always indicates a programming error somewhere.
As for performance - it depends on what you mean by "continuously". Exceptions can be horribly expensive if they're thrown when nothing's really wrong - they're fine when used properly. How many are you seeing per second, for example? Note that when running in a debugger, exceptions are often much more expensive than they would be when a debugger isn't attached.
As ever, when you're worried about performance, you should test the performance, so you can use hard data to make decisions.
It depends where you handle the exceptions and how often they happen. There is a good article on CodeProject regarding exceptions and performance, I suggest you read it.
Yes it affects it depend upon the frequency of it .
For further info please refer Link and it may also help you Link
Exceptions have been designed to alter performances to a minimum when they are not thrown : adding a try/catch block should have very limited impact to your application performances.
Therefore, I'd recommand to add as many try/catch blocks as required to catch any exception IN THE RIGHT LEVEL.
However, throwing and catching an exception may be very expensive. Application has to switch contexts, dispose any element in using blocks... And that's why I agree with #Ramhound : you ought to fix the exception rather than catching it.
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
Performance Cost Of ‘try’
I am being told that adding a try catch block adds major performance cost in the order of 1000 times slower than without, in the example of a for loop of a million. Is this true?
Isn't it best to use try catch block as much as possible?
From MSDN site:
Finding and designing away
exception-heavy code can result in a
decent perf win. Bear in mind that
this has nothing to do with try/catch
blocks: you only incur the cost when
the actual exception is thrown. You
can use as many try/catch blocks as
you want. Using exceptions
gratuitously is where you lose
performance. For example, you should
stay away from things like using
exceptions for control flow.
Also see these related SO questions: (1) (2) (3) and (4).
I could swear there was a question like this just a few days ago, but I can't find it...
Just adding a try/catch block is unlikely to change the performance noticeably when exceptions aren't being thrown, although it may prevent a method from being inlined. (Different CLR versions have different rules around inlining; I can't remember the details.)
The real expense is when an exception is actually thrown - and even that expense is usually overblown. If you use exceptions appropriately (i.e. only in genuinely exceptional or unexpected error situations) then they're unlikely to be a significant performance hit except in cases where your service is too hosed to be considered "working" anyway.
As for whether you should use try/catch blocks as much as possible - absolutely not! You should usually only catch an exception if you can actually handle it - which is relatively rare. In particular, just swallowing an exception is almost always the wrong thing to do.
I write far more try/finally blocks (effectively - almost always via using statements) than try/catch blocks. Try/catch is sometimes appropriate at the top level of a stack, so that a service can keep processing the next request even if one fails, but otherwise I rarely catch exceptions. Sometimes it's worth catching one exception in order to wrap it in a different exception - basically translating the exception rather than really handling it.
You should definitely test claims like this (easy enough), but no, that isn't going to hurt you (it'll have a cost, but not 1000's of times).
Throwing exceptions and handling them is expensive. Having a try..catch..finally isn't bad.
Now with that said, If you are going to catch an exception, you need to have a plan for what you are going to do with it. There is no point in catching if you are just going to rethrow, and a lot of times, there's not much you can do if you get an exception.
Adding try catch blocks helps control your application from exceptions you have no control over. The performance cost comes from throwing an exception when there are other alternatives. For example throwing an exception to bail out of a routine instead of simply returning from a routine causes a significant amount of overhead, which may be completely unnecessary.
I am being told that adding a try
catch block adds major performance
cost in the order of 1000 times slower
than without, in the example of a for
loop of a million. Is this true?
Using try catch adds performance cost, but it isn't a major performance cost.
Isn't it best to use try catch block
as much as possible?
No, it is best to use try catch block when makes sense.
Why guess at the performance costs, when you can benchmark and see if it matters?
It's true that exceptions a very expensive operation. Also try..catch blocks clutter the code and make it hard to read. That said exceptions are fine for errors that should crash the application most of the time.
I always run on break on all exceptions, so as soon as an error happens it throws and I can pinpoint it fairly easy. If everyone is throwing exceptions all the time I get bad performance and I can't use the break on all exceptions, that's makes me sad.
IMO don't use exception for normal program events (like user types in a non-number and you try to parse it as a number). Use the normal program flow constructs for that (ie if).
If you use functions that may throw you make a choice. Is the error critical => crash the app. Is the error non-critical and likely => catch it (and potentially log it).
I just found in a project:
try
{
myLabel.Text = school.SchoolName;
}
catch
{
myPanel.Visible = false;
}
I want to talk to the developer than wrote this, saying that incurring the null exception (because school might theoretically be null, not myLabel) would virtually make the computer beep three times and sleep for two seconds. However, I wonder if I'm misremembering the rule about that. Obviously, this isn't the intended use for try/catch, but is this bad because it defies intention, or bad because of performance considerations? I feel like it's just bad, but I want to say more than "that's really bad".
You should not use exceptions for control flow simply because it is bad design. It doesn't make sense. Exceptions are for exceptional cases, not for normal flow. Performance probably won't be an issue in this situation because for most modern applications on modern hardware, you could throw exceptions all day long and the user wouldn't notice a performance hit. However, if this is a high performance application processing a lot of data or doing a lot of some sort of work, then yes, performance would be a concern.
In my opinion this is poor because it could be made much more clear with an if statement:
if (school != null) {
myLabel.Text = school.SchoolName;
}
else {
myPanel.Visible = false;
}
That will certainly avoid using exception handling unnecessarily and make the code's meaning very obvious.
I think this is bad because it is coding against an exception for one and it will also inherit unnecessary overhead. Exceptions should only be caught if they are going to be handled in a specific way.
Exceptions should be caught specifically for Exceptional cases that you can not predict, in this case it is a simple check to see if school can be null, in fact it is expected that school might be null (since the label is set nothing). If school was null and it should not have been than it should throw its own ArgumentNullException.
Exceptions do incur runtime overhead, but it's probably negligible here. There will be a difference running in the debugger, but the built binaries should run at pretty much the same speed.
Tell your developer that any chimp can make code the machine can read. Good code is written for human beings, not machines. If a null exception is the only thing you're worried about, then it's probably a bug in the user's code -- noone should ever try to assign null to anything that way. Use an Assert() statement instead.
You are absolutely right that this is bad. It is bad because it defies intention and because it hurts performance.
I realize there is room for different programming styles, but personally, I think that even though this works, and I can see what the code is attempting to do, it also hurts readability and code clarity, making it that much more difficult for the maintenance programmers to follow. An if statement is much more appropriate here.
Throwing exceptions does have a negative impact on performance, see http://msdn.microsoft.com/en-us/library/ms229009(VS.80).aspx
I never like using exceptions for flow control. Exceptions are expensive, and it is difficult to determine what the actual flow of a program with exceptions being thrown to reach other places in code. To me this is like using GoTo. This doesn't mean that you should avoid exceptions, but rather an exception should be just that, an exception to what should normally happen in the program.
I think a worse part of the code, is that it's not even doing anything with the exception. There is no logging or even an explanation as to why the exception is being thrown.
I agree with everyone here--it's a horrible idea.
There are a few cases in Java (I think they are mostly gone now, but there may still be some in external libraries) where you were required to catch an exception for certain "non-exception" cases.
In general, when writing library code (well, any class actually), avoid using exceptions for ANYTHING that could possibly be avoided. If it's possible that a name field isn't set and that should cause an exception in a write() method, be sure to add an isValid() method so that you don't actually HAVE to catch the exception around the write to know there is a problem.
(Bad Java code addendum): this "good" programming style virtually eliminates any need for checked exceptions in Java, and checked exceptions in Java are the Suck.
Basically, the question is:
Do the Exceptions in C# affect the performance a lot? Is it better to avoid Exceptions rethrow? If i generate an exception in my code, does it affect a performance?
Sorry for the sillines of the question itself
If you're worried about exception performance, you're using them wrong.
But yes, exceptions do affect performance.
Raising an exception is an expensive operation in C# (compared to other operations in C#) but not enough that I would avoid doing it.
I agree with Jared, if your application is significantly slower because of raising and throwing exceptions, I would take a look at your overall strategy. Something can probably be refactored to make exception handling more efficient rather than dismissing the concept of raising exceptions in code.
Microsoft's Design Guidelines for Developing Class Libraries is a very valuable resource. Here is a relevant article:
Exceptions and Performance
I would also recommend the Framework Design Guidelines book from Microsoft Press. It has a lot of the information from the Design Guidelines link, but it is annotated by people with MS, and Anders Hejlsberg, himself. It gives a lot of insight into the "why" and "how" of the way things are.
running code through a try/catch statement does not affect performance at all. The only performance hit comes if an exception is thrown ... because then the runtime has to unwind the stack and gather other information in order to populate the exception object.
What most other folks said, plus:
Don't use exceptions as part of the programming flow. In other words, don't throw an exception for something like, account.withdrawalAmount > account.balance. That is a business case.
The other biggie to look out for regarding performance is swallowing exceptions. It's a slippery slope, and once you start allowing your app to swallow exceptions, you start doing it everywhere. Now you may be allowing your app to throw exceptions that you don't know about because you are swallowing them, your performance suffers and you don't know why.
This is not silly just I've seen it somewhere else also on SO.
The exceptions occur well, when things are really exceptional. Most of the time you re-throw the exception (may after logging) when there are not many chances of recovering from it. So it should not bother you for normal course of execution of program.
Exceptions as its name implies are intended to be exceptional. Hence you can't expect them to have been an important target for optimisation. More often then not they don't perform well since they have other priorites such as gathering detailed info about what went wrong.
Exceptions in .NET do affect performance. This is the reason why they should be used only in exceptional cases.