I'm looking for a way to find bottleneck methods in a solution (lots of projects).
Lets say i have a HUGE program (1000s of methods) and i want to improve performance by finding methods that are called a lot (actually used at runtime), and optimize them.
I need this for a complex problem that's written in C++, C#, CLI/C++. (I can compile it all in debug and have the .pdb files)
So, I'm looking for some kind of analyzer that will tell me how much cpu time each method is using.
What tool/addon/feature can I use in Visual Studio to get that information ?
I want to be able to run the program for a few minutes, and then analyze the method's cpu usage. Or even better - amount of cpu / number of calls.
Would be even better if I could sort by namespace or dll/package/project.
The more expensive Visual Studio versions should provide a Profiler builtin: see this thread.
However there are more methods to profile, this topic has been covered a lot of times on stackoverflow, here for example.
Following one of Christian Goltz links, I've found a program that might do what I want, it profiles both managed and unmanaged code:
AQTime Pro
I'm had some good experiences with the DotTrace product by JetBrains. Not sure if it has the IDE integration or all the features that you're looking for, but it definitely gets the job done.
This method is low-tech, but works perfectly well.
I also work in a huge application, and when we have performance problems it finds them quickly.
Related
I'm looking for a tool to tell me how long my code takes to run. Something that would be the equivalent of recording the DateTime.Now before and after each line of code in my program, then displaying the difference between the two times for each line (after my program finishes running).
For instance, if I have a program that calls four methods in its main, I want to know (after running this tool) how long each of those methods takes to run, and then if I stepped into each method, I'd want to know how long each line in there takes to run, and so on.
Do these tools exists? Of course I'd prefer a free one, but if all that exist are professional tools then please mention those as well.
edit: it appears these tools are called Profiling tools. Thanks, this will definitely help me in my search. Unfortunately, I'm using Visual Studio 2010 Professional, so I believe the Microsoft profiling tool is out of my grasp. Any good third-party profiling tools?
You can use the CLR Profiler for .NET Framework 4
The CLR Profiler includes a number of
very useful views of the allocation
profile, including a histogram of
allocated types, allocation and call
graphs, a time line showing GCs of
various generations and the resulting
state of the managed heap after those
collections, and a call tree showing
per-method allocations and assembly
loads.
And even more profilers and tools can be found here...
More about profiling on the wikipedia
If you use the Profiler tool that come with VS it shows this to you very well. Only one down side is that I think it only comes with Ultimate. :(
Are you looking for the performance profiler? It tells you how long each function takes.
I like dotTrace, it's by the same guys that make Resharper: dotTrace
Give a try to Red Gate ANTS Performance Profiler. There's a free trial, and if you don't have access to the built-in VS2010 profiler, it does a good job.
I have written a Winform application in C#. How can I check the performance of my code. By that I mean, how can I check which forms references are active at a given time or event, so that I can remove them if they are not required (make them available for garbage collection). Is there a way to do it using VS 2005 or any free tool. Any tutorials or guide will be useful.
[Edit] Sorry if my question is confusing. I am not looking for a professional tool, but ways to know/understand the working of my code better and code more efficiently.
Thanks
Making code efficient is always a secondary step for me. First I write the code so that it works. Next, I profile it if i am unhappy with the performance. The truth is most applications run fast enough after the first time writing them. Sometimes though, better performance is needed. Performance can be gained many different ways. It all depends on your application. I write LOB apps mainly, so I deal with alot of IO to databases, services and storage. These calls are all very expensive and need to be limited so they are my first area to optimize. I optimize by lazy-loading, eager-loading, batching calls, making less frequent calls and so on. I recently had a winforms app that created hundreds of controls dynamically and it took a long time. That's another bottleneck that I have to address. I use a profiler to measure the performance of the applications.
Use the free Equatec profiler. It will show you how long calls take and how many times a call is made. The profiler gives a nice report and visual display that can drill down the call stacks.
Red Gate Performance Profiler
...it's been said here a million times before. If you suspect performance issues, profile your application. It will tell you how long calls are taking and point out the bottlenecks in your code.
Kobra,
What you're looking for is called a Memory Profiler. There happens to be one (paid) version for .NET aptly named ".NET Memory Profiler", I've not used it extensively but it should answer the questions you're asking. There are a few others ones which will do basically the same thing, like giving you instance counts of loaded types, and help you identify when instances are not being garbage collected for one reason or another (i.e. Event Handler References, Static Properties, etc).
Hope this helps,
Dylan
I need to know how can I get every instruction's duration so I can maintain the code to increase the performance of my program .
Use a profiler. If you have Visual Studio Team System, there's one included. Otherwise take a look at ANTS or dotTrace.
what you're looking for is a profiler I believe :)
see: Profiler and List of Performance Analysis Tools
You want an application profiler for that, it shows exactly what code takes how long.
You need to use a profiler to accomplish this. It exists several profilers, some are free.
My preference goes to Red Gate Ants.
I dont think you should go upto level of instructions to measure performance bottlenecks. Micro optimization can be harmful. You should go up to function profiling. If you are using VS2005 or 2008 you can use
Performance wizard
CLR profiler
to profile your functions.
Alternatly I personally recomend using Ants Profiler
Since Ants and dotTrace are very good but commercial tools (I wouldn't call them expensive - they're worth the money), I recently heard about the EQATEC Profiler, which is free of charge. Did not try it because of lack of time, but maybe you want to give it a try.
(no, I am not affiliated with them)
If you have an application running and you want to improve its performance using a profiler (.net or database) is a must .DotTrace and Ants are famous ones for good reasons.
If you are using SQL Server ,SQL server profiler is a great tool to trace and watch what's going on ,on the server side of your application.
If you want to decide what approach is better to use you can use ILDASM to disassemble your code to IL and see what's going on under the hood Although it's not a simple task but I think it worth it.
You might want to take a look at FxCop as well, might give you some more vauge hints as to what could be improved. (Oh, and it's free!)
I'm surprised no one have mentioned this yet, but if you want to know the cost of individual instructions, look them up here or here.
The cost of individual instructions varies between CPU's, but both AMD and Intel (and any other CPU maker) documents this.
The problem is that determining the cost of instructions is not straightforward. You have a lot of metrics to consider: There's the latency, whether it is pipelined (fully or partially), how big the instruction is (affects instruction cache) and so on. So this information is only really helpful if you're writing a single really performance-sensitive function where you're either writing assembly yourself, or closely reading the compiler-generated ASM to find and eliminate inefficiencies. And if you know a fair bit about how the CPU works.
But before you get to this point, you should use a profiler as everyone else have suggested. That helps you narrow down where the time is being spent, and what needs optimizing.
I'm writing a plug-in for another program in C#.NET, and am having performance issues where commands take a lot longer then I would. The plug-in reacts to events in the host program, and also depends on utility methods of the the host program SDK. My plug-in has a lot of recursive functions because I'm doing a lot of reading and writing to a tree structure. Plus I have a lot of event subscriptions between my plugin and the host application, as well as event subscriptions between classes in my plug-in.
How can I figure out what is taking so long for a task to complete? I can't use regular breakpoint style debugging, because it's not that it doesn't work it's just that it's too slow. I have setup a static "LogWriter" class that I can reference from all my classes that will allow me to write out timestamped lines to a log file from my code. Is there another way? Does visual studio keep some kind of timestamped log that I could use instead? Is there someway to view the call stack after the application has closed?
You need to use profiler. Here link to good one: ANTS Performance Profiler.
Update: You can also write messages in control points using Debug.Write. Then you need to load DebugView application that displays all your debug string with precise time stamp. It is freeware and very good for quick debugging and profiling.
My Profiler List includes ANTS, dotTrace, and AQtime.
However, looking more closely at your question, it seems to me that you should do some unit testing at the same time you're doing profiling. Maybe start by doing a quick overall performance scan, just to see which areas need most attention. Then start writing some unit tests for those areas. You can then run the profiler while running those unit tests, so that you'll get consistent results.
In my experience, the best method is also the simplest. Get it running, and while it is being slow, hit the "pause" button in the IDE. Then make a record of the call stack. Repeat this several times. (Here's a more detailed example and explanation.)
What you are looking for is any statement that appears on more than one stack sample that isn't strictly necessary. The more samples it appears on, the more time it takes. The way to tell if the statement is necessary is to look up the stack, because that tells you why it is being done.
Anything that causes a significant amount of time to be consumed will be revealed by this method, and recursion does not bother it.
People seem to tackle problems like this in one of two ways:
Try to get good measurements before doing anything.
Just find something big that you can get rid of, rip it out, and repeat.
I prefer the latter, because it's fast, and because you don't have to know precisely how big a tumor is to know it's big enough to remove. What you do need to know is exactly where it is, and that's what this method tells you.
Sounds like you want a code 'profiler'. http://en.wikipedia.org/wiki/Code_profiler#Use_of_profilers
I'm unfamiliar with which profilers are the best for C#, but I came across this link after a quick google which has a list of free open-source offerings. I'm sure someone else will know which ones are worth considering :)
http://csharp-source.net/open-source/profilers
Despite the title of this topic I must argue that the "best" way is subjective, we can only suggest possible solutions.
I have had experience using Redgate ANTS Performance Profiler which will show you where the bottlenecks are in your application. It's definitely worth checking out.
Visual Studio Team System has a profiler baked in, its far from perfect, but for simple applications you can kind of get it to work.
Recently I have had the most success with EQATECs free profiler, or rolling my own tiny profiling class where needed.
Also, there have been quite a few questions about profilers in that past see: http://www.google.com.au/search?hl=en&q=site:stackoverflow.com+.net+profiler&btnG=Google+Search&meta=&aq=f&oq=
Don't ever forget Rico Mariani's advice on how to carry out a good perf investigation.
You can also use performance counter for asp.net applications.
UPDATE: Focus your answers on hardware solutions please.
What hardware/tools/add-in are you using to improve ASP.NET compilation and first execution speed? We are looking at solid state hard drives to speed things up, but the prices are really high right now.
I have two 7200rpm harddrives in RAID 0 right now and I'm not satisfied with the performance anymore.
So my main question is what is the best cost effective way right now to improve ASP.NET compilation speed and overall development performance when you do a lot of debugging?
Scott Gu has a pretty good blog post about this, anyone has anything else to suggest?
http://weblogs.asp.net/scottgu/archive/2007/11/01/tip-trick-hard-drive-speed-and-visual-studio-performance.aspx
One of the important things to do is keeping projects of not-so-often changed assemblies unloaded. When a change occurs, load it, compile and unload again. It makes huge differences in large solutions.
First make sure your that you are using Web Application Projects (WAP). In our experience, compared to Website Projects, WAP compiles roughly 10x faster.
Then, consider migrating all the logic (including complex UI components) into separate library projects. The C# compiler way faster than the ASP.NET compiler (at least for VS2005).
If you have lots of 3rd party "Referenced Assemblies" ensuring that CopyLocal=False on all projects except for the web application project makes quite a big difference.
I blogged about the perf increases I managed to get by making this simple change:
Speeding Up your Desktop Build - Part 1
You can precompile the site, which will make the first run experience better
http://msdn.microsoft.com/en-us/library/ms227972.aspx
I would recommend adding as much memory to your PC as you can. If your solutions are super large you may want to explore 64bit so that your machine can address more than 3GB of ram.
Also Visual Studio 2008 SP1 seems to be markedly faster than previous versions, make certain you are running the latest release with the Service Packs.
Good Luck!
If you are looking purely at hardware you will need to search for benchmarks around the web. There are a few articles written just on the performance of hardware on Visual Studio compilation, I am just too lazy to find them and link them.
Hardware solutions can be endless because you can get some really high end quipment if you have the money. Otherwise its the usual more memory, faster processor, and faster hard drive. Moving to a 64bit OS also helps.
If you want to be even more specific? Just off the top of my head...8GB or more of memory, if you cant afford solid state drives I'd go for 10K to 15K RPM hard drives. There is a debate of whether quad-core makes any difference but look up the benchmarks and see what works for you.
If you are looking at just hardware, Multi-core processors, high amounts of ram, and ideally 10K or 15K hard drives.
I personally have noticed a huge improvement in performance with the change to 10K RPM drives.
The best way to improve ASP.NET compile time is to throw more hardware at it. An OCZ Vertex Turbo SSD drive and an Intel i7 960 gave me a huge boost. You can see my results here.
I switched from websites to web applications. Compile time went down by a factor of ten at least.
Also, I try not to use the debugger if possible ("Run without debugging"). This cuts down the time it takes to start the web application.