Time complexity of .NET BCL APIs or Framework methods - c#

Is there any way to know exact time complexity for .NET predefined methods. Like if I want to know the complexity for
String.Contains()
or
Hashtable.ContainsKey()
Does Microsoft share this information?

Yes, in MSDN :)
Hashtable.ContainsKey Method:
This method is an O(1) operation.
Enumerable.Contains Method (IEnumerable, TSource):
If the type of source implements
ICollection, the Contains method in
that implementation is invoked to
obtain the result. Otherwise, this
method determines whether source
contains the specified element.
Enumeration is terminated as soon as a
matching element is found.
So, for String it would be O(n).

A couple of posts have mentioned Reflector, which is a good tool but no longer free. A free tool that provides a similar service is ILSpy, worth a look if you don't want to buy reflector.

One possibility is using a tool like Reflector, and looking at the implementation of the methods yourself. You should then be able to determine the complexity for most methods (if they rely on many other methods, it might be cumbersome to trace your way through all the calls to do the calculation).
I do not think there is a page that explicitly lists the complexity for all methods.

You can find this out yourself very easily using Reflector and the Code Metrics addin.

Reflector is an obvious choice to find out the information you need as highlighted in other answers. However, this is no longer a free tool although not too expensive.
A free alternative is to look at the source code for the .NET framework libraries. Scott Guthrie has a post which provides some information and links to accessing and debugging the source code.

Related

Compile Times: Additional Using Statements [duplicate]

I've always run Remove and Sort Usings as a matter of course, because it seems the right thing to do.
But just now I got to wondering: Why do we do this?
Certainly, there's always a benefit to clean & compact code.
And there must be some benefit if MS took the time to have it as a menu item in VS.
Can anyone answer: why do this?
What are the compile-time or run-time (or other) benefits from removing and/or sorting usings?
As #craig-w mentions, there's a very small compile time performance improvement.
The way that the compiler works, is when it encounters a type, it looks in the current namespace, and then starts searching each namespace with a using directive in the order presented until it finds the type it's looking for.
There's an excellent writeup on this in the book CLR Via C# by Jeffrey Richter (http://www.amazon.com/CLR-via-4th-Developer-Reference/dp/0735667454/ref=sr_1_1?ie=UTF8&qid=1417806042&sr=8-1&keywords=clr+via+c%23)
As to why MS provided the menu option, I would imagine that enough internal developers were asking for it for the same reasons that you mention: cleaner, more concise code.
There's probably a teeny-tiny (i.e. minuscule/virtually unmeasurable) performance improvement during compilation because it doesn't have to search through namespaces that you aren't actually using for unqualified types. I do it because it's just neater and easier to read in the end. Also, I use the Productivity Power Tools and have them set to do the Remove and Sort when I save the file.

What's the value in removing and/or sorting Usings?

I've always run Remove and Sort Usings as a matter of course, because it seems the right thing to do.
But just now I got to wondering: Why do we do this?
Certainly, there's always a benefit to clean & compact code.
And there must be some benefit if MS took the time to have it as a menu item in VS.
Can anyone answer: why do this?
What are the compile-time or run-time (or other) benefits from removing and/or sorting usings?
As #craig-w mentions, there's a very small compile time performance improvement.
The way that the compiler works, is when it encounters a type, it looks in the current namespace, and then starts searching each namespace with a using directive in the order presented until it finds the type it's looking for.
There's an excellent writeup on this in the book CLR Via C# by Jeffrey Richter (http://www.amazon.com/CLR-via-4th-Developer-Reference/dp/0735667454/ref=sr_1_1?ie=UTF8&qid=1417806042&sr=8-1&keywords=clr+via+c%23)
As to why MS provided the menu option, I would imagine that enough internal developers were asking for it for the same reasons that you mention: cleaner, more concise code.
There's probably a teeny-tiny (i.e. minuscule/virtually unmeasurable) performance improvement during compilation because it doesn't have to search through namespaces that you aren't actually using for unqualified types. I do it because it's just neater and easier to read in the end. Also, I use the Productivity Power Tools and have them set to do the Remove and Sort when I save the file.

Dynamic code generation

I am currently developing an application where you can create "programs" with it without writing source code, just click&play if you like.
Now the question is how do I generate an executable program from my data model. There are many possibilities but I am not sure which one is the best for me. I need to generate assemblies with classes and namespace and everything which can be part of the application.
CodeDOM class: I heard of lots of limitations and bugs of this class. I need to create attributes on method parameters and return values. Is this supported?
Create C# source code programmatically and then call CompileAssemblyFromFile on it: This would work since I can generate any code I want and C# supports most CLR features. But wouldn't this be slow?
Use the reflection ILGenerator class: I think with this I can generate every possible .NET code. But I think this is much more complicated and error prone than the other approaches?
Are there other possible solutions?
EDIT:
The tool is general for developing applications, it is not restricted to a specific domain. I don't know if it can be considered a visual programming language. The user can create classes, methods, method calls, all kinds of expressions. It won't be very limitating because you should be able to do most things which are allowed in real programming languages.
At the moment lots of things must still be written by the user as text, but the goal at the end is, that nearly everything can be clicked together.
You my find it is rewarding to look at the Dynamic Language Runtime which is more or less designed for creating high-level languages based on .NET.
It's perhaps also worth looking at some of the previous Stack Overflow threads on Domain Specific Languages which contain some useful links to tools for working with DSLs, which sounds a little like what you are planning although I'm still not absolutely clear from the question what exactly your aim is.
Most things "click and play" should be simple enough just to stick some pre-defined building-block objects together (probably using interfaces on the boundaries). Meaning: you might not need to do dynamic code generation - just "fake it". For example, using property-bag objects (like DataTable etc, although that isn't my first choice) for values, etc.
Another option for dynamic evaluation is the Expression class; especially in .NET 4.0, this is hugely versatile, and allows compilation to a delegate.
Do the C# source generation and don't care about speed until it matters. The C# compiler is quite quick.
When I wrote a dynamic code generator, I relied heavily on System.Reflection.Emit.
Basically, you programatically create dynamic assemblies and add new types to them. These types are constructed using the Emit constructs (properties, events, fields, etc..). When it comes to implementing methods, you'll have to use an ILGenerator to pump out MSIL op-codes into your method. That sounds super scary, but you can use a couple of tools to help:
A pre-built sample implementation
ILDasm to inspect the op-codes of the sample implementation.
It depends on your requirements, CodeDOM would certainly be the best fit for a "program" stored it in a "data model".
However its unlikely that using option 2 will be in any way measurably slower in comparision with any other approach.
I would echo others in that 1) the compiler is quick, and 2) "Click and Play" things should be simple enough so that no single widget added to a pile of widgets can make it an illegal pile.
Good luck. I'm skeptical that you can achieve point (2) for anything but really toy-level programs.

Enhanced DynamicQuery?

I've recently started using the DynamicQuery API, and it quickly became apparent that it has numerous limitations. I've found at least one improvement online: support for enum arguments, but it's pretty clear that this API is not actively maintained (if at all).
In case I'm wrong and there is somebody maintaining an improved version - please post a link!
Alternatively, a separate, active project with similar goals would also be of interest.
(Clarification: I'm looking to parse strings at runtime.)
In the end we just implemented some of the features we missed by editing the source code. Added support for passing in a static class as an "external" (DynamicQuery's terminology), support for calling methods on this static class, and type inference if any such methods are generic.
I suspect there isn't much demand for this, so I didn't bother making it available anywhere. Let me know if you think otherwise.
Edit: due to a request, DynamicQuery Enhanced is now available on BitBucket. Expect to be underwhelmed; take a look at this Info and this list of tweaks.
I've seen PredicateBuilder before mentioned (here on Stackoverflow) as an alternative. I've not used it though, but it might be useful to you.

Existing library to calculate code complexity of a block of code

I'm given a string which contains an arbitrary amount of code. I want to calculate a number which represents the code complexity of that string. Something like:
int complexity = Lib.FindComplexity(someString);
I realize there are a lot of tools out there that will do this for you. These tools will not work for me, because I want to do it programmatically. I'd love for the library to be in C#, but will work with anything at this point.
Thanks in advance!
Have you considered using one of those existing tools and wrapping it in a library? For instance, you might be able to use the NDepend.Console.exe by calling it from your code with the parameters you want, and parse out the result.
NDepend is a great tool, although not cheap at the time I looked at it. If money isn't an option, I'd look into using reflection and http://en.wikipedia.org/wiki/Cyclomatic_complexity. It doesn't meet your requirement with any string but you could definitely test assemblies you created.
You can also use reflector and the code metrics plug-in available for it.
A library such as this must be able to parse an arbitrary language
fragment, and then compute the complexity metrics over the parsed fragment.
Most metrics tools have at best a parser for the entire language,
not just a fragment, so you are likely to be hard pressed to find
many solutions.
There is one system that can provide you what you need:
our DMS Software Reengineering Toolkit. It provides parsers
for many languages (such as Java and C#;
it is unclear what language you want to analyze).
DMS has been used already to implement these kinds of metrics
for several langauges (Java, C#, JavaScript, COBOL)
and the process of doing this is straigtforward.
And DMS does parse langauge fragments, and amazingly,
the metrics implementation actually operate on such fragments.
You could customize DMS to implement exactly what you want.
See http://www.semanticdesigns.com/Products/DMS/DMSToolkit.html
and for derived metrics tools,
http://www.semdesigns.com/Products/Metrics/index.html

Categories

Resources