I see more and more open source libraries using .NET 5's Source Generators which improves their performance.
As much as I can understand from the docs, they are meant to replace System.Reflection becomes it comes at the expense of performance. Is that true? What I personally know about source generators is that when they introduced them in .NET 5, they were meant to generate C# code based on the .proto data contract files.
There is a clone library of MediatR which uses Source Generators instead of System.Reflection.
Could you simplify the source generators benefits and usage in that MediatR library and overall?
Source generators are really nothing magic - it's just some custom piece of code that generates text into files, which then get inserted into the compilation process and becomes part of the binary output (e.g. DLL or EXE) as if it was manually typed in by hand in some source file before you hit compile.
The only "magic" here is the formalized concept of analyzers-as-generators, which enables Visual Studio to automatically pass in the original source code into your custom generator routine and include the output whenever you build your project.
One application of source generators is to create specialized, type-specific code for some operation that otherwise would require run-time reflection. Run-time reflection is typically rather slow and CPU intensive, yet often needed in order to centralize logic for common operations on unknown objects. A common example is the serialization and deserialization of objects. This could be done either through reflection (lookup properties at run-time, invoke the getters and setters, etc) or through faster, type specific code that directly references properties and reads/writes to and from data streams. However, creating such specialized code for every type is a lot of boring, repetitive work and so - enter source generators. They can do the "reflection" during build time and output slim, fast code into temporary .cs files, which get compiled with the product.
Related
This question is related to How to detect static code dependencies in C# code in the presence of constants?
If type X depends on a constant defined in type Y, this dependency is not captured in the binary code, because the constant is inlined. Yet the dependency is there - try compiling X without Y and the compilation fails. So it is a compile time dependency, but not runtime.
I need to be able to discover such dependencies and scanning all the source code is prohibitively expensive. However, I have full control over the build and if there is a way to instruct the C# compiler not to inline constants - that is good enough for me.
Is there a way to compile C# code without inlining the constants?
EDIT 1
I would like to respond to all the comments so far:
I cannot modify the source code. This is not a toy project. I am analysing a big code base - millions of lines of C# code.
I am already using Roslyn API to examine the source code. However, I only do it when the binary code inspection (I use Mono.Cecil) of a method indicates the use of dynamic types. Analysing methods using dynamic with Roslyn is useful, because not all the dynamic usages are as bad as reflection. However, there is absolutely no way to figure out that a method uses a constant in general. Using Roslyn Analyser for that takes really long time, because of the code base size. Hence my "prohibitively expensive" statement.
I have an NDepend license and I used it at first. However, it only processes binary code. It does NOT see any dependencies introduced through constants. My analysis is better, because I drill down to dynamic users and employ Roslyn API to harvest as much as I can from such methods. NDepend does nothing of the kind. Moreover, it has bugs. For example, the latest version does not inspect generic method constraints and thus does not recognise any dependencies introduced by them.
I am building a multi-language MVC application and have a series of resource files with translated strings for messages that will be displayed to the user.
Is there any way of ensuring that any resource files added in the future have all required keys and are spelled correctly?
As an analogy, if the resource file was a regular class, you could provide an interface to ensure that all required method and properties were present in the implementing class. Is there a similar concept for resource files?
I've been unable to find a supported way to enforce an explicit contract upon a .resx file. Since your goal is ultimately to catch implementation errors before they show up at runtime (and compile time checking isn't possible), I recommend falling back to static code analysis. Luckily, .NET makes this trivially easy:
Use the System.Resources.ResXResourceReader class to read the contents of the resx files to be validated.
Implement a test that asserts against all required keys in the "contract" you'd like to enforce on the resx.
Test should run as part of an existing test suite, and failure will warn a developer of the implicit contract before encountering the problem at runtime.
Since your resource files will exist in a known location, you can trivially ensure that the tests run against all resx files in that directory. In this way, you don't even need to update the test when new resource files are added, only if the contract changes.
I've used a similar approach to help with maintenance of stored procedure names kept in (an extensive number of) resx files. Since the resource files are spread across dozens of projects, manual maintenance is tedious and error-prone -- in other words, it doesn't get done. The static code analysis approach has yielded few downsides, and I think it would work well in your case as well.
Landing page for resource files on MSDN
ResXResourceReader on MSDN
System.Resources.ResXResourceReader requires a reference to System.Windows.Forms. It's available on both .NET and Mono.
I am reading the book "CLR Via C#" and in the Generics chapter is said:
Source code protecton
The developer using a generic algorithm doesn't need to have access to the algorithm's source code. With C++ templates or Java's generics, however, the algorithm's source code must be available to the developer who is using the algorithm.
Can anyone explain what exactly is meant by this?
Well, Generic classes are distributed in compiled form, unlike C++, where templates need to be distributed in full source code. So you do not need to distribute the C# source code of a library that contains generic classes.
This does not prevent the Receiver of your class from disassembling it though (as it is compiled to IL which can be rather easily decompiled again). To really protect the code, additional methods, such as obfuscation are required.
Behind the scene: This distribution in compiled form is the reason why C# generics and C++ templates also differ in the way they need to be written. C# generic classes and their methods need to be fully defined at the time of compilation, and any error in the definition of the generic class or their methods or any operation on a template parameter which cannot be deduced at compile time will directly produce a compile error. In C++ the template is only compiled at the time of usage and only the methods actually used are compiled. If you have an undefined operation or even a syntactical error in a template definition, you will only see the error when that function is actually instantiated and used.
It seems like the documentation around Roslyn is a bit lacking?
I am not able to find good comprehensive documentation.
What I am trying to do essentially is copy the public surface of an existing API (.dll)
into a new assembly (need to create source code .cs files!) and at the same time make a variety of tranformations to the resulting code (think making wrapper classes).
Would really appreciate any help in how I can use Rolsyn to load the initial SyntaxTree from an existing assembly and how to do those basic tranforms (for example exclude internal classes etc)
In the current Roslyn CTP there is a Roslyn.Services.MetadataAsSource namespace which can be used to convert an type's public interface to source code. This is what we implement the F12 "metadata as source" feature with. Now, it generates only a shell of source code which won't actually compile, so you'd have to use further APIs to munge the syntax tree into what you want. Alternatively, you could use the Roslyn.Services.CodeGeneration namespace to generate source from these symbols automatically. I should warn the MetadataAsSource namespace may go away in future versions of the API.
You can import symbols from metadata by creating an otherwise empty compilation with the metadata references you care about added, and then from that compilation browsing the type hierarchy from GlobalNamespace property, or calling Compilation.GetReferencedAssemblySymbol() and then digging through that. This is actually far better than using reflection, since it'll properly express the symbol model from the "C# perspective" instead of the "CLR perspective" -- reflection won't give you information for uses of dynamic, some default parameter values, etc.
It seems like the documentation around Roslyn is a bit lacking? I am not able to find good comprehensive documentation.
Roslyn is at the Community Technology Preview stage, so it's not surprising that its documentation is lacking. You can find some sources at Roslyn API documentation.
What I am trying to do essentially is copy the public surface of an existing API (.dll) into a new assembly (need to create source code .cs files!) and at the same time make a variety of transformations to the resulting code (think making wrapper classes).
Working with assemblies this way is not something Roslyn can do. But it seems for what you want, reflection for reading the assembly combined with Roslyn for writing the new code would work. But you would need to write all the code to translate from the reflection model to Roslyn's model (e.g. Type → TypeDeclarationSyntax, MethodInfo → MethodDeclarationSyntax, etc.).
I am writing an application that allows the user to create custom algorithms for computing values over a collection of objects. Simply put, i will be having a string with the source code of class with one method.
The solution I have implemented is to compile the string source code in a separate dll for each such custom algorithm and then load them using Assembly.Load and instantiate the class saved in the dll. From a maintainability point of views, this means that i have to store the source code in the db (for example) and also manage the existence of the compiled dlls (recreate by compiling again the source code if it is missing)
Is there a better way to do this, considering the new features of .Net 4.0?
EDIT:
The input source code is C# and i am using CSharpCodeProvider to compile the code. The custom classes are all derived from a base class and they override the method that actually holds the computation logic. What i would really like to do is to get rid of the dll management and not lose (too much) performance in compiling all the classes every time my application starts up
I would look at scripting languages; IronPython is easy to embed, or there are JavaScript engines for .NET. Simple, and usually fast enough.
If (comments) you need to use c#, I would:
build all the current methods at the same time into one assembly; solves a lot of problems
if the data changes during execution, make use of AppDomains so that I can unload them
I've done something similar where the model/rules were XML, running it through a transform to get c#, and compiling with CSharpCodeProvider (or whatever); and simply polling every minute or so to see if a new build is required
The CSharpCodeProvider has been around for a while and should fit the ticket. It can be used to generate the separate libraries like you have been doing (perhaps you are using the CSharpCodeProvider), but it can also be used to generate dynamic class objects. If they all implement an interface you can cast the objects as an interface or you can use reflection to invoke your logic. Here is a codeproject article to achieve something similar:
http://www.codeproject.com/KB/dotnet/dynacodgen.aspx