Can I prevent the CLR from optimizing away debugging information? - c#

I've written an abstract base class for unit tests that sets up just enough environment for our tests to run. The class exposes some of the runtime environment bits as properties whose types vary test by test (the property types are type arguments specified in the inheriting, concrete test class).
This is all well and good, except a co-worker noticed that he can't view any of the class' properties in the debugger. Turns out the reason is that he had no fields defined in his inheriting class, and the CLR optimized something or other away, so the debugger couldn't display the properties. Is it possible to prevent this in the base class somehow, or do I have to resort to telling everyone they need to define at least one field which is used somewhere during the tests?
Edit:
Sounds like a likely culprit should be the optimization/debug settings. That said, I'm building the app from Visual Studio in Debug mode, I've double-checked that all projects are set for a debug build, and none of the projects in this solution have the Optimize flag set.
Perhaps it would also be relevant to note that I'm using MSTest and the Visual Studio test runner.
Edit 2:
By "can't view properties" I'm referring to when I evaluate the property in Quickwatch and get a red exclamation mark and a text "Could not evaluate expression" error text. And lest you think I'm entirely off base with my suspicions, adding an instance field that gets initialized in the test initialize method makes the problem go away...
Edit 3:
Checked the build output. I notice that the compiler is invoked with these options:
/debug+
/debug:full
/optimize-
/define:DEBUG,TRACE
I should think that would be enough to stop this from happening, but there you go. :)

I've encountered this same problem before, and it's invariably due to the fact that Debug mode has been turned off in some way. Try checking each of the following:
The current build configuration for the solution and the appropiate project(s) is Debug.
In the Build tab of the property pages, the Optimize code checkbox is unchecked.
If this is all correct, then I recommend you paste the text written to the Output window here so can we can potentially spot any more unusual cause of the issue.

Make sure your arent trying to debug your release build.
All of these compile settings set behind these configurations.
The debug version is one for debugging ;-)

In my case, the project configuration was correct, but my code was the result of a decompilation from ILSpy, and it had assembly attributes like the following:
[assembly: CompilationRelaxations(8)]
[assembly: RuntimeCompatibility(WrapNonExceptionThrows = true)]
[assembly: Debuggable(DebuggableAttribute.DebuggingModes.IgnoreSymbolStoreSequencePoints)]
Removing these attributes fixed the debugger...

Related

Always treat #nullable enable/disable as error

Presently with our build some assembly classes have the #nullable enable/disable/restore attribute set in some of the source files.
As a way of improving our quality I would like to make the build fail if someone sets this pre-processor directive.
If the project is set to <Nullable>enable</Nullable> or <Nullable>disable</Nullable>, that is fine, but within a class file make it fail.
Is there a way to make this happen or do I just need to go through the source files and resolve them. (But that doesn't stop someone in the future bypassing by doing this again).

"Internal error in the expression evaluator"

I've encountered a problem in expression evaluator of visual studio 2015 that says "Internal error in the expression evaluator", after some investigations I found that this is caused by an assembly that is loaded using reflection. This assembly wouldn't throw any exceptions but after that, vs exp evaluator will fail.
This is not the only assembly that I load, there are some good working ones that don't influent the evaluator.
To overcome this issue I had to check 'Menu > options > debugging > Use Managed Compatibility Mode' but this disables the 'Edit & Continue' feature, what can I do?
Properties of the causer assembly:
its name is the same as the main assembly
(i changed its name but nothing happened)
all of my projects are using dotNet 4.5
all root namespaces are same
(all of the assemblies are so)
Thanks!
That sounds like a bug in the expression evaluator. For a better workaround, instead of checking "Use Managed Compatibility Mode", check "Use the legacy C# and VB expression evaluators". This should enable you to continue using Edit and Continue as well as other features added within the last few releases.
In terms of the "Internal error in expression evaluator", can you please open an issue by going to Help -> Send Feedback -> Report a problem? That will help us fix the problem in future releases.
Just extending on the solution provided by Patrick Nelson. For Visual Studio 2015+ as inquired, the steps are as follows.
If you're debugging at the moment, this option will be unavailable. Stop the debugger.
Go to Tools -> Options
and here under the Options select Debug -> General and scroll down to find the ...legacy C# expression.. option:
More information is provided here:
Switching to Managed Compatibility Mode
Note: There are also some serious drawbacks that occur from switching to the legacy mode. Especially Runtime Reflection of the implemented Interfaces becomes almost an impossibility or extremely inconvenient. There are also other Reflection methods which will throw errors.
I finally figured out what created this problem in my Visual Studio!
The quick fix is to delete the debug object favorites from the "Documents/Visual Studio xx/Visualizers" folder and restart Visual Studio.
When you "pin" a variable in the debugger, Visual Studio saves a 'favorite' json object for it.
It appears that there is a bug in Visual Studio which corrupts the 'favorite' for some child variables that are dynamic in nature (not exactly sure of the conditions though).
For me checking "Use Managed Compatibility Mode" option worked. I was also seeing question marks when hovering over variables, instead of properties and values
I resolved this issue by simply resetting my visual studio settings by going to: to Tools->Import and Export Settings and selecting to reset to default settings
I had the same issue with VS2019. I ended up deleting my Documents/Visual Studio 2019 folder. Hope this might help someone, one day. Cost me a day.
PS. Probably not required to delete all, and of course not your projects (if they're in there), but in my case, everything in there was autogenerated by VS.
I of course tried all solutions mentioned here, and even reinstalling VS didn't work. Refactoring the class to another name was the 'trigger' which made me think there must be some cache, despite cleaning symbols and such didnt work.
In my case I was trying to evaluate lambda expression on List<> and had such error ("Internal error in the expression evaluator"). I was using VS2015, so lambda expressions were allowed. It turns out expression evaluator was lacking of Linq library. I added
using System.Linq;
to my current class and voilĂ ! Lambda evaluated.
I encountered the "internal error in the expression evaluator" error when I was debugging in release mode instead of in debug mode. I had changed it to Release when publishing to production and forgot to change it back to Debug.
Check your use of the [DebuggerBrowsable] attribute; I found a minimal case in VisualStudio 2017 15.5 and posted it here.
In this particular case, the expression evaluator (EE) crash appears related to the [DebuggerBrowsable] attribute applied to a property overriding a field of the same name. This will account for some percentage of the cases that people are experiencing out there, but there's no way of knowing how many are due to this specific issue until it gets fixed.
The full and complete demonstration example is shown in the image (and included below for good measure)
Machine-readable copy of the code in the image:
using System;
using System.Diagnostics;
class Program { static void Main() => new _derived(); }
abstract class _base
{
[DebuggerBrowsable(DebuggerBrowsableState.Never)]
public Object trace;
};
class _derived : _base
{
public _derived() => Debugger.Break(); // <-- vs2017 EE crash when stopped here
[DebuggerBrowsable(DebuggerBrowsableState.Never)]
new public Object trace => base.trace;
}
In my case, I had 2 same dll files in 2 different folders (seems, one dll was not correct). Deleting the .dll and rebuilding solution solved my issue.
In my case the data I was attempting to inspect was extremely large, a string which unexpectantly had hundreds of megabytes of data in it. The issue wasn't apparent when the amount of data being inspected was reasonable.

Build entire solution but add global Conditional Compilation Symbols for just one project

I hava a quite complex solution, containing 10 projects aside from Test projects.
It is a network of distributed applications & services that communicate using remoting; therefore having the proper referenced assemblies (& versions) is crucial. That's why I want the whole thing to be compiled and schrink-wrapped in ONE build.
One of the applications is a demo/analysis-tool that runs a subprocess of another - much bigger - application based on the user's input and displays the results; That way engineers have a tool to help tweak their settings for "the big computation". Obviously that subprocess is contained in another assembly, and a big part of te results presented to the engineers is generated by
#if ENABLE_TRACE_MATCHING
Trace.WriteLine("Some engineering output");
#endif
My problem is that Conditional Compilation Symbols in the project settings are limited to that project's assembly, and do not propagate over referenced assemblies.
How can I configure my build in such a way that all projects will be built without ENABLE_TRACE_MATCHING being defined, except for the one debug/analysis-app project where all referenced projects/assemblies must be compiled with ENABLE_TRACE_MATCHING being defined
I also cannot replace #if ENABLE_TRACE_MATCHING by #if DEBUG, since that would enable a whole lot of different output our engineers wouldn't know how to handle.
Thanks in advance.
PS: If you think my code smells, then I agree. Additionally: It's mostly not my code ;)
You need to learn more about Microsoft Build, which is an out-of-the-box Microsoft .NET tool present in any framework's installation.
Using MSBuild you can define these "symbols" (properties) and a batch of commands (targets).
That's you can create a MSBuild script that imports default Visual Studio targets from all projects in your solution, and declare in the script these properties ("symbols").
In fact, the property to set such symbols already exists: "DefineConstants".
So, since you have it, you can have that MSBuild script that provides that property value, re-declaring it there, so, ALL MSBuild targets will be knowing about these symbols.
EDIT:
Check this other question too:
msbuild, defining Conditional Compilation Symbols

When to use preprocessor directives in .net?

I think this is a simple question so I assume I'm missing something obvious. I don't really ever use preprocessor directives but I was looking at someone's code which did and thought it was something I should be familiar with.
So I looked at the msdn example here it has the code:
#define DEBUG
// ...
#if DEBUG
Console.WriteLine("Debug version");
#endif
My two questions are:
in the example above why do they define DEBUG? I was under the impression that was set if you compile in debug v. release mode?
looking at the other example which has #define MYTEST and then writes to the console dependent on if it 'defined', but how does this differ from just using a variable? What am I missing here?
I would actually recommend using the Conditional Attribute instead of inline #if statements.
[Conditional("DEBUG")]
private void DeleteTempProcessFiles()
{
}
Not only is this cleaner and easier to read since you don't end up having #if, #else within your code. This style is less prone to errors either during normal code edits and well as logic flow errors.
Generally, the optional/conditional compilation symbols will be provided by the build script. It is pretty rare to see #define, except for very debug code (if you see what I mean).
Re using a variable; I often use such conditions to handle code that must run on different runtimes (mono, cf, silverlight, etc). A variable cannot suffice because the code cannot be compiled against the wrong platform (missing types/methods etc).
In the example presented I would probably just have used Debug.WriteLine; since this is decorated with [Conditional("DEBUG")], all calls to it are automatically removed if DEBUG is not defined at build.
in the example above why do they define DEBUG? I was under the impression that was set if you compile in debug v. release mode?
Probably because it is example code. It is meant to demonstrate how #ifdef and friends work. I wouldn't expect you to define symbols like that in source files, unless it is for a quick test.
looking at the other example which has "#define MYTEST" and then writes to the console dependent on if it 'defined', but how does this differ from just using a variable? What am I missing here?
If MYTEST is not defined at compile time, the compiler will not actually emit the code between the #if and #endif blocks. Therefore the resultant IL will be smaller.
Also, note that these are not preprocessor directives in C#.
If you use variable all your code is compiled, when you use preprocessor directives only part of code included in executable/dll.
I would like to give one example where I have used preprocessor directive in my project.
My program creates lot of intermediate files on disk. I used #DEBUG directive to delete those files only if my project is in release mode, otherwise I keep those file so that we can view those intermediate files and determine whats happening inside.
When my app is working on production server, I build project in release mode so those files are deleted after processing is complete.
#if (DEBUG==false)
deleteTempFiles()
#endif
I have some code which needs a different handling when using the Mono environment instead of the CLR - thus I have a Mono directive in some of my modules. I think this is a better example than debug
I've used it for a lot of things. Debug messages that I only want in debug builds; clean up temp files; include diagnostic functions or actions.

C# .NET Application Crashes Immediately After Starting

I was experimenting with the Assembly and File version number. Though my program runs well from the IDE, but after creating a Setup file and installing the application crashes with InvalidDeploymentException.
What should I do to resolve the matter?
The [AssemblyVersion] and [AssemblyFileVersion] attributes play different roles. [AssemblyVersion] is only visible to managed code and is important for the GAC. Whenever a make a breaking change in the assembly's public interface you should bump this number up.
The compiler embeds an unmanaged resource in an assembly with the /win32res command line option. This includes the VERSIONINFO resource, readable by all unmanaged code, including the shell. It determines what you see when you right-click the assembly in Explorer and look at the Details property page. The "File version" value shown there is set by the [AssemblyFileVersion] attribute. The [AssemblyVersion] value isn't visible there, Explorer doesn't (yet) know how to read that.
It is up to you to decide how to use this attribute. The crash indicates that there's some minimum sanity checking going on in the deployment code, never tried to get it wrong myself to see what would happen. Making them the same would however make a lot of sense.
Microsoft uses [AssemblyFileVersion] a different way, they automatically increment it for each build and nail [AssemblyVersion] down. That's a good idea and the strategy I use. What is however quite ironic is that the automatic version increment feature works exactly backwards, it can only auto-increment [AssemblyVersion]. Sigh.
Try using the fusion log viewer to see what's not being loaded in your deployed app.

Categories

Resources