Following on from this tutorial from MS, I have created an analyzer for Roslyn.
According to the page, you can mark the rule as DiagnosticSeverity.Error, and this will cause the build to break:
In the line declaring the Rule field, you can also update the severity of the diagnostics you’ll be producing to be errors rather than warnings. If the regex string doesn’t parse, the Match method will definitely throw an exception at run time, and you should block the build as you would for a C# compiler error. Change the rule’s severity to DiagnosticSeverity.Error:
internal static DiagnosticDescriptor Rule =
new DiagnosticDescriptor(DiagnosticId, Title, MessageFormat,
Category, DiagnosticSeverity.Error, isEnabledByDefault: true, description: Description);
In my code, I have created the rule more or less as detailed here:
private static readonly DiagnosticDescriptor Rule =
new DiagnosticDescriptor(DiagnosticId, Title, MessageFormat, Category,
DiagnosticSeverity.Error, true, helpLinkUri: HelpUrl);
This rule works fine. It throws up the red lines, it displays the message in the errors list. However, the build succeeds, and I am able to successfully run the application.
NB: I've created this rule to capture Thread.Sleep for this example.
Is there additional setup required to ensure a rule breaks the build?
This is a feature of Analyzers running from a VSIX file.
If the IDE-installed rules ran as part of the in-IDE build, it would result in IDE builds and command line builds having potentially very different outputs. For example, a user with code-cracker installed as a VSIX could end up filing a bug report that an open source project does not build due to an analyzer error (or perhaps a warning when the project uses /warnaserror). They would be forced to either uninstall the analyzer extension or modify the rule set used by the project to disable some rule that only exists on one developer's machine.
In contrast, rules that are installed via NuGet become part of the project and part of the build. They run the same way across developer machines, and they run the same way in-IDE, on the command line, and in automated build environments.
Source: IDE rules don't fail builds
In order to make the build fail for the rules, you need to add the analyzer as a nuget package to the project. This will ensure that failures will cause the build to fail as expected.
Related
IntelliSense gives errors because it disregards (does not see) the source generator generated source, despite build gives no errors.
I've tried to build/rebuild multiple times. The generated source are exist, and OK, I can navigate into them with ctrl + click and also the build gives no error.
Still the red underline and IntelliSense errors are there...
Question
What am I missing?
Let me clarify the comment by #Hans Kesting , because I've recently been down this frustrating path:
There are typically multiple caches involved here.
One issue is that Visual Studio does not allow for analyzer assemblies to be unloaded once they are loaded. Once Visual Studio has loaded your analyzer for use in the IDE and Intellisense, it will keep using that version until you close Visual Studio, or at least until you increase the assembly version.
However, when you hit build/rebuild for your project Visual Studio will spawn a new msbuild process which will (typically) load a fresh version of your analyzer. Thus you may end up with a project that builds fine but doesn't update the IDE and Intellisense.
Another cache issue concerns incremental builds with IIncrementalGenerator. This newer version of the source generator will, if you play it right, cache the last execution and reuse the output for the IDE/Intellisense if nothing related has changed. This typically requires you to implement a custom equality comparer for the content of the source syntax node. However, if this comparison fails to take into account the relevant content (i.e. that which actually changed with the last keypress) the generator will not be executed and the IDE/Intellisense will not be updated. Again, msbuild may still run fine because each new build ignores any previous output cache and just feeds the analyzer every source node from the start.
I've got a solution containing two MVC 5 web applications with associated class libraries and the code analysis settings are causing the build to hang. If I try to interact with the UI during this time I get the "VS is busy" bubble. Leaving the build to complete overnight doesn't work either.
To troubleshoot this I turned off code analysis on all projects and the project builds just fine [0]. So I enabled the "Microsoft All Rules" on one of the MVC projects and the build process doesn't complete.
"Microsoft Managed Minimum Rules" builds but what I'd now like is that there's some kind of structured way of going through the rulesets, where the next one I try is a superset of the last successful one. Does such a hierarchy exist, and if so, is there a canonical reference for it?
Once I get to that level then I can start to isolate individual rules, perhaps by increasing the verbosity of the build output...
[0]
This statement should not be interpreted as "Building without code analysis is perfectly okay"
A general hierarchy is exposed via the Include elements in the .ruleset files located under the Visual Studio install directory (e.g.: C:\Program Files (x86)\Microsoft Visual Studio 14.0\Team Tools\Static Analysis Tools\Rule Sets for a typical VS 2015 installation). Broadly, it looks something like this (with "All Rules" not actually depending on any of the others):
All Rules
Extended Correctness Rules
Basic Correctness Rules
Minimum Recommended Rules
Extended Design Guideline Rules
Basic Design Guideline Rules
Minimum Recommended Rules (same as above)
Globalization Rules
Security Rules
It's also worth noting that this isn't a clean hierarchy without overlaps. For example, rules included in the "Globalization" and "Security" rulesets are also included in some of the others (including the "Minimum" set).
To inherit from a ruleset file, you can include it with:
<Include Path="MyOther.ruleset" Action="Default" />
Then you can override the action for specific rules.
I'm tring to run analysis on SonarQube, using an FxCop custom Rule.
In SonarQube 4.5.7 I add the rule to the set of rules, I activate it and then run the analysis.
To run the analysis I use the sequence of following commands:
1) MSBuild.SonarQube.Runner.exe begin /k:my.project.C-Sharp-ConsoleApp /n:C-Sharp-ConsoleApp /v:1.1
2) "C:\Program Files (x86)\MSBuild\14.0\Bin\MSBuild" /T:Rebuild
3) MSBuild.SonarQube.Runner.exe end
I see that the rule is executed, because when I run the second command I read the following part of log:
...
(RunCodeAnalysis target) ->
MSBUILD : warning CR1000: MyRules : Field 'CSharpSortNames.m_variabile' is not in Hungarian notation. Field name should be prefixed with 's'. [C:\Users\Alessandro\Documents\Visual Studio 2015\Projects\C-Sharp-ConsoleApp\C-Sharp-ConsoleApp\C-Sharp-ConsoleApp.csproj]
My custom rule checkID is CR1000, and after the third command, I see that an error of this rule is founded, but the web app doesn't let me see where. For all other errors the web app let me see the precise line where is the error by a link to the .cs file. For my rule it doesn't.
Anyone can help me about this?
Further problem is in SonarQube 5.4 the same rule is activated but web app does not show the error.
The root cause here is that FxCop uses information from the PDB file for providing location information. However, the PDB only contains information that would be useful for debugging scenarios, which means that FxCop rule violations that are associated with non-executable code (e.g.: field declarations or interface definitions) will not have location information available. (FWIW, there is an open SonarQube issue for addressing this, but it would be non-trivial to accomplish unless SonarQube were to directly examine the source files to attempt to locate the field declaration. I rather suspect they might not bother given that it simpler to address via a Roslyn analyzer.)
Further problem is in SonarQube 5.4 the same rule is activated but web
app does not show the error.
That's because older versions of the C# plugin for SonarQube completely ignored FxCop violations without location information. This was addressed in version 5.2 of the plugin, which only became in early May 2016 (and is presumably what you used when you installed SonarQube 5.5). It is compatible with version 5.4 of SonarQube, so you should be able to use it with your older installs if you like.
Are there any performance costs when static code analysis is enabled in a production (release) build?
Our CI server runs code analysis on a debug build of our C# projects, whereas the release build has static code analysis disabled (i.e. CODE_ANALYSIS not defined). If there's no reason to disable code analysis on production builds, then I'm wasting time with the debug build.
Reflector shows me that SuppressMessage attributes are excluded if code analysis is disabled, but I don't expect the extra attribute to affect run-time performance. Is that the only effect of enabling static code analysis (in Visual Studio 2013)?
There are actual differences when compiling with the CODE_ANALYSIS keyword enabled, for example, the compiler will remove all [SuppressMessage] attributes from the assembly when it is not enabled (and may thus cause messages to show up when you run FxCop later from the command line, since the Suppressions have been removed). If you're installing your binaries on a internal system, it may be fine to leave the suppressions in the binaries. Some companies want them removed from assemblies released to 3rd parties, since the presence of these attributes (and the contents of the Justification properties) might disclose sensitive information.
When running Code Analysis on a DEBUG build you might get stricter results, certain optimizations that occur in most RELEASE builds can cause specific FxCop rules to get lost. The optimization may remove private methods (through inlining) or replace calls to constants with the value, instead of the definition of the constant. It will not be possible for FxCop to validate these items, since they have been removed. This is to be expected.
For best results: Run Code Analysis in a Debug build. For least information disclosure, remove the CODE_ANALYSIS constant from Release builds.
On TFS 2010 I configured some CI builds that run MSTest unit tests too. This works fine, except for one solution, where I usually (but not always) get the following build (not test runner) error:
C:\Program Files
(x86)\MSBuild\Microsoft\VisualStudio\v10.0\TeamTest\Microsoft.TeamTest.targets
(14): Object reference not set to an instance of an object.
The Logging Verbosity of the Build Definition is set to Diagnostic. Some social.msdn.com posts suggest this to get rid of this very occasional error. If it were occasional, by the way, we could work around it by scheduling another build if it is broken for the above reason. It's not, however, and takes pretty much time to build too.
Even though it is a build error, it can be fixed by setting Disable Tests to True. I do want to run the test, however. Does anybody know how to fix this? The other (working) solutions are often subsets of All Projects's projects. All Project is a rather big solution.
I have the shortened MSBuild Output here, in case it helps:
Run MSBuild for Project
Initial Property Values
AdditionalVCOverrides =
CommandLineArguments = /p:SkipInvalidConfigurations=true
Configuration = Release
GenerateVSPropsFile = True
LogFile =
LogFileDropLocation =
MaxProcesses = 1
OutDir = All Projects-CI\Binaries\Release
Platform = Any CPU
Project = All Projects-CI\Sources\Shared\All Projects.sln
ResponseFile =
RunCodeAnalysis = AsConfigured
Targets =
TargetsNotLogged = String[] Array
ToolPath =
ToolPlatform = Auto
Verbosity = Diagnostic
Built $/.../DataAccessLayer.Testing.csproj for default targets.
C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\TeamTest\Microsoft.TeamTest.targets (14): Object reference not set to an instance of an object.
That BuildShadowTask custom task in MSBuild\Microsoft\VisualStudio\v10.0\TeamTest\Microsoft.TeamTest.targets is causing the error. The reason why you don't see the error when you disable tests this build task doesn't run when tests are disabled.
Suggest adding MsBuild Message tasks to output the various values to determine which one is causing the "Object reference not set to an instance of an object" error.
Before this line:
<BuildShadowTask
ExecuteAsTool="False"
CurrentResolvedReferences="#(ReferencePath)"
CurrentCopyLocalFiles="#(ReferenceCopyLocalPaths)"
Shadows="#(Shadow)"
ProjectPath="$(ProjectDir)"
IntermediatePath="$(IntermediateOutputPath)"
SignAssembly="$(SignAssembly)"
KeyFile="$(AssemblyOriginatorKeyFile)"
DelaySign="$(DelaySign)">
Add a Message task to output the values of each parameter passed to the BuildShadowTask to determine which one is in error:
<Message Text="AssemblyOriginatorKeyFile $(AssemblyOriginatorKeyFile)" Importance="High" />