Code Coverage and Lines of Code are displayed as '-' in sonarqube console - c#

I am using azure devops for running a test and trying to integrate sonarqube with it.The issue i am facing is that in the summary part of azure pipeline i am able to view code coverage as 22% but in the sonarqube console i am only able to view code coverage as '-'.There is a warning message that i am seeing when i run 'Run Code Analysis' task in pipeline.
The warning message is WARN: The Code Coverage report doesn't contain any coverage data for the included files.
[Please find the image to view the code coverage displayed in azure pipeline][1]
This is the yaml for dot test task
- task: DotNetCoreCLI#2
displayName: 'dotnet test'
inputs:
command: test
projects: '**/*Test*.csproj'
arguments: '--configuration $(BuildConfiguration) --collect "Code coverage" '
workingDirectory: '$(System.DefaultWorkingDirectory)'
This is the yaml for copy files task that i am doing right after the dot test task
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(Common.TestResultsDirectory)'
inputs:
SourceFolder: '$(Agent.WorkFolder)\_temp'
TargetFolder: '$(Common.TestResultsDirectory)'
Please find the yaml file for Prepare analysis on sonarqube task
displayName: 'Prepare analysis on SonarQube'
inputs:
SonarQube: 'CDA-Sonarqube'
projectKey: Test
projectName: Test
extraProperties: sonar.cs.nunit.reportsPaths
Any help is appreciated.
[1]: https://i.stack.imgur.com/HbZfW.png

WARN: The Code Coverage report doesn't contain any coverage data for the included files.
For troubleshooting hints, please refer to https://docs.sonarqube.org/x/CoBh , the .coverage file will be convert to coveragexml during sonarqube end analysis task
Run Unit Tests and Save Results in file "NUnitResults.xml"
packages\NUnit.ConsoleRunner.3.7.0\tools \ nunit3-console.exe --result=NUnitResults.xml "NUnitTestProject1\bin\Debug\NUnitTestProject1.dll"
or, for older NUnit 2
"%ProgramFiles(x86)%\NUnit 2.6.4\bin \nunit-console.exe /result=NUnitResults.xml "NUnitTestProject1\bin\Debug\NUnitTestProject1.dll"
Meanwhile,there is a workaround explained in the VSTS extension documentation in "Analysing a .NET solution": in the Additional Properties text area, add the following property:
sonar.cs.vscoveragexml.reportsPaths=**/*.coveragexml
You can also refer to these cases(case1 , case2) for details.
Here is a blog about configuring Code Coverage for Dotnet Core based applications using SonarQube and Azure DevOps .

By default the test result files are in temp folder, try to copy files through Copy File task, then the .coverage file will be analysis and generate coveragexml file.
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.sourcesdirectory)\TestResults'
inputs:
SourceFolder: '$(Agent.TempDirectory)'
TargetFolder: '$(build.sourcesdirectory)\TestResults'
On the other hand, you can refer to this article to call CodeCoverage.exe:
Configure Code Coverage for Dotnet Core 2.0 based applications using SonarQube and Azure DevOps

Related

Gitlab does not find csv files required for testing | CI/CD pipeline

I have set up a pipeline for a C# dotnet project. It builds correctly, however the tests do not run - gitlab can't seem to find the csv files that are required for the tests. The CSV files are committed to gitlab and show up in the gitlab build folder. however the error I get is " Error Message:
System.IO.DirectoryNotFoundException : Could not find a part of the path 'C:\Windows\TEMP\Data\xxxxxx.csv'."
I would have thought it would look in the gitlab build folder for the project, but instead it seems to look in C:\windows\temp\Data for the files. Might this be a misconfiguration issue? The pipeline script looks like this:
variables:
SOURCE_CODE_PATH: '"$CI_PROJECT_DIR"'
TEST_DIRECTORY: '"$ProjectName.Test
cache:
paths:
- '$SOURCE_CODE_PATH$TEST_DIRECTORY/Data'
before_script:
- nuget locals all -clear
- echo "starting build for $CI_PROJECT_NAME"
- echo "Restoring NuGet packages..."
- nuget restore "$CI_PROJECT_DIR"
- echo "NuGet Packages restored..."
stages:
- build
- test
build:
tags:
- ProjectName
stage: build
script:
- echo "Release build..."
- 'dotnet build /consoleloggerparameters:ErrorsOnly /maxcpucount /nologo /property:Configuration=Release /verbosity:normal "$CI_PROJECT_DIR"'
artifacts:
untracked: false
test:
tags:
- ProjectName
script:
- echo "starting tests"
- 'dotnet test "$CI_PROJECT_DIR"'
Data is the folder where the files for the tests are stored, so I tried caching them. It doesn't seem to help. The testing framework in use is nunit. Any ideas on what I may be missing or getting wrong here?

Optimizing C# code coverage collection and publishing in Azure DevOps Server 2020

We use the following command line (more or less) to collect C# code coverage in our pipeline:
dotnet test --no-build -l trx -r TheResultsDir --collect "Code coverage" -s CodeCoverage.runsettings
(We actually use the build-in DotNetCoreCLI#2 task for that)
This produces a bunch of .coverage files. We want to do 2 things with them:
Send to our SonarQube server
Publish on the build itself using PublishCodeCoverageResults#1 task.
As it turns out (surprise, surprise), the produced .coverage files are only understood by VS IDE.
As usual - Internet to the rescue. We figured out that:
Using the Microsoft.CodeCoverage tool we can convert the .coverage files to .xml which are understood by SonarQube, but not the PublishCodeCoverageResults#1 task.
Using the reportgenerator tool we can convert the .xml files from (1) to the Cobertura format understood by the PublishCodeCoverageResults#1 task
And that is what we do and it is fine for small projects. However, now we introduce code coverage to our big monolithic application (sigh, helas - all is true) and the timings are awful. For one solution (out of several):
Conversion from .coverage to .xml takes about 20 minutes.
Conversion from .xml to Cobertura - 2h 33m.
So, this is really bad.
Is there a better solution if I want both send to SQ and publish to the build?
Here is the actual code we use:
From .coverage to .xml
CodeCoverage.exe analyze /output:CoverageResult.xml CoverageResult.coverage
(there are several .coverage files, so the command is applied on each one)
From .xml to Cobertura
reportgenerator.exe -reports:CoverageResults\*.xml -targetdir:CoberturaReport -reporttypes:Cobertura
Publishing to build
- task: PublishCodeCoverageResults#1
displayName: Publish Coverage Results
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: CoberturaReport/Cobertura.xml
failIfCoverageEmpty: true
Publishing to SonarQube is done with the standard SQ tasks:
Prepare SQ Analysis
Run SQ Analysis
Publish SQ Analysis
The prepare task is:
- task: SonarQubePrepare#4
displayName: Prepare CI SQ Analysis
inputs:
SonarQube: SonarQube
scannerMode: MSBuild
projectKey: $(SonarQubeProjectKey)
projectName: $(SonarQubeProjectName)
projectVersion: $(SonarQubeProjectVersion)
extraProperties: |
sonar.cs.vscoveragexml.reportsPaths=$(Common.TestResultsDirectory)\vstest-coverage\*.xml
sonar.cs.nunit.reportsPaths=$(Common.TestResultsDirectory)\tests\*.TestResult.xml
sonar.inclusions=**/*.cs
sonar.branch.name=$(SonarQubeSourceBranch)
sonar.scm.disabled=true
Ideally I would like to find a way to eliminate the Cobertura conversion at all. If only it was possible to publish the .coverage files directly, that would be ideal.
This is a known issue on Azure devops. Now, AFAIK, there is no such out box way to eliminate the Cobertura conversion. The only way is using the reportgenerator tool convert the the test result to the Cobertura format at this moment.
That because the Publish Code Coverage Results task only supports coverage result formats such as Cobertura and JaCoCo. And only support the download link for .coverage files currently.
Besides, this issue has been submitted in this earlier suggestion ticket linked here:
support vstest .coverage "code coverage" build results tab
This feature request is On Roadmap, I believe it will be released soon, you can follow this thread to know its latest feedback.

Displaying NUnit tests code coverage on Azure DevOps

I've setup a new pipeline on Azure DevOps which builds and run the tests of the projects. The tests are written with NUnit.
In the pipeline I'm using the VSTest#2 task to run the unit tests and I add the codeCoverageEnabled to true.
In the end the pipeline runs and when I go in the "Code Coverage" tab of the job, it allows me to download .codecoverage file but it does not display its content in the tab. My understanding was that this should happen.
How can I fix this ?
Thanks
By default, the code coverage for the VSTest Task is output to a .codecoverage file, which Azure DevOps does not know how to interpret and only provides as a downloadable file. You'll need to use a few DotNetCoreCLI tasks and coverlet to be able to display code coverage results on the code coverage tab in azure pipelines.
So, if you are on .NET CORE, There is a way how you can do that.
Step 1
add Coverlet.collector nuget Package in your test project
Step 2
Change your azure-pipelines.yml to include the following for your code coverage:
If you have any settings from a CodeCoverage.runsettings file, you can keep them too
- task: DotNetCoreCLI#2
inputs:
command: 'test'
projects: '**/*.Tests/*.csproj'
arguments: -c $(BuildConfiguration) --collect:"XPlat Code Coverage" -- RunConfiguration.DisableAppDomain=true
testRunTitle: 'Run Test and collect Coverage'
displayName: 'Running tests'
- task: DotNetCoreCLI#2
inputs:
command: custom
custom: tool
arguments: install --tool-path . dotnet-reportgenerator-globaltool
displayName: Install ReportGenerator tool
- script: reportgenerator -reports:$(Agent.TempDirectory)/**/coverage.cobertura.xml -targetdir:$(Build.SourcesDirectory)/coverlet/reports -reporttypes:"Cobertura"
displayName: Create reports
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage'
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: $(Build.SourcesDirectory)/coverlet/reports/Cobertura.xml
One other thing to note for the above code is the Report Generator. Depending on which version of .net core you are using it might be required to get a different version of the tool.
More information can also be found on the Microsoft Docs

Azure pipeline not running NUnit tests

Just been introduced to Azure piplelines. My project is a .NET project and is linked up with Azure but does not run my Unit Tests before integrating (therefore is integrating everything even with failing tests)
My .yaml file is:
# ASP.NET Core
# Build and test ASP.NET Core projects targeting .NET Core.
# Add steps that run tests, create a NuGet package, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
buildConfiguration: 'Release'
steps:
- script: dotnet build --configuration $(buildConfiguration)
displayName: 'dotnet build $(buildConfiguration)'
My unit tests are in the solution under a project called MyProjectTests and in a file called ProjectTests.cs. Can anyone please advise what I need to add to my yaml file (or do in general?) to get these to run please? I have looked into this myself and can't seem to find a solution and I want to avoid clogging up my commit history with failed attempts to run the unit tests.
Thanks so much.
UPDATE:
I have fixed by adding the following:
- task: DotNetCoreCLI#2
inputs:
command: test
projects: '**/*Tests/*.csproj'
arguments: '--configuration $(buildConfiguration)'
Here you go.
- task: DotNetCoreCLI#2
displayName: Test
inputs:
command: test
projects: 'PathToTestProject/TestProject.csproj'
arguments: '--configuration Debug'
You can choose whatever configuration you like, of course. And the displayName is optional. If any of your tests fail, the Pipeline will abort subsequent steps.

How to release and publish downloadable binaries?

How do I configure Azure DevOps to publish binaries in a web downloadable form, and automatically update my readme.md or wiki page to reflect the latest released artifacts?
I know how to build release pipelines for artifacts I publish to Azure, e.g. I can publish webapps and functions.
But I can't figure out how to publish and release dowloadable content.
I'd like the result to be similar to GitHub releases, where my users can browse releases, and click and download the files.
I'd like the project page (readme.md or wiki) to automatically be updated with the release data, similar to how I would create a build state link.
If you want to upload the artifacts to your shared path. I recommend that you could add the script task to upload the released artifacts to the shared path or ftp server.
For example, if azure storage is acceptable, then you could publish your build artifacts to the Azure storage with following scripts
$source = "build file"
$azureStorageKey = "xxxxx"
$storage_context = New-AzureStorageContext -StorageAccountName "yourstorageAccount" -StorageAccountKey "$azureStorageKey"
Set-AzureStorageBlobContent -Context $storage_context -Container "containerName" -File $source -Blob "drop.zip" -Force
I'd like the project page (readme.md or wiki) to automatically be updated with the release data, similar to how I would create a build state link.
Yes, you could to do that with Azure pipeline build state badge. You could copy the markdown link into your readme file
Update :
I do a demo upload the build to azure storage with following YAML file.
queue:
name: Hosted VS2017
demands:
- msbuild
- visualstudio
- azureps
steps:
- task: NuGetCommand#2
displayName: 'NuGet restore'
- task: VSBuild#1
displayName: 'Build solution **\*.sln'
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
SourceFolder: '$(build.sourcesdirectory)'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
- task: ArchiveFiles#2
displayName: 'Archive $(Build.ArtifactStagingDirectory)'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
- task: AzureFileCopy#1
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
azureSubscription: xxxxx
Destination: AzureBlob
storage: $(storageAccountName)
ContainerName: $(containerName)
UI design:
what I'd like is for the page to have a download link that points to the latest build that passed.
We could use AzureBlob File Copy task to copy the build easily to the Azure blob storage.
If Azure function is possible, you could use the blob trigger to create your customized page with your script.

Categories

Resources