Displaying NUnit tests code coverage on Azure DevOps - c#

I've setup a new pipeline on Azure DevOps which builds and run the tests of the projects. The tests are written with NUnit.
In the pipeline I'm using the VSTest#2 task to run the unit tests and I add the codeCoverageEnabled to true.
In the end the pipeline runs and when I go in the "Code Coverage" tab of the job, it allows me to download .codecoverage file but it does not display its content in the tab. My understanding was that this should happen.
How can I fix this ?
Thanks

By default, the code coverage for the VSTest Task is output to a .codecoverage file, which Azure DevOps does not know how to interpret and only provides as a downloadable file. You'll need to use a few DotNetCoreCLI tasks and coverlet to be able to display code coverage results on the code coverage tab in azure pipelines.
So, if you are on .NET CORE, There is a way how you can do that.
Step 1
add Coverlet.collector nuget Package in your test project
Step 2
Change your azure-pipelines.yml to include the following for your code coverage:
If you have any settings from a CodeCoverage.runsettings file, you can keep them too
- task: DotNetCoreCLI#2
inputs:
command: 'test'
projects: '**/*.Tests/*.csproj'
arguments: -c $(BuildConfiguration) --collect:"XPlat Code Coverage" -- RunConfiguration.DisableAppDomain=true
testRunTitle: 'Run Test and collect Coverage'
displayName: 'Running tests'
- task: DotNetCoreCLI#2
inputs:
command: custom
custom: tool
arguments: install --tool-path . dotnet-reportgenerator-globaltool
displayName: Install ReportGenerator tool
- script: reportgenerator -reports:$(Agent.TempDirectory)/**/coverage.cobertura.xml -targetdir:$(Build.SourcesDirectory)/coverlet/reports -reporttypes:"Cobertura"
displayName: Create reports
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage'
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: $(Build.SourcesDirectory)/coverlet/reports/Cobertura.xml
One other thing to note for the above code is the Report Generator. Depending on which version of .net core you are using it might be required to get a different version of the tool.
More information can also be found on the Microsoft Docs

Related

Azure FunctionApp Dependency Injection Error only if deploying through Azure Devops

When I deploy my function app with the command
func azure functionapp publish '<name>' --dotnet it successfully packages my code and publishes it to the function app, and everything works.
I'm now working on building out automation pipelines, and I created a pipeline with the following stages
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: UseDotNet#2
displayName: 'Use .NET 6 Core sdk'
inputs:
packageType: 'sdk'
version: '6.0.402'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
command: 'build'
projects: |
$(workingDirectory)/*.csproj
arguments: --output $(System.DefaultWorkingDirectory)/publish_output --configuration Release
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/publish_output'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
which packages the app and
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureFunctionApp#1
displayName: 'Azure functions app deploy'
inputs:
azureSubscription: $(azureSubscription)
appType: functionApp
appName: $(functionAppName)
slotName: $(slotName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
- task: AzureAppServiceManage#0
inputs:
azureSubscription: $(azureSubscription)
Action: 'Swap Slots'
WebAppName: $(functionAppName)
ResourceGroupName: 'group-name'
SourceSlot: $(slotName)
SwapWithProduction: true
It builds perfectly fine and then it also deploys and swaps the function app slots without any error.
The issue is that after it does this, everything is broken when making requests the function endpoints. I either get 404 errors saying my function endpoints don't exist, or I get 500 dependency injection errors. To escape this chaos I locally run func azure functionapp publish '<name>' --dotnet again on the exact same code that was deployed through the pipeline and then everything works fine.
I also tried deploying straight to the production slot instead of swapping, and the result is the same through the pipeline.
I am using the exact same version of dotnet locally as I am in the pipeline. Without any good errors to help me understand why it says it deployed without issue but everything is broken, it's hard for me to figure out what's going on.
Does anyone have any idea what's going on?
Here is the dependency Injection error:
System.InvalidOperationException : Unable to resolve service for type
'Api.Request.Services.User' while attempting to activate
'Api.Request.Functions.Create'.at
Microsoft.Extensions.DependencyInjection.ActivatorUtilities.GetService(IServiceProvider
sp,Type type,Type requiredBy,Boolean isDefaultParameterRequired)at
lambda_method331(Closure ,IServiceProvider ,Object[] )at
Microsoft.Azure.WebJobs.Host.Executors.DefaultJobActivator.CreateInstance[T](IServiceProvider
serviceProvider) at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\DefaultJobActivator.cs
: 42at
Microsoft.Azure.WebJobs.Host.Executors.DefaultJobActivator.CreateInstance[T](IFunctionInstanceEx
functionInstance) at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\DefaultJobActivator.cs
: 31at
Microsoft.Azure.WebJobs.Host.Executors.ActivatorInstanceFactory1.<>c__DisplayClass1_1.<.ctor>b__0(IFunctionInstanceEx i) at C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\ActivatorInstanceFactory.cs : 20at Microsoft.Azure.WebJobs.Host.Executors.ActivatorInstanceFactory1.Create(IFunctionInstanceEx
functionInstance) at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\ActivatorInstanceFactory.cs
: 26at
Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2.CreateInstance(IFunctionInstanceEx
functionInstance) at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionInvoker.cs
: 44at
Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ParameterHelper.Initialize()
at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs
: 791at async
Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.TryExecuteAsync(IFunctionInstance
functionInstance,CancellationToken cancellationToken) at
C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs
: 104
but I am including it in Startup.cs
services.AddScoped<Services.User>();
Also, I am using in-process function apps, could that be why everything is breaking in the pipeline? I just don't get why it works perfect if I use the func command and doesn't work at all when I use the function azure provides in the pipeline. Isn't this all MS, so shouldn't the underlying publish mechanisms be the same?
EDIT
One thing I've just found is that the files in /home/wwwroot are different when deploying through the pipeline and through the func azure functionapp command. When deploying through the pipeline it has a lot more files in it. I'm wondering if it's building the project incorrectly..
The issue was that I had thought I was building/zipping and deploying my code correctly, but it wasn't being packaged the same way that it was locally.
Using Kudu to view the deployed files showed me that everything was being placed in the /home/wwwroot directory instead of having most of the built files being placed in the /bin directory. This explained why I was getting the unexpected behavior. After I altered my build script, I was able to get it to properly build and deploy my code changes.

How can I share a built .NET Core project between many jobs in Azure pipelines?

- job: buildAndTestJob
steps:
- task: DotNetCoreCLI#2
displayName: dotnet restore
inputs:
command: restore
vstsFeed: $(vstsFeed)
- task: DotNetCoreCLI#2
displayName: 'dotnet build'
inputs:
arguments: '--configuration ${{ parameters.buildConfiguration }}'
- task: DotNetCoreCLI#2
displayName: 'dotnet test'
inputs:
command: test
arguments: '--configuration ${{ parameters.buildConfiguration }}'
- task: CopyFiles#2
displayName: copy bin files
inputs:
sourceFolder: '$(Build.SourcesDirectory)'
contents: '**/bin/**/*'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: CopyFiles#2
displayName: copy obj files
inputs:
sourceFolder: '$(Build.SourcesDirectory)'
contents: '**/obj/**/*'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
displayName: publish build artifacts
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: buildeffect
publishLocation: 'Container'
-job: packAndPushJob
steps:
- task: DownloadBuildArtifacts#0
displayName: download build artifacts
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'buildeffect'
downloadPath: '$(Build.ArtifactStagingDirectory)'
- task: CopyFiles#2
displayName: copy files to source directory
inputs:
sourceFolder: '$(Build.ArtifactStagingDirectory)/buildeffect'
contents: '**/*'
TargetFolder: '$(Build.SourcesDirectory)'
allowPackageConflicts: true
- task: DotNetCoreCLI#2
displayName: 'dotnet pack'
inputs:
arguments: '--no-restore'
nobuild: true
command: pack
projects: '$(Build.SourcesDirectory)'
publishWebProjects: true
publishVstsFeed: $(vstsFeed)
includeNuGetOrg: true
- task: NuGetCommand#2
displayName: 'nuGet push'
inputs:
command: push
projects: '$(Build.SourcesDirectory)'
publishVstsFeed: $(vstsFeed)
allowPackageConflicts: true
I have 2 jobs. First, to restore, test, build and share build files in a build artifact. Second to pack and push nuget packages. First job finished their job with success, but second job failed during pack task. It have a problem with nuget packages, for example:
/usr/share/dotnet/sdk/3.0.100/Sdks/Microsoft.NET.Sdk/targets/Microsoft.PackageDependencyResolution.targets(234,5): error NETSDK1064: Package Microsoft.CSharp, version 4.6.0 was not found. It might have been deleted since NuGet restore. Otherwise, NuGet restore might have only partially completed, which might have been due to maximum path length restrictions. [/home/vsts/work/1/s/src/Spotio.Leads.Client/Spotio.Leads.Client.csproj]
So, maybe should we share builded project in another way? Or maybe add some parameters with feeds to restore? I don't have any idea, so please, if you have any suggestions, help us :)
How can I share a built .NET Core project between many jobs in Azure
pipelines?
1.See this : Projects using the PackageReference format always use packages directly from this folder(%userprofile%\.nuget\packages).
2.Project that targets .net core uses PackageReference format, so the restored packages are stored in %userprofile%\.nuget\packages in first agent.
3.For your second agent job, devops actually start another hosted agent to run your tasks. It means for second agent, it doesn't have the referenced packages in %userprofile%\.nuget\packages.
4.Something we should know is that: Though we've copied all files of bin and obj to second agent, dotnet pack xx.csproj will still try to confirm the referenced packages exist, then the issue occurs.
So I suggest you can add a dotnet restore task before that dotnet pack task in second agent job to make sure the missing packages can be found in second agent.
Note:
1.In one build pipeline with two agent jobs, though these two agent jobs all use hosted agent, these two agent is not the same instance.
2.Make sure the configuration you use to build is the same configuration you use to pack.
Hope it helps. If i misunderstand anything, feel free to let me know :)

Azure pipeline not running NUnit tests

Just been introduced to Azure piplelines. My project is a .NET project and is linked up with Azure but does not run my Unit Tests before integrating (therefore is integrating everything even with failing tests)
My .yaml file is:
# ASP.NET Core
# Build and test ASP.NET Core projects targeting .NET Core.
# Add steps that run tests, create a NuGet package, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
buildConfiguration: 'Release'
steps:
- script: dotnet build --configuration $(buildConfiguration)
displayName: 'dotnet build $(buildConfiguration)'
My unit tests are in the solution under a project called MyProjectTests and in a file called ProjectTests.cs. Can anyone please advise what I need to add to my yaml file (or do in general?) to get these to run please? I have looked into this myself and can't seem to find a solution and I want to avoid clogging up my commit history with failed attempts to run the unit tests.
Thanks so much.
UPDATE:
I have fixed by adding the following:
- task: DotNetCoreCLI#2
inputs:
command: test
projects: '**/*Tests/*.csproj'
arguments: '--configuration $(buildConfiguration)'
Here you go.
- task: DotNetCoreCLI#2
displayName: Test
inputs:
command: test
projects: 'PathToTestProject/TestProject.csproj'
arguments: '--configuration Debug'
You can choose whatever configuration you like, of course. And the displayName is optional. If any of your tests fail, the Pipeline will abort subsequent steps.

Code Coverage and Lines of Code are displayed as '-' in sonarqube console

I am using azure devops for running a test and trying to integrate sonarqube with it.The issue i am facing is that in the summary part of azure pipeline i am able to view code coverage as 22% but in the sonarqube console i am only able to view code coverage as '-'.There is a warning message that i am seeing when i run 'Run Code Analysis' task in pipeline.
The warning message is WARN: The Code Coverage report doesn't contain any coverage data for the included files.
[Please find the image to view the code coverage displayed in azure pipeline][1]
This is the yaml for dot test task
- task: DotNetCoreCLI#2
displayName: 'dotnet test'
inputs:
command: test
projects: '**/*Test*.csproj'
arguments: '--configuration $(BuildConfiguration) --collect "Code coverage" '
workingDirectory: '$(System.DefaultWorkingDirectory)'
This is the yaml for copy files task that i am doing right after the dot test task
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(Common.TestResultsDirectory)'
inputs:
SourceFolder: '$(Agent.WorkFolder)\_temp'
TargetFolder: '$(Common.TestResultsDirectory)'
Please find the yaml file for Prepare analysis on sonarqube task
displayName: 'Prepare analysis on SonarQube'
inputs:
SonarQube: 'CDA-Sonarqube'
projectKey: Test
projectName: Test
extraProperties: sonar.cs.nunit.reportsPaths
Any help is appreciated.
[1]: https://i.stack.imgur.com/HbZfW.png
WARN: The Code Coverage report doesn't contain any coverage data for the included files.
For troubleshooting hints, please refer to https://docs.sonarqube.org/x/CoBh , the .coverage file will be convert to coveragexml during sonarqube end analysis task
Run Unit Tests and Save Results in file "NUnitResults.xml"
packages\NUnit.ConsoleRunner.3.7.0\tools \ nunit3-console.exe --result=NUnitResults.xml "NUnitTestProject1\bin\Debug\NUnitTestProject1.dll"
or, for older NUnit 2
"%ProgramFiles(x86)%\NUnit 2.6.4\bin \nunit-console.exe /result=NUnitResults.xml "NUnitTestProject1\bin\Debug\NUnitTestProject1.dll"
Meanwhile,there is a workaround explained in the VSTS extension documentation in "Analysing a .NET solution": in the Additional Properties text area, add the following property:
sonar.cs.vscoveragexml.reportsPaths=**/*.coveragexml
You can also refer to these cases(case1 , case2) for details.
Here is a blog about configuring Code Coverage for Dotnet Core based applications using SonarQube and Azure DevOps .
By default the test result files are in temp folder, try to copy files through Copy File task, then the .coverage file will be analysis and generate coveragexml file.
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.sourcesdirectory)\TestResults'
inputs:
SourceFolder: '$(Agent.TempDirectory)'
TargetFolder: '$(build.sourcesdirectory)\TestResults'
On the other hand, you can refer to this article to call CodeCoverage.exe:
Configure Code Coverage for Dotnet Core 2.0 based applications using SonarQube and Azure DevOps

How to release and publish downloadable binaries?

How do I configure Azure DevOps to publish binaries in a web downloadable form, and automatically update my readme.md or wiki page to reflect the latest released artifacts?
I know how to build release pipelines for artifacts I publish to Azure, e.g. I can publish webapps and functions.
But I can't figure out how to publish and release dowloadable content.
I'd like the result to be similar to GitHub releases, where my users can browse releases, and click and download the files.
I'd like the project page (readme.md or wiki) to automatically be updated with the release data, similar to how I would create a build state link.
If you want to upload the artifacts to your shared path. I recommend that you could add the script task to upload the released artifacts to the shared path or ftp server.
For example, if azure storage is acceptable, then you could publish your build artifacts to the Azure storage with following scripts
$source = "build file"
$azureStorageKey = "xxxxx"
$storage_context = New-AzureStorageContext -StorageAccountName "yourstorageAccount" -StorageAccountKey "$azureStorageKey"
Set-AzureStorageBlobContent -Context $storage_context -Container "containerName" -File $source -Blob "drop.zip" -Force
I'd like the project page (readme.md or wiki) to automatically be updated with the release data, similar to how I would create a build state link.
Yes, you could to do that with Azure pipeline build state badge. You could copy the markdown link into your readme file
Update :
I do a demo upload the build to azure storage with following YAML file.
queue:
name: Hosted VS2017
demands:
- msbuild
- visualstudio
- azureps
steps:
- task: NuGetCommand#2
displayName: 'NuGet restore'
- task: VSBuild#1
displayName: 'Build solution **\*.sln'
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
SourceFolder: '$(build.sourcesdirectory)'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
- task: ArchiveFiles#2
displayName: 'Archive $(Build.ArtifactStagingDirectory)'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
- task: AzureFileCopy#1
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
azureSubscription: xxxxx
Destination: AzureBlob
storage: $(storageAccountName)
ContainerName: $(containerName)
UI design:
what I'd like is for the page to have a download link that points to the latest build that passed.
We could use AzureBlob File Copy task to copy the build easily to the Azure blob storage.
If Azure function is possible, you could use the blob trigger to create your customized page with your script.

Categories

Resources