community.docker/.azure-pipelines/templates/coverage.yml
Felix Fontein a4539a309e
Move licenses to LICENSES/, use SPDX-License-Identifier, mention all licenses in galaxy.yml (#430)
* Move licenses to LICENSES/, use SPDX-License-Identifier, mention all licenses in galaxy.yml.

* ignore.txt lines cannot be empty or contain only a comment.

* Cleanup.

* This particular __init__.py seems to be crucial.

* Try extra newline.

* Markdown comments are a real mess. I hope this won't break Galaxy...

* More licenses.

* Add sanity test.

* Skip some files, lint.

* Make sure there is a copyright line everywhere.

* Also check for copyright line in sanity tests.

* Remove colon after 'Copyright'.

* Normalize lint script.

* Avoid colon after 'Copyright' in lint script.

* Improve license checker.

* Update README.md

Co-authored-by: Maxwell G <9920591+gotmax23@users.noreply.github.com>

* Remove superfluous space.

* Referencing target instead of symlink

Co-authored-by: Maxwell G <9920591+gotmax23@users.noreply.github.com>
2022-07-20 07:45:33 +02:00

44 lines
1.9 KiB
YAML

# Copyright (c) Ansible Project
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
# SPDX-License-Identifier: GPL-3.0-or-later
# This template adds a job for processing code coverage data.
# It will upload results to Azure Pipelines and codecov.io.
# Use it from a job stage that completes after all other jobs have completed.
# This can be done by placing it in a separate summary stage that runs after the test stage(s) have completed.
jobs:
- job: Coverage
displayName: Code Coverage
container: default
workspace:
clean: all
steps:
- checkout: self
fetchDepth: $(fetchDepth)
path: $(checkoutPath)
- task: DownloadPipelineArtifact@2
displayName: Download Coverage Data
inputs:
path: coverage/
patterns: "Coverage */*=coverage.combined"
- bash: .azure-pipelines/scripts/combine-coverage.py coverage/
displayName: Combine Coverage Data
- bash: .azure-pipelines/scripts/report-coverage.sh
displayName: Generate Coverage Report
condition: gt(variables.coverageFileCount, 0)
- task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: Cobertura
# Azure Pipelines only accepts a single coverage data file.
# That means only Python or PowerShell coverage can be uploaded, but not both.
# Set the "pipelinesCoverage" variable to determine which type is uploaded.
# Use "coverage" for Python and "coverage-powershell" for PowerShell.
summaryFileLocation: "$(outputPath)/reports/$(pipelinesCoverage).xml"
displayName: Publish to Azure Pipelines
condition: gt(variables.coverageFileCount, 0)
- bash: .azure-pipelines/scripts/publish-codecov.py "$(outputPath)"
displayName: Publish to codecov.io
condition: gt(variables.coverageFileCount, 0)
continueOnError: true