Comments (5)
TL;DR: In my opinion, it is a good idea to measure code coverage for each type of test. This enables a detailed analysis of the tests.
By defining test labels, it is possible to output test coverage for each type of test.
To illustrate this benefit, I forked Autoware.Auto
and added test labels.
Here is my repository. https://gitlab.com/keisuke.shima/AutowareAuto/-/commit/c812f150906b63a176c685cf243cb9f8bc5f6876
By specifying the labels and running the tests, we can measure the code coverage for each of the unit tests and the integration tests (including smoke tests and component tests).
This is useful for reviewing pull requests. Knowing which tests are covering the changes helps reviewers to consider whether the changes are properly tested.
It is also possible to define merge criteria for each type of test. For example, the criteria can be set to 80% or more for the unit test, 75% for the integration test.
As an example, here is the coverage of costmap_generator_nodes
.
Since there are only launch_test in this code, the gtest coverage and smoke_test coverage are zero. We do not know this from the coverage report calculated from all the tests.
I would like to use this feature for the development of Autoware Core
.
ros-metrics-reporter
developed by TIER IV is a tool for continuous integration to measure software metrics, including code coverage reports by labels.
Source code and example results can be found below.
ros-metrics-reporter repository
Example for Autoware.Auto report
from autoware-documentation.
Related my previous comment: #2 (comment)
In my previous comment, I showed that detailed code coverage can be measured by testing by label.
(As a related work, I've added a test-by-label workflow here.)
Now that we are ready, I would like to discuss the specific test labels.
As shown here, Autoware.Auto had a testing framework in place: smoke test and interface test.
The benefits of these are explained in the link.
First, I would like to apply the above two tests that were accepted in Autoware.Auto to Core/Universe.
These tests can be added to each Node with only a few changes, and if there is a problem, the CI system will be able to detect the error. And the cost of fixing them is lower than if they are detected by later tests such as scenario tests or real vehicle tests. (This approach is called Shift-left testing.)
Another advantage is that you do not have to write gtest
for the lines that these tests target.
As a result, the following test labels are created in Core/Universe
- gtest
- smoke_test
- interface_test
If you have ideas about adding or removing test labels, please comment. I'm open to ideas here.
from autoware-documentation.
This pull request has been automatically marked as stale because it has not had recent activity.
from autoware-documentation.
I'll write some documents about this, but it's a low priority.
from autoware-documentation.
This pull request has been automatically marked as stale because it has not had recent activity.
from autoware-documentation.
Related Issues (20)
- Add documentation for perception interface HOT 1
- Image and Latex Equations are not properly loaded/rendered HOT 4
- No progress update using rocker with docker installation (How to set up a workspace)
- Add Frequently Asked Question page HOT 1
- Add documentation for map interface HOT 1
- DevOps Dojo: ROS Node Configuration - JSON Schema to MD-Table for Web Documentation HOT 6
- DevOps Dojo: ROS Node Configuration - Update Parameter Contribution Guidelines HOT 2
- Establish and document the versioning system for the autoware_msgs HOT 2
- Add documentation for Creating vehicle and sensor description HOT 2
- Add documentation on how to add a new localization node in autoware HOT 1
- Fill the documentation of "Creating vehicle interface for ackerman kinematic model"
- The link to the Autoware concepts documentation page may be broken. HOT 2
- Add documentation on how to guide autoware meta-repository
- "Behavior Planner" calculates "path", not "trajectory".
- Broken Links Issue Tracking HOT 4
- Update Autoware Integration Document HOT 1
- Update DDS isolation instructions for shared networks in Autoware HOT 4
- Add documentation how to train traffic light fine detector model HOT 1
- Update docker installation documentation
- Add docs for guiding the developers on finding, diagnosing, and fixing scenarios in Autoware HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autoware-documentation.