Tests too verbose
See original GitHub issueMoveIt! is using the following alias for running its tests on Travis via Docker:
run_tests: build --verbose --catkin-make-args run_tests –
However the verbose argument results in really long log files that get cut off by Travis’ limit. e.g. from this build a small snippet of the noise:
make[3]: Leaving directory '/root/ws_moveit/build/moveit_ikfast'
/usr/bin/make -f CMakeFiles/run_tests.dir/build.make CMakeFiles/run_tests.dir/build
make[3]: Entering directory '/root/ws_moveit/build/moveit_ikfast'
make[3]: Nothing to be done for 'CMakeFiles/run_tests.dir/build'.
make[3]: Leaving directory '/root/ws_moveit/build/moveit_ikfast'
Built target run_tests
make[2]: Leaving directory '/root/ws_moveit/build/moveit_ikfast'
/usr/bin/cmake -E cmake_progress_start /root/ws_moveit/build/moveit_ikfast/CMakeFiles 0
make[1]: Leaving directory '/root/ws_moveit/build/moveit_ikfast'
cd /root/ws_moveit/build/moveit_ikfast; catkin build --get-env moveit_ikfast | catkin env -si /usr/bin/make run_tests --jobserver-fds=6,7 -j; cd -
Finished << moveit_ikfast:make
Starting >> moveit_ikfast:symlink
Output << moveit_ikfast:symlink /root/ws_moveit/logs/moveit_ikfast/build.symlink.002.log
Linked: (/root/ws_moveit/devel/.private/moveit_ikfast/share/moveit_ikfast/cmake/moveit_ikfastConfig-version.cmake, /root/ws_moveit/devel/share/moveit_ikfast/cmake/moveit_ikfastConfig-version.cmake)
Linked: (/root/ws_moveit/devel/.private/moveit_ikfast/share/moveit_ikfast/cmake/moveit_ikfastConfig.cmake, /root/ws_moveit/devel/share/moveit_ikfast/cmake/moveit_ikfastConfig.cmake)
Linked: (/root/ws_moveit/devel/.private/moveit_ikfast/lib/pkgconfig/moveit_ikfast.pc, /root/ws_moveit/devel/lib/pkgconfig/moveit_ikfast.pc)
I tried removing the --verbose
flag but then I also lose the important test results data, e.g. from this build it is missing the output that looks like:
[==========] Running 6 tests from 2 test cases.
[----------] Global test environment set-up.
[----------] 1 test from TestPropagationDistanceField
[ RUN ] TestPropagationDistanceField.TestAddRemovePoints
[==========] Running 2 tests from 1 test case.
[----------] Global test environment set-up.
[----------] 2 tests from LoadPlanningModelsPr2
[ RUN ] LoadPlanningModelsPr2.InitOK
[ OK ] TestPropagationDistanceField.TestAddRemovePoints (11 ms)
[----------] 1 test from TestPropagationDistanceField (11 ms total)
How can I hide the build noise but still show the test results? catkin_test_results
will give you a one line summary of failures, but not tell you which tests failed.
I suspect a new verb proposed in https://github.com/catkin/catkin_tools/issues/397 might address this issue.
Issue Analytics
- State:
- Created 7 years ago
- Reactions:21
- Comments:14 (3 by maintainers)
Top Results From Across the Web
Why I've found verbose tests are infinitely better
Verbose tests are fast. Adding dependencies to your test suite slows it down. Especially for spec-based tests, because they add ...
Read more >Overly verbose logging in jest tests - node.js - Stack Overflow
I'm getting this overly verbose output (mostly I don't want to see the console.log stuff, just info: after all completed in my test...
Read more >Managing pytest's output — pytest documentation
The -v flag controls the verbosity of pytest output in various aspects: test session progress, assertion details when tests fail, fixtures details with...
Read more >unittest — Unit testing framework — Python 3.11.1 ...
These methods are used instead of the assert statement so the test runner can ... You can run tests with more detail (higher...
Read more >MORE Verbose please - Google Groups
If you suspect that assertions themselves are not working correctly on your platform, you can build and run Google Test's own tests. We...
Read more >
Top Related Medium Post
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
A workaround is to pipe the output through
sed
:This filters out everything besides the output produced by the commands executed via
run_tests.py
script.Another workaround that I find helpful in having more control over gtest is to first build the test target and then run the tests manually. So assuming I have a test target defined in my CMakeLists.txt file
catkin_add_gtest(${PROJECT_NAME}_test
…` I can do something like this:I like this because I can interact with gtest for example to get a list of tests or to filter tests. Get a list of available tests: