-
Notifications
You must be signed in to change notification settings - Fork 7.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
twister : tests report: project execution failed without any testsuites failed #86611
Comments
Are you missing a see https://docs.zephyrproject.org/latest/develop/test/ztest.html#test-result-expectations` |
What if, skip is not always expected, it's just used as a way to skip in some cases, I am not clear from this documentation the proper way on how to handle that |
@yperess can you please take a look and give your input? |
We don't have a feature for that, but it should be very easy to support, the quick way is to do this... #if !defined(CONFIG_SPI_STM32_DMA) && !defined(CONFIG_DSPI_MCUX_EDMA)
ZTEST_EXPECT_SKIP(spi_loopback, test_spi_rx_bigger_than_tx);
ZTEST_EXPECT_SKIP(spi_loopback, test_spi_rx_every_4);
#endif
#ifdef CONFIG_SPI_STM
ZTEST_EXPECT_SKIP(spi_loopback, test_spi_rx_half_end)
#endif The better way to do it is implement ZTEST_EXPECT_SKIP_IF(
!defined(CONFIG_SPI_STM32_DMA) && !defined(CONFIG_DSPI_MCUX_EDMA),
spi_loopback,
test_spi_rx_bigger_than_tx
); |
I mean , it seems redundant to say , expect a skip if these configs and then also skip if these configs, maybe for this test, we just put #if around the test cases since these are statically defined macros But more general question is why does a skip lead to a twister project execution fail here, or is it the assumption that makes it fail? Because I thought I have seen tests with skips before that dont cause a fail, and I think don't have this ZTEST_EXPECT_SKIP so a bit confused what is the difference here, maybe they didn't use assumption? |
This is not a twister issue, this is all in ZTEST. Ztest at the end of the test concludes the test has failed and twister just evaluates that and determines based on the output from ztest that something went wrong, see:
We need ztest to mark this as passed when using the assume logic somehow. |
Okay sure, but, it's just that sometimes I have seen that when doing a build using twister, there are some hooks in the scripting and build system for twister to set some flags /change some behaviors that would not normally happen when using like |
no, ztest has 0 python in it :) |
Describe the bug
This PR #86383 Convert SPI Loopback test to ZTEST
and introduce in particular the use of
zassume_false
andzassume_true
.While running the test on STM32 boards, we realized that the project test report was failed yet there is no testsuite failed.
It could be that twister considers
assumption failed
is the same asassertion failed
when report tests status.In twister.json :
To Reproduce
Steps to reproduce the behavior:
Expected behavior
In the twister.json file, for an "assumption failed" in log, we should have
"status":"skipped"
or"status":"passed"
Logs and console output
We can see in console :
Impact
The test is reported as failed without it actually being the case
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: