-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with test execution report when multiple scenario outlines and surefirererun is set #2911
Comments
I'm having some trouble reproducing your problem with the information provided.
|
I updated the body: Execute the test from command line (make sure you have surefire.rerunFailingTestsCount set):
Let me know if you can reproduce it with this information. |
Unfortunately not. Could you make a minimal reproducer of the cucumber-java-skeleton? |
The line numbers in the stack trace in the dump stream don't match the sources of |
Hi @mpkorstanje, I just would like to mention I'm also facing this same issue. My bom version is <cucumber.version>7.18.0</cucumber.version> and I am using
for This is my whole dump file:
|
PS: Just to add that this is how the xml file ends. (open test case tag and nothing else after it, this not even being the last test executed and which should be printed)
|
@francislainy cheers! I still can't reproduce the problem but I think I have a vague idea where the problem might be. One contributing factor seems to be that sure fire assumes test names to be unique. Can you try to reproduce your problem with all combinations of the following:
|
So the trick is to have a skipped test included with flaky test. Reproducer: Feature: Example Belly
Scenario: a few cukes
Given I have 0 cukes in my belly
Scenario: a few cukes
Given I have 21 cukes in my belly
Scenario: a few cukes
Given I have 42 cukes in my belly package io.cucumber.skeleton;
import io.cucumber.java.en.Given;
import org.junit.jupiter.api.Assumptions;
public class StepDefinitions {
@Given("I have {int} cukes in my belly")
public void I_have_cukes_in_my_belly(int cukes) {
if (cukes == 0) {
return;
}
if (cukes == 21) {
Assumptions.abort("Not now");
}
throw new RuntimeException("Oops");
}
} <plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<rerunFailingTestsCount>2</rerunFailingTestsCount>
<properties>
<!-- Work around. Surefire does not include enough
information to disambiguate between different
examples and scenarios. -->
<configurationParameters>
cucumber.junit-platform.naming-strategy=long
</configurationParameters>
</properties>
</configuration>
</plugin> |
@francislainy @dunja132015 as a work around ensure that each feature has a unique name and that with in a feature all scenarios have unique names. Then also set <plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.5.0</version>
<configuration>
<properties>
<configurationParameters>
cucumber.junit-platform.naming-strategy=long
</configurationParameters>
</properties>
</configuration>
</plugin> |
@dunja132015 in addition, you'll also want to use JUnit 5 tag expressions to select your tests with Surefire instead of See https://github.com/cucumber/cucumber-jvm/tree/main/cucumber-junit-platform-engine#tags. |
@mpkorstanje Thanks a lot! |
You can also make your scenarios unique with the pickle name strategy. <configurationParameters>
cucumber.junit-platform.naming-strategy=short
cucumber.junit-platform.naming-strategy.short.example-name=pickle
</configurationParameters> But then you have to ensure that each of your scenario outlines uses a parameterized scenario name that is unique.
|
Created issue for Surefire: https://issues.apache.org/jira/browse/SUREFIRE-2260 But we'll probably have to add some suffixes to scenario names if they're not unique in Cucumber. |
Thank you @mpkorstanje . I'm not sure that would feasible in our case though, since a few of the scenarios require the same parameter name, such as <error_status_code>. I've also looked into the opened issue for the surefire plugin and also not exactly sure this is only to do with duplicated test names since I've checked our feature file and there's no duplicates there. I'll try to spend some more time on it this week to see if I can spot anything that can help us with this issue. Other than this, it seems the number of tests run seems incorrect when there's a retry and it does not log the full amount of tests, but this would also require some further investigation and not sure whether we'd want to treat this as a separate issue. |
About the logging for the number of tests executed:
Then I get this for the TEST.xml file, which shows 1 test run (as there's only 1 test which needs a rerunning), even though I have two tests in my feature file.
|
Scenario outlines are syntactic sugar for repeating the same scenario several times with some variables replaced. Then because the |
One other possibility to distinguish the different But I am not sure if that would have any effect with The approach that @mpkorstanje suggested is probably better, like:
(hint: multiple |
👓 What did you see?
Have a feature test file like:
Java step Implementation:
Execute the test from command line (make sure you have surefire.rerunFailingTestsCount set):
>mvn test -Dsurefire.rerunFailingTestsCount=2 -Dcucumber.filter.tags="@XYZ"
TEST OUTPUT:
IMPORTANT: If the pipeline is used, test result is PASS (assuming due to 0 failures reported), and the next pipeline step is not blocked!!
NOTE: The issue seems to be already reported (and appearing fixed) here: #2709, but I'm reproducing it with the steps above.
RunCucumberTest.xml file does not have the proper xml format, it ends with
<testcase name="Example #1.1" classname="Examples" time="0.408"
✅ What did you expect to see?
2 failed tests, 2 passed
📦 Which tool/library version are you using?
cucumber.version: 7.18.1
maven-surefire-plugin version: 3.3.1
🔬 How could we reproduce it?
No response
📚 Any additional context?
No response
The text was updated successfully, but these errors were encountered: