-
Notifications
You must be signed in to change notification settings - Fork 49
Add Apply Workflow End-to-End Verification Test Plan #7478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
doug-s-nava
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've never worked with this sort of file before, so excuse my ignorance. It seems like serves as a summary of a feature, before writing automated tests or running manual QA, that includes a rough outline of what will be tested and where. As such iti's a document to keep product, engineering and testing on the same page as to expectations. Does that sound right?
If so, should product review as well?
I also wonder if this might belong more in an epic definition in the Github project rather than in the file system, but I'll need to think that over a little. @mdragon @ychoy - thoughts?
| @@ -0,0 +1,132 @@ | |||
| # Apply Workflow End-to-End Verification plan | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we take a quick step back and talk about what you see as the file structure we'll have set up here? Is it something like
- frontend
- tests
- e2e
- <feature name>
- features
- feature files
- testplan
- test plan markdown files
- specs
- playwright test spec files
- ???
- any other file types that I'm missing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right. The structure you outlined is what I had in mind. Conceptually:
frontend/tests/e2e/
|--/ # Feature-level folder (e.g., apply, search)
|-----|--- / # Artifact folder (e.g., features, testplan, steps, specs)
|--------------|--- / # Actual files inside the artifact folder
With this structure in place, the E2E folder in our current GitHub repo would appear as follows:
frontend/tests/e2e/
|---apply/
|----|--- features/ (BDD-style feature files / high-level workflow definitions)
|----|--- testplan/ (test plan / verification markdown files - living documents that evolve as functionality grows)
|----|--- steps/ (optional - step definition files if we adopt a step-based / BDD approach)
|----|--- specs/ (optional / proposed - Playwright test spec files)
|----|--- (others as needed, e.g. helpers, test data)
|
|---search/
|----|--- features/ (BDD-style feature files / high-level workflow definitions)
|----|--- testplan/ (test plan / verification markdown files - living documents that evolve as functionality grows)
|----|--- steps/ (optional - step definition files if we adopt a step-based / BDD approach)
|----|--- specs/ (optional / proposed - Playwright test spec files)
|----|--- (others as needed, e.g. helpers, test data)
Currently, I see that Playwright spec files are maintained directly under the folder, so I followed the same pattern in Login flow PR for consistency. That said, I like the idea (as in your comment above) of moving spec files into a dedicated folder, as it would make the structure more organized and easier to scale as we add more tests for each functionality under the feature. This is a proposed approach, I am open to align with the team’s preferred direction.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sounds good to me - in line with the other comment about file names, can this be testPlan instead of testplan?
frontend/tests/e2e/apply/testplan/apply_workflow_e2e_verification_plan.md
Outdated
Show resolved
Hide resolved
frontend/tests/e2e/apply/testplan/apply_workflow_e2e_verification_plan.md
Outdated
Show resolved
Hide resolved
frontend/tests/e2e/apply/testplan/apply_workflow_e2e_verification_plan.md
Outdated
Show resolved
Hide resolved
| • Validates full user experience: Start → Fill → Upload → Review → Submit | ||
| • Includes autosave, negative/edge case handling, network errors, browser compatibility checks | ||
| • Location: | ||
| o frontend/tests/e2e/apply/happypath.specs.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we decide on standard file naming convention for all e2e file names? I am fine with either camelCase, snake_case, or whatever-it-is-called-when-you-use-dashes, but we should be consistent. The rest of the frontend app uses camelCase, but there are examples of each across the code base
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing this out! I agree - we should standardize E2E file names. I noticed that existing folders were using camelCase, we can use the same for naming files to be consistent. We can create a quick guideline and update the existing files over time as part of maintenance task. Let me know your thoughts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From experience I'd recommend using dashes. E2E descriptive name can get long, and having dashes rather than camel case make them much easier to read.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like more descriptive file names rather than things like happypath and edgecases as it isn't clear what they are actually testing. Even in the table below you have to specify what each happy path and each edgecase is in a separate column. I'd rather that embedded in the name so when something fails you know exactly what went wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we're also going to probably want to do form-by-form testing. So I'd suggest we have something similar to
frontend/tests/e2e/apply/form-cd511/add-minimal-fields-succes.test.js
frontend/tests/e2e/apply/form-cd511/upload-two-attachments-success.test.js
frontend/tests/e2e/apply/form-cd511/organization-owned-requirements-success.test.js
Err, these aren't the use cases for form CD511, but you can see how much more readable and meaningful the filenames are.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe a short meeting about this is in order?
frontend/tests/e2e/apply/testplan/apply_workflow_e2e_verification_plan.md
Outdated
Show resolved
Hide resolved
|
|
||
| • Staging - QA verification in a production-like environment - Full Playwright end-to-end tests covering happy path, negative and edge cases, optional forms, attachment uploads, and submission confirmation | ||
| o URL:- https://staging.simpler.grants.gov/ | ||
| • Production - Happy Path E2E Check to ensure submission works and confirmation page displays correctly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you want to run automated e2e tests against production as well? I don't think that's in scope for right now, but we could talk about that for the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes just the E2E happy path test post latest code deployment to Prod. To make sure that the code changes did not break anything in Prod from user experience perspective.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a good idea. I've started a list of things we need to do, and added that to the list so we don't lose track https://docs.google.com/spreadsheets/d/1UwOYG21f2YjHLBCrM1Nhck8WAX40biBDCN4UiV9nPuo/edit?usp=sharing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How are we managing different suites of test cases? For instance we'll want to run all of them against localhost/ci, most against staging, and a subset smoke set against production.
Note that our test competitions are different in all three of those environments, so we'll want to find a way to take that into account/work on creating some standard competitions across all environments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ErinPattisonNava beginning to figure that out is ticketed here, though the work of actually tagging tests and determining which groups to run when and where is maybe not ticketed yet? #7307
frontend/tests/e2e/apply/testplan/apply_workflow_e2e_verification_plan.md
Show resolved
Hide resolved
doug-s-nava
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks good to me - I've tagged @ychoy and @chris-kuryak for review as well.
| @@ -0,0 +1,132 @@ | |||
| # Apply Workflow End-to-End Verification plan | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sounds good to me - in line with the other comment about file names, can this be testPlan instead of testplan?
| • Validates full user experience: Start → Fill → Upload → Review → Submit | ||
| • Includes autosave, negative/edge case handling, network errors, browser compatibility checks | ||
| • Location: | ||
| o frontend/tests/e2e/apply/happypath.specs.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From experience I'd recommend using dashes. E2E descriptive name can get long, and having dashes rather than camel case make them much easier to read.
| • Validates full user experience: Start → Fill → Upload → Review → Submit | ||
| • Includes autosave, negative/edge case handling, network errors, browser compatibility checks | ||
| • Location: | ||
| o frontend/tests/e2e/apply/happypath.specs.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like more descriptive file names rather than things like happypath and edgecases as it isn't clear what they are actually testing. Even in the table below you have to specify what each happy path and each edgecase is in a separate column. I'd rather that embedded in the name so when something fails you know exactly what went wrong.
|
|
||
| • Staging - QA verification in a production-like environment - Full Playwright end-to-end tests covering happy path, negative and edge cases, optional forms, attachment uploads, and submission confirmation | ||
| o URL:- https://staging.simpler.grants.gov/ | ||
| • Production - Happy Path E2E Check to ensure submission works and confirmation page displays correctly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How are we managing different suites of test cases? For instance we'll want to run all of them against localhost/ci, most against staging, and a subset smoke set against production.
Note that our test competitions are different in all three of those environments, so we'll want to find a way to take that into account/work on creating some standard competitions across all environments.
|
|
||
| ## 7. Entry & Exit Criteria | ||
|
|
||
| Entry Criteria: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Entry Criteria for writing tests? Or executing them?
|
|
||
| The backend APIs that the Apply workflow depends on may be slow, unreliable, or return errors. This can cause tests to fail or the application to behave unpredictably during submission, form saving, or document uploads. | ||
|
|
||
| ### Large file uploads |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure how this is a risk for writing tests. Feels like this could be its own performance suite of tests. Not sure if we do pen testing here either and this might be better covered under that.
|
|
||
| ### 3.1. In Scope | ||
|
|
||
| This Test Plan covers functional and non-functional aspects of the Apply Workflow: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand the non-functional aspect?
| ## 2. Preconditions | ||
|
|
||
| Before executing Apply workflow tests, the following must be satisfied: | ||
| • The user is able to login successfully with all required roles to create an application. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These dots don't format correctly when you view the file in github. Can you switch to using * so it becomes a bulleted list?
| | Workflow Step | Test Type | Location (Exact Path) | Notes | | ||
| | ------------------------------------- | --------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------- | | ||
| | User login with necessary roles | E2E | `frontend/tests/e2e/apply/happypath.specs.ts`<br>`frontend/tests/e2e/apply/negativecases.specs.ts` | Happy: successful login<br>Negative: invalid credentials, missing roles | | ||
| | Search and select Funding Opportunity | E2E | `frontend/tests/e2e/apply/happypath.specs.ts`<br>`frontend/tests/e2e/apply/negativecases.specs.ts` | Happy: valid opportunity<br>Negative: invalid or non-existent search | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are search tests, not apply tests.
|
|
||
| The workflow involves multiple steps and forms. There’s a risk that form data could be lost if the user navigates away or a network request fails. This can disrupt the user experience and lead to submission errors. | ||
|
|
||
| ## 9. Traceability Table |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This feels fragmented into parts that are too small, which I'm concerned about from a performance perspective. It is difficult to right-size e2e tests to balance setup/tear down costs with intermittent failures, but this feels off. I'd like to view e2e tests as covering creating an application -> submission to make sure we are getting a good quality guarantee.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also don't think we should have 100% coverage. e2e tests are expensive, so we'll want to carefully pick what we invest in. For instance, I suspect that most of what you have below as negative cases (which would be named more descriptively) aren't worth our time/resources.
| • Validates full user experience: Start → Fill → Upload → Review → Submit | ||
| • Includes autosave, negative/edge case handling, network errors, browser compatibility checks | ||
| • Location: | ||
| o frontend/tests/e2e/apply/happypath.specs.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we're also going to probably want to do form-by-form testing. So I'd suggest we have something similar to
frontend/tests/e2e/apply/form-cd511/add-minimal-fields-succes.test.js
frontend/tests/e2e/apply/form-cd511/upload-two-attachments-success.test.js
frontend/tests/e2e/apply/form-cd511/organization-owned-requirements-success.test.js
Err, these aren't the use cases for form CD511, but you can see how much more readable and meaningful the filenames are.
|
|
||
| • Post-submission editing | ||
| • Reviewer/Admin workflows | ||
| • Deep search or dashboard functionality |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Bhavna-Ramachandran I'm unsure what Deep Search or Dashboard functionality is referring to, can you clarify or give an example.
| • Review application | ||
| • Submission flow and confirmation page | ||
| • End-to-end workflows in a real environment | ||
| • Mobile browser checks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that we are not currently optimizing (or checking) for mobile compatibility on our Apply workflow. We are optimizing for desktop experience. Therefore, I don't think this needs to be in-scope for now, assuming we would optimize for mobile later in our development.
| • Optional form selection for submission | ||
| • Document upload via forms and Attachments section | ||
| • Navigation between steps | ||
| • Review application |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is not currently a feature around "reviewing an application". What is this referring to?
| • Submission flow and confirmation page | ||
| • End-to-end workflows in a real environment | ||
| • Mobile browser checks | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After successfully submitting an application, there are some backend processes that take place to create a Zip file of the submission package:
- PDFs are created from the forms
- XML is generated from the forms
- Attachments are added
Will these tests also be checking to ensure that process is completed as expected?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The zip file/submission package is created asynchronously once an hour. We won't want to hold up the e2e tests for that long, so these aren't candidates to be tested through the e2e web-based framework. Those are better suited to unittests and monitoring.
Summary
Fixes / Work for #6146
Changes proposed
Context for reviewers
This PR introduces the first formal E2E verification test plan for the Apply workflow.
The document defines how happy path, negative, and edge-case scenarios will be validated across BDD, and Playwright E2E test layers.
Reviewers should verify that:
Validation steps