Skip to content

[New Feature Launch]: Fully Automated Voiceovers - Broad Launch #48

@tczerwinski726

Description

@tczerwinski726

Product

Web (LaCE)

Brief description of the feature. Include links to PRD and TDD, where possible.

With this launch, we will roll out a broader implementation of automated voiceovers across (TBD) most / all of the content automatically.

PRD: document link
TDD: document link

Unit-wise launching of the entire feature:

Unit Pre launch review document Status Key deliverables
Unit 1 Pre launch review doc not-yet-launched Implement functionality that allows server admins to enable or disable the Azure text-to-speech service on Oppia.org, while enabling voiceover admins to configure and activate all required language–accent pairs for automatic voiceover support. Voice artists should be able to regenerate individual voiceovers in any supported language–accent code directly from the exploration editor for curated explorations. Learners should be able to play these automatic voiceovers in the lesson player with synchronised sentence highlighting, and voiceover admins should have access to a complete history of all voiceover regenerations through the voiceover admin dashboard.
Unit 2 Pre launch review doc or CUJ sheet not-yet-launched Add functionality to automatically regenerate voiceovers for the curated explorations when any of the following events happen: (a) content is updated, (b) the translation is updated, (c) an exploration is added to a published story, (d) the story containing the exploration is published.
Unit 3 Pre launch review doc not-yet-launched Server admins to regenerate voiceovers for every curated exploration in Oppia in all the supported language-accent pairs. At this point, all text and translations in curated explorations in the supported language-accent pairs should have associated voiceovers, and those voiceovers should match the corresponding English or translated text. This “full coverage” should persist indefinitely, including when explorations are reverted to earlier versions.
Unit 4 Pre launch review doc not-yet-launched Implement a way for voiceover admins to regenerate voiceovers for any curated exploration in any of the supported language accent codes. Also, the act of enrolling a new language-accentcode in Oppia’s automatic voiceover support should result in voiceovers being generated for all curated explorations in that newly-added language/accent pair.

Links

Key Contacts

Expected code-complete date

2026 Q1 (7th Feb)

Required roles for testing

  • Voiceover admin

Feature flag name

  • automatic_voiceover_regeneration_from_exp
  • show_regenerated_voiceovers_to_learners
  • enable_background_voiceover_synthesis

Instructions for ProdOps Team

When you are assigned to this issue, please do the following. Tick each
checkbox item as you complete each step:

Tech-lead prep (to be done by a tech lead)

  • Ensure that this issue is filed correctly. Ask for any missing ("TO BE ADDED") info if needed.
  • Confirm that the Drive folder is in the ProdOps Feature Launch folder.
  • Confirm that the Drive folder contains the pre-launch review doc, PRD, and TDD (these should be the actual docs, not just shortcut links to them).
  • Ask the developer and the relevant dev team leads to subscribe to this issue thread for updates, and ensure that they've done so. (You should also subscribe to it for the same reason.)
  • Ask the developer and the relevant dev team leads to subscribe to comment/edit notifications on the pre-launch review doc for updates, and ensure that they've done so. (You should also subscribe to it for the same reason.)
  • Ensure that the "Information for Testers" and "Information for Server Admins" pages of the pre-launch review doc are fully filled in.
  • Ensure that the CUJs in the spreadsheet relating to this feature are updated. If any behaviour is hidden behind a flag, there should be a "Current" row and a separate row for the flag-hidden behaviour.
  • Ask the PM team to verify the current/updated CUJs, to ensure that the diff from the old ones is correct (e.g. we haven't removed stuff from the old ones that we would actually like to keep).
  • Verify that all the above CUJs are fully covered by acceptance tests, and that the implemented acceptance tests in the codebase match the ones documented in the spreadsheet. Fill in the "Evaluation of Automated Tests (by tech lead)" columns in the CUJ spreadsheet with your evaluation, and work with the developer to fix all remaining issues.
  • If the feature launch will include manual steps (in addition to deploying to PROD and turning the flag on), create a tech launch plan with all the steps needed, similar to this one. Ensure that it is added to the Drive folder. This should consider any migrations or data upgrades that need to happen, as well as any launch-time changes that are needed for reliabilty, deployability, performance, scalability, and stability.
  • Verify that the "test stage" PR has been merged (see the "Link to the PR which moved the flag to the TEST feature stage" field), and that it actually moves the feature flag to the TEST stage.

Tech-lead verification (to be done by the same tech lead)

  • Once the "test stage" PR has been merged, deploy it to the backup-server.
  • Enable the feature flag on the backup-server.
  • Complete any steps in the tech launch plan (if one exists) on the backup-server.
  • Verify that the feature's CUJs actually work technically on the backup-server (with the flag enabled). Clarify any CUJs that aren't clear with the developer.
  • Set up a Google Space group chat for this feature launch. Title it "[Launch] Name_Of_Feature" (where Name_Of_Feature is replaced by the name of the feature you're launching).
  • Invite the following people to the launch group chat: (a) the developer of the feature, (b) the relevant dev leads, (c) the ProdOps assignee, (d) the ProdOps leads, (e) any folks in UXD/UXR who were involved in the feature's design, (f) the QA assignee, (g) the QA leads, (h) the Marketing assignee, and (i) the Marketing leads.
  • Click on the "Board" icon in the Google Space, and add the following resources in this order: (a) TDD, (b) PRD, (c) Drive folder, (d) Launch Checklist (GitHub), (e) Pre-Launch Review Doc. (The resources will then appear in reverse order, so that the pre-launch review doc is at the top.)
  • Send a message to @all to introduce the group chat and ask everyone in it to subscribe to the space with "Notifications" set to "All". Explain that the launch chat is where all communication about the launch will happen, and that the launch checklist on GitHub must be kept up-to-date. Provide a link to the GitHub launch checklist.
  • Ask everyone in the launch chat (ProdOps, QA) to subscribe to comment/edit notifications on the pre-launch review doc for updates.
  • Ensure that the ProdOps assignee has the correct permissions on the backup-server to do a dry-run of the feature.
  • Hand off this issue to the ProdOps team (deassign yourself and tag/add the correct assignee from ProdOps). (Only mark this step as complete once the ProdOps assignee has acknowledged the handoff.)

Preparation and Dry-Run (ProdOps team)

  • Verify that the Marketing team's tracker has a sheet for this product launch.
  • Send an email to qa-leads@ to let them know that the feature is coming soon (subject to passing the ProdOps dry-run).
  • [LATER, once analytics pipeline is set up] Set up the Looker Studio dashboard for the feature, and supplement it with metrics obtained using BigQuery queries, as needed.
  • [LATER, once analytics pipeline is set up] Add a link to the backup-server Looker Studio dashboard in the instructions for testers.
  • Do a full run-through of the feature, and make updates to clarify the CUJ descriptions as needed. Then, post a new comment to this thread with the testing report, and cc the developer and relevant dev team leads. Ensure that any bugs found during this process are filed and assigned to the developer. If you see design issues, it's fine to loop in the design team. If issues were discovered in the run-through, repeat the above as needed until there are no more issues. (Only mark this step as complete once that stage is reached -- at this point, ProdOps is saying "I'm happy with the feature, it can go to full QA.")
  • [LATER, once analytics pipeline is set up] Verify that events from the testing were correctly recorded on the Looker Studio dashboard.

Full feature testing (ProdOps and QA teams)

  • Send QA team (qa-team AT oppia DOT org) an email that a new feature is ready for testing. Send them the pre-launch doc (with CUJs) and request the list of tester usernames.
  • Ask the tech leads to provide the necessary permissions (see "Required roles for testing", above) to the tester usernames on the backup-server.
  • Ask QA leads to (a) start the feature testing process, (b) send updates to the group chat after each testing cycle, and (c) send a final update once they've confirmed that the full feature is working properly and is good to launch. If there are bugs found in the testing, they should inform the developer, who should fix the issues, add automated test coverage, and ask for a re-test. This continues until all issues have been fixed. (While this is happening, ProdOps should periodically check the Looker Studio dashboard to verify that metrics are being correctly recorded during feature testing, and adjust the metrics, if needed, to make them more useful.) (Note: Only mark this step as complete when the QA leads have notified the group chat that testing is complete.)
  • Ask the tech leads to move the feature to PROD stage, and add a link to the PR here once it is merged: ______. (Only mark this step as complete once the tech leads have done this.)
  • Perform a final Product review. Product should review the state of QA findings, including whether any bugs still exist, and look at the feature again if a lot of changes have been made. They should also understand any caveats (e.g. if anything has been disabled due to quality issues, etc.). They should also decide (in collaboration with the Campaign Strategy and UXR teams) whether the project needs a "trusted tester" phase.
  • If the decision is made to include a "trusted tester" phase, run through this with the UXR/Marketing teams following the guidance in the Product/Marketing Communication Process document. (Only mark this step as complete when the "trusted tester" phase is complete, or if there is no "trusted tester" phase.)
  • Formally greenlight the launch. (Only mark this step as complete when ProdOps has made the decision to greenlight.)

Rollout (ProdOps team)

  • Agree on launch date with Marketing team and tech leads, and document it here.
    • Deployment date of binary (with the flag enable-able in PROD mode): _________.
    • Date for completing all marketing collateral: _______.
    • Launch date to users (must be after the above): _______.
  • Turn on the feature flag for just the PM.
  • PM verifies that the feature works correctly.
  • [LATER, once analytics pipeline is set up] Check the ProdOps "CUJ health" analytics dashboard to verify that the new dashboards for this feature have been included in it.
  • Work with the server admins team to flip the feature flag per the rollout schedule before the launch date, and confirm that the feature works in production. (Do partial rollout if desired, to make sure usage doesn't drop / support requests don't spike / system health is good.)
  • Ask the server admins to permanently flip the feature flag to ON in the test/backup servers. (Only mark this step as complete when they confirm they've done that.)
  • Once the feature is actually launched to all users:
    • Send out a congratulatory launch email, crediting all the folks involved in the launch. Show some pictures!
    • Update the rows for this feature in the master CUJ document to have a status of "Current" (rather than being gated behind a flag).
  • Monitor post-launch metrics through the feature's Looker Studio dashboard (constructed above), filed GitHub issues, and Google Groups forum.
  • 2 weeks after launch, compile a post-launch report and schedule a post-launch review with everyone on the command channel (design, developers, marketing, QA).
  • Ask the tech leads to (a) remove the feature flag, (b) drop obsolete journeys from the CUJ sheet. Link to PR that removes the flag: _____.
  • Close this bug. This concludes the feature launch, congratulations! 🎉

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

Status

(Feature) Still in development

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions