Skip to content

refactor(api): bump v3 prompt with albert support#970

Merged
valentinbourgoin merged 9 commits intofeature/plateforme-engagementfrom
valentin/feature/prompt-albert
Apr 30, 2026
Merged

refactor(api): bump v3 prompt with albert support#970
valentinbourgoin merged 9 commits intofeature/plateforme-engagementfrom
valentin/feature/prompt-albert

Conversation

@valentinbourgoin
Copy link
Copy Markdown
Collaborator

@valentinbourgoin valentinbourgoin commented Apr 29, 2026

Description

Introduit une architecture ai multi-providers pour centraliser l'instanciation des modèles utilisés par le Vercel AI SDK.

Le nouveau service api/src/services/ai expose une façade applicative unique (ai.model(providerName, modelId)) et délègue la création des modèles à des providers enregistrés. Albert devient un provider custom sous api/src/services/ai/providers/albert, tandis que les providers officiels du Vercel AI SDK peuvent être branchés via un adapter générique (VercelAiProviderAdapter).

Cette PR conserve le fonctionnement des prompts existants :

  • prompts/v1.ts et prompts/v2.ts instancient Mistral via ai.model("mistral", "mistral-small-2603").
  • prompts/v3.ts instancie Albert via ai.model("albert", "mistralai/Mistral-Small-3.2-24B-Instruct-2506").

L'implémentation Albert conserve le contrat LanguageModelV3, qui est l'interface bas niveau attendue par AI SDK v6 pour les providers custom. Côté application, le service ai expose le type public LanguageModel, compatible avec les providers officiels et custom.

Type de changement

  • Nouvelle fonctionnalité
  • Refactoring

Checklist

  • Code testé localement
  • Tests unitaires ajoutés/modifiés si nécessaire
  • Respect des standards de code (ESLint)
  • Migration de données nécessaire

Notes complémentaires

Architecture ajoutée :

  • AiService : registre de providers et point d'entrée model(providerName, modelId).
  • AiProvider : contrat minimal commun aux providers.
  • AlbertProvider : provider custom qui construit un LanguageModelV3 Albert.
  • VercelAiProviderAdapter : adapter pour encapsuler les factories des providers officiels Vercel AI SDK, comme mistral.

Variables d'environnement à provisionner :

  • ALBERT_API_KEY — clé d'accès à l'API Albert (obligatoire)
  • ALBERT_BASE_URL — endpoint Albert (défaut : https://albert.api.etalab.gouv.fr)

Ce qui ne change pas : le schéma de sortie (ENRICHMENT_SCHEMA), la température (0), les messages utilisateur et le prompt système restent identiques à v2. Le retour arrière vers v2 est possible en changeant CURRENT_PROMPT_VERSION dans api/src/services/mission-enrichment/config.ts.

Limitations connues : le streaming et l'usage des tools ne sont pas supportés par le provider Albert (UnsupportedFunctionalityError).

@valentinbourgoin valentinbourgoin marked this pull request as ready for review April 29, 2026 13:58
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 9b3611a13e

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread api/src/services/albert/index.ts Outdated
Comment thread terraform/containers.tf
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: cea1a66996

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@@ -1,4 +1,4 @@
export const CONFIDENCE_THRESHOLD = 0.3;
export const CURRENT_PROMPT_VERSION = "v2" as const;
export const CURRENT_PROMPT_VERSION = "v3" as const;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Keep prompt version and mocked registry in sync

Changing CURRENT_PROMPT_VERSION to "v3" causes missionEnrichmentService.enrich to dereference an undefined prompt entry in the existing mission-enrichment unit tests, because their mocked PROMPT_REGISTRY only defines v2; this now throws before generateObject and breaks the success-path test coverage for enrichment chaining. Please update the related prompt registry mocks/tests to include v3 (or derive from CURRENT_PROMPT_VERSION) so the suite continues validating this workflow.

Useful? React with 👍 / 👎.

@valentinbourgoin valentinbourgoin merged commit 9d9d7a1 into feature/plateforme-engagement Apr 30, 2026
12 checks passed
@valentinbourgoin valentinbourgoin deleted the valentin/feature/prompt-albert branch April 30, 2026 14:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants