refactor(api): bump v3 prompt with albert support#970
refactor(api): bump v3 prompt with albert support#970valentinbourgoin merged 9 commits intofeature/plateforme-engagementfrom
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9b3611a13e
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: cea1a66996
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| @@ -1,4 +1,4 @@ | |||
| export const CONFIDENCE_THRESHOLD = 0.3; | |||
| export const CURRENT_PROMPT_VERSION = "v2" as const; | |||
| export const CURRENT_PROMPT_VERSION = "v3" as const; | |||
There was a problem hiding this comment.
Keep prompt version and mocked registry in sync
Changing CURRENT_PROMPT_VERSION to "v3" causes missionEnrichmentService.enrich to dereference an undefined prompt entry in the existing mission-enrichment unit tests, because their mocked PROMPT_REGISTRY only defines v2; this now throws before generateObject and breaks the success-path test coverage for enrichment chaining. Please update the related prompt registry mocks/tests to include v3 (or derive from CURRENT_PROMPT_VERSION) so the suite continues validating this workflow.
Useful? React with 👍 / 👎.
9d9d7a1
into
feature/plateforme-engagement
Description
Introduit une architecture
aimulti-providers pour centraliser l'instanciation des modèles utilisés par le Vercel AI SDK.Le nouveau service
api/src/services/aiexpose une façade applicative unique (ai.model(providerName, modelId)) et délègue la création des modèles à des providers enregistrés. Albert devient un provider custom sousapi/src/services/ai/providers/albert, tandis que les providers officiels du Vercel AI SDK peuvent être branchés via un adapter générique (VercelAiProviderAdapter).Cette PR conserve le fonctionnement des prompts existants :
prompts/v1.tsetprompts/v2.tsinstancient Mistral viaai.model("mistral", "mistral-small-2603").prompts/v3.tsinstancie Albert viaai.model("albert", "mistralai/Mistral-Small-3.2-24B-Instruct-2506").L'implémentation Albert conserve le contrat
LanguageModelV3, qui est l'interface bas niveau attendue par AI SDK v6 pour les providers custom. Côté application, le serviceaiexpose le type publicLanguageModel, compatible avec les providers officiels et custom.Type de changement
Checklist
Notes complémentaires
Architecture ajoutée :
AiService: registre de providers et point d'entréemodel(providerName, modelId).AiProvider: contrat minimal commun aux providers.AlbertProvider: provider custom qui construit unLanguageModelV3Albert.VercelAiProviderAdapter: adapter pour encapsuler les factories des providers officiels Vercel AI SDK, commemistral.Variables d'environnement à provisionner :
ALBERT_API_KEY— clé d'accès à l'API Albert (obligatoire)ALBERT_BASE_URL— endpoint Albert (défaut :https://albert.api.etalab.gouv.fr)Ce qui ne change pas : le schéma de sortie (
ENRICHMENT_SCHEMA), la température (0), les messages utilisateur et le prompt système restent identiques à v2. Le retour arrière vers v2 est possible en changeantCURRENT_PROMPT_VERSIONdansapi/src/services/mission-enrichment/config.ts.Limitations connues : le streaming et l'usage des tools ne sont pas supportés par le provider Albert (
UnsupportedFunctionalityError).