Port main branch updates to refactored api/common/conserver structure#149
Merged
pavanputhra merged 1 commit intooptimize-docker-imagesfrom Apr 15, 2026
Merged
Conversation
Backport 7 commits from main (5f3350c..e98a3df) into the split layout, adapting all file paths and imports from the old server/ structure. Changes ported: - Add shared openai_client.py (common/lib/) with get_openai_client() and get_async_openai_client() supporting OpenAI, Azure, and LiteLLM proxy - Refactor all OpenAI-using links and storage to use get_openai_client(): analyze, analyze_and_label, analyze_vcon, check_and_tag, detect_engagement, openai_transcribe, chatgpt_files, milvus - deepgram_link: add LiteLLM proxy path (transcribe_via_litellm), fix fd leak in audio temp file handling, make confidence check optional - wtf_transcribe: update for new vfun /wtf API — simplified create_wtf_analysis (pass response body directly), file-binary field, language option, diarize default→False, min-duration default→0, status 200 only - api: /config endpoint uses Configuration.get_config() instead of reading the YAML file directly - tests: add mock_get_client patches to analyze_and_label and detect_engagement tests; fix test_external_ingress to patch api.index_vcon instead of api.index_vcon_parties - docs: add Langfuse integration and OTel Collector fan-out documentation - .gitignore: add litellm_config.yaml (contains local credentials) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Backports 7 commits from
main(up to5f3350c) into theoptimize-docker-imagessplit layout, adapting all file paths and imports from the oldserver/structure.common/lib/openai_client.py— sharedget_openai_client()/get_async_openai_client()supporting OpenAI, Azure OpenAI, and LiteLLM proxy; all links/storage now call this instead of constructing clients inlinetranscribe_via_litellm), fix fd leak in audio temp file handling, make confidence check optional (not available on LiteLLM path)/wtfAPI: simplifiedcreate_wtf_analysis(pass response body directly),file-binaryfield name,languageoption,diarizedefault→False,min-durationdefault→0, accept status200only/configendpoint usesConfiguration.get_config()instead of reading the YAML file directlymock_get_clientpatches toanalyze_and_labelanddetect_engagementtests; fixtest_external_ingressto patchapi.index_vconinstead ofapi.index_vcon_partiesmonitoring.mdlitellm_config.yaml(local dev file with credentials, was untracked)Test plan
docker compose run --rm conserver poetry run pytest/configendpoint returns config correctly/wtfendpoint🤖 Generated with Claude Code