A production-focused collection of OpenAI API examples built with Python.
This repository demonstrates how to build scalable, maintainable, and production-ready AI systems using modern OpenAI APIs, structured engineering patterns, and real-world architecture practices.
Chaitanya Dasadiya
- GitHub: https://github.com/cdasadiya
- Focus Areas: AI Engineering, Python Development, OpenAI APIs, Automation
This repository is designed to become a complete professional OpenAI engineering reference covering:
- OpenAI APIs
- AI agents
- Multi-modal systems
- Production AI architectures
- RAG pipelines
- Realtime systems
- Fine-tuning
- Deployment systems
- AI infrastructure engineering
The examples are grouped by API area so related scripts stay together:
01_core_platform/
02_responses_api/
03_realtime_apis/
04_audio_apis/
05_vision_apis/
utils/
01_core_platform/— production platform foundations such as authentication, API keys, organizations, projects, usage tracking, billing, rate limits, models, tokens, and pricing optimization.02_responses_api/— all Responses API examples, including basic responses, structured output, prompting, streaming, function calling, tool calling, multi-turn conversations, and reasoning prompts.03_realtime_apis/— realtime API examples covering WebSocket connections, live streaming, voice systems, realtime transcription, interrupt handling, and low-latency architectures.04_audio_apis/— audio API examples for transcription, translation, text-to-speech, and voice workflows.05_vision_apis/— vision API examples for image understanding, OCR, and vision reasoning.utils/— shared configuration and OpenAI client utilities used by runnable examples.
All implemented Core Platform examples live in 01_core_platform/.
-
01_core_platform/authentication.py -
01_core_platform/api_keys.py -
01_core_platform/organizations.py -
01_core_platform/projects.py -
01_core_platform/usage_tracking.py -
01_core_platform/billing.py -
01_core_platform/rate_limits.py -
01_core_platform/models.py -
01_core_platform/tokens.py -
01_core_platform/pricing_optimization.py
Install dependencies:
pip install -r requirements.txtSet OPENAI_API_KEY in your environment, GitHub Codespaces secret, or local .env file. Optionally set OPENAI_ORG_ID and OPENAI_PROJECT_ID when you need explicit organization or project scoping. Then run examples from the repository root:
python 01_core_platform/authentication.py
python 01_core_platform/api_keys.py
python 01_core_platform/organizations.py
python 01_core_platform/projects.py
python 01_core_platform/usage_tracking.py
python 01_core_platform/billing.py
python 01_core_platform/rate_limits.py
python 01_core_platform/models.py
python 01_core_platform/tokens.py
python 01_core_platform/pricing_optimization.pyEach example also supports running from inside its own folder because it adds the repository root to sys.path before importing shared utilities.
All implemented Responses API examples live in 02_responses_api/.
-
02_responses_api/basic_response.py -
02_responses_api/structured_json_output.py -
02_responses_api/system_prompting.py -
02_responses_api/streaming_responses.py -
02_responses_api/function_calling.py -
02_responses_api/tool_calling.py -
02_responses_api/multi_turn_conversation.py -
02_responses_api/reasoning_models.py
Install dependencies:
pip install -r requirements.txtSet OPENAI_API_KEY in your environment, GitHub Codespaces secret, or local .env file. Then run examples from the repository root:
python 02_responses_api/basic_response.py
python 02_responses_api/structured_json_output.py
python 02_responses_api/system_prompting.py
python 02_responses_api/streaming_responses.py
python 02_responses_api/function_calling.py
python 02_responses_api/tool_calling.py
python 02_responses_api/multi_turn_conversation.py
python 02_responses_api/reasoning_models.pyEach example also supports running from inside its own folder because it adds the repository root to sys.path before importing shared utilities.
All implemented Realtime API examples live in 03_realtime_apis/.
-
03_realtime_apis/websocket_connections.py -
03_realtime_apis/live_streaming.py -
03_realtime_apis/realtime_voice.py -
03_realtime_apis/realtime_transcription.py -
03_realtime_apis/interrupt_handling.py -
03_realtime_apis/low_latency_systems.py
Install dependencies:
pip install -r requirements.txtSet OPENAI_API_KEY in your environment, GitHub Codespaces secret, or local .env file. Then run examples from the repository root:
python 03_realtime_apis/websocket_connections.py
python 03_realtime_apis/live_streaming.py
python 03_realtime_apis/realtime_voice.py
python 03_realtime_apis/realtime_transcription.py
python 03_realtime_apis/interrupt_handling.py
python 03_realtime_apis/low_latency_systems.pyThe voice and transcription scripts include Codespaces-safe placeholder modes for environments without microphone or speaker devices.
- Authentication
- API keys
- Organizations
- Projects
- Usage tracking
- Billing
- Rate limits
- Models
- Tokens
- Pricing optimization
- Responses API
- Streaming
- Structured outputs
- JSON schema outputs
- Function calling
- Tool calling
- Multi-turn conversations
- Reasoning models
- Chat Completions
- WebSocket connections
- Live streaming
- Realtime voice
- Realtime transcription
- Interrupt handling
- Low-latency systems
- Speech-to-text
- Transcription
- Translation
- Text-to-speech
- Voice synthesis
- Audio generation
- Image understanding
- OCR
- Multi-image analysis
- Vision reasoning
- Image generation
- Image editing
- Variations
- Inpainting
- Style transfer
- Embeddings
- Semantic search
- Similarity search
- RAG pipelines
- Vector databases
- Dataset preparation
- Training jobs
- Hyperparameters
- Evaluation
- Deployment
- Agents SDK
- Tool orchestration
- Memory
- Sessions
- MCP
- Multi-agent systems
- Files API
- Uploads
- Batch processing
- Vector stores
- Moderation API
- Safety systems
- Guardrails
- Prompt injection prevention
- Scaling
- Monitoring
- Observability
- Logging
- Retry systems
- Queue systems
- Caching
- Load balancing
- Cost optimization
- RAG systems
- AI workflows
- Agent systems
- Multi-modal systems
- Hybrid AI architectures
- Autonomous systems
- Docker
- Kubernetes
- Serverless
- Edge deployment
- CI/CD
- Cloud deployment
- LangChain
- LlamaIndex
- Pinecone
- Weaviate
- Supabase
- Vercel AI SDK
- Long-context systems
- AI memory systems
- Planning systems
- Reflection loops
- Self-improving agents
- Tool ecosystems
- Computer-use agents
- Browser agents
- Coding agents
| Category | Tools |
|---|---|
| Language | Python 3.12+ |
| AI Platform | OpenAI API |
| Environment | Virtualenv / dotenv |
| Patterns | Async IO, Structured Outputs, Streaming |
| Focus | Production-ready AI engineering |
git clone https://github.com/cdasadiya/openai-api-playground.git
cd openai-api-playgroundThis repository supports:
- Local machine development
- GitHub Codespaces development
python3 -m venv .venv
source .venv/bin/activateWindows:
.venv\Scripts\activatepip install -r requirements.txtOPENAI_API_KEY=sk-proj-your_api_key_herepython 02_responses_api/basic_response.pyGo to:
GitHub → Settings → Codespaces → Secrets
Create:
Name: OPENAI_API_KEY
Value: sk-proj-your_api_key_here
Grant access to:
openai-api-playground
echo $OPENAI_API_KEYpython 02_responses_api/basic_response.py❌ Never commit API keys to GitHub
❌ Never hardcode API keys inside Python files
❌ Never push .env files
❌ Never expose API keys in screenshots
❌ Never store secrets in public repositories
✅ Use .env locally
✅ Use GitHub Secrets for Codespaces and Actions
✅ Add .env to .gitignore
✅ Rotate compromised keys immediately
✅ Validate AI outputs before execution
.env
.env.*
__pycache__/
*.pyc
.venv/
venv/This repository follows production-focused engineering standards:
- Centralized OpenAI client architecture
- Environment-based configuration
- Structured outputs
- Reusable utilities
- Debugging support
- Error handling
- Secure secret management
- Codespaces compatibility
- Production-safe patterns
- Responses API
- Structured Outputs
- Streaming
- Function Calling
- Embeddings
- RAG
- Agents
- Realtime APIs
- Fine-Tuning
- Production AI Systems
This repository can be used for:
- AI engineering interview preparation
- OpenAI API learning
- Production AI architecture references
- AI SaaS development
- Internal AI tooling
- Rapid AI prototyping
- Agent system experimentation
- Multi-modal AI systems
Contributions are welcome.
Suggested contribution areas:
- New OpenAI API examples
- RAG systems
- Realtime applications
- AI agent orchestration
- CI/CD automation
- Deployment examples
- Performance optimization
- Production monitoring
This repository is licensed under the MIT License. See the LICENSE file for details.