Weathery is an AI agent that can understand and respond to natural language weather-related questions using live data from openweathermap.com. The agent should provide intelligent, conversational responses about weather forecasts.
Weathery provides a web interface for to discuss with the agent. It also provides a google calendar integration to add weather information to calendar events.
Here are the specification that this project delivers
- Answer user question using natural language
 - Weathery can answer question regarding: weather forecast, live weather data, historical data
 - Weathery offers a simple web interface to chat with the agent
 - Chat session ID is hold only in the browser session. Refreshing the page will delete the chat history
 
- Weathery offers a google calendar integration, it allows to add weather information to event in the calendar.
 - the event shall have a location and a time defined for the integration to work
 
The project will follow the Clean Architecture principles. This will allow to easily switch interfaces (change LLM providers, choose other weather API, etc.) and help building a clean code base.
I decided to choose OpenAI providers and SDK. Mainly because I already know it and it will be faster. Other option, like LangGraph could work for a small project like this and I plan to later add an LLM interface using Langgraph for learning purpose.
To get weather data, several option where possible:
- using a weather API and providing tools to the AI agent to retrieve weather data
 - scrapping weather webpage
 
I decided to go with an API as it is a more reliable option and I have no experience in web scrapping so API would be faster to implement.
After few investigation on their API, it seems the pattern will:
- use geocoding endpoint to get latitude and longitude for a location
 - use one of the weather data endpoint
 
Weather data can be quite heavy and might need to be digested to make the work easier for the LLM and reduce risk of errors or hallucination.
Google has a REST API to interact with calendar. So this can be provided as a tool to the AI agent
Using this API, the agent could retrieve the upcoming event in the calendar, get time and location, get weather data and add weather information to the events descriptions. There is quite a few steps to achieve this task but it is still simple enough for a single AI agent.
The AI agent will have to do perform few tasks, but it seems reasonable for a single agent. To increase reliability, a multi agent architecture could be implemented but I will start with a single agent to keep things simple.
- Language: Python (typed)
 - Framework: FastAPI (Python)
 - AI Agent: raw OpenAI SDK
 - LLM provider: OpenAI
 - Weather API: OpenWeatherMap (free tier)
 - Calendar: Google Calendar API
 - Data Validation: Pydantic
 
- Interface: Simple HTML/CSS/JavaScript
 - HTTP Client: Fetch API
 - Styling: Minimal responsive design
 
- Environment: Python 3.9+
 - Package Manager: pip
 - Testing: pytest
 - Code Quality: black, Mypy
 - Environment Variables: python-dotenv
 
- Domain Layer: Core business entities and rules
 - Use Cases Layer: Application-specific business logic
 - Interface Layer: Abstract contracts and services
 - Infrastructure Layer: External integrations and frameworks
 
weathery/ ├── backend/src/ │ ├── domain/entities/ │ ├── use_cases/ │ ├── interfaces/ │ └── infrastructure/ ├── frontend/ ├── tests/ └── docs/
First, we will build a PoC with minimal features to start building the project structure:
- frontend: no frontend, instead read user message from the CLI
 - backend: read user question, send the question to OpenAI via their API, set LLM response to the user
 - Agent tool: no tool yet, no need to connect to any API
 
- Add an interface to retrieve data from openweather API
 - Add a tool for the Agent to get the weather data
 
- Add all required openweather endpoint connection
 - Add all required tools for the agent
 
- Add an API in the backend for the web interface
 - Add a frontend: simple web interface for interaction (text input + display of the chat history)
 - Agent can request current weather, weather forecast, historical weather for any location via the tool provided for openweather
 
- Add the google calendar integration
 
Current features include:
- ✅ CLI interface for user interaction
 - ✅ Web interface with chat functionality
 - ✅ OpenAI integration
 - ✅ OpenWeatherMap API integration (current, forecast, historical data)
- ✅ Current weather
 - ✅ Weather forecast
 - ✅ Historical weather data
 
 - ✅ Google Calendar integration
- ✅ Add weather information to upcoming events in your calendar
 
 - ✅ Clean architecture implementation
 - ✅ Session management and conversation history (in cache only)
 - ✅ Environment-based configuration
 - ✅ LLM Observability with Langfuse integration (optional)
 
- add langfuse or equivalent for observability (ongoing)
 - handle long running queries: need to decouple HTTP request from processing time: message queue
 - testing: complex query, long conversation, many events in the caldendar
 - improve prompting: based on the test result improve the prompt:
- system prompt can be improved
 - tool response can be improved: for instance instead of giving the full load of data from the weather web site, we could digest and only provide to the Agent the data for the requested day or hour. We could also improve the formatting with daily weather abstract, and then details
 
 - agent memory for long running tasks
 - user management
 - compatibility with other LLM providers
 - Deployment:
- build a proper CI/CD flow with github actions
 - use docker container for deployment
 - improve frontend server running and config generation
 
 - code testing:
- check test coverage and add test where there are blind spots
 
 - monitoring: add a LLM observability framework, like langsmith, to be able to easily monitor LLM actions
 - logging can be improved and cleaned
 - message queue between frontend and backend for long running queries
 - for performance, we could add a cache for weather data
 
- Python 3.11 or newer
 - OpenAI API key (Get one here)
 - OpenWeatherMap API key (Get one here)
 - Google OAuth credentials (optional, for calendar integration)
 
- Clone and navigate to the project:
 
git clone <repository-url>
cd weathery- Create virtual environment:
 
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate- Install dependencies:
 
pip install -r requirements.txt- Configure environment:
 
cp .env.example .env
# Edit .env and add your API keys:
OPENAI_API_KEY=your_actual_openai_api_key_here
OPENWEATHERMAP_API_KEY=your_openweathermap_api_key_here
API_HOST=127.0.0.1
API_PORT=8000
# Optional: Enable Langfuse for LLM Observability
# Get your keys from https://cloud.langfuse.com
LANGFUSE_PUBLIC_KEY=pk-lf-your_public_key_here
LANGFUSE_SECRET_KEY=sk-lf-your_secret_key_here
LANGFUSE_ENABLED=true- Run the application:
 
Option A: Web Interface (Recommended)
# Terminal 1: Start the backend API server
source venv/bin/activate
python api_main.py
# Terminal 2: Start the frontend server
source venv/bin/activate
python webpage_main.py
# Then open your browser to: http://127.0.0.1:3000Option B: CLI Interface
source venv/bin/activate
python main.pyWeb Interface:
- Open your browser to http://127.0.0.1:3000
 - Chat with Weathery using natural language
 - Ask about current weather, forecasts, or historical data for any location
 - Your conversation history is maintained during the session
 
CLI Interface:
🌤️  Welcome to Weathery - Your AI Weather Assistant
==================================================
You can ask me about:
- Current weather conditions anywhere in the world
- Weather forecasts up to 14 days ahead
- Historical weather data for the past 10 years
Type your weather questions or 'quit' to exit.
==================================================
💬 You: What's the weather like in Paris right now?
🤔 Weathery is thinking...
🌤️  Weathery: The current weather in Paris is...
Weathery includes optional integration with Langfuse for comprehensive LLM observability and analytics.
- Conversation Tracing: Track entire conversation flows across multiple messages
 - Tool Call Monitoring: Observe weather API calls and calendar operations with detailed metrics
 - Session Correlation: Link Langfuse traces with your existing session management
 - Performance Analytics: Monitor response times, token usage, and costs
 - Configurable Detail Levels: Choose between basic and detailed tracing
 
| Environment Variable | Description | Default | 
|---|---|---|
LANGFUSE_ENABLED | 
Enable/disable Langfuse integration | false | 
LANGFUSE_PUBLIC_KEY | 
Your Langfuse public key | - | 
LANGFUSE_SECRET_KEY | 
Your Langfuse secret key | - | 
LANGFUSE_HOST | 
Langfuse host URL | https://cloud.langfuse.com | 
LANGFUSE_TRACE_LEVEL | 
Tracing granularity: conversation or message | 
conversation | 
LANGFUSE_TOOL_TRACE_DETAIL | 
Tool call detail: basic or detailed | 
basic | 
- Sign up for a free account at cloud.langfuse.com
 - Create a new project and get your API keys
 - Add the keys to your 
.envfile and setLANGFUSE_ENABLED=true - Restart Weathery - all LLM interactions will now be traced!