An intelligent personal financial advisor powered by AI. Get expert financial guidance in your preferred language with support for multiple LLM providers, including local offline inference with Ollama.
- π€ Multi-Provider AI Support - Choose from Ollama (local), Google Gemini, or OpenAI
- π¬ Interactive Conversations - Natural language financial discussions
- π Portfolio Analysis - AI-generated portfolio recommendations based on your profile
- π Historical Data - 10-year historical returns for analyzed assets
- π‘οΈ Privacy First - Full offline support with Ollama
- π Multi-Language - Communicate in your preferred language
- π₯ Profile Management - Load, save, and download financial profiles as JSON
- Python 3.11+
- Ollama (optional, for local LLM inference)
-
Clone the repository and enter the directory:
git clone https://github.com/merendamattia/personal-financial-ai-agent.git cd personal-financial-ai-agent -
Create and activate a Python virtual environment:
conda create --name personal-financial-ai-agent python=3.11.13 conda activate personal-financial-ai-agent
-
Install dependencies:
python -m pip install --upgrade pip pip install -r requirements.txt
-
Configure environment variables:
cp .env.example .env # Edit .env with your API keys and preferences -
Extract the dataset:
cd dataset unzip ETFs.zip cd ..
Local Streamlit:
streamlit run app.pyThen open http://localhost:8501 in your browser.
Docker Compose (recommended):
# Start (includes Ollama container)
docker compose up
# Stop
docker compose downAccess at http://localhost:8501
Docker (without Ollama):
# Option 1: Build locally
docker build --no-cache -t financial-ai-agent:local .
docker run -p 8501:8501 --env-file .env financial-ai-agent:local
# Option 2: Use pre-built image from Docker Hub
docker pull merendamattia/personal-financial-ai-agent:latest
docker run -p 8501:8501 --env-file .env merendamattia/personal-financial-ai-agent:latestNote: On first launch, you'll be prompted to select your preferred LLM provider.
Choose your preferred AI provider based on your needs:
- Cost: Free
- Privacy: 100% offline, no data sent to external servers
- Setup:
- Download and install Ollama
- Start Ollama:
ollama serve
- Configuration:
OLLAMA_MODELin.env(default:qwen3:0.6b)OLLAMA_API_URLin.env(default:http://localhost:11434/v1)
- Cost: Free tier available, then pay-as-you-go
- Privacy: Cloud-based processing
- Setup:
- Get API key from AI Studio
- Set
GOOGLE_API_KEYin.env
- Configuration:
GOOGLE_MODELin.env(default:gemini-2.5-flash)
- Cost: Pay-as-you-go, no free tier
- Privacy: Cloud-based processing
- Setup:
- Create account and get API key from OpenAI Platform
- Set
OPENAI_API_KEYin.env
- Configuration:
OPENAI_MODELin.env(default:gpt-4.1-mini)
Provider Selection:
The app detects available providers from your .env configuration. Select your preferred provider when starting the app or switch anytime using the sidebar button.
Run the test suite:
pytest -q # Quick run
pytest -v # Verbose outputSee tests/ directory for available test files.
- Agent Selection β Choose your preferred LLM provider
- Conversation β Answer financial assessment questions
- Profile Extraction β AI extracts your financial profile from responses
- Portfolio Generation β RAG-enhanced advisor generates personalized portfolio
- Analysis β View historical returns and investment recommendations
We welcome contributions! Please see CONTRIBUTING.md for:
- Development setup
- Pre-commit hooks configuration
- Commit convention guidelines
- Pull request process
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
For issues, feature requests, or questions:
- Open an issue on GitHub