Skip to content

πŸš€ Get smarter financial decisions with AI: multi-LLM support (Ollama, Gemini, OpenAI), natural conversations about your finances, automated portfolio recommendations, and 10-year historical insights.

License

Notifications You must be signed in to change notification settings

merendamattia/personal-financial-ai-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

220 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ’° Personal Financial AI Agent

License Latest Release Docker Hub Python 3.11+

An intelligent personal financial advisor powered by AI. Get expert financial guidance in your preferred language with support for multiple LLM providers, including local offline inference with Ollama.

✨ Features

  • πŸ€– Multi-Provider AI Support - Choose from Ollama (local), Google Gemini, or OpenAI
  • πŸ’¬ Interactive Conversations - Natural language financial discussions
  • πŸ“Š Portfolio Analysis - AI-generated portfolio recommendations based on your profile
  • πŸ“ˆ Historical Data - 10-year historical returns for analyzed assets
  • πŸ›‘οΈ Privacy First - Full offline support with Ollama
  • 🌐 Multi-Language - Communicate in your preferred language
  • πŸ“₯ Profile Management - Load, save, and download financial profiles as JSON

πŸ“‹ Requirements

  • Python 3.11+
  • Ollama (optional, for local LLM inference)

πŸš€ Getting Started

Installation

  1. Clone the repository and enter the directory:

    git clone https://github.com/merendamattia/personal-financial-ai-agent.git
    cd personal-financial-ai-agent
  2. Create and activate a Python virtual environment:

    conda create --name personal-financial-ai-agent python=3.11.13
    conda activate personal-financial-ai-agent
  3. Install dependencies:

    python -m pip install --upgrade pip
    pip install -r requirements.txt
  4. Configure environment variables:

    cp .env.example .env
    # Edit .env with your API keys and preferences
  5. Extract the dataset:

    cd dataset
    unzip ETFs.zip
    cd ..

Running the Application

Local Streamlit:

streamlit run app.py

Then open http://localhost:8501 in your browser.

Docker Compose (recommended):

# Start (includes Ollama container)
docker compose up

# Stop
docker compose down

Access at http://localhost:8501

Docker (without Ollama):

# Option 1: Build locally
docker build --no-cache -t financial-ai-agent:local .
docker run -p 8501:8501 --env-file .env financial-ai-agent:local

# Option 2: Use pre-built image from Docker Hub
docker pull merendamattia/personal-financial-ai-agent:latest
docker run -p 8501:8501 --env-file .env merendamattia/personal-financial-ai-agent:latest

Note: On first launch, you'll be prompted to select your preferred LLM provider.

πŸ€– Supported LLM Providers

Choose your preferred AI provider based on your needs:

πŸ¦™ Ollama (Recommended for Privacy)

  • Cost: Free
  • Privacy: 100% offline, no data sent to external servers
  • Setup:
    1. Download and install Ollama
    2. Start Ollama: ollama serve
  • Configuration:
    • OLLAMA_MODEL in .env (default: qwen3:0.6b)
    • OLLAMA_API_URL in .env (default: http://localhost:11434/v1)

🌐 Google Generative AI (Gemini)

  • Cost: Free tier available, then pay-as-you-go
  • Privacy: Cloud-based processing
  • Setup:
    1. Get API key from AI Studio
    2. Set GOOGLE_API_KEY in .env
  • Configuration:
    • GOOGLE_MODEL in .env (default: gemini-2.5-flash)

✨ OpenAI (GPT Models)

  • Cost: Pay-as-you-go, no free tier
  • Privacy: Cloud-based processing
  • Setup:
    1. Create account and get API key from OpenAI Platform
    2. Set OPENAI_API_KEY in .env
  • Configuration:
    • OPENAI_MODEL in .env (default: gpt-4.1-mini)

Provider Selection: The app detects available providers from your .env configuration. Select your preferred provider when starting the app or switch anytime using the sidebar button.

πŸ§ͺ Testing

Run the test suite:

pytest -q # Quick run
pytest -v # Verbose output

See tests/ directory for available test files.

πŸ”§ How It Works

  1. Agent Selection β†’ Choose your preferred LLM provider
  2. Conversation β†’ Answer financial assessment questions
  3. Profile Extraction β†’ AI extracts your financial profile from responses
  4. Portfolio Generation β†’ RAG-enhanced advisor generates personalized portfolio
  5. Analysis β†’ View historical returns and investment recommendations

🀝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for:

  • Development setup
  • Pre-commit hooks configuration
  • Commit convention guidelines
  • Pull request process

πŸ“„ License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

πŸ“ž Support

For issues, feature requests, or questions:

About

πŸš€ Get smarter financial decisions with AI: multi-LLM support (Ollama, Gemini, OpenAI), natural conversations about your finances, automated portfolio recommendations, and 10-year historical insights.

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •