Skip to content

This is an LLM application with chat functionality, featuring chat using RAG, a database, and MCP server capabilities. The UI is designed for Japanese users.

License

Notifications You must be signed in to change notification settings

to-aoki/tiny_chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tiny Chat

Ask DeepWiki

Installation

Tested with Python 3.10 or later

Development Installation

pip install -r requirements.txt

Package Installation

# Build the package
pip install build
python -m build

# Install the built package
pip install dist/*.whl

Web Interface Usage

Running from source (development)

streamlit run tiny_chat/main.py --server.address=127.0.0.1

only database (development)

streamlit run tiny_chat/main.py --server.address=127.0.0.1 -- --database

Running installed package

tiny-chat

only database

tiny-chat --database

img.png

MCP Usage

Claude Desktop example.

{
  "mcpServers": {
    "tiny-chat": {
      "command": "/path/to/tiny_chat/.venv/bin/tiny-chat-mcp",
      "env": {
        "DB_CONFIG": "/path/to/tiny_chat/database_config.json"
      }
    }
  }
}

OpenAI Chat API RAG Server Usage

tiny-chat-api

model: target search qdrant collection name (model change in conversation).

curl http://localhost:8080/v1/chat/completions   -H "Content-Type: application/json"   -d '{"model": "qdrant-collection-name", "messages": [{"role": "user", "content": "カレーライスの材料は?"}]}'

About

This is an LLM application with chat functionality, featuring chat using RAG, a database, and MCP server capabilities. The UI is designed for Japanese users.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages