An intelligent database performance analyzer that uses AI to diagnose slow queries and provide actionable optimization recommendations.
✅ Production Ready: PostgreSQL slow query analysis with comprehensive AI-powered recommendations ✅ Production Ready: MongoDB slow query analysis with real-time profiler integration and multi-format reporting ✅ Multi-Cloud AI: Vertex AI, AWS Bedrock, OpenAI, Ollama 🚧 Traditional SQL: MySQL and SQL Server support in v0.4.0 (Q3 2026)
🚀 NEW in v0.2.6: Database-Direct EXPLAIN Analysis — Run EXPLAIN against live PostgreSQL databases using IQToolkit config file. No log files needed! Plus contextual AI recommendations that acknowledge efficient queries.
🚀 NEW in v0.2.4: Governance & version-sync patch — simplified PR/commit rules and aligned all version strings to 0.2.4. Multi-cloud AI support from v0.2.3 remains available (Vertex AI, Bedrock, OpenAI, Ollama).
🚀 NEW in v0.2.0: MongoDB support is now fully available! Use
iqtoolkit-analyzer mongodbto analyze your MongoDB performance with real-time profiler integration, comprehensive indexing recommendations, and multi-format reports.
🚀 NEW in v0.3.0: contracts-first outputs,
config validate,--examplesflags, and refreshed shell completion scripts.
🚀 Current repo version:
0.3.0(stable on PyPI). MongoDB support, 4 AI providers (Ollama + OpenAI + Vertex AI + Bedrock), and live database analysis available.
- Overview
- Key Features
- Installation
- Quick Start
- Sample Log Files
- Project Architecture
- Configuration
- Slow Query Log Setup
- Sample Output
- Command Line Options
- Troubleshooting
- Development
- System Requirements
- License
- Documentation
- Roadmap, Technical Debt & Contributing
IQToolkit Analyzer automatically analyzes your PostgreSQL and MongoDB slow query logs and provides intelligent, AI-powered optimization recommendations. It identifies performance bottlenecks, calculates impact scores, and generates detailed reports with specific suggestions for improving database performance.
Database Support:
| Database | Status | Version | Timeline |
|---|---|---|---|
| PostgreSQL | ✅ Fully Supported | v0.1.x+ | Available now |
| MongoDB | ✅ Fully Supported | v0.2.0+ | Available now |
| MySQL | 🚧 Planned | v0.4.0 | Q3 2026 |
| SQL Server | 🚧 Planned | v0.4.0 | Q3 2026 |
AI Provider Support:
| AI Provider | Status | Privacy | Speed | Cost |
|---|---|---|---|---|
| Ollama (Local) | ✅ Default | ⭐⭐⭐⭐⭐ | Fast | Free |
| Vertex AI (Gemini) | ✅ Supported | ⭐⭐⭐⭐ | Fast | GCP pricing |
| AWS Bedrock | ✅ Supported | ⭐⭐⭐⭐ | Medium | ~$3/1K |
| OpenAI GPT | ✅ Supported | ⭐⭐⭐ | Fast | ~$0.15/1K |
📢 Multi-Cloud AI: Use Ollama (default) for free local analysis, or switch to cloud providers (Vertex AI, Bedrock, OpenAI) when needed. AWS Community Builder-endorsed: Bedrock is always included.
v0.1.6 Release Note: This is the final v0.1.x release with new features. It includes comprehensive architecture documentation and prepares the codebase for multi-database support coming in v0.4.0. All references have been updated from "PostgreSQL-specific" to "database log analyzer" to reflect our roadmap for MySQL and SQL Server support. Future v0.1.x releases (v0.1.7+) will contain bug fixes only - all new features move to v0.2.0+.
This repository now hosts a modular structure to support future services while keeping development fast:
iqtoolkit-analyzer/
├── iqtoolkit_analyzer/ # Current CLI package (to be service-ized)
├── iqtoolkit-contracts/ # Shared Pydantic models (Poetry package)
├── iqtoolkit-iqai/ # AI Copilot service (Poetry package)
├── iqtoolkithub/ # Orchestration gateway (Poetry package)
├── iqtoolkit-deployment/ # Helm charts and deployment assets
└── docs/ # Documentation and samples
See docs/repo/ROADMAP.md for the phase-by-phase plan.
- 🔍 Smart Log Parsing:
- PostgreSQL: Extracts slow queries from log files, supports multi-line queries and unusual characters
- MongoDB: Real-time profiler integration for live slow query detection
- 📊 Impact Analysis: Calculates query impact using duration × frequency scoring
- 🤖 AI-Powered Recommendations:
- 4 AI Providers: Ollama (default), Vertex AI, AWS Bedrock, OpenAI GPT
- Privacy Options: Local Ollama for sensitive data, cloud providers for convenience
- Enterprise Ready: AWS Bedrock and Vertex AI for cloud deployments
- 📝 Comprehensive Reports:
- PostgreSQL: Detailed Markdown reports with statistics and recommendations
- MongoDB: Multi-format reports (JSON, HTML, Markdown) with collection-level insights
- 📂 Sample Data Included: Ready-to-use sample log files for both PostgreSQL and MongoDB
- 🗂️ Multiple Formats:
- PostgreSQL: Plain, CSV, and JSON log formats
- MongoDB: Direct profiler integration with configurable thresholds
- ⚙️ Config File Support:
- PostgreSQL: Use
.iqtoolkit-analyzer.ymlfor analysis options - MongoDB: Use
.mongodb-config.ymlfor connection and profiling settings
- PostgreSQL: Use
- 🔒 Privacy & Flexibility:
- Local AI: Ollama for privacy-first analysis (default)
- Cloud AI: 6 cloud providers for enterprise deployments
- Your Choice: Switch providers based on your infrastructure and requirements
# PyPI - All platforms (macOS, Windows, Linux)
pip install iqtoolkit-analyzer
# Verify installation
iqtoolkit-analyzer --versionmacOS (Homebrew)
brew tap iqtoolkit/iqtoolkit
brew install iqtoolkit-analyzerWindows (pip)
pip install iqtoolkit-analyzerLinux (pip)
pip install iqtoolkit-analyzerStandalone Binaries
Download from GitHub Releases:
- macOS:
iqtoolkit-analyzer-macos-universal.tar.gz - Windows:
iqtoolkit-analyzer-windows-x64.zip - Linux:
iqtoolkit-analyzer-linux-x64.tar.gz
Docker
docker pull iqtoolkit/iqtoolkit-analyzer:latest📖 Full installation guide: docs/installation.md
IQToolkit Analyzer supports 4 AI providers. Choose based on your needs:
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Pull recommended model
ollama pull arctic-text2sql-r1:7b
# Use (no API key needed)
iqtoolkit-analyzer pg analyze -c "postgresql://localhost/mydb" --use-ai --ai-provider ollama# Install dependencies
pip install iqtoolkit-analyzer[cloud-ai]
# Authenticate
gcloud auth application-default login
# Set project and region
export GCP_PROJECT="my-gcp-project"
export GCP_REGION="us-west1"
# Use
iqtoolkit-analyzer pg analyze -c "postgresql://localhost/mydb" \
--use-ai --ai-provider vertex# Install dependencies
pip install iqtoolkit-analyzer[cloud-ai]
# Configure AWS credentials
export AWS_ACCESS_KEY_ID="your-key"
export AWS_SECRET_ACCESS_KEY="your-secret"
export AWS_DEFAULT_REGION="us-east-1"
# Use
iqtoolkit-analyzer pg analyze -c "postgresql://localhost/mydb" \
--use-ai --ai-provider bedrock# Set API key
export OPENAI_API_KEY="your-api-key"
# Use
iqtoolkit-analyzer pg analyze -c "postgresql://localhost/mydb" \
--use-ai --ai-provider openai- 🔧 Extensible: Future-ready architecture supports multiple databases and AI providers
⚡ Ready to analyze PostgreSQL or MongoDB slow queries right now? Follow the installation below.
🔮 Planning for MySQL/SQL Server? Join the early feedback program to shape v0.4.0 development!
git clone https://github.com/iqtoolkit/iqtoolkit-analyzer.git
cd iqtoolkit-analyzer
# Install Poetry (pick one)
## macOS (Homebrew)
brew update && brew install poetry
## macOS/Linux (Official installer)
curl -sSL https://install.python-poetry.org | python3 -
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc && source ~/.zshrc
## Windows (PowerShell)
powershell -ExecutionPolicy Bypass -NoProfile -Command "(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | py -"
## Cross-platform (pipx)
pipx install poetry
# Install dependencies for shared/contracts and services
cd iqtoolkit-contracts && poetry install && cd -
cd iqtoolkit-iqai && poetry install && cd -
cd iqtoolkithub && poetry install && cd -
# Analyzer CLI (root package)
poetry install --with dev,testOption A: Ollama (Recommended - Local or remote, private, no API key needed) ⭐
# Local setup (see docs/5-minute-ollama-setup.md for details)
curl -LsSf https://ollama.com/install.sh | sh
ollama serve
ollama pull a-kore/Arctic-Text2SQL-R1-7B # SQL-specialized model (recommended)
# Copy example config and customize
cp .iqtoolkit-analyzer.yml.example .iqtoolkit-analyzer.yml
# Edit: set llm_provider: ollama
# OR use remote Ollama server
export OLLAMA_HOST=http://your-server-ip:11434
# Or add to .iqtoolkit-analyzer.yml:
# ollama_host: http://your-server-ip:11434Option B: OpenAI (Cloud, requires API key)
export OPENAI_API_KEY="your-openai-api-key-here"
# Config will use OpenAI by default if no .iqtoolkit-analyzer.yml exists💡 Tip: Ollama can run locally or on a remote server—your queries stay within your infrastructure. Perfect for sensitive production data. See Ollama Setup Guide for local and remote configuration details.
# Quick health check
iqtoolkit-analyzer pg health -c "postgresql://user:pass@localhost/mydb"
# Full analysis with AI recommendations
iqtoolkit-analyzer pg analyze -c "postgresql://user:pass@localhost/mydb" \
--use-ai --ai-provider ollama
# Find slow queries (requires pg_stat_statements)
iqtoolkit-analyzer pg slow-queries -c "postgresql://user:pass@localhost/mydb" --limit 20
# Detect table bloat
iqtoolkit-analyzer pg bloat -c "postgresql://user:pass@localhost/mydb"
# Save analysis to JSON
iqtoolkit-analyzer pg analyze -c "postgresql://user:pass@localhost/mydb" -o results.json# Test connection first
iqtoolkit-analyzer mongo test-connection -c "mongodb://localhost:27017"
# Enable profiling on a database
iqtoolkit-analyzer mongo profile -c "mongodb://localhost:27017" -d myapp --level 1
# Analyze a database
iqtoolkit-analyzer mongo analyze -c "mongodb://localhost:27017" -d myapp
# Generate reports in multiple formats
iqtoolkit-analyzer mongo analyze -c "mongodb://localhost:27017" -d myapp \
--output ./reports --format json html markdown
# Continuous monitoring (every 5 minutes)
iqtoolkit-analyzer mongo monitor -c "mongodb://localhost:27017" -d myapp --interval 5The docs/sample_logs/ directory contains database slow query log examples for testing and demonstration:
- PostgreSQL: Real sample logs from 100M record database operations with authentic slow queries → View samples
- MongoDB: Complete profiler integration with real-time slow query detection and comprehensive optimization recommendations → View samples
- MySQL: Placeholder directory with configuration examples and feedback collection → View samples
- SQL Server: Placeholder directory with Extended Events samples and configuration → View samples
🎯 Early Feedback Opportunities:
- MySQL Users: Share your slow query log formats and challenges
- SQL Server DBAs: Tell us about your Extended Events setup and pain points
postgresql-2025-10-28_192816.log.txt: Contains authentic slow queries from a 100M record database including:- Complex aggregation queries (15.5+ seconds): Statistical calculations across 40M records
- Expensive correlated subqueries (109+ seconds): Text pattern matching with per-row subqueries
- Mathematical operations with window functions (209+ seconds): Multiple window functions with trigonometric calculations
- Multiple query patterns that benefit from different optimization strategies (indexes, query rewrites, JOIN optimizations)
Sample log files use the .txt extension instead of .log to prevent them from being excluded by .gitignore patterns that typically ignore *.log files. This ensures the sample data remains available in the repository for testing and demonstration purposes.
- Real Performance Issues: Authentic slow queries from actual 100M record database operations
- Variety of Problems: Different types of performance bottlenecks (missing indexes, correlated subqueries, expensive window functions)
- AI-Ready: Perfect for testing AI recommendation quality with real optimization opportunities
- Educational: Great examples for learning PostgreSQL performance optimization techniques
- Range of Complexity: From 2-second queries to 209-second extreme cases
-
Aggregation with Mathematical Functions (15.5s)
AVG,STDDEV,COUNToperations on large datasets- Range filtering across 40M records
- Perfect for testing index recommendations
-
Correlated Subqueries with Pattern Matching (109s)
LIKEoperations with multiple patterns- Correlated subquery executing for each row
- Demonstrates JOIN optimization opportunities
-
Window Functions with Mathematical Operations (209s)
- Multiple
ROW_NUMBER(),RANK(),LAG(),LEAD()functions - Complex mathematical calculations (
SQRT,SIN,COS,LOG) - Heavy sorting and partitioning operations
- Multiple
iqtoolkit-analyzer/
├── iqtoolkit_analyzer/ # Current CLI package (to be service-ized)
├── iqtoolkit-contracts/ # Shared Pydantic models (Poetry package)
├── iqtoolkit-iqai/ # AI Copilot service (Poetry package)
├── iqtoolkithub/ # Orchestration gateway (Poetry package)
├── iqtoolkit-deployment/ # Helm charts and deployment assets
└── docs/ # Documentation and samples
- Parse → Extract slow queries from database logs (currently PostgreSQL)
- Analyze → Calculate impact scores and normalize queries
- AI Analysis → Generate optimization recommendations using AI models
- Report → Create comprehensive Markdown analysis report
See docs/repo/ROADMAP.md for milestones and central-plan phases. For the exact current version, see the repo
VERSIONfile.
See the full guide: docs/getting-started.md
Enable slow query logging in your postgresql.conf:
# Log queries taking longer than 1 second
log_min_duration_statement = 1000
# Enable logging collector
logging_collector = on
# Set log directory (relative to data_directory)
log_directory = 'log'
# Log file naming pattern
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
# What to log
log_statement = 'none'
log_duration = off
Or configure dynamically:
-- Enable for current session
SET log_min_duration_statement = 1000;
-- Enable globally (requires restart)
ALTER SYSTEM SET log_min_duration_statement = 1000;
SELECT pg_reload_conf();For a step-by-step guide to enabling slow query logging, running example queries, and analyzing logs, see:
This guide covers:
- Editing postgresql.conf
- Session-level logging
- Running example slow queries
- Collecting and analyzing logs with IQToolkit Analyzer
MongoDB analysis uses the built-in profiler to collect slow operation data. Enable profiling for your databases:
// Enable profiling for operations slower than 100ms
db.setProfilingLevel(2, {slowms: 100})
// Check profiling status
db.getProfilingStatus()
// View recent slow operations
db.system.profile.find().limit(5).sort({ts: -1}).pretty()Create a .mongodb-config.yml configuration file:
# MongoDB Connection
connection:
connection_string: "mongodb://localhost:27017"
connection_timeout_ms: 5000
# Performance Thresholds
thresholds:
slow_threshold_ms: 100.0
very_slow_threshold_ms: 1000.0
critical_threshold_ms: 5000.0
# Analysis Settings
databases_to_monitor: ["myapp", "analytics"]
exclude_databases: ["admin", "config", "local"]
# Report Settings
reporting:
formats: ["json", "html", "markdown"]
include_query_samples: true
max_query_samples: 5For complete MongoDB setup instructions, see: docs/mongodb-guide.md
| Variable | Description | Default | Required |
|---|---|---|---|
OPENAI_API_KEY |
OpenAI API key | None | For OpenAI provider |
OPENAI_MODEL |
GPT model to use | gpt-4o-mini |
Optional |
OPENAI_BASE_URL |
Custom OpenAI endpoint | https://api.openai.com/v1 |
Optional |
Create a .iqtoolkit-analyzer.yml file to customize behavior:
# Default provider (matches keys in `providers`)
default_provider: ollama
# AI providers (both `providers` and legacy `llm_providers` keys are supported)
providers:
ollama:
host: http://localhost:11434
model: llama3
openai:
api_key: ${OPENAI_API_KEY}
model: gpt-4o-mini
# Analysis Options
log_format: csv
top_n: 10
output: reports/report.md
min_duration: 1000
# LLM Configuration
llm_temperature: 0.3
max_tokens: 300
llm_timeout: 30See Configuration Guide for all options and Ollama Local Setup for local AI setup.
# Slow Query Analysis Report
## Summary
- **Total queries analyzed**: 8
- **Slow queries found**: 4
- **Total duration**: 336,175.06 ms
- **Most impactful query**: Mathematical operations with window functions
## Top Slow Queries
### Query #1: Mathematical Operations with Window Functions (Impact Score: 209,297.06)
**Duration**: 209,297.06 ms | **Frequency**: 1 | **First seen**: 2025-10-28 20:04:57
```sql
SELECT id, random_number, random_text, created_at,
SQRT(ABS(random_number)::numeric) as sqrt_abs_number,
LOG(GREATEST(random_number, 1)::numeric) as log_number,
SIN(random_number::numeric / 180000.0 * PI()) as sin_degrees,
ROW_NUMBER() OVER (ORDER BY random_number) as row_num_asc,
AVG(random_number) OVER (ROWS BETWEEN 1000 PRECEDING AND 1000 FOLLOWING) as moving_avg
FROM large_test_table
WHERE random_number BETWEEN 250000 AND 750000
AND (id % 7 = 0 OR id % 11 = 0 OR id % 13 = 0)
ORDER BY SQRT(ABS(random_number)::numeric) DESC
LIMIT 200;🤖 AI Recommendation:
This query suffers from expensive mathematical operations and multiple window functions. Create a composite index on (random_number, id) and consider materializing complex calculations. The multiple window functions could be optimized by combining operations. Expected improvement: 70-85% faster execution.
Duration: 109,234.02 ms | Frequency: 1 | First seen: 2025-10-28 19:31:23
SELECT DISTINCT l1.random_number, l1.random_text, l1.created_at,
(SELECT COUNT(*) FROM large_test_table l2 WHERE l2.random_number = l1.random_number)
FROM large_test_table l1
WHERE l1.random_text LIKE '%data_555%' OR l1.random_text LIKE '%data_777%'
ORDER BY l1.random_number DESC LIMIT 30;
🤖 AI Recommendation:
Replace the correlated subquery with a JOIN or window function. Create indexes on random_text (consider GIN for pattern matching) and random_number. The LIKE operations with leading wildcards are expensive - consider full-text search if applicable. Expected improvement: 60-80% faster execution.
## 🔧 Command Line Reference
Run `iqtoolkit-analyzer --help` for the full command tree, or `iqtoolkit-analyzer <command> --help` for per-command options.
### PostgreSQL (`pg`)
| Command | Description |
|---------|-------------|
| `pg analyze -c <conn>` | Full analysis with optional AI (`--use-ai --ai-provider ollama\|openai\|bedrock\|vertex`) |
| `pg health -c <conn>` | Quick health check with severity summary |
| `pg slow-queries -c <conn>` | List slow queries from `pg_stat_statements` |
| `pg bloat -c <conn>` | Detect table bloat |
| `pg settings -c <conn>` | Generate an HTML settings report |
### MongoDB (`mongo`)
| Command | Description |
|---------|-------------|
| `mongo analyze -c <conn> -d <db>` | Analyze slow queries; `--output <dir> --format json\|html\|markdown` |
| `mongo monitor -c <conn> -d <db>` | Continuous monitoring (`--interval <mins>`, Ctrl+C to stop) |
| `mongo profile -c <conn> -d <db>` | Set profiling level (`--level 0\|1\|2`) |
| `mongo test-connection -c <conn>` | Test connectivity and optionally verify profiling |
| `mongo config create` | Create a sample `mongodb_config.yml` |
| `mongo config validate -c <file>` | Validate a MongoDB config file |
| `mongo config show -c <file>` | Show resolved config (redacted) |
### Global
| Command | Description |
|---------|-------------|
| `config validate` | Validate `.iqtoolkit-analyzer.yml` |
| `config show` | Show resolved global config |
| `config create` | Create a sample `.iqtoolkit-analyzer.yml` |
## 🐛 Troubleshooting
### Common Issues
#### PostgreSQL Issues
**"No slow queries found"**
```bash
# Check if log file contains duration entries
grep -i "duration:" your_log_file.log
# Verify PostgreSQL logging is enabled
psql -c "SHOW log_min_duration_statement;"
"Permission denied on log file"
# Fix file permissions
chmod 644 /path/to/postgresql.log"Connection failed"
# Test MongoDB connection
mongosh "mongodb://localhost:27017" --eval "db.adminCommand('ismaster')"
# Check if profiler is enabled
mongosh "mongodb://localhost:27017/mydb" --eval "db.getProfilingStatus()""No profiler data found"
# Enable MongoDB profiling for slow operations (>100ms)
mongosh "mongodb://localhost:27017/mydb" --eval "db.setProfilingLevel(2, {slowms: 100})"
# Check system.profile collection
mongosh "mongodb://localhost:27017/mydb" --eval "db.system.profile.count()""OpenAI API Error" (v0.1.x Only)
# Verify API key is set
echo $OPENAI_API_KEY
# Test API connectivity
curl -H "Authorization: Bearer $OPENAI_API_KEY" \
https://api.openai.com/v1/models💡 Alternative: If you prefer local AI processing for privacy, consider waiting for v0.2.0 with Ollama support (Nov 2025 - Q1 2026).
cp /var/log/postgresql/postgresql.log ~/my_log.log
### Log File Locations
| Installation Method | Typical Log Location |
|-------------------|---------------------|
| **Homebrew (macOS)** | `/opt/homebrew/var/postgresql@*/log/` |
| **Ubuntu/Debian** | `/var/log/postgresql/` |
| **CentOS/RHEL** | `/var/lib/pgsql/*/data/log/` |
| **Docker** | `/var/lib/postgresql/data/log/` |
| **Windows** | `C:\Program Files\PostgreSQL\*\data\log\` |
## 🧪 Development
### Make Commands Reference
See [docs/make-commands.md](docs/make-commands.md) for the full list of Make targets and usage.
### Quick Development Setup
```bash
# Clone and setup (prefers Poetry, falls back to venv/pip)
git clone https://github.com/iqtoolkit/iqtoolkit-analyzer.git
cd iqtoolkit-analyzer
make setup
# Or traditional approach (manual)
python -m venv .venv
source .venv/bin/activate
pip install -e .[dev,test]
make test # Run all tests
make lint # Run linting
make format # Format code
# Traditional approach
pytest tests/ -v
pytest tests/ --cov=iqtoolkit_analyzer --cov-report=htmlSee the full Make targets reference: docs/make-commands.md
htmlcov is the folder where the HTML coverage report is generated when you run tests with coverage reporting. In this project:
- How it’s generated:
- Pytest is configured in pyproject.toml to produce coverage reports, including HTML, via addopts: --cov=iqtoolkit_analyzer --cov-report=term-missing --cov-report=html --cov-report=xml
- The HTML output directory is configured under [tool.coverage.html] as directory = "htmlcov".
- You’ll typically get it by running make test (which runs pytest with those flags) or pytest ... --cov-report=html.
- Where to view it:
- Open htmlcov/index.html in your browser to see per-file and line-level coverage.
- Is it excluded from Git?
- Yes. .gitignore contains htmlcov/ so the generated report is not committed.
- How to clean it up:
- make clean removes htmlcov/ along with other build/test artifacts.
# With Makefile (recommended)
make format # Format with ruff
make lint # Lint with ruff + mypy
make validate # Full validation suite
# Traditional approach
ruff format .
ruff check .
poetry run mypy iqtoolkit_analyzerMypy is a static type checker for Python. It analyzes your code without executing it to catch type-related errors early and to make the codebase easier to maintain.
In this repository, mypy helps to:
- Prevent common bugs by verifying function inputs/outputs match their annotations
- Enforce consistent, explicit types (useful in a data-heavy tool like this)
- Improve editor/IDE auto-completion and refactoring safety
How it’s configured here:
- Configuration lives in pyproject.toml under [tool.mypy]
- We enable a relatively strict set of options:
- disallow-untyped-defs, disallow-incomplete-defs, disallow-untyped-decorators
- no_implicit_optional, warn_redundant_casts, warn_unused_ignores, warn_no_return, warn_unreachable
- strict_equality and check_untyped_defs
- Third‑party modules with incomplete type hints (like openai, dotenv) are allowed via ignore_missing_imports overrides.
How to run it:
- Recommended: make lint (runs ruff check then mypy)
- Directly: poetry run mypy iqtoolkit_analyzer
Common fixes:
- Add or refine type hints: parameters, return types, and local variables when useful
- Use Optional[T] (or | None) when something can be None
- Narrow types with isinstance checks before using values
- For one-off unavoidable cases, use a targeted suppression: # type: ignore[code]
Type stubs:
- If a dependency lacks types, prefer installing its types (e.g., types-pyyaml)
- If none exist, consider adding minimal annotations around your usage or a local stub package later
# Test the parser
python -c "from iqtoolkit_analyzer import parse_postgres_log; print(len(parse_postgres_log('sample_logs/postgresql-2025-10-28_192816.log.txt')))"
# Test full pipeline with sample data
python -m iqtoolkit_analyzer sample_logs/postgresql-2025-10-28_192816.log.txt --output test_report.md
# Verify AI recommendations are generated
grep -A 5 "🤖 AI Recommendation" test_report.md- Python: 3.11 or higher
- Memory: 512MB+ available RAM
- Storage: 50MB+ free space
- Network: Internet connection for OpenAI API
- Platforms: macOS, Linux, Windows
openai>=1.0.0- OpenAI API clientpython-dotenv>=0.19.0- Environment variable managementpandas>=2.0.0- Data analysis and CSV/JSON log supportpyyaml>=6.0.0- YAML config file supporttqdm>=4.0.0- Progress bars for large log analysispytest,pytest-cov- Testing and coverage (dev)ruff,mypy,pre-commit- Code quality (dev)argparse- Command line parsing (built-in)re,json,logging- Standard library modules
MIT License - see LICENSE file for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open a Pull Request
# Clone your fork
git clone https://github.com/yourusername/iqtoolkit-analyzer.git
cd iqtoolkit-analyzer
# Complete development environment setup
bash scripts/setup-dev-environment.sh
# Install git hooks for automated version management
bash scripts/setup-hooks.sh
# Verify everything works
make check-version
make testDatabase Support Roadmap:
- v0.2.8 (Jan-Feb 2026): Stabilization release on PyPI
- v0.3.0rc3 (Feb 2026): Release candidate on PyPI
- v0.3.0a6 (Feb 2026): Alpha release on PyPI for validation
- v0.3.0 (Feb 2026): Modular architecture (contracts), analyzer refactor, CLI UX upgrades
- v0.4.0 (Q3 2026): MySQL and SQL Server support
AI Provider Evolution:
- v0.3.0 CURRENT: 7 providers supported; Ollama is the default
- v0.4.0 NEXT: Provider UX + contract-based integration cleanup
- v0.4.0+ FUTURE: Provider registry and multi-provider fallback
⚠️ Privacy Note: Default Ollama runs locally. Cloud providers send prompts to their APIs, so choose a provider that matches your data sensitivity.
When asked about new features:
- For v0.3.0: "Released: modular architecture and analyzer refactor."
- For MySQL/SQL Server: "Planned for v0.4.0 (Q3 2026) after v0.3.0 stabilization."
For complete documentation and guides, see our Documentation Index 📖
Quick Links:
- 🚀 Getting Started - New user tutorial
- 🤝 Contributing Guide - How to contribute
- ⚙️ Configuration - Setup and config options
- 💡 PostgreSQL Examples - Real usage examples
- ❓ FAQ - Common questions and troubleshooting
- See docs/repo/ROADMAP.md for the full project roadmap, timeline, and community requests.
- See docs/repo/TECHNICAL_DEBT.md for known limitations and areas for future improvement.
- See docs/repo/CONTRIBUTING.md for contribution guidelines and code standards.
- See docs/repo/VERSION_MANAGEMENT.md for automated version synchronization.
- See docs/repo/BRANCH_PROTECTION.md for repository governance and branch protection rules.
- See docs/repo/ARCHITECTURE.md for detailed system architecture and extension points.
Made with ❤️ for Database performance optimization