Book Translator
A platform for translating books and large text documents.
Two-step process. For better quality.
The tool processes text files using Ollama LLM models with a two-stage approach: primary translation followed by AI self-reflection and refinement for better results. Suitable for translators, publishers, authors, researchers and content creators who need to translate large text documents.
Support for multiple languages including English, Russian, Spanish, French, German, Italian, Portuguese, Chinese, and Japanese. Genre-specific modes (fiction, technical, academic, business, poetry), real-time translation progress tracking for both stages, translation history and status monitoring, automatic error recovery and retry mechanisms, and multi-format export (TXT, PDF, EPUB).
- Install Ollama
curl -fsSL https://ollama.com/install.sh | sh- Clone the repository
git clone https://github.com/KazKozDev/book-translator.git
cd book-translator- Install dependencies
pip install -r requirements.txt- Pull an Ollama model (choose any you prefer)
# Example with gpt-oss:20b
ollama pull gpt-oss:20b
# Or use other models:
# ollama pull llama3.2
# ollama pull qwen2.5
# ollama pull gemma3:12b
# ollama pull phi3- Start the application
Option 1: Quick Launch (macOS)
./Launch\ Book-Translator.commandThis will automatically:
- Kill any process on port 5001
- Start the Flask server
- Open http://localhost:5001 in your browser
- Clear translation cache
Option 2: Manual start
python translator.py
# Then open http://localhost:5001 in your browserDefault settings in translator.py:
- Port: 5001
- Chunk size: 1000 characters
- Temperature varies by genre (0.3-0.8)
MIT License - see LICENSE
If you like this project, please give it a star ⭐ For questions, feedback, or support, open an issue or submit a PR.
Note: The previous version of this project is available in the archive-old-version branch.
