🌐 AI搜索 & 代理 主页
Skip to content

hydropix/TranslateBookWithLLM

Repository files navigation

TBL Logo

TranslateBook with LLM (TBL)

Translate books, subtitles, and documents using AI - locally or in the cloud.

Formats: EPUB, SRT, TXT | Providers: Ollama (local), OpenRouter, OpenAI, Gemini

📊 Translation Quality Benchmarks — Find the best model for your target language.


Quick Start

Prerequisites: Python 3.8+, Ollama, Git

git clone https://github.com/hydropix/TranslateBookWithLLM.git
cd TranslateBookWithLLM
ollama pull qwen3:14b    # Download a model

# Windows
start.bat

# Mac/Linux
chmod +x start.sh && ./start.sh

The web interface opens at http://localhost:5000


Choosing a Model

VRAM Model Best For
8 GB gemma3:12b Spanish, Portuguese, European
24 GB mistral-small:24b French
24 GB gemma3:27b Japanese, Korean, Arabic, most languages
24 GB qwen3:30b-instruct Chinese (Simplified/Traditional)

📊 Full benchmarks — 11 models × 19 languages with accuracy, fluency & style scores.


LLM Providers

Provider Type Setup
Ollama Local ollama.com
OpenAI-Compatible Local llama.cpp, LM Studio, vLLM, LocalAI...
OpenRouter Cloud (200+ models) openrouter.ai/keys
OpenAI Cloud platform.openai.com
Gemini Cloud Google AI Studio

OpenAI-Compatible servers: Use --provider openai with your server's endpoint (e.g., llama.cpp: http://localhost:8080/v1/chat/completions, LM Studio: http://localhost:1234/v1/chat/completions)

See docs/PROVIDERS.md for detailed setup instructions.


EPUB Translation Modes

Mode Use When
Standard (default) Large model (>12B), formatting is important
Fast Mode (--fast-mode) Small model (≤12B), reader compatibility issues, simpler is better

Fast Mode strips HTML and outputs EPUB 2.0 for maximum compatibility. See docs/FAST_MODE.md for details.


Command Line

# Basic
python translate.py -i book.epub -o book_zh.epub -sl English -tl Chinese

# With Fast Mode
python translate.py -i book.epub -o book_zh.epub --fast-mode

# With OpenRouter
python translate.py -i book.txt -o book_fr.txt --provider openrouter \
    --openrouter_api_key YOUR_KEY -m anthropic/claude-sonnet-4

# With OpenAI
python translate.py -i book.txt -o book_fr.txt --provider openai \
    --openai_api_key YOUR_KEY -m gpt-4o

# With Gemini
python translate.py -i book.txt -o book_fr.txt --provider gemini \
    --gemini_api_key YOUR_KEY -m gemini-2.0-flash

# With local OpenAI-compatible server (llama.cpp, LM Studio, vLLM, etc.)
python translate.py -i book.txt -o book_fr.txt --provider openai \
    --api_endpoint http://localhost:8080/v1/chat/completions -m your-model

Main Options

Option Description Default
-i, --input Input file Required
-o, --output Output file Auto
-sl, --source_lang Source language English
-tl, --target_lang Target language Chinese
-m, --model Model name mistral-small:24b
--provider ollama/openrouter/openai/gemini ollama
--fast-mode Fast Mode for EPUB Off

See docs/CLI.md for all options and examples.


Configuration (.env)

Copy .env.example to .env and edit:

# Provider
LLM_PROVIDER=ollama

# Ollama
API_ENDPOINT=http://localhost:11434/api/generate
DEFAULT_MODEL=mistral-small:24b

# API Keys (if using cloud providers)
OPENROUTER_API_KEY=sk-or-v1-...
OPENAI_API_KEY=sk-...
GEMINI_API_KEY=...

# Performance
MAIN_LINES_PER_CHUNK=25
REQUEST_TIMEOUT=900

Docker

docker build -t translatebook .
docker run -p 5000:5000 -v $(pwd)/translated_files:/app/translated_files translatebook

See DOCKER.md for more options.


Troubleshooting

Problem Solution
Ollama won't connect Check Ollama is running, test curl http://localhost:11434/api/tags
Model not found Run ollama list, then ollama pull model-name
Timeouts Increase REQUEST_TIMEOUT or reduce MAIN_LINES_PER_CHUNK
EPUB won't open Try --fast-mode
Placeholders in output (⟦TAG0⟧) Use --fast-mode

See docs/TROUBLESHOOTING.md for more solutions.


Documentation

Guide Description
docs/PROVIDERS.md Detailed provider setup (Ollama, LM Studio, OpenRouter, OpenAI, Gemini)
docs/FAST_MODE.md EPUB Fast Mode explained
docs/CLI.md Complete CLI reference
docs/TROUBLESHOOTING.md Problem solutions
DOCKER.md Docker deployment guide

Links


License: AGPL-3.0

About

A python script designed to translate large amounts of text with an LLM, Ollama, OpenAI, Gemini and OpenRouter API

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 2

  •  
  •