- Python 98.8%
- Dockerfile 1.2%
Files within a downloading torrent are now imported as soon as their individual progress reaches 100%, without waiting for the full torrent to complete. A new processed_files table (info_hash, file_name) tracks which files have been handled to avoid redundant re-hashing each poll. - db.py: add processed_files table, is_file_processed, record_processed_file - qbit_client.py: replace get_completed_torrents with get_torrents(filter, category) - stash_handler.py: extract process_torrent_files helper, daemon now polls both downloading and completed torrents each cycle Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> |
||
|---|---|---|
| .gitignore | ||
| db.py | ||
| docker-compose.prod.example.yaml | ||
| docker-compose.yaml | ||
| Dockerfile | ||
| hasher.py | ||
| qbit_client.py | ||
| README.md | ||
| requirements.txt | ||
| stash_handler.py | ||
| summary.md | ||
StashHandler
A Docker service that bridges qBittorrent and Stash. It polls qBittorrent for completed torrents (filtered by category), hashes media files, deduplicates them against a SQLite database, and copies new files into Stash's import directory — preserving originals so seeding can continue.
Why
- No reimports — If you delete a file in Stash, StashHandler remembers its hash and won't import it again.
- Deduplication — Duplicate files are caught before they reach Stash.
- Bulk bootstrap — Record hashes of your existing Stash library so future downloads are checked against it.
Quick Start
Development
git clone <repo-url> && cd stashhandler
# Edit environment variables in docker-compose.yaml
docker compose up -d --build
Production
# Copy the example and fill in your values
cp docker-compose.prod.example.yaml docker-compose.yaml
# Edit QBIT_PASSWORD, volume paths, etc.
# Docker Swarm
docker stack deploy -c docker-compose.yaml stash-handler
# Or plain Compose
docker compose up -d
The prod compose pulls the pre-built image from git.fabledsword.com/bvandeusen/stashhandler:latest and only includes the stash-handler service (qBittorrent and Stash are managed separately).
Modes
Daemon (default)
Runs continuously, polling qBittorrent for completed torrents.
docker run stash-handler # default command is "daemon"
Scan
Bulk-process files on disk. Useful for bootstrapping the dedup database or one-off imports.
# Record hashes of existing files (no files moved)
docker exec stash-handler python stash_handler.py scan /staging
# Record and import new files to Stash
docker exec stash-handler python stash_handler.py scan --import /staging
# Record hashes from your existing Stash library (mount read-only)
docker exec stash-handler python stash_handler.py scan /stash-library
Configuration
All configuration is via environment variables. No config files needed — works with Docker Swarm secrets and stack deploys.
| Variable | Default | Description |
|---|---|---|
QBIT_URL |
http://qbittorrent:8080 |
qBittorrent Web UI URL |
QBIT_USERNAME |
admin |
qBit username |
QBIT_PASSWORD |
adminadmin |
qBit password |
STAGING_DIR |
/staging |
qBit download directory mount |
IMPORT_DIR |
/import |
Stash import directory mount |
DB_PATH |
/data/seen.db |
SQLite database path |
POLL_INTERVAL |
60 |
Seconds between polls |
MEDIA_EXTENSIONS |
.mp4,.mkv,.avi,.wmv,.mov,.webm,.flv,.m4v |
Comma-separated extensions |
HASH_ALGORITHM |
sha256 |
sha256 or blake3 |
MOVE_MODE |
copy |
copy, hardlink, or move |
QBIT_CATEGORY |
Stash |
qBit category to filter torrents by |
LOG_LEVEL |
INFO |
Python log level |
Volumes
volumes:
- /data/staging:/staging:ro # qBit downloads (read-only)
- /data/stash-import:/import # Stash import dir (read-write)
- /data/stash-handler:/data # SQLite database (persistent)