7.7 KiB
Pinchflat
YouTube channel auto-archiver. Subscribes to channels, polls for new uploads, downloads via yt-dlp, stores locally with metadata/subtitles/chapters. Phoenix/Elixir app backed by SQLite.
Status
Production on LAN/Tailscale only (promoted 2026-04-24). No public hostname, no SSO, no NPM reverse proxy. Running under Portainer GitOps from main.
Design + rollout history: docs/superpowers/specs/2026-04-24-pinchflat-design.md, docs/superpowers/plans/2026-04-24-pinchflat.md.
Access
- Web UI: http://192.168.0.200:8945
- Host: Atlantis (
atlantis.tail.vish.gg) - Image:
ghcr.io/kieraneglin/pinchflat:latest - Container name:
pinchflat - Portainer stack:
pinchflat-stack(ID 739, EndpointId 2) - Compose path in repo:
hosts/synology/atlantis/pinchflat/docker-compose.yml
Paths on Atlantis
- Config / SQLite DB:
/volume2/metadata/docker2/pinchflat/config(NVMe)- DB file:
/volume2/metadata/docker2/pinchflat/config/db/pinchflat.db - Cookies file:
/volume2/metadata/docker2/pinchflat/config/extras/cookies.txt
- DB file:
- Downloads:
/volume1/data/media/youtube/<Channel>/<YYYY-MM-DD> - <Title>.mkv(SATA)
Runtime defaults
Configured in the UI; current state of media profile Default (id 1):
| Setting | Value |
|---|---|
| Output template | {{ source_custom_name }}/{{ upload_yyyy_mm_dd }} - {{ title }}.{{ ext }} |
| Resolution cap | 2160p (4K) |
| Container format | mkv (required for VP9/AV1 4K streams) |
| Thumbnails | download + embed |
| Subtitles | download + embed, en + auto-subs |
| Metadata | download + embed |
| Chapters | on |
| NFO files | off |
| Shorts / Livestreams | include |
| SponsorBlock | disabled |
Integrations
Plex
- Library: "Youtube" (section id 8, type movie, agent
com.plexapp.agents.none, scannerPlex Video Files Scanner, languagexn) - Mount: plex container already has
/volume1/data/media:/data/media, library points at/data/media/youtube - Manual refresh:
curl "http://192.168.0.200:32400/library/sections/8/refresh?X-Plex-Token=$TOKEN"where token is from/volume2/metadata/docker2/plex/Library/Application Support/Plex Media Server/Preferences.xml→PlexOnlineToken
Uptime Kuma
- Monitor:
Pinchflat(id 129, HTTPhttp://192.168.0.200:8945/, 60s interval, parent groupAtlantisid 4)
Homarr
- Tile on board
Homelab→ Atlantis section (position 8,2) - App id
5h5g339iuh20lc7jfsklal4t
Cookie management (keeps auth alive)
YouTube session cookies let Pinchflat bypass age gates, rate limits, and "sign in to confirm you're not a bot" checks. They expire periodically and must be refreshed.
Current source
Extracted 2026-04-24 from Chromium on uqiyoe (Windows 11), 333 cookies. Chromium on Windows uses DPAPI (not ABE), so yt-dlp can read cookies directly when the browser is closed.
Per-source behaviour
Each source has a cookie_behaviour setting:
disabled— no cookieswhen_needed← default choice, uses cookies only for operations that require authall_operations— always use cookies
Refresh procedure (when auth breaks)
On the host with the signed-in YouTube session (uqiyoe; Edge is also possible but requires an extension because of App-Bound Encryption):
# Install yt-dlp if needed
ssh vish@uqiyoe 'py -m pip install --user --upgrade yt-dlp'
# Close Chromium first (the cookie DB is locked while it runs)
# Extract cookies to a temp file
ssh vish@uqiyoe 'py -m yt_dlp --cookies-from-browser chromium --cookies "C:\Users\Vish\yt-cookies.txt" --skip-download --no-warnings "https://www.youtube.com/feed/subscriptions"'
# Pipe to Atlantis, set perms, remove local copy
ssh vish@uqiyoe 'powershell -NoProfile -Command "Get-Content -Raw C:\Users\Vish\yt-cookies.txt"' | \
ssh vish@atlantis 'sudo tee /volume2/metadata/docker2/pinchflat/config/extras/cookies.txt > /dev/null && \
sudo chown 1029:100 /volume2/metadata/docker2/pinchflat/config/extras/cookies.txt && \
sudo chmod 600 /volume2/metadata/docker2/pinchflat/config/extras/cookies.txt'
ssh vish@uqiyoe 'del C:\Users\Vish\yt-cookies.txt'
Pinchflat re-reads the cookies file for each yt-dlp invocation, so no container restart is required.
Alternative cookie sources
- Firefox anywhere:
yt-dlp --cookies-from-browser firefoxworks cleanly (no DPAPI/ABE). - Browser extension "Get cookies.txt LOCALLY" on any browser → manual export → ssh-pipe to Atlantis.
- Edge/Chrome recent versions: yt-dlp extraction is broken (App-Bound Encryption, yt-dlp issue #10927). Use the extension.
Operations
Docker on Synology DSM requires sudo and the full binary path.
Redeploy via Portainer (pulls latest compose from main)
curl -sk -X POST \
-H "X-API-Key: $PORTAINER_TOKEN" \
"https://192.168.0.200:9443/api/stacks/739/git/redeploy?endpointId=2" \
-d '{"pullImage": true}'
Direct container control (emergency only; prefer Portainer)
ssh vish@atlantis
cd /data/compose/739 # Portainer's materialized compose dir
sudo /usr/local/bin/docker compose logs -f
sudo /usr/local/bin/docker compose restart
Inspect SQLite state
ssh vish@atlantis 'sudo /usr/bin/sqlite3 /volume2/metadata/docker2/pinchflat/config/db/pinchflat.db \
"SELECT id, custom_name, cookie_behaviour, download_cutoff_date FROM sources;"'
ssh vish@atlantis 'sudo /usr/bin/sqlite3 /volume2/metadata/docker2/pinchflat/config/db/pinchflat.db \
"SELECT COUNT(*) total, SUM(CASE WHEN media_filepath IS NOT NULL THEN 1 ELSE 0 END) downloaded FROM media_items;"'
Adding a new channel (source)
In the web UI: Sources → Add Source. Paste the channel URL (e.g. https://www.youtube.com/@ChannelName), set a custom name (becomes the folder name), pick media profile Default, cookie behaviour when_needed.
Reasonable download_cutoff_date for high-volume channels to avoid downloading the entire back catalog:
- 7 days (
2026-04-17as of writing) - 14 days (
2026-04-10) ← used for LinusTechTips - 30 days (
2026-03-25) - 90 days (
2026-01-24)
Tier limit: the free tier clamps index_frequency_minutes to 43200 (30 days) — regardless of what the form shows. Initial indexing is unaffected; only automatic re-polling for new uploads uses this cadence.
Current sources
| ID | Custom Name | URL | Cookie | Cutoff |
|---|---|---|---|---|
| 1 | Linus Tech Tips | https://www.youtube.com/@LinusTechTips | when_needed | 2026-04-10 |
Troubleshooting
- Container unhealthy — we use
curlin the healthcheck, notwget(image ships curl only). If you see an old compose withwget -qO /dev/null ..., it will always report unhealthy even when the UI serves 200s. Current compose usescurl -fsS. - yt-dlp update rate-limit on boot — benign. Pinchflat tries to
yt-dlp --updateat startup and can hit GitHub's unauthenticated rate limit (HTTP 403). It falls back to the bundled version and retries later. To silence: addGITHUB_TOKENenv var. - Downloads stop with auth errors — cookies expired. Follow the refresh procedure above.
- Missing files in Plex — trigger a library refresh against section 8 (see Plex section above). Also check for
.temppartial files in the download folder; they rename to.mkvwhen yt-dlp finishes. - Source indexing stale — free tier clamps re-index to every 30 days. To force, edit the source in the UI and save (triggers a re-index).
Deliberate scope decisions
No public hostname, no Authentik SSO, no NPM reverse proxy. Access is LAN + Tailscale only. Re-evaluate only if sharing with non-Tailscale users becomes a requirement.
Image is pinned to :latest with Watchtower auto-updates — acceptable given the service's non-critical nature. Pin to a digest if stability becomes an issue.