3.4 KiB
3.4 KiB
Paperless-NGX + AI
Document management system with AI-powered automatic tagging and categorization.
Deployment
- Host: Calypso (Synology NAS)
- Paperless-NGX URL: https://paperlessngx.vishconcord.synology.me
- Paperless-AI URL: http://calypso.local:3000
- Deployed via: Portainer Stacks
Stacks
1. Paperless-NGX (paperless-testing)
Main document management system with office document support.
File: docker-compose.yml
| Container | Port | Purpose |
|---|---|---|
| PaperlessNGX | 8777 | Main web UI |
| PaperlessNGX-DB | - | PostgreSQL database |
| PaperlessNGX-REDIS | - | Redis cache |
| PaperlessNGX-GOTENBERG | - | Office doc conversion |
| PaperlessNGX-TIKA | - | Document parsing |
2. Paperless-AI (paperless-ai)
AI extension for automatic document classification.
File: paperless-ai.yml
| Container | Port | Purpose |
|---|---|---|
| PaperlessNGX-AI | 3000 (host) | AI processing & web UI |
Data Locations
| Data | Path |
|---|---|
| Documents | /volume1/docker/paperlessngx/media |
| Database | /volume1/docker/paperlessngx/db |
| Export/Backup | /volume1/docker/paperlessngx/export |
| Consume folder | /volume1/docker/paperlessngx/consume |
| Trash | /volume1/docker/paperlessngx/trash |
| AI config | /volume1/docker/paperlessngxai |
Credentials
Paperless-NGX
- URL: https://paperlessngx.vishconcord.synology.me
- Admin user: vish
- Admin password: "REDACTED_PASSWORD"
PostgreSQL
- Database: paperless
- User: paperlessuser
- Password: "REDACTED_PASSWORD"
Redis
- Password: "REDACTED_PASSWORD"
API Token
- Token:
REDACTED_API_TOKEN
AI Integration (Ollama)
Paperless-AI connects to Ollama on Atlantis for LLM inference.
Ollama URL: https://ollama.vishconcord.synology.me Model: neural-chat:7b (recommended)
Configuring AI
- Access Paperless-AI web UI: http://calypso.local:3000
- Complete initial setup wizard
- Configure:
- AI Provider: Ollama
- Ollama URL: https://ollama.vishconcord.synology.me
- Model: neural-chat:7b (or llama3.2:latest)
- Set up tags and document types to auto-assign
- Restart container after initial setup to build RAG index
Available Ollama Models
| Model | Size | Best For |
|---|---|---|
| neural-chat:7b | 7B | General documents |
| llama3.2:3b | 3.2B | Fast processing |
| mistral:7b | 7.2B | High quality |
| phi3:mini | 3.8B | Balanced |
Backup
Manual Export
# SSH into Calypso or use Portainer exec
docker exec PaperlessNGX document_exporter ../export -c -d
Backup Location
Exports are saved to: /volume1/docker/paperlessngx/export/
Restore
docker exec PaperlessNGX document_importer ../export
Troubleshooting
Paperless-AI not connecting to Ollama
- Verify Ollama is running on Atlantis
- Check URL is correct:
https://ollama.vishconcord.synology.me - Test connectivity:
curl https://ollama.vishconcord.synology.me/api/tags
Documents not being processed
- Check Paperless-AI logs:
docker logs PaperlessNGX-AI - Verify API token is correct
- Ensure tags are configured in Paperless-AI web UI
OCR issues
- Check Tika and Gotenberg are running
- Verify language is set:
PAPERLESS_OCR_LANGUAGE: eng