Files
homelab-optimized/docs/services/paperless.md
Gitea Mirror Bot fad8c41112
Some checks failed
Documentation / Deploy to GitHub Pages (push) Has been cancelled
Documentation / Build Docusaurus (push) Has been cancelled
Sanitized mirror from private repository - 2026-04-06 02:25:47 UTC
2026-04-06 02:25:48 +00:00

129 lines
3.4 KiB
Markdown

# Paperless-NGX + AI
Document management system with AI-powered automatic tagging and categorization.
## Deployment
- **Host:** Calypso (Synology NAS)
- **Paperless-NGX URL:** https://paperlessngx.vishconcord.synology.me
- **Paperless-AI URL:** http://calypso.local:3000
- **Deployed via:** Portainer Stacks
## Stacks
### 1. Paperless-NGX (paperless-testing)
Main document management system with office document support.
**File:** `docker-compose.yml`
| Container | Port | Purpose |
|-----------|------|---------|
| PaperlessNGX | 8777 | Main web UI |
| PaperlessNGX-DB | - | PostgreSQL database |
| PaperlessNGX-REDIS | - | Redis cache |
| PaperlessNGX-GOTENBERG | - | Office doc conversion |
| PaperlessNGX-TIKA | - | Document parsing |
### 2. Paperless-AI (paperless-ai)
AI extension for automatic document classification.
**File:** `paperless-ai.yml`
| Container | Port | Purpose |
|-----------|------|---------|
| PaperlessNGX-AI | 3000 (host) | AI processing & web UI |
## Data Locations
| Data | Path |
|------|------|
| Documents | `/volume1/docker/paperlessngx/media` |
| Database | `/volume1/docker/paperlessngx/db` |
| Export/Backup | `/volume1/docker/paperlessngx/export` |
| Consume folder | `/volume1/docker/paperlessngx/consume` |
| Trash | `/volume1/docker/paperlessngx/trash` |
| AI config | `/volume1/docker/paperlessngxai` |
## Credentials
### Paperless-NGX
- URL: https://paperlessngx.vishconcord.synology.me
- Admin user: vish
- Admin password: "REDACTED_PASSWORD"
### PostgreSQL
- Database: paperless
- User: paperlessuser
- Password: "REDACTED_PASSWORD"
### Redis
- Password: "REDACTED_PASSWORD"
### API Token
- Token: `REDACTED_API_TOKEN`
## AI Integration (Ollama)
Paperless-AI connects to Ollama on Atlantis for LLM inference.
**Ollama URL:** https://ollama.vishconcord.synology.me
**Model:** neural-chat:7b (recommended)
### Configuring AI
1. Access Paperless-AI web UI: http://calypso.local:3000
2. Complete initial setup wizard
3. Configure:
- AI Provider: Ollama
- Ollama URL: https://ollama.vishconcord.synology.me
- Model: neural-chat:7b (or llama3.2:latest)
4. Set up tags and document types to auto-assign
5. Restart container after initial setup to build RAG index
### Available Ollama Models
| Model | Size | Best For |
|-------|------|----------|
| neural-chat:7b | 7B | General documents |
| llama3.2:3b | 3.2B | Fast processing |
| mistral:7b | 7.2B | High quality |
| phi3:mini | 3.8B | Balanced |
## Backup
### Manual Export
```bash
# SSH into Calypso or use Portainer exec
docker exec PaperlessNGX document_exporter ../export -c -d
```
### Backup Location
Exports are saved to: `/volume1/docker/paperlessngx/export/`
### Restore
```bash
docker exec PaperlessNGX document_importer ../export
```
## Troubleshooting
### Paperless-AI not connecting to Ollama
1. Verify Ollama is running on Atlantis
2. Check URL is correct: `https://ollama.vishconcord.synology.me`
3. Test connectivity: `curl https://ollama.vishconcord.synology.me/api/tags`
### Documents not being processed
1. Check Paperless-AI logs: `docker logs PaperlessNGX-AI`
2. Verify API token is correct
3. Ensure tags are configured in Paperless-AI web UI
### OCR issues
1. Check Tika and Gotenberg are running
2. Verify language is set: `PAPERLESS_OCR_LANGUAGE: eng`
## Documentation
- [Paperless-ngx Docs](https://docs.paperless-ngx.com/)
- [Paperless-AI GitHub](https://github.com/clusterzx/paperless-ai)
- [Ollama Docs](https://ollama.com/)