Sanitized mirror from private repository - 2026-04-20 01:32:01 UTC
This commit is contained in:
134
hosts/synology/calypso/reactive_resume_v5/README.md
Normal file
134
hosts/synology/calypso/reactive_resume_v5/README.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# Reactive Resume v5 - GitOps Deployment
|
||||
|
||||
This directory contains the GitOps deployment configuration for Reactive Resume v5 on the Calypso server with AI integration.
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
```bash
|
||||
# Deploy the complete stack
|
||||
./deploy.sh
|
||||
|
||||
# Check status
|
||||
./deploy.sh status
|
||||
|
||||
# View logs
|
||||
./deploy.sh logs
|
||||
```
|
||||
|
||||
## 🌐 Access URLs
|
||||
|
||||
- **External**: https://rx.vish.gg
|
||||
- **Internal**: http://192.168.0.250:9751
|
||||
- **Download Service**: http://192.168.0.250:9753 (rxdl.vish.gg)
|
||||
- **Ollama API**: http://192.168.0.250:11434
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
### Core Services
|
||||
- **Main App**: Reactive Resume v5 with AI features
|
||||
- **Database**: PostgreSQL 18
|
||||
- **Storage**: SeaweedFS (S3-compatible)
|
||||
- **PDF Generation**: Browserless Chrome
|
||||
- **AI Engine**: Ollama with llama3.2:3b model
|
||||
|
||||
### Infrastructure
|
||||
- **Proxy**: Nginx Proxy Manager (ports 8880/8443)
|
||||
- **Router**: Port forwarding 80→8880, 443→8443
|
||||
|
||||
## 🤖 AI Features
|
||||
|
||||
Reactive Resume v5 includes AI-powered features:
|
||||
- Resume content suggestions
|
||||
- Job description analysis
|
||||
- Skills optimization
|
||||
- Cover letter generation
|
||||
|
||||
Powered by Ollama running locally with the llama3.2:3b model.
|
||||
|
||||
## 📋 Prerequisites
|
||||
|
||||
1. **Router Configuration**: Forward ports 80→8880 and 443→8443
|
||||
2. **DNS**: rx.vish.gg and rxdl.vish.gg pointing to YOUR_WAN_IP
|
||||
3. **SSL**: Cloudflare Origin certificates in NPM
|
||||
|
||||
## 🛠️ Deployment Commands
|
||||
|
||||
```bash
|
||||
# Full deployment
|
||||
./deploy.sh deploy
|
||||
|
||||
# Setup individual components
|
||||
./deploy.sh setup-npm # Setup Nginx Proxy Manager
|
||||
./deploy.sh setup-ollama # Setup AI model
|
||||
|
||||
# Management
|
||||
./deploy.sh restart # Restart services
|
||||
./deploy.sh stop # Stop services
|
||||
./deploy.sh update # Update images and redeploy
|
||||
./deploy.sh status # Check service status
|
||||
./deploy.sh logs # View application logs
|
||||
```
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### Environment Variables
|
||||
- `APP_URL`: https://rx.vish.gg
|
||||
- `AI_PROVIDER`: ollama
|
||||
- `OLLAMA_URL`: http://ollama:11434
|
||||
- `OLLAMA_MODEL`: llama3.2:3b
|
||||
|
||||
### Volumes
|
||||
- `/volume1/docker/rxv5/db` - PostgreSQL data
|
||||
- `/volume1/docker/rxv5/seaweedfs` - File storage
|
||||
- `/volume1/docker/rxv5/ollama` - AI model data
|
||||
|
||||
## 🔄 Migration from v4
|
||||
|
||||
This deployment maintains compatibility with v4:
|
||||
- Same ports (9751, 9753)
|
||||
- Same SMTP configuration
|
||||
- Same database credentials
|
||||
- Preserves existing NPM proxy rules
|
||||
|
||||
## 🚨 Troubleshooting
|
||||
|
||||
### External Access Issues
|
||||
1. Check router port forwarding: 80→8880, 443→8443
|
||||
2. Verify NPM proxy hosts are configured
|
||||
3. Confirm DNS propagation: `nslookup rx.vish.gg`
|
||||
|
||||
### AI Features Not Working
|
||||
1. Check Ollama service: `docker logs Resume-OLLAMA-V5`
|
||||
2. Pull model manually: `docker exec Resume-OLLAMA-V5 ollama pull llama3.2:3b`
|
||||
3. Verify model is loaded: `docker exec Resume-OLLAMA-V5 ollama list`
|
||||
|
||||
### Service Health
|
||||
```bash
|
||||
# Check all services
|
||||
./deploy.sh status
|
||||
|
||||
# Check specific container
|
||||
ssh Vish@192.168.0.250 -p 62000 "sudo docker logs Resume-ACCESS-V5"
|
||||
```
|
||||
|
||||
## 📊 Monitoring
|
||||
|
||||
- **Application Health**: http://192.168.0.250:9751/health
|
||||
- **Database**: PostgreSQL on port 5432 (internal)
|
||||
- **Storage**: SeaweedFS S3 API on port 8333 (internal)
|
||||
- **AI**: Ollama API on port 11434
|
||||
|
||||
## 🔐 Security
|
||||
|
||||
- All services run with `no-new-privileges:true`
|
||||
- Database credentials are environment-specific
|
||||
- SMTP uses app-specific passwords
|
||||
- External access only through NPM with SSL
|
||||
|
||||
## 📈 Status
|
||||
|
||||
**Status**: ✅ **ACTIVE DEPLOYMENT** (GitOps with AI integration)
|
||||
- **Version**: v5.0.9
|
||||
- **Deployed**: 2026-02-16
|
||||
- **AI Model**: llama3.2:3b
|
||||
- **External Access**: ✅ Configured
|
||||
Reference in New Issue
Block a user