Files
homelab-optimized/hosts/synology/calypso/DEPLOYMENT_SUMMARY.md
Gitea Mirror Bot fb00a325d1
Some checks failed
Documentation / Build Docusaurus (push) Failing after 5m14s
Documentation / Deploy to GitHub Pages (push) Has been skipped
Sanitized mirror from private repository - 2026-04-18 11:19:59 UTC
2026-04-18 11:19:59 +00:00

134 lines
3.8 KiB
Markdown

# Calypso GitOps Deployment Summary
## 🎯 Completed Deployments
### ✅ Reactive Resume v5 with AI Integration
- **Location**: `/home/homelab/organized/repos/homelab/Calypso/reactive_resume_v5/`
- **External URL**: https://rx.vish.gg
- **Internal URL**: http://192.168.0.250:9751
- **AI Features**: Ollama with llama3.2:3b model
- **Status**: ✅ ACTIVE
**Services**:
- Resume-ACCESS-V5: Main application (port 9751)
- Resume-DB-V5: PostgreSQL 18 database
- Resume-BROWSERLESS-V5: PDF generation (port 4000)
- Resume-SEAWEEDFS-V5: S3 storage (port 9753)
- Resume-OLLAMA-V5: AI engine (port 11434)
### ✅ Nginx Proxy Manager (Fixed)
- **Location**: `/home/homelab/organized/repos/homelab/Calypso/nginx_proxy_manager/`
- **Admin UI**: http://192.168.0.250:81
- **HTTP Proxy**: http://192.168.0.250:8880 (external port 80)
- **HTTPS Proxy**: https://192.168.0.250:8443 (external port 443)
- **Status**: ✅ ACTIVE
## 🚀 GitOps Commands
### Reactive Resume v5
```bash
cd /home/homelab/organized/repos/homelab/Calypso/reactive_resume_v5
# Deploy complete stack with AI
./deploy.sh deploy
# Management commands
./deploy.sh status # Check all services
./deploy.sh logs # View application logs
./deploy.sh restart # Restart services
./deploy.sh stop # Stop services
./deploy.sh update # Update images
./deploy.sh setup-ollama # Setup AI model
```
### Nginx Proxy Manager
```bash
cd /home/homelab/organized/repos/homelab/Calypso/nginx_proxy_manager
# Deploy NPM
./deploy.sh deploy
# Management commands
./deploy.sh status # Check service status
./deploy.sh logs # View NPM logs
./deploy.sh restart # Restart NPM
./deploy.sh cleanup # Clean up containers
```
## 🌐 Network Configuration
### Router Port Forwarding
- **Port 80** → **8880** (HTTP to NPM)
- **Port 443** → **8443** (HTTPS to NPM)
### DNS Configuration
- **rx.vish.gg** → YOUR_WAN_IP ✅
- **rxdl.vish.gg** → YOUR_WAN_IP ✅
### NPM Proxy Configuration
NPM should be configured with:
1. **rx.vish.gg** → http://192.168.0.250:9751
2. **rxdl.vish.gg** → http://192.168.0.250:9753
## 🤖 AI Integration
### Ollama Configuration
- **Service**: Resume-OLLAMA-V5
- **Port**: 11434
- **Model**: llama3.2:3b (2GB)
- **API**: http://192.168.0.250:11434
### AI Features in Reactive Resume
- Resume content suggestions
- Job description analysis
- Skills optimization
- Cover letter generation
## 📊 Service Status
### Current Status (2026-02-16)
```
✅ Resume-ACCESS-V5 - Up and healthy
✅ Resume-DB-V5 - Up and healthy
✅ Resume-BROWSERLESS-V5 - Up and healthy
✅ Resume-SEAWEEDFS-V5 - Up and healthy
✅ Resume-OLLAMA-V5 - Up with llama3.2:3b loaded
✅ nginx-proxy-manager - Up and healthy
```
### External Access Test
```bash
curl -I https://rx.vish.gg
# HTTP/2 200 ✅
```
## 🔧 Troubleshooting
### If External Access Fails
1. Check NPM proxy host configuration
2. Verify router port forwarding (80→8880, 443→8443)
3. Confirm DNS propagation: `nslookup rx.vish.gg`
### If AI Features Don't Work
1. Check Ollama: `./deploy.sh logs` (look for Resume-OLLAMA-V5)
2. Verify model: `ssh Vish@192.168.0.250 -p 62000 "sudo /usr/local/bin/docker exec Resume-OLLAMA-V5 ollama list"`
### Service Management
```bash
# Check all services
ssh Vish@192.168.0.250 -p 62000 "sudo /usr/local/bin/docker ps"
# Restart specific service
ssh Vish@192.168.0.250 -p 62000 "sudo /usr/local/bin/docker restart Resume-ACCESS-V5"
```
## 🎉 Migration Complete
**Reactive Resume v5** deployed with AI integration
**NPM** fixed and deployed via GitOps
**External access** working (https://rx.vish.gg)
**AI features** ready with Ollama
**Port compatibility** maintained from v4
**GitOps workflow** established
Your Reactive Resume v5 is now fully operational with AI capabilities!