Local Deployment Guide
Prerequisites
- Docker or Podman
- Make
- 8GB RAM minimum
- (Optional) NVIDIA GPU with CUDA support
Quick Start
- Clone the repository and initialize:
git clone https://github.com/evereven-tech/horizons-omnichat.git cd horizons-omnichat make init
- Configure environment:
cp local/.env.example local/.env # Edit local/.env to set your preferences
- Start services:
make local-up
- You can access at the following url:
- http://localhost:3002/
Troubleshooting
Common Issues
Container Startup Failures
- Database Connection Issues
# Check database logs
docker logs open-webui-db
# Verify database is running
docker exec open-webui-db pg_isready
- Ollama Model Download Issues
# Check Ollama logs
docker logs ollama
# Manually trigger model download
docker exec ollama ollama pull tinyllama
- WebUI Connection Issues
# Check WebUI logs
docker logs open-webui
# Verify WebUI is responding
curl http://localhost:3002/health
GPU Support
- Verify NVIDIA Driver Installation
nvidia-smi
- Check Docker GPU Access
# Should list GPU devices
docker run --rm --gpus all nvidia/cuda:12.8.0-base-oraclelinux9 nvidia-smi
- Enable GPU in Configuration
# Edit .env file
OLLAMA_USE_GPU=true
Resource Issues
- Memory Problems
# Check container memory usage
docker stats
# Increase container memory limits in docker-compose.yml if needed
- Disk Space
# Check available space
df -h
# Clean up unused Docker resources
docker system prune
Network Issues
- Port Conflicts
# Check what's using port 3002
sudo lsof -i :3002
# Change port in docker-compose.yml if needed
- Container Communication
# Verify network creation
docker network ls | grep local_chatbot-net
# Check network connectivity
docker network inspect local_chatbot-net
Maintenance
Backup Data
# Backup PostgreSQL database
docker exec open-webui-db pg_dump -U $POSTGRES_USER $POSTGRES_DB > backup.sql
# Backup Ollama models (root permissions)
tar -czf ollama-models.tar.gz $(docker volume inspect -f '' local_ollama-data)
Update Components
# Pull latest images
docker compose pull
# Restart services
make local-down
make local-up
Logs and Monitoring
# View all logs
docker compose logs -f
# View specific service logs
docker compose logs -f ollama
docker compose logs -f open-webui
Advanced Configuration
Custom Model Configuration
Edit local/.env
and add/remove models you like: https://ollama.com/library
INSTALLED_MODELS=llama2,mistral,tinyllama
Database Tuning
Edit PostgreSQL configuration:
docker exec -it open-webui-db psql -U $POSTGRES_USER -d $POSTGRES_DB
Security Hardening
Remember, it is a local environment but you can:
- Change default passwords in
.env
- Enable TLS for database connections
- Configure authentication for WebUI
Getting Help
- Check the GitHub Issues
- Join our Community Discussion
- Review logs using commands above
Horizons OmniChat by evereven