Horizons: The OmniChat

A flexible and powerful chatbot platform that brings enterprise-grade LLM capabilities to your infrastructure.

View project on GitHub

Local Deployment Guide

Prerequisites

Quick Start

  1. Clone the repository and initialize:
    git clone https://github.com/evereven-tech/horizons-omnichat.git
    cd horizons-omnichat
    make init
    
  2. Configure environment:
    cp local/.env.example local/.env
    # Edit local/.env to set your preferences
    
  3. Start services:
    make local-up
    
  4. You can access at the following url:
    • http://localhost:3002/

Troubleshooting

Common Issues

Container Startup Failures

  1. Database Connection Issues
# Check database logs
docker logs open-webui-db

# Verify database is running
docker exec open-webui-db pg_isready
  1. Ollama Model Download Issues
# Check Ollama logs
docker logs ollama

# Manually trigger model download
docker exec ollama ollama pull tinyllama
  1. WebUI Connection Issues
# Check WebUI logs
docker logs open-webui

# Verify WebUI is responding
curl http://localhost:3002/health

GPU Support

  1. Verify NVIDIA Driver Installation
nvidia-smi
  1. Check Docker GPU Access
# Should list GPU devices
docker run --rm --gpus all nvidia/cuda:12.8.0-base-oraclelinux9 nvidia-smi
  1. Enable GPU in Configuration
# Edit .env file
OLLAMA_USE_GPU=true

Resource Issues

  1. Memory Problems
# Check container memory usage
docker stats

# Increase container memory limits in docker-compose.yml if needed
  1. Disk Space
# Check available space
df -h

# Clean up unused Docker resources
docker system prune

Network Issues

  1. Port Conflicts
# Check what's using port 3002
sudo lsof -i :3002

# Change port in docker-compose.yml if needed
  1. Container Communication
# Verify network creation
docker network ls | grep local_chatbot-net

# Check network connectivity
docker network inspect local_chatbot-net

Maintenance

Backup Data

# Backup PostgreSQL database
docker exec open-webui-db pg_dump -U $POSTGRES_USER $POSTGRES_DB > backup.sql

# Backup Ollama models (root permissions)
tar -czf ollama-models.tar.gz $(docker volume inspect -f '' local_ollama-data)

Update Components

# Pull latest images
docker compose pull

# Restart services
make local-down
make local-up

Logs and Monitoring

# View all logs
docker compose logs -f

# View specific service logs
docker compose logs -f ollama
docker compose logs -f open-webui

Advanced Configuration

Custom Model Configuration

Edit local/.env and add/remove models you like: https://ollama.com/library

INSTALLED_MODELS=llama2,mistral,tinyllama

Database Tuning

Edit PostgreSQL configuration:

docker exec -it open-webui-db psql -U $POSTGRES_USER -d $POSTGRES_DB

Security Hardening

Remember, it is a local environment but you can:

  1. Change default passwords in .env
  2. Enable TLS for database connections
  3. Configure authentication for WebUI

Getting Help

  1. Check the GitHub Issues
  2. Join our Community Discussion
  3. Review logs using commands above

Horizons OmniChat by evereven