Skip to content

RESHAPELab/better-docgpt-n8n

Repository files navigation

Better DocGPT N8N

A comprehensive multi-service application that integrates n8n (automation platform), MCP (Model Context Protocol) server, Matrix bot, and various supporting services for document processing and AI integration.

πŸ—οΈ Architecture Overview

This system provides a scalable automation platform with the following core components:

  • n8n: Main automation platform with scalable worker instances
  • MCP Server (deprecated): Python-based service providing document processing and AI tools
  • Matrix Bot: Handles Matrix room events and forwards to n8n webhooks
  • Vector Storage: Qdrant for embeddings, ChromaDB for MCP server
  • Databases: PostgreSQL (n8n), MongoDB (general data), Redis (queue management)

πŸš€ Quick Start

Prerequisites

  • Docker & Docker Compose
  • Make (for convenient commands)

Getting Started

  1. Configure environment: Create .env file (see Configuration section)

  2. Add credentials (if restoring): Place credential files in ./credentials/ directory

    • Credentials are not tracked by git (please request to admin)
    • Required before running make import
  3. Start services:

make up                # Start all services
make up-ollama         # Start with Ollama (local LLM)
make up-build          # Start and rebuild images
  1. Import workflows and credentials:
make import
  1. Initial n8n setup:

Stop the System

make down

⚑ Worker Scaling

n8n workers handle workflow execution in parallel. Scale workers based on your workload:

# Scale workers to specific number
make scale WORKERS=8

# Check current service status
make ps

# Restart all services
make reload

# Restart with rebuild
make reload-build

πŸ“‹ Available Commands

Development & Deployment

make up              # Start all services
make up-ollama       # Start with Ollama profile
make up-build        # Start and rebuild images
make down            # Stop all services
make reload          # Restart all services
make reload-build    # Restart and rebuild images

Worker Scaling

make scale WORKERS=N # Scale to N workers
make ps              # Show service status

Backup & Import

make backup          # Export workflows and credentials
make import          # Import workflows and credentials

πŸ”§ Configuration

Environment Variables

Create a .env file with the following variables:

# Database Configuration
POSTGRES_USER=your_postgres_user
POSTGRES_PASSWORD=your_postgres_password  
POSTGRES_DB=your_postgres_db
POSTGRES_NON_ROOT_USER=n8n_user
POSTGRES_NON_ROOT_PASSWORD=n8n_password

# MongoDB
MONGO_USERNAME=your_mongo_user
MONGO_PASSWORD=your_mongo_password
MONGO_DATABASE=your_mongo_db

# N8N Configuration  
N8N_ENCRYPTION_KEY=your_encryption_key

# Matrix Bot (optional)
MATRIX_PASSWORD=your_matrix_password
N8N_WEBHOOK_BASE_URL=http://n8n:5678/webhook/

# AI Services (optional)
OPENAI_API_KEY=your_openai_key
GOOGLE_API_KEY=your_google_key

🌐 Service Ports

  • 5678: n8n web interface
  • 8000: MCP server
  • 8787: Dask dashboard
  • 6333/6334: Qdrant vector database
  • 8001: ChromaDB
  • 6379: Redis
  • 27018: MongoDB
  • 11434: Ollama (when using ollama profile)

πŸ“ Project Structure

better-docgpt-n8n/
β”œβ”€β”€ docker-compose.yml          # Main service orchestration
β”œβ”€β”€ makefile                   # Convenient commands
β”œβ”€β”€ .env                       # Environment configuration
β”œβ”€β”€ mcp_server/               # MCP Protocol server
β”‚   β”œβ”€β”€ main.py              # Server entry point
β”‚   β”œβ”€β”€ config/              # Configuration modules
β”‚   β”œβ”€β”€ project/             # Project tools
β”‚   └── web/                 # Web scraping tools  
β”œβ”€β”€ matrix/                  # Matrix bot service
β”‚   └── main.py             # Matrix bot implementation
β”œβ”€β”€ snapshots/              # Exported n8n workflows
β”œβ”€β”€ .n8n/                   # n8n configuration & data
β”œβ”€β”€ .data/                  # Shared data volume
β”œβ”€β”€ init-n8n-db.sh         # PostgreSQL initialization
└── init-mongo-db.js       # MongoDB initialization

🧩 Service Details

N8N Automation Platform

  • Queue-based execution with Redis
  • Scalable worker instances (configurable)
  • PostgreSQL backend with optimized settings
  • Manual executions can be offloaded to workers
  • Workflows and Credentials can be exported to snapshots/ and credentials/ via make backup

MCP Server

  • FastMCP-based server providing document processing tools
  • Uses Dask for distributed computing
  • Web scraping capabilities with Playwright
  • ChromaDB integration for vector storage

Matrix Bot

  • Processes messages mentioning the bot
  • Forwards raw Matrix events to n8n webhooks
  • Maintains thread replies for better UX
  • Token persistence for session management

πŸ”„ Development Workflow

Local Development

For Python services (mcp_server & matrix):

cd mcp_server/  # or matrix/
uv sync         # Install dependencies  
uv run ./main.py  # Run locally

Backup & Restore

# Create complete backup
make backup

# Restore from backup
make import

Monitoring

# View service logs
docker compose logs -f [service_name]

# Check service status
make ps

# Access services in browser:
# - n8n interface: http://localhost:5678
# - Dask dashboard: http://localhost:8787
# - Qdrant: http://localhost:6333/dashboard

πŸ› οΈ Troubleshooting

Common Issues

  1. Services not starting: Check .env file configuration
  2. Worker scaling issues: Ensure Redis is healthy
  3. Database connection errors: Verify database initialization scripts
  4. Port conflicts: Check if ports are already in use

Health Checks

make ps                        # Check all services
docker compose logs postgres   # Database logs
docker compose logs redis      # Queue logs

πŸ“– Additional Resources

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published