A comprehensive multi-service application that integrates n8n (automation platform), MCP (Model Context Protocol) server, Matrix bot, and various supporting services for document processing and AI integration.
This system provides a scalable automation platform with the following core components:
- n8n: Main automation platform with scalable worker instances
- MCP Server (deprecated): Python-based service providing document processing and AI tools
- Matrix Bot: Handles Matrix room events and forwards to n8n webhooks
- Vector Storage: Qdrant for embeddings, ChromaDB for MCP server
- Databases: PostgreSQL (n8n), MongoDB (general data), Redis (queue management)
- Docker & Docker Compose
- Make (for convenient commands)
-
Configure environment: Create
.envfile (see Configuration section) -
Add credentials (if restoring): Place credential files in
./credentials/directory- Credentials are not tracked by git (please request to admin)
- Required before running
make import
-
Start services:
make up # Start all services
make up-ollama # Start with Ollama (local LLM)
make up-build # Start and rebuild images- Import workflows and credentials:
make import- Initial n8n setup:
- Access n8n at http://localhost:5678
- Complete registration and one-time setup
- Enable features: In Settings, enter the license key received via email to unlock features
- Install community nodes: In Settings > Community Nodes, install:
make downn8n workers handle workflow execution in parallel. Scale workers based on your workload:
# Scale workers to specific number
make scale WORKERS=8
# Check current service status
make ps
# Restart all services
make reload
# Restart with rebuild
make reload-buildmake up # Start all services
make up-ollama # Start with Ollama profile
make up-build # Start and rebuild images
make down # Stop all services
make reload # Restart all services
make reload-build # Restart and rebuild imagesmake scale WORKERS=N # Scale to N workers
make ps # Show service statusmake backup # Export workflows and credentials
make import # Import workflows and credentialsCreate a .env file with the following variables:
# Database Configuration
POSTGRES_USER=your_postgres_user
POSTGRES_PASSWORD=your_postgres_password
POSTGRES_DB=your_postgres_db
POSTGRES_NON_ROOT_USER=n8n_user
POSTGRES_NON_ROOT_PASSWORD=n8n_password
# MongoDB
MONGO_USERNAME=your_mongo_user
MONGO_PASSWORD=your_mongo_password
MONGO_DATABASE=your_mongo_db
# N8N Configuration
N8N_ENCRYPTION_KEY=your_encryption_key
# Matrix Bot (optional)
MATRIX_PASSWORD=your_matrix_password
N8N_WEBHOOK_BASE_URL=http://n8n:5678/webhook/
# AI Services (optional)
OPENAI_API_KEY=your_openai_key
GOOGLE_API_KEY=your_google_key- 5678: n8n web interface
- 8000: MCP server
- 8787: Dask dashboard
- 6333/6334: Qdrant vector database
- 8001: ChromaDB
- 6379: Redis
- 27018: MongoDB
- 11434: Ollama (when using ollama profile)
better-docgpt-n8n/
βββ docker-compose.yml # Main service orchestration
βββ makefile # Convenient commands
βββ .env # Environment configuration
βββ mcp_server/ # MCP Protocol server
β βββ main.py # Server entry point
β βββ config/ # Configuration modules
β βββ project/ # Project tools
β βββ web/ # Web scraping tools
βββ matrix/ # Matrix bot service
β βββ main.py # Matrix bot implementation
βββ snapshots/ # Exported n8n workflows
βββ .n8n/ # n8n configuration & data
βββ .data/ # Shared data volume
βββ init-n8n-db.sh # PostgreSQL initialization
βββ init-mongo-db.js # MongoDB initialization
- Queue-based execution with Redis
- Scalable worker instances (configurable)
- PostgreSQL backend with optimized settings
- Manual executions can be offloaded to workers
- Workflows and Credentials can be exported to
snapshots/andcredentials/viamake backup
- FastMCP-based server providing document processing tools
- Uses Dask for distributed computing
- Web scraping capabilities with Playwright
- ChromaDB integration for vector storage
- Processes messages mentioning the bot
- Forwards raw Matrix events to n8n webhooks
- Maintains thread replies for better UX
- Token persistence for session management
For Python services (mcp_server & matrix):
cd mcp_server/ # or matrix/
uv sync # Install dependencies
uv run ./main.py # Run locally# Create complete backup
make backup
# Restore from backup
make import# View service logs
docker compose logs -f [service_name]
# Check service status
make ps
# Access services in browser:
# - n8n interface: http://localhost:5678
# - Dask dashboard: http://localhost:8787
# - Qdrant: http://localhost:6333/dashboard- Services not starting: Check
.envfile configuration - Worker scaling issues: Ensure Redis is healthy
- Database connection errors: Verify database initialization scripts
- Port conflicts: Check if ports are already in use
make ps # Check all services
docker compose logs postgres # Database logs
docker compose logs redis # Queue logsThis project is licensed under the MIT License - see the LICENSE file for details.