Market Data Service is a high-performance financial data API that provides comprehensive Symbol prices of different markets through both RESTful endpoints and real-time WebSocket connections.
Complete guide for running Market Data Service in Docker containers for both development and production environments.
No other dependencies required! Everything runs in containers.
# 1. Clone the repository
git clone <repository-url>
cd market-data-service
# 2. Copy environment template
cp .env.example .env
# 3. Edit .env file with your settings (optional for development)
# Default values work for local development
# 4. Start all services
docker-compose up -d
# 5. Check service status
docker-compose ps
# 6. View logs
docker-compose logs -f api
The API will be available at:
# 1. Set up environment variables
cp .env.example .env
# Edit .env with production values
# 2. Set up SSL certificates (if using HTTPS)
# Place certificates in ./ssl/certs/ and ./ssl/private/
# Or update SSL_CERT_PATH and SSL_KEY_PATH in docker-compose.prod.yml
# 3. Start production services
docker-compose -f docker-compose.prod.yml up -d
# 4. Check status
docker-compose -f docker-compose.prod.yml ps
The development environment consists of three services:
API Service (api)
Database Service (db)
Nginx Service (nginx)
# Start all services in detached mode
docker-compose up -d
# Start with logs visible
docker-compose up
# Rebuild containers after code changes
docker-compose up -d --build
src/ directorydocker-compose logs -f apidocker-compose restart apiMigrations run automatically on container startup. To run manually:
# Run migrations
docker-compose exec api npx sequelize-cli db:migrate
# Rollback last migration
docker-compose exec api npx sequelize-cli db:migrate:undo
# Check migration status
docker-compose exec api npx sequelize-cli db:migrate:status
# Connect to PostgreSQL
docker-compose exec db psql -U postgres -d financial_data
# Run SQL commands
docker-compose exec db psql -U postgres -d financial_data -c "SELECT * FROM symbols LIMIT 10;"
# List tables
docker-compose exec db psql -U postgres -d financial_data -c "\dt"
Production environment uses optimized settings:
Environment Variables
# Set strong passwords and secrets
DB_PASSWORD=<strong_password>
JWT_SECRET=<strong_random_secret>
CORS_ORIGIN=https://your-domain.com
DOMAIN_NAME=your-domain.com
SSL_EMAIL=your-email@example.com
See CONFIGURATION.md for complete environment variable documentation.
See DEPLOYMENT.md for production deployment with SSL certificate setup.
# Build production images
docker-compose -f docker-compose.prod.yml build
# Start services
docker-compose -f docker-compose.prod.yml up -d
# Check health status
docker-compose -f docker-compose.prod.yml ps
Key environment variables (see .env.example for complete list):
| Variable | Description | Default |
|---|---|---|
DB_HOST |
Database host | db (Docker) or localhost |
DB_PORT |
Database port | 5432 |
DB_NAME |
Database name | financial_data |
DB_USER |
Database user | postgres |
DB_PASSWORD |
Database password | Required |
PORT |
API port | 3000 |
NODE_ENV |
Environment | development or production |
CORS_ORIGIN |
Allowed origins | * |
JWT_SECRET |
JWT secret key | Required |
LOG_LEVEL |
Logging level | debug (dev) or info (prod) |
nginx/nginx.conf (HTTP only)nginx/nginx.prod.conf (HTTPS with SSL)Key features:
Database is automatically initialized with:
financial_datapostgres (or from DB_USER)# All services
docker-compose logs -f
# Specific service
docker-compose logs -f api
docker-compose logs -f db
docker-compose logs -f nginx
# Last 100 lines
docker-compose logs --tail=100 api
# Restart all services
docker-compose restart
# Restart specific service
docker-compose restart api
# Stop all services
docker-compose down
# Stop and remove volumes (WARNING: deletes data)
docker-compose down -v
# Execute command in API container
docker-compose exec api sh
# Execute command in database container
docker-compose exec db sh
# Run Node.js commands
docker-compose exec api node -v
docker-compose exec api npm list
# 1. Pull latest code
git pull
# 2. Rebuild containers
docker-compose build
# 3. Restart services (migrations run automatically)
docker-compose up -d
# Or for production
docker-compose -f docker-compose.prod.yml build
docker-compose -f docker-compose.prod.yml up -d
# Check container health
docker-compose ps
# Test API health endpoint
curl http://localhost/health
# Test database connection
docker-compose exec db pg_isready -U postgres
Problem: API can't connect to database
Solutions:
# Check database is running
docker-compose ps db
# Check database logs
docker-compose logs db
# Test database connection manually
docker-compose exec db psql -U postgres -d financial_data
# Verify environment variables
docker-compose exec api env | grep DB_
Problem: Migrations fail on startup
Solutions:
# Check migration logs
docker-compose logs api | grep -i migration
# Run migrations manually
docker-compose exec api npx sequelize-cli db:migrate
# Check migration status
docker-compose exec api npx sequelize-cli db:migrate:status
Problem: WebSocket connections fail through Nginx
Solutions:
docker-compose logs nginxws://localhost:3000/socket.io/Problem: Port already in use
Solutions:
# Check what's using the port
lsof -i :80
lsof -i :5432
# Change ports in docker-compose.yml
# Update PORT and DB_PORT environment variables
Problem: Container exits immediately
Solutions:
# Check exit code
docker-compose ps
# View detailed logs
docker-compose logs api
# Check container status
docker ps -a
# Try starting without detached mode
docker-compose up api
Problem: Permission denied errors
Solutions:
# Fix script permissions
chmod +x docker/*.sh
# Check file ownership
ls -la docker/
# Rebuild containers
docker-compose build --no-cache
# Create backup
docker-compose exec db pg_dump -U postgres financial_data > backup_$(date +%Y%m%d_%H%M%S).sql
# Or using docker directly
docker exec market-data-db pg_dump -U postgres financial_data > backup.sql
# Restore from backup
docker-compose exec -T db psql -U postgres financial_data < backup.sql
# Or using docker directly
docker exec -i market-data-db psql -U postgres financial_data < backup.sql
# Backup database volume
docker run --rm -v market_data_db_data:/data -v $(pwd):/backup alpine tar czf /backup/db_backup.tar.gz /data
# Restore database volume
docker run --rm -v market_data_db_data:/data -v $(pwd):/backup alpine tar xzf /backup/db_backup.tar.gz -C /
Create a cron job or scheduled task:
# Daily backup script
#!/bin/bash
BACKUP_DIR="./backups"
DATE=$(date +%Y%m%d_%H%M%S)
mkdir -p $BACKUP_DIR
docker-compose exec -T db pg_dump -U postgres financial_data | gzip > $BACKUP_DIR/backup_$DATE.sql.gz
# Keep only last 30 days
find $BACKUP_DIR -name "backup_*.sql.gz" -mtime +30 -delete
Update MT5 EA settings:
ApiBaseUrl: http://market-data.localUpdate MT5 EA settings:
ApiBaseUrl: https://market-price.insightbull.io (or your production domain)wss:// protocol# Increase shared buffers (edit postgresql.conf in container)
# Or use environment variables in docker-compose.yml
worker_processes in nginx.confworker_connections based on expected loadLimit resources in docker-compose.prod.yml:
services:
api:
deploy:
resources:
limits:
cpus: '2'
memory: 2G
reservations:
cpus: '1'
memory: 1G
DB_PASSWORD and JWT_SECRETCORS_ORIGIN in productionFor issues or questions:
docker-compose logs