A polished deployment of the Task Manager capstone requires more than working code — it needs a documented API, a contribution workflow, a changelog, and the production environment configuration that ties everything together. This lesson covers OpenAPI documentation with Swagger UI (auto-generated from Zod schemas), the complete production environment configuration, the make-based developer workflow, and the architecture documentation that makes the project maintainable long after the initial build.
Production Readiness Checklist
| Category | Item | Status Check |
|---|---|---|
| Security | HTTPS only, HSTS header, JWT secrets ≥32 chars, no secrets in code | helmet() + app.set('trust proxy', 1) |
| Authentication | Access token in memory, refresh token in httpOnly cookie, token rotation | Auth store + secure cookie flags |
| Database | Connection pooling, indexes created, slow query logging enabled | maxPoolSize + schema.index() definitions |
| Caching | Redis connected, TTLs defined, cache invalidation tested | CacheService with getOrSet pattern |
| Rate limiting | Global + auth endpoint + per-user limits in Redis | Sliding window Lua script |
| Error handling | All errors return structured JSON, no stack traces in production | errorMiddleware with NODE_ENV check |
| Logging | Structured JSON logs, correlation IDs, no sensitive data logged | Winston + AsyncLocalStorage |
| Monitoring | Prometheus metrics endpoint, health probes, uptime check | prom-client + /health/ready |
| Documentation | OpenAPI spec, README with setup steps, .env.example | Swagger UI + docs/ |
zod-to-openapi to convert Zod validation schemas to OpenAPI components ensures the documentation is always in sync with the actual validation. A separate documentation file that diverges from the code is worse than no documentation.apps/api/src/scripts/seed.js and run it with make seed. Never run the seed script against a production database.CORS_ORIGINS environment variable in production must list only your production frontend domain — never *. A wildcard CORS policy with credentials: true is actually rejected by browsers (the spec disallows it), but if you accidentally set credentials: false with *, you enable cross-site request forgery from any origin. Explicitly list: CORS_ORIGINS=https://app.taskmanager.io,https://www.taskmanager.io.Complete Documentation and Dev Workflow
# ── openapi.yml — API documentation (excerpt) ────────────────────────────
openapi: 3.1.0
info:
title: Task Manager API
version: 1.0.0
description: Complete MEAN Stack Task Manager REST API
servers:
- url: http://localhost:3000/api/v1
description: Development
- url: https://api.taskmanager.io/api/v1
description: Production
components:
securitySchemes:
bearerAuth:
type: http
scheme: bearer
bearerFormat: JWT
schemas:
Task:
type: object
properties:
id: { type: string, format: objectid }
title: { type: string, maxLength: 500 }
status: { type: string, enum: [todo, in-progress, in-review, done, cancelled] }
priority: { type: string, enum: [none, low, medium, high, urgent] }
dueDate: { type: string, format: date-time, nullable: true }
tags: { type: array, items: { type: string } }
isOverdue: { type: boolean }
createdAt: { type: string, format: date-time }
updatedAt: { type: string, format: date-time }
required: [id, title, status, priority, tags, isOverdue]
CreateTaskRequest:
type: object
required: [title, workspaceId]
properties:
title: { type: string, minLength: 1, maxLength: 500 }
description: { type: string, maxLength: 10000 }
priority: { type: string, enum: [none, low, medium, high, urgent] }
dueDate: { type: string, format: date-time }
tags: { type: array, items: { type: string }, maxItems: 10 }
workspaceId: { type: string, format: objectid }
ValidationError:
type: object
properties:
message: { type: string, example: "Validation failed" }
code: { type: string, example: "VALIDATION_ERROR" }
fields:
type: object
additionalProperties: { type: string }
example: { title: "Title is required" }
security:
- bearerAuth: []
paths:
/tasks:
post:
summary: Create a task
operationId: createTask
tags: [Tasks]
requestBody:
required: true
content:
application/json:
schema: { $ref: '#/components/schemas/CreateTaskRequest' }
responses:
'201':
description: Task created
content:
application/json:
schema:
type: object
properties:
success: { type: boolean, example: true }
data: { $ref: '#/components/schemas/Task' }
'400':
description: Validation error
content:
application/json:
schema: { $ref: '#/components/schemas/ValidationError' }
'401':
description: Unauthorized
# Makefile — developer workflow
.PHONY: up down build logs test seed docs help
up: ## Start all services in development mode
docker compose up -d
build: ## Rebuild images and start
docker compose up -d --build
down: ## Stop all services
docker compose down
clean: ## Stop services and remove volumes (DELETES DATA)
docker compose down -v
logs: ## Follow all service logs
docker compose logs -f
logs-api: ## Follow API logs
docker compose logs -f api
test: ## Run all tests
docker compose exec api npm test
npm run test --workspace=apps/client -- --watch=false --browsers=ChromeHeadless
test-api: ## Run API tests only (with coverage)
docker compose exec api npm run test:coverage
seed: ## Seed database with realistic test data
docker compose exec api node src/scripts/seed.js
shell-api: ## Open shell in API container
docker compose exec api sh
shell-mongo: ## Open MongoDB shell
docker compose exec mongodb mongosh taskmanager
docs: ## Serve OpenAPI docs locally (requires redoc-cli)
npx redoc-cli serve docs/openapi.yml --port 8080
typecheck: ## TypeScript type check all packages
npm run typecheck --workspaces --if-present
lint: ## Lint all packages
npm run lint --workspaces --if-present
help: ## Show this help message
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | \
awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-15s\033[0m %s\n", $$1, $$2}'
// apps/api/src/scripts/seed.js — realistic development data
require('dotenv').config();
const mongoose = require('mongoose');
const { faker } = require('@faker-js/faker');
const User = require('../modules/users/user.model');
const Workspace = require('../modules/workspaces/workspace.model');
const Task = require('../modules/tasks/task.model');
const STATUSES = ['todo', 'in-progress', 'in-review', 'done', 'cancelled'];
const PRIORITIES = ['none', 'low', 'medium', 'high', 'urgent'];
const TAG_POOL = ['bug', 'feature', 'docs', 'refactor', 'test', 'design', 'backend', 'frontend', 'urgent', 'blocked'];
async function seed() {
await mongoose.connect(process.env.MONGO_URI);
console.log('Connected. Clearing existing data...');
await Promise.all([User.deleteMany({}), Workspace.deleteMany({}), Task.deleteMany({})]);
// Create admin user
const admin = await User.create({
name: 'Admin User',
email: 'admin@taskmanager.io',
password: 'Password123!',
role: 'admin',
isVerified: true,
});
// Create 9 regular users
const users = await User.insertMany(
Array.from({ length: 9 }, () => ({
name: faker.person.fullName(),
email: faker.internet.email().toLowerCase(),
password: '$2b$12$fixedHashForSeedingOnly',
isVerified: true,
}))
);
const allUsers = [admin, ...users];
// Create 3 workspaces
const workspaces = await Workspace.insertMany(
Array.from({ length: 3 }, (_, i) => ({
name: faker.company.name(),
slug: faker.lorem.slug(2) + '-' + i,
members: allUsers.slice(0, 5 + i).map((u, j) => ({
userId: u._id,
role: j === 0 ? 'owner' : j === 1 ? 'admin' : 'member',
})),
createdBy: admin._id,
}))
);
// Create 200 tasks per workspace
let totalTasks = 0;
for (const ws of workspaces) {
const wsMembers = ws.members.map(m => m.userId);
const tasks = Array.from({ length: 200 }, () => {
const status = faker.helpers.arrayElement(STATUSES);
const createdAt = faker.date.past({ years: 1 });
return {
title: faker.hacker.phrase().slice(0, 100),
description: Math.random() > 0.4 ? faker.lorem.paragraph() : undefined,
status,
priority: faker.helpers.arrayElement(PRIORITIES),
dueDate: Math.random() > 0.5 ? faker.date.future({ years: 0.5 }) : undefined,
tags: faker.helpers.arrayElements(TAG_POOL, { min: 0, max: 4 }),
assignees: faker.helpers.arrayElements(wsMembers, { min: 0, max: 3 }),
workspace: ws._id,
createdBy: faker.helpers.arrayElement(wsMembers),
createdAt,
updatedAt: faker.date.between({ from: createdAt, to: new Date() }),
completedAt: status === 'done' ? faker.date.recent({ days: 30 }) : undefined,
};
});
await Task.insertMany(tasks);
totalTasks += tasks.length;
}
console.log(`Seed complete: ${allUsers.length} users, ${workspaces.length} workspaces, ${totalTasks} tasks`);
console.log(`Admin login: admin@taskmanager.io / Password123!`);
process.exit(0);
}
seed().catch(err => { console.error(err); process.exit(1); });
How It Works
Step 1 — OpenAPI Documentation as Code Stays in Sync
Generating the OpenAPI spec from Zod schemas (via zod-to-openapi) or writing it alongside the validation schemas ensures the documentation reflects the actual API behaviour. When a Zod schema changes — a field becomes optional, a new status is added — the OpenAPI spec is regenerated and reflects the change. Manually maintained docs drift from the implementation within weeks; code-generated docs are always accurate.
Step 2 — Seed Script Creates Representative Test Data
Faker.js generates realistic but fake data — names, emails, sentences, dates. The seed script creates a representative cross-section: tasks in every status and priority combination, tasks with and without due dates, tasks with 0–4 tags, tasks with 0–3 assignees, tasks created across the past year. This variety surfaces UI bugs (empty state handling, long title truncation, overdue badge display) that would not appear with minimal or uniform test data.
Step 3 — Makefile Documents the Development Workflow
The Makefile serves as both a task runner and living documentation. make help (generated by the grep/awk pattern) outputs every target with its description. New team members run make help and immediately know how to start the development environment, run tests, seed data, and access the database shell — without reading through README prose. The targets also encode the correct command-line arguments, preventing the “which flags do I need?” problem.
Step 4 — .env.example Is the Configuration Contract
Every environment variable the application uses is documented in .env.example with its purpose, expected format, and where to get the value for local development. This file is the onboarding document for configuration — a new developer copies it to .env, reads the comments, fills in the values, and runs make up. It also serves as a checklist for production deployment — every variable in .env.example must be set in the production environment.
Step 5 — Architecture Documentation Prevents Knowledge Decay
A docs/ARCHITECTURE.md that explains the monorepo structure, data flow, authentication pattern, caching strategy, and real-time architecture allows future developers (including future you) to understand design decisions without reverse-engineering the code. Document the “why” not just the “what” — why the access token is in memory rather than localStorage, why Change Streams drive cache invalidation rather than write-through, why the ESR rule determines index field order.
Quick Reference — Environment Variables
# apps/api/.env.example
NODE_ENV=development
PORT=3000
# MongoDB — get connection string from Atlas or use Docker
MONGO_URI=mongodb://localhost:27017/taskmanager
# Redis
REDIS_URL=redis://localhost:6379
# JWT — generate with: openssl rand -hex 32
JWT_SECRET=change-me-in-production-must-be-32-chars-minimum
REFRESH_SECRET=change-me-too-must-also-be-32-chars-minimum
JWT_EXPIRES_IN=15m
REFRESH_EXPIRES_IN=7d
# Email (Mailtrap for dev, SES/SendGrid for production)
SMTP_HOST=smtp.mailtrap.io
SMTP_PORT=587
SMTP_USER=your-mailtrap-user
SMTP_PASS=your-mailtrap-pass
FROM_EMAIL=noreply@taskmanager.io
# File uploads
CLOUDINARY_CLOUD_NAME=your-cloud-name
CLOUDINARY_API_KEY=your-api-key
CLOUDINARY_API_SECRET=your-api-secret
# CORS — comma-separated origins
CORS_ORIGINS=http://localhost:4200
# App
APP_URL=http://localhost:3000
CLIENT_URL=http://localhost:4200