Advanced Node.js Interview Questions and Answers

๐Ÿ“‹ Table of Contents โ–พ
  1. Questions & Answers
  2. 📝 Knowledge Check

🟢 Advanced Node.js Interview Questions

This lesson targets mid-to-senior Node.js roles. Topics include Streams, the event loop in depth, child processes, Worker Threads, the cluster module, debugging, performance, testing, the net module, and production patterns. These questions separate Node.js users from those who truly understand the runtime.

Questions & Answers

01 What are Streams in Node.js? What are the four types?

Streams Streams are objects that let you read or write data chunk by chunk (rather than loading everything into memory at once). They are EventEmitter instances and form the backbone of Node.js I/O โ€” HTTP requests, file reads, network connections, and process I/O are all streams.

  • Readable โ€” source of data you read from: fs.createReadStream(), http.IncomingMessage, process.stdin
  • Writable โ€” destination you write data to: fs.createWriteStream(), http.ServerResponse, process.stdout
  • Duplex โ€” both readable and writable simultaneously: TCP sockets (net.Socket), WebSocket connections
  • Transform โ€” Duplex stream that transforms data as it passes through: zlib.createGzip(), crypto.createCipher(), CSV parsers
const fs   = require('fs');
const zlib = require('zlib');

// pipe() โ€” connect streams; handles backpressure automatically
fs.createReadStream('large-file.csv')         // Readable
  .pipe(zlib.createGzip())                    // Transform (compress)
  .pipe(fs.createWriteStream('large-file.csv.gz')); // Writable

// pipeline() โ€” preferred over pipe(); handles errors and cleanup
const { pipeline } = require('stream/promises');
await pipeline(
  fs.createReadStream('input.csv'),
  zlib.createGzip(),
  fs.createWriteStream('output.csv.gz')
); // throws on error, closes all streams on completion or error

02 What is backpressure in Node.js streams and how do you handle it?

Streams Backpressure occurs when a Writable stream cannot process data as fast as a Readable produces it. Without handling it, data accumulates in memory โ€” causing memory leaks and potentially crashing the process.

// โŒ WITHOUT backpressure handling โ€” memory blows up
const readable = fs.createReadStream('huge-file.txt');
const writable = fs.createWriteStream('output.txt');

readable.on('data', chunk => {
  writable.write(chunk); // writable.write() returns false when buffer is full
  // but we never check! readable continues emitting data
});

// โœ… Manual backpressure handling
readable.on('data', chunk => {
  const ok = writable.write(chunk);
  if (!ok) {
    readable.pause();                        // stop producing data
    writable.once('drain', () => readable.resume()); // resume when buffer drains
  }
});

// โœ… BEST: use pipe() or pipeline() โ€” handles backpressure automatically
readable.pipe(writable);

// โœ… ALSO GOOD: async iteration handles backpressure automatically
async function copyFile(src, dest) {
  const reader = fs.createReadStream(src);
  const writer = fs.createWriteStream(dest);
  for await (const chunk of reader) {
    if (!writer.write(chunk)) {
      await new Promise(resolve => writer.once('drain', resolve));
    }
  }
}

03 How do you create a custom Transform stream?

Streams A custom Transform stream processes incoming data chunks and pushes transformed output.

const { Transform } = require('stream');

// Custom Transform: uppercase every chunk of text
class UpperCaseTransform extends Transform {
  _transform(chunk, encoding, callback) {
    const upper = chunk.toString().toUpperCase();
    this.push(upper);  // push transformed data downstream
    callback();        // signal this chunk is done (can pass error: callback(err))
  }

  _flush(callback) {
    // Called when source stream ends โ€” push any remaining buffered data
    this.push('\n--- END ---\n');
    callback();
  }
}

// Usage
fs.createReadStream('input.txt')
  .pipe(new UpperCaseTransform())
  .pipe(fs.createWriteStream('output.txt'));

// Simpler โ€” Transform.from() for one-liner transforms (Node 16+)
const { Transform } = require('stream');
const upperCase = new Transform({
  transform(chunk, encoding, callback) {
    callback(null, chunk.toString().toUpperCase()); // pass result directly
  }
});

// Real-world example: JSON line parser
class JSONLineParser extends Transform {
  constructor() { super({ objectMode: true }); this._buffer = ''; }
  _transform(chunk, enc, cb) {
    this._buffer += chunk.toString();
    const lines = this._buffer.split('\n');
    this._buffer = lines.pop(); // keep incomplete last line
    lines.filter(l => l.trim()).forEach(l => this.push(JSON.parse(l)));
    cb();
  }
}

04 What are Child Processes in Node.js? What are the four methods?

Child Processes The child_process module lets you spawn subprocesses โ€” running shell commands, scripts, or other programs from Node.js. There are four main methods:

const cp = require('child_process');

// 1. exec() โ€” run a shell command, buffer output in memory (simple, small output)
cp.exec('git log --oneline -5', (err, stdout, stderr) => {
  if (err) return console.error(err);
  console.log(stdout);
});
// Promise version:
const { exec } = require('child_process');
const { promisify } = require('util');
const execAsync = promisify(exec);
const { stdout } = await execAsync('ls -la');

// 2. execFile() โ€” like exec() but runs a file directly (no shell, safer)
cp.execFile('./scripts/deploy.sh', ['--env', 'prod'], (err, stdout) => { ... });

// 3. spawn() โ€” launch process with streaming I/O (large output, long-running)
const child = cp.spawn('ffmpeg', ['-i', 'input.mp4', 'output.webm']);
child.stdout.on('data', chunk => process.stdout.write(chunk));
child.stderr.on('data', chunk => process.stderr.write(chunk));
child.on('close', code => console.log('exited with code', code));

// 4. fork() โ€” spawn a new NODE.JS process with IPC channel (message passing)
const worker = cp.fork('./worker.js');
worker.send({ task: 'processImage', file: 'photo.jpg' });
worker.on('message', result => console.log('Result:', result));
worker.on('exit', code => console.log('Worker exited', code));

05 What are Worker Threads? How do they differ from Child Processes?

Concurrency Worker Threads (introduced in Node.js 10, stable in 12) run JavaScript in parallel threads within the same process. Unlike child processes, workers share memory via SharedArrayBuffer and Atomics.

// main.js
const { Worker, isMainThread, parentPort, workerData } = require('worker_threads');

if (isMainThread) {
  // Main thread โ€” spawn worker for CPU-intensive task
  const worker = new Worker(__filename, {
    workerData: { numbers: [1, 2, 3, 4, 5], operation: 'sum' }
  });

  worker.on('message', result => console.log('Result:', result));
  worker.on('error',   err    => console.error('Error:', err));
  worker.on('exit',    code   => { if (code !== 0) console.error('Worker stopped'); });
} else {
  // Worker thread โ€” runs in separate thread
  const { numbers, operation } = workerData;
  const result = operation === 'sum'
    ? numbers.reduce((a, b) => a + b, 0)
    : Math.max(...numbers);
  parentPort.postMessage(result); // send result to main thread
}

Worker Threads vs Child Processes:

  • Worker Threads โ€” lighter (same process, shared memory), faster startup, best for CPU-intensive JavaScript (image processing, encryption, JSON parsing of large files)
  • Child Processes โ€” full process isolation, can run any language, each has its own memory/Node.js instance, best for running external programs, shell commands, or completely isolated workloads
06 What is the cluster module? How does it differ from Worker Threads?

Scaling The cluster module forks multiple Node.js processes (one per CPU core) that all share the same server port. The master process distributes incoming connections to worker processes using round-robin (Linux/Mac) or OS-level load balancing (Windows).

const cluster = require('cluster');
const http    = require('http');
const os      = require('os');
const numCPUs = os.cpus().length;

if (cluster.isPrimary) {
  console.log(`Master ${process.pid}: forking ${numCPUs} workers`);

  for (let i = 0; i < numCPUs; i++) cluster.fork();

  cluster.on('exit', (worker, code, signal) => {
    console.log(`Worker ${worker.process.pid} died (${signal || code}). Restarting...`);
    cluster.fork();
  });
} else {
  // Each worker runs the full HTTP server independently
  http.createServer((req, res) => {
    res.writeHead(200);
    res.end(`Handled by worker ${process.pid}\n`);
  }).listen(3000);

  console.log(`Worker ${process.pid} started`);
}

Cluster vs Worker Threads:

  • Cluster โ€” for scaling HTTP servers across CPU cores; each worker is a full Node.js process; no shared memory; best for I/O-bound web servers
  • Worker Threads โ€” for parallelising CPU-intensive tasks within a single request; shared memory possible; best for compute-heavy operations like image processing, data transformation
07 How do you debug a Node.js application?

Debugging

1. Chrome DevTools (most powerful):

node --inspect server.js         # start inspector on localhost:9229
node --inspect-brk server.js     # pause at first line (--brk = break)
# Open chrome://inspect in Chrome โ†’ click "inspect" under Remote Target

2. VS Code debugger (most convenient):

// .vscode/launch.json
{
  "version": "0.2.0",
  "configurations": [{
    "type": "node",
    "request": "launch",
    "name": "Debug App",
    "program": "${workspaceFolder}/src/server.js",
    "env": { "NODE_ENV": "development" },
    "restart": true, // restart on file changes with nodemon
    "runtimeExecutable": "nodemon"
  }]
}

3. Built-in debugger:

node inspect server.js   # CLI debugger
# Commands: cont (c), next (n), step (s), out (o), repl, watch('expr')

4. Programmatic breakpoints:

debugger; // pauses execution when inspector is attached

// Conditional logging
const DEBUG = process.env.DEBUG;
if (DEBUG) console.log('State:', JSON.stringify(state, null, 2));

5. debug npm package:

const debug = require('debug')('myapp:http');
debug('request received: %o', req.headers);
// Enable: DEBUG=myapp:* node server.js

08 What is the net module? How do you create a TCP server?

Networking The net module provides an asynchronous network API for creating TCP/IPC servers and clients. HTTP (and Express) is built on top of it.

const net = require('net');

// TCP Server
const server = net.createServer((socket) => {
  console.log('Client connected:', socket.remoteAddress, socket.remotePort);

  socket.setEncoding('utf8');

  socket.on('data', (data) => {
    console.log('Received:', data.trim());
    socket.write(`Echo: ${data}`); // write back to client
  });

  socket.on('end', () => console.log('Client disconnected'));
  socket.on('error', err => console.error('Socket error:', err.message));
});

server.listen(8080, '127.0.0.1', () => {
  console.log('TCP server on port 8080');
});

// TCP Client
const client = net.createConnection({ port: 8080, host: '127.0.0.1' }, () => {
  console.log('Connected to server');
  client.write('Hello, server!\n');
});

client.on('data', data => {
  console.log('Server response:', data.toString());
  client.end(); // close connection
});

// Unix domain socket (IPC between processes on same machine)
server.listen('/tmp/my.sock');
net.createConnection('/tmp/my.sock', () => { ... });

09 What is the readline module? How do you build CLI tools with it?

I/O The readline module reads input line by line from a Readable stream โ€” used for interactive CLI prompts, reading large files line by line, and REPL-style tools.

const readline = require('readline');

// โ”€โ”€ Interactive CLI prompt โ”€โ”€
const rl = readline.createInterface({
  input:  process.stdin,
  output: process.stdout
});

rl.question('What is your name? ', (name) => {
  console.log(`Hello, ${name}!`);
  rl.question('How old are you? ', (age) => {
    console.log(`You are ${age} years old.`);
    rl.close();
  });
});

// โ”€โ”€ Promise-based question (Node 17+) โ”€โ”€
const rl2 = readline.createInterface({ input: process.stdin, output: process.stdout });
const name = await rl2.question('Enter your name: ');
rl2.close();

// โ”€โ”€ Read a large file line by line (memory-efficient, no need to load all at once) โ”€โ”€
const fs = require('fs');
const fileRl = readline.createInterface({
  input: fs.createReadStream('./large-log.txt'),
  crlfDelay: Infinity  // handle \r\n and \n line endings
});

let lineCount = 0;
for await (const line of fileRl) {
  lineCount++;
  if (line.includes('ERROR')) console.log(`Line ${lineCount}: ${line}`);
}
console.log(`Total lines processed: ${lineCount}`);

10 What is the zlib module? How do you compress data in Node.js?

Compression The zlib module provides gzip, deflate, and Brotli compression/decompression โ€” both as streams and as one-shot Promise-based functions.

const zlib = require('zlib');
const { promisify } = require('util');

// One-shot compression (for small data)
const gzip       = promisify(zlib.gzip);
const gunzip     = promisify(zlib.gunzip);
const brotliComp = promisify(zlib.brotliCompress);
const brotliDecomp = promisify(zlib.brotliDecompress);

const original = Buffer.from('Hello, World! '.repeat(1000));
const compressed = await gzip(original);
console.log(`${original.length} bytes โ†’ ${compressed.length} bytes`);

const restored = await gunzip(compressed);
console.log(restored.toString() === original.toString()); // true

// Stream-based (for large files)
const { pipeline } = require('stream/promises');
await pipeline(
  fs.createReadStream('video.mp4'),
  zlib.createGzip({ level: 9 }), // 1 (fastest) to 9 (best compression)
  fs.createWriteStream('video.mp4.gz')
);

// HTTP response compression (raw Node โ€” use compression middleware with Express)
http.createServer((req, res) => {
  const acceptEncoding = req.headers['accept-encoding'] || '';
  const data = Buffer.from(JSON.stringify({ large: 'payload' }));
  if (acceptEncoding.includes('br')) {
    res.writeHead(200, { 'Content-Encoding': 'br' });
    zlib.brotliCompress(data, (err, result) => { if (!err) res.end(result); });
  } else if (acceptEncoding.includes('gzip')) {
    res.writeHead(200, { 'Content-Encoding': 'gzip' });
    zlib.gzip(data, (err, result) => { if (!err) res.end(result); });
  } else {
    res.end(data);
  }
}).listen(3000);

11 What is the DNS module in Node.js?

Networking The dns module provides DNS resolution functions โ€” looking up IP addresses, MX records, TXT records, and more.

const dns = require('dns');
const { Resolver, promises: dnsPromises } = require('dns');

// Simple hostname lookup (uses OS resolver โ€” hosts file + system DNS)
dns.lookup('example.com', (err, address, family) => {
  console.log(address); // "93.184.216.34"
  console.log(family);  // 4 (IPv4) or 6 (IPv6)
});

// DNS resolve โ€” bypasses OS, queries DNS servers directly
// Returns ALL addresses (dns.lookup returns only one)
const addresses = await dnsPromises.resolve4('example.com');   // IPv4 records
const ipv6      = await dnsPromises.resolve6('example.com');   // IPv6 records
const mx        = await dnsPromises.resolveMx('example.com');  // Mail servers
const txt       = await dnsPromises.resolveTxt('example.com'); // TXT records
const ns        = await dnsPromises.resolveNs('example.com');  // Name servers

// Reverse lookup โ€” IP to hostname
const hostnames = await dnsPromises.reverse('8.8.8.8');
console.log(hostnames); // ['dns.google']

// Custom resolver (use a specific DNS server)
const resolver = new Resolver();
resolver.setServers(['8.8.8.8', '1.1.1.1']); // Google and Cloudflare DNS
const recs = await resolver.resolve4('example.com');

12 What is the Node.js test runner (node:test)?

Testing Node.js 18 introduced a built-in test runner (node:test) and assertion library (node:assert) โ€” no external test framework needed for basic testing.

// math.test.js
const test   = require('node:test');
const assert = require('node:assert/strict');

// Basic test
test('adds two numbers correctly', () => {
  assert.equal(add(2, 3), 5);
  assert.notEqual(add(2, 3), 6);
});

// Async test
test('fetches user successfully', async () => {
  const user = await fetchUser(1);
  assert.ok(user, 'user should exist');
  assert.match(user.email, /^.+@.+\..+$/);
});

// Test groups (describe blocks)
test('UserService', async (t) => {
  await t.test('create โ€” saves user to database', async () => {
    const user = await UserService.create({ name: 'Alice', email: 'a@b.com' });
    assert.ok(user._id);
  });

  await t.test('getById โ€” returns null for missing user', async () => {
    const user = await UserService.getById('nonexistent');
    assert.equal(user, null);
  });
});

// Mocking
test('uses mocked Date', (t) => {
  t.mock.timers.enable({ apis: ['Date'], now: new Date('2026-01-01') });
  assert.equal(new Date().getFullYear(), 2026);
});

// Run: node --test math.test.js
//  OR: node --test (auto-discovers *.test.js files)

13 How do you implement rate limiting without a framework in Node.js?

Patterns Understanding rate limiting from first principles shows Node.js competency. A simple sliding window counter using a Map:

// In-memory rate limiter (single process โ€” use Redis for multi-process)
class RateLimiter {
  constructor({ windowMs = 60000, max = 100 } = {}) {
    this.windowMs = windowMs;
    this.max      = max;
    this.store    = new Map(); // key โ†’ [timestamps...]
  }

  isAllowed(key) {
    const now  = Date.now();
    const hits = this.store.get(key) || [];

    // Remove timestamps outside the window
    const recent = hits.filter(t => now - t < this.windowMs);

    if (recent.length >= this.max) {
      this.store.set(key, recent);
      return false; // rate limit exceeded
    }

    recent.push(now);
    this.store.set(key, recent);
    return true;
  }

  reset(key) { this.store.delete(key); }

  // Cleanup โ€” prevent memory leaks
  startCleanup() {
    return setInterval(() => {
      const now = Date.now();
      for (const [key, hits] of this.store) {
        const active = hits.filter(t => now - t < this.windowMs);
        if (!active.length) this.store.delete(key);
        else this.store.set(key, active);
      }
    }, this.windowMs);
  }
}

const limiter = new RateLimiter({ windowMs: 60000, max: 10 });
const cleanup = limiter.startCleanup();

// Use in Node http server
const server = http.createServer((req, res) => {
  const ip = req.socket.remoteAddress;
  if (!limiter.isAllowed(ip)) {
    res.writeHead(429, { 'Retry-After': '60' });
    return res.end('Rate limit exceeded');
  }
  // handle request normally
});

14 What are AsyncLocalStorage and async context tracking?

Async Context AsyncLocalStorage (stable since Node.js 16) provides a way to store data that is accessible throughout the lifetime of an asynchronous call chain โ€” like thread-local storage, but for async operations. Used for request context propagation (request IDs, user info, transaction IDs) without passing them through every function signature.

const { AsyncLocalStorage } = require('async_hooks');

// Create storage for request context
const requestContext = new AsyncLocalStorage();

// Middleware โ€” set context at the start of each request
function contextMiddleware(req, res, next) {
  const context = {
    requestId: crypto.randomUUID(),
    userId:    req.user?.id,
    startTime: Date.now()
  };
  requestContext.run(context, next); // all async ops within this request share context
}

// Any function, any level of nesting, can access the context
function logEvent(event, data) {
  const ctx = requestContext.getStore();
  logger.info({ requestId: ctx?.requestId, event, ...data });
}

// Service function โ€” no need to pass requestId as parameter
async function UserService.create(data) {
  const user = await db.users.insertOne(data);
  logEvent('user.created', { userId: user._id }); // requestId is automatic
  return user;
}

// Use in an Express app
app.use(contextMiddleware);
app.post('/users', async (req, res, next) => {
  try {
    const user = await UserService.create(req.body);
    res.status(201).json(user);
  } catch (err) { next(err); }
});

15 How do you use the perf_hooks module for performance measurement?

Performance The perf_hooks module provides the Performance Measurement API โ€” high-resolution timestamps, performance marks, and measures (same API as browser performance).

const { performance, PerformanceObserver } = require('perf_hooks');

// Basic high-resolution timing
performance.mark('db-start');
const users = await db.find({});
performance.mark('db-end');
performance.measure('db-query', 'db-start', 'db-end');

const [entry] = performance.getEntriesByName('db-query');
console.log(`DB query took ${entry.duration.toFixed(2)}ms`);

// PerformanceObserver โ€” observe all measures automatically
const obs = new PerformanceObserver((list) => {
  for (const entry of list.getEntries()) {
    if (entry.duration > 100) { // log slow operations
      logger.warn({ operation: entry.name, ms: entry.duration.toFixed(2) }, 'Slow operation');
    }
  }
});
obs.observe({ entryTypes: ['measure'] });

// Wrap any function to measure its execution time
function measured(name, fn) {
  return async function(...args) {
    performance.mark(`${name}-start`);
    try { return await fn(...args); }
    finally {
      performance.mark(`${name}-end`);
      performance.measure(name, `${name}-start`, `${name}-end`);
    }
  };
}

const timedFetch = measured('fetchUsers', fetchUsers);

// Node.js-specific: monitorEventLoopDelay
const { monitorEventLoopDelay } = require('perf_hooks');
const h = monitorEventLoopDelay({ resolution: 20 }); // 20ms sampling
h.enable();
setTimeout(() => {
  console.log('P99 event loop lag:', h.percentile(99), 'ms');
  h.disable();
}, 5000);

16 What is the vm module in Node.js? When would you use it?

Advanced The vm module compiles and runs JavaScript code within a V8 Virtual Machine context โ€” a sandboxed environment with its own global scope. The running code cannot access the host module’s variables unless explicitly passed.

const vm = require('vm');

// Run code in a sandbox with custom globals
const context = { x: 10, y: 20, result: 0 };
vm.createContext(context); // contextify the object

vm.runInContext('result = x + y', context);
console.log(context.result); // 30

// Run a script object (reusable, pre-compiled)
const script = new vm.Script('Math.max(...numbers)');
const ctx = vm.createContext({ numbers: [3, 1, 4, 1, 5], Math });
console.log(script.runInContext(ctx)); // 5

// Sandbox with timeout (prevent infinite loops)
try {
  vm.runInNewContext('while(true){}', {}, { timeout: 100 }); // throws after 100ms
} catch (e) {
  console.log('Script timed out');
}

// Typical use cases:
// - Plugin systems where user provides configurable business rules
// - Template engines evaluating expressions
// - Test frameworks isolating module execution
// - Sandboxed code evaluation in a REPL or online code editor

// SECURITY WARNING: vm is NOT a security sandbox
// Malicious code can still escape: vm.runInNewContext('process.exit(1)');
// Use isolated-vm npm package for true sandboxing (V8 isolates)

17 What is the difference between process.nextTick() and setImmediate()?

Event Loop Both defer code execution, but they run at different points in the event loop:

// process.nextTick โ€” runs BEFORE the event loop continues
// Fires after current operation completes, before any I/O or timers
// Drains completely before moving to next phase

// setImmediate โ€” runs in the Check phase (after I/O callbacks)

setImmediate(()     => console.log('setImmediate'));
process.nextTick(() => console.log('nextTick 1'));
process.nextTick(() => console.log('nextTick 2'));
Promise.resolve().then(() => console.log('Promise microtask'));
console.log('sync');

// Output:
// sync
// nextTick 1      โ† process.nextTick drains completely
// nextTick 2      โ† (both nextTick callbacks run before anything else)
// Promise microtask โ† then microtask queue
// setImmediate    โ† then Check phase

// โš ๏ธ Danger: recursive process.nextTick can starve the event loop
function recursiveTick() {
  process.nextTick(recursiveTick); // NEVER allows I/O to run!
}

// Use process.nextTick for:
// - Emitting events that handlers may not have registered yet (constructor timing)
// - Ensuring a callback runs asynchronously even if the result is already known

// Use setImmediate for:
// - Breaking up long synchronous operations over multiple event loop turns
// - Scheduling work after I/O callbacks have a chance to run

18 What is the node:assert module and how do you write assertions?

Testing The built-in assert module provides assertion functions for testing. Use assert/strict (strict mode โ€” === comparisons) rather than legacy assert (==).

const assert = require('node:assert/strict');

// Equality
assert.equal(2 + 2, 4);
assert.notEqual(2 + 2, 5);
assert.strictEqual('5', '5');    // passes (===)
// assert.strictEqual(5, '5'); // fails โ€” different types in strict mode

// Deep equality (objects/arrays)
assert.deepStrictEqual({ a: 1, b: [2, 3] }, { a: 1, b: [2, 3] }); // passes
assert.notDeepStrictEqual([1, 2], [1, 3]);

// Boolean assertions
assert.ok(true);
assert.ok('non-empty string');
assert.ok(1);
// assert.ok(0);  // fails

// Error assertions
assert.throws(() => JSON.parse('{bad json}'), SyntaxError);
await assert.rejects(
  async () => { throw new Error('fail'); },
  { message: 'fail' }
);

// Does NOT throw
assert.doesNotThrow(() => JSON.parse('{"ok": true}'));

// Match against regex
assert.match('Hello World', /Hello/);
assert.doesNotMatch('Hello World', /Goodbye/);

// Custom error message
assert.equal(result, expected, `Expected ${expected}, got ${result}`);

19 What is the REPL in Node.js and how do you use it programmatically?

Tools REPL (Read-Eval-Print-Loop) is Node.js’s interactive shell โ€” type JavaScript and see immediate results. The repl module lets you embed a REPL in your application for runtime inspection, admin tools, and CLI interfaces.

# Start Node.js REPL
node

# Useful REPL commands
.help       # list available commands
.break      # clear multi-line input
.clear      # reset context
.editor     # multi-line editor mode (Ctrl+D to run)
.exit       # exit REPL
.load file  # load a .js file into current session
.save file  # save REPL history to file

# Built-in REPL tricks
> 2 ** 10                       // 1024
> [1,2,3].map(x => x * 2)     // [2,4,6]
> const u = await fetch('https://api.example.com/users').then(r=>r.json())
> _                             // last expression result
// Programmatic REPL (build custom admin shells)
const repl = require('repl');
const db   = require('./db');

const r = repl.start({ prompt: 'admin> ', useGlobal: false });

// Inject variables into REPL context
r.context.db      = db;
r.context.User    = require('./models/User');
r.context.Order   = require('./models/Order');

// Custom REPL command
r.defineCommand('clearCache', {
  help: 'Clear the application cache',
  action() {
    cache.flush();
    this.output.write('Cache cleared!\n');
    this.displayPrompt();
  }
});

// Start: node admin.js โ†’ access DB directly in REPL
// admin> await User.countDocuments()
// admin> .clearCache

20 What are Diagnostic Channels in Node.js?

Observability Diagnostic Channels (node:diagnostics_channel, stable in Node 15+) provide a publish/subscribe mechanism for internal library instrumentation. Libraries publish named channel messages; APM tools subscribe without any changes to the library code.

const dc = require('node:diagnostics_channel');

// โ”€โ”€ Library side โ€” publish diagnostics โ”€โ”€
const dbChannel = dc.channel('my-lib:database');

async function query(sql, params) {
  const start = performance.now();
  try {
    const result = await pool.query(sql, params);
    if (dbChannel.hasSubscribers) {
      dbChannel.publish({
        sql, params,
        duration: performance.now() - start,
        rowCount: result.rowCount
      });
    }
    return result;
  } catch (err) {
    dc.channel('my-lib:database:error').publish({ sql, params, error: err });
    throw err;
  }
}

// โ”€โ”€ APM/monitoring side โ€” subscribe to channels โ”€โ”€
// (done in separate monitoring module โ€” zero coupling)
dc.subscribe('my-lib:database', ({ sql, duration, rowCount }) => {
  if (duration > 100) logger.warn({ sql, duration }, 'Slow query detected');
  metrics.histogram('db.query.duration', duration);
});

// Node.js core modules publish to built-in channels
dc.subscribe('http.client.request.start', ({ request }) => {
  console.log('Outgoing HTTP request:', request.method, request.path);
});

21 What is the module.exports vs exports difference in Node.js?

Modules Both module.exports and exports point to the same object initially. The critical difference: only module.exports is returned by require(). If you reassign exports to a new value, it breaks the reference and the module exports nothing meaningful.

// Initially: exports === module.exports === {}

// โœ… exports.property โ€” works (modifies shared object)
exports.add      = (a, b) => a + b;
exports.subtract = (a, b) => a - b;
// require('./math') returns { add: fn, subtract: fn }

// โœ… module.exports = value โ€” works (replaces the exported value)
module.exports = class Database { ... };
// require('./db') returns the Database class

// โœ… module.exports = { ... } โ€” works (replaces with new object)
module.exports = { add, subtract };

// โŒ exports = { add, subtract } โ€” DOES NOT WORK
// This reassigns the local 'exports' variable, breaking the reference to module.exports
// require('./math') returns {} (the original empty object)
exports = { add, subtract }; // โŒ local variable reassignment, ignored

// โŒ Mixed: reassign module.exports then add to exports
module.exports = class User {};
exports.utils = {}; // โŒ exports still points to OLD object, not the class

Rule: Use exports.name = value for named exports. Use module.exports = value when exporting a single class, function, or object. Never mix both styles in the same file.

📝 Knowledge Check

Test your understanding of advanced Node.js patterns and built-in modules.

🧠 Quiz Question 1 of 5

What is backpressure in Node.js streams, and what is the most reliable way to handle it?





🧠 Quiz Question 2 of 5

What is the key difference between Worker Threads and the cluster module in Node.js?





🧠 Quiz Question 3 of 5

Which approach should you use to start a Node.js debugger and inspect it in Chrome DevTools?





🧠 Quiz Question 4 of 5

What does AsyncLocalStorage solve in Node.js asynchronous programming?





🧠 Quiz Question 5 of 5

Why does exports = { add, subtract } fail to export anything useful in a CommonJS module?