Expert Node.js Interview Questions and Answers

๐Ÿ“‹ Table of Contents โ–พ
  1. Questions & Answers
  2. 📝 Knowledge Check

🟢 Expert Node.js Interview Questions

This lesson targets senior engineers and architects. Topics include V8 internals, libuv, memory management, garbage collection, the event loop in depth (microtasks, phases, starvation), N-API native addons, security, module resolution internals, Corepack, the Permission Model, and production architecture patterns. These questions reveal whether you understand Node.js or merely write JavaScript in it.

Questions & Answers

01 What is V8 and how does it execute JavaScript?

V8 Internals V8 is Google’s open-source JavaScript and WebAssembly engine, written in C++. Node.js embeds V8 to execute JavaScript code.

V8 compilation pipeline:

  • Parsing โ€” JavaScript source is parsed into an Abstract Syntax Tree (AST). V8 uses lazy parsing: functions are only fully parsed when called (except exported functions and IIFEs).
  • Ignition (bytecode interpreter) โ€” the AST is compiled to bytecode and executed by the Ignition interpreter. Fast to start, not the fastest to run.
  • TurboFan (optimising JIT compiler) โ€” functions called frequently (hot functions) are identified and compiled by TurboFan to highly optimised native machine code. This is just-in-time (JIT) compilation.
  • Deoptimisation โ€” if V8’s type assumptions are violated (e.g., an array that was always ints suddenly gets a string), TurboFan deoptimises back to Ignition. This is why monomorphic code (consistent types) is faster.
// Help V8 optimise โ€” consistent types
function add(a, b) { return a + b; }
add(1, 2);   // V8 assumes Number + Number
add(3, 4);   // confirms โ€” TurboFan optimises
add('x', 2); // type change โ€” deoptimisation triggered

// Use --v8-options to list V8 flags
// node --v8-options | grep 'trace-opt'
// node --trace-opt server.js  โ€” shows which functions TurboFan optimises

02 What is libuv and what role does it play in Node.js?

Internals libuv is a cross-platform C library that powers Node.js’s asynchronous I/O operations. V8 runs JavaScript but cannot interact with the OS; libuv provides the event loop, thread pool, and system call abstractions.

libuv components:

  • Event loop โ€” the core loop that polls for I/O readiness using OS-specific mechanisms: epoll (Linux), kqueue (macOS/BSD), IOCP (Windows)
  • Thread pool โ€” a pool of 4 threads (configurable via UV_THREADPOOL_SIZE) for operations the OS cannot do asynchronously: file I/O, DNS resolution (dns.lookup), some crypto operations, fs.stat
  • Handles and requests โ€” handles represent long-lived objects (TCP connections, timers); requests represent short-lived operations (a write to a socket)
  • Timer implementation โ€” setTimeout/setInterval are implemented in libuv using a binary heap of timer callbacks
# Increase thread pool size for file-heavy workloads
UV_THREADPOOL_SIZE=16 node server.js

# Why this matters: if you have 4 threads and 10 concurrent fs.readFile()
# calls, 6 will queue and wait. Increasing the pool helps I/O-heavy workloads.

03 Explain Node.js memory management and garbage collection.

Memory Node.js memory is managed by V8’s garbage collector. Understanding it helps diagnose memory leaks and tune performance.

Heap structure:

  • New Space (Young Generation) โ€” small (~1-8MB), frequently collected (Scavenge GC). Short-lived objects live here. Most objects die young.
  • Old Space (Old Generation) โ€” objects that survived 2+ Scavenge cycles are promoted here. Collected by Major GC (Mark-Sweep-Compact). Larger, less frequent.
  • Large Object Space โ€” objects exceeding the size threshold (typically 256KB) that would not fit in a normal page.
  • Code Space โ€” JIT-compiled code from TurboFan.
# Default heap limit (~1.5GB on 64-bit)
node --max-old-space-size=4096 server.js  # set to 4GB

# Monitor GC activity
node --expose-gc --trace-gc server.js

# Force GC (development/testing only โ€” NEVER in production)
if (global.gc) global.gc();

// Check heap usage
const v8 = require('v8');
const stats = v8.getHeapStatistics();
console.log({
  used:   Math.round(stats.used_heap_size / 1024 / 1024) + 'MB',
  total:  Math.round(stats.total_heap_size / 1024 / 1024) + 'MB',
  limit:  Math.round(stats.heap_size_limit / 1024 / 1024) + 'MB'
});

Common memory leaks: closures holding references to large objects, event listeners not removed (EventEmitter.off()), global variables, timers not cleared, circular references in Buffers, keeping references in Maps/Sets without clearing them.

04 What is the event loop microtask queue starvation problem?

Event Loop Microtask starvation occurs when the microtask queue (Promises, queueMicrotask, process.nextTick) is continuously refilled, preventing the event loop from ever advancing to its I/O and timer phases. The server stops processing new requests.

// โŒ DANGER: recursive Promise chain โ€” starves event loop
async function recursiveAsync() {
  await Promise.resolve(); // queues to microtask queue
  recursiveAsync();        // immediately queues another
  // event loop NEVER proceeds to I/O phase โ€” all HTTP requests hang
}

// โŒ Same with process.nextTick
function starveTick() {
  process.nextTick(starveTick); // fills nextTick queue infinitely
}

// โœ… FIX: use setImmediate to yield to I/O between iterations
async function safeBatch(items) {
  for (const item of items) {
    await processItem(item);
    // Yield to event loop every 100 items
    if (items.indexOf(item) % 100 === 0) {
      await new Promise(resolve => setImmediate(resolve));
    }
  }
}

// โœ… FIX for CPU work: use setImmediate to release the event loop
function processLargeArray(arr, callback) {
  let i = 0;
  function doChunk() {
    let n = 1000; // process 1000 items per tick
    while (n-- && i < arr.length) process(arr[i++]);
    if (i < arr.length) setImmediate(doChunk); // yield, then continue
    else callback();
  }
  setImmediate(doChunk);
}

05 What is N-API and why is it important for native addons?

Native N-API (now called Node-API) is a stable C API for building native Node.js addons โ€” modules written in C/C++ (or Rust via napi-rs) that are callable from JavaScript.

Why Node-API matters:

  • ABI stability โ€” N-API is ABI-stable across Node.js versions. Before N-API, native addons broke on every Node.js version upgrade because they depended on V8’s internal API (which changes). N-API addons compiled for Node 14 work on Node 20.
  • Language bridging โ€” call C++ libraries (OpenCV, TensorFlow, libsodium) or Rust crates from JavaScript
  • Performance-critical code โ€” offload CPU-intensive work to native code
// Native addon written with node-addon-api (C++ wrapper for N-API)
// addon.cc
#include <napi.h>

Napi::Value Add(const Napi::CallbackInfo& info) {
  Napi::Env env = info.Env();
  double arg0 = info[0].As<Napi::Number>().DoubleValue();
  double arg1 = info[1].As<Napi::Number>().DoubleValue();
  return Napi::Number::New(env, arg0 + arg1);
}

Napi::Object Init(Napi::Env env, Napi::Object exports) {
  exports.Set("add", Napi::Function::New(env, Add));
  return exports;
}
NODE_API_MODULE(addon, Init)

// Usage from JavaScript
const addon = require('./build/Release/addon');
console.log(addon.add(3, 4)); // 7

// Modern approach: napi-rs (Rust โ†’ Node.js) or ffi-napi (call .so/.dll directly)

06 What is the Node.js Permission Model?

Security The Permission Model (experimental in Node.js 20, maturing in Node 22+) restricts what resources a Node.js process can access at runtime โ€” file system paths, network, child processes, workers, and native addons. Similar to Deno’s permission system.

# Run with specific permissions only
node --experimental-permission server.js
# By default ALL access is blocked

# Grant read access to specific directory
node --experimental-permission \
     --allow-fs-read=/app/public \
     server.js

# Grant write access
node --experimental-permission \
     --allow-fs-write=/app/logs \
     --allow-fs-read=/app \
     server.js

# Allow network access (optional โ€” can restrict to specific hosts)
node --experimental-permission \
     --allow-net \
     server.js

# Allow child process spawning
node --experimental-permission \
     --allow-child-process \
     server.js

# Allow worker threads
node --experimental-permission \
     --allow-worker \
     server.js
// Check permissions programmatically
const { permission } = require('node:process');
console.log(permission.has('fs.read', '/app/data')); // true/false

// Attempt blocked operation โ†’ throws an error
try { fs.readFileSync('/etc/passwd'); }
catch (e) { console.error(e.code); } // ERR_ACCESS_DENIED

The Permission Model is valuable for running untrusted scripts, multi-tenant systems, and hardening production containers by applying the principle of least privilege.

07 What is Corepack and how does it manage package manager versions?

Tooling Corepack is a Node.js built-in tool (included since Node.js 16.9+) that manages the version of npm, Yarn, or pnpm used in a project โ€” ensuring everyone on the team uses the exact same package manager version without global installs.

# Enable Corepack (disabled by default)
corepack enable

# package.json โ€” specify package manager and version
{
  "packageManager": "pnpm@9.1.0"
}

# Now pnpm is automatically downloaded and used at the specified version
pnpm install     # uses pnpm 9.1.0 exactly (downloaded if not cached)

# Prepare a specific version (pre-download for offline use)
corepack prepare pnpm@9.1.0 --activate

# Switch between package managers per project
echo '{"packageManager": "yarn@4.2.2"}' > package.json
yarn install  # Corepack downloads Yarn 4.2.2

# Why this matters:
# Without Corepack, developers each install different npm/yarn/pnpm versions
# This causes lockfile format differences, inconsistent behaviour, and CI/CD issues
# Corepack pins the package manager version just like node engines pins Node

08 How does Node.js module resolution work? What is the resolution algorithm?

Modules When you call require('some-module'), Node.js follows a specific resolution algorithm to find the file:

// require('./utils') โ€” relative path resolution
// 1. Try exact: ./utils.js โ†’ ./utils.json โ†’ ./utils.node
// 2. Try directory: ./utils/index.js โ†’ ./utils/index.json โ†’ ./utils/index.node
// 3. Check package.json "main" field in ./utils/package.json

// require('express') โ€” bare specifier (npm package)
// Starting from current directory, walk UP the tree:
// ./node_modules/express โ†’ ../node_modules/express โ†’ ../../node_modules/express
// Until reaching the filesystem root or finding the module

// Inside node_modules/express/:
// 1. Check package.json "main" field โ†’ e.g., "./index.js"
// 2. Fall back to index.js, index.json, index.node

// require('/absolute/path/to/file') โ€” absolute path
// Tries .js, .json, .node extensions if no extension given

// package.json "exports" field (modern โ€” overrides "main")
// Enables conditional exports: different files for CJS vs ESM, browser vs Node
{
  "exports": {
    ".":           { "import": "./dist/index.mjs", "require": "./dist/index.cjs" },
    "./utils":     "./dist/utils.js",
    "./package.json": "./package.json"
  }
}
// require('mylib')         โ†’ dist/index.cjs
// import 'mylib'           โ†’ dist/index.mjs
// import 'mylib/utils'     โ†’ dist/utils.js
// import 'mylib/internal'  โ†’ โŒ not exported โ€” blocked

09 How do you detect and fix memory leaks in a production Node.js app?

Memory

Step 1 โ€” Detect: watch memory usage over time

// Add memory monitoring to your app
setInterval(() => {
  const mem = process.memoryUsage();
  logger.info({
    rss:      Math.round(mem.rss      / 1024 / 1024) + 'MB', // total process memory
    heapUsed: Math.round(mem.heapUsed / 1024 / 1024) + 'MB', // JS heap in use
    heapTotal:Math.round(mem.heapTotal/ 1024 / 1024) + 'MB', // total JS heap
    external: Math.round(mem.external / 1024 / 1024) + 'MB'  // C++ objects (Buffers)
  }, 'Memory usage');
}, 30000); // log every 30s โ€” a rising heapUsed trend indicates a leak

Step 2 โ€” Isolate: heap snapshots in Chrome DevTools

node --inspect server.js
# Open chrome://inspect โ†’ Memory tab โ†’ Take heap snapshot
# Apply load with autocannon, take another snapshot
# "Comparison" view shows which object types grew
# Look for: detached DOM nodes, growing arrays, uncleaned event listeners

Step 3 โ€” Common causes and fixes:

  • Event listeners not removed โ†’ emitter.off() in cleanup
  • Timer not cleared โ†’ clearInterval() / clearTimeout()
  • Growing global Map/Set โ†’ add TTL or max-size eviction
  • Closures capturing large objects โ†’ null the reference when done
  • Unclosed streams โ†’ always handle end/close/error events
  • Promise chains not completing โ†’ add .catch() and timeouts
// Tool: clinic.js โ€” automated profiling suite
npm install -g clinic
clinic doctor -- node server.js   # detects I/O, event loop, memory issues
clinic heap -- node server.js     # heap profiler
clinic flame -- node server.js    # CPU flame graph

10 What are the OWASP Node.js security best practices?

Security

  • Prototype pollution prevention โ€” never merge untrusted objects into existing objects. Use Object.create(null) for safe maps without __proto__. Validate with no-prototype-builtins ESLint rule.
  • Dependency security โ€” run npm audit in CI/CD. Automate with Snyk or Dependabot. Prefer packages with a small, well-maintained dependency tree.
  • ReDoS (RegEx DoS) โ€” avoid catastrophically backtracking regular expressions that can hang the event loop on malicious input. Use the safe-regex package to detect them.
  • Timing attacks โ€” use crypto.timingSafeEqual() for comparing secrets/hashes. Standard string comparison (===) leaks timing information.
  • Path traversal โ€” sanitise file path inputs. Use path.resolve() and verify the result starts with your intended directory.
  • SSRF (Server-Side Request Forgery) โ€” validate URLs before making outbound requests. Block private IP ranges (169.254.x.x, 10.x.x.x).
  • Environment variables โ€” never log process.env. Use secrets managers (AWS Secrets Manager, HashiCorp Vault) in production.
  • Least privilege โ€” run Node.js as a non-root user. Use the Permission Model. Scope IAM roles to minimum required permissions.
// Path traversal prevention
function safeReadFile(userPath) {
  const safeBase = path.resolve('/app/uploads');
  const requested = path.resolve(safeBase, userPath);
  if (!requested.startsWith(safeBase)) {
    throw new Error('Path traversal detected');
  }
  return fs.readFileSync(requested);
}

11 What are async_hooks and how are they used for request tracing?

Async Context async_hooks provide lifecycle hooks for every async resource created in Node.js โ€” init, before, after, destroy, promiseResolve. They are the internal mechanism behind AsyncLocalStorage and are used by APM tools for distributed tracing.

const async_hooks = require('async_hooks');
const fs = require('fs');

// Track async resource lifetimes (low-level, rarely used directly)
const hook = async_hooks.createHook({
  init(asyncId, type, triggerAsyncId, resource) {
    fs.writeSync(1, `init: asyncId=${asyncId} type=${type} trigger=${triggerAsyncId}\n`);
    // type: PROMISE, TCPWRAP, TIMERWRAP, HTTPPARSER, etc.
  },
  before(asyncId) {
    fs.writeSync(1, `before: asyncId=${asyncId}\n`);
  },
  after(asyncId) {
    fs.writeSync(1, `after: asyncId=${asyncId}\n`);
  },
  destroy(asyncId) {
    fs.writeSync(1, `destroy: asyncId=${asyncId}\n`);
  }
});
hook.enable();

// Note: console.log is NOT safe inside async_hooks callbacks (it's async!)
// Use fs.writeSync (synchronous) for debugging inside hooks

// WHY ASYNC_HOOKS MATTERS:
// AsyncLocalStorage (lesson 2) is built on async_hooks
// OpenTelemetry uses async_hooks to propagate trace context
// APM tools (Datadog, New Relic) use it to track all async operations

// Most developers use AsyncLocalStorage instead of raw async_hooks
const { AsyncLocalStorage } = require('async_hooks');
// (Prefer this โ€” safe, stable, simpler API)

12 What is the Node.js startup performance? How do you optimise it?

Performance Cold start time matters in serverless functions (AWS Lambda, Vercel, Cloud Run) where every invocation may start a fresh Node.js process.

Optimisation techniques:

  • Lazy require() โ€” only load modules when first needed, not at startup
  • V8 code cache โ€” Node.js can serialize V8’s compiled bytecode to disk (--code-cache-path flag), skipping recompilation on restart
  • Node.js Single Executable Application (SEA) โ€” Node 20+ can bundle an app into a single self-contained executable, eliminating module resolution overhead
  • Reduce dependencies โ€” each require() costs time (file I/O + parsing). Use bundle tools like esbuild or ncc to create a single-file bundle
# Bundle the entire app to a single file (no node_modules at runtime)
npx @vercel/ncc build src/server.js -o dist/
# Result: dist/index.js โ€” single file with all dependencies inlined

# Measure startup time
time node -e "require('./src/server')"

# Profile startup with --cpu-prof
node --cpu-prof --cpu-prof-interval=100 server.js
# Generates a .cpuprofile file โ€” open in Chrome DevTools Performance tab

# V8 startup snapshot (experimental Node 20+)
# Serialise initialised heap state to a snapshot file
# Future processes restore from snapshot โ€” skipping init work

13 What is the Node.js SEA (Single Executable Application)?

Packaging Node.js 20+ supports bundling an application into a standalone executable that includes Node.js itself โ€” no Node.js installation required on the target machine. Similar to Deno’s deno compile.

// 1. Create SEA configuration
// sea-config.json
{
  "main": "dist/app.js",          // entry point (must be a single bundled file)
  "output": "sea-prep.blob",      // intermediate blob
  "disableExperimentalSEAWarning": true,
  "useSnapshot": false,
  "useCodeCache": true            // cache V8 bytecode for faster startup
}

// 2. Bundle your app first (SEA requires a single file)
npx esbuild src/server.js --bundle --platform=node --outfile=dist/app.js

// 3. Generate the blob
node --experimental-sea-config sea-config.json

// 4. Copy the Node.js binary and inject the blob
cp $(which node) my-app
# On macOS โ€” remove existing signature:
codesign --remove-signature my-app
# Inject the blob:
npx postject my-app NODE_SEA_BLOB sea-prep.blob \
  --sentinel-fuse NODE_SEA_FUSE_fce680ab2cc467b6e072b8b5df1996b2 \
  --macho-segment-name NODE_SEA

// 5. Distribute and run โ€” no Node.js needed on target machine
./my-app

SEA is ideal for CLI tools, desktop apps, and container images where you want a single binary with no runtime dependency.

14 What is the difference between EventEmitter’s sync and async event handling?

Events EventEmitter is synchronous by design โ€” emit() calls all listeners synchronously in the order they were registered, returning only after all listeners complete. This surprises many developers who expect asynchronous behaviour.

const { EventEmitter } = require('events');
const emitter = new EventEmitter();

emitter.on('data', (val) => console.log('listener 1:', val));
emitter.on('data', (val) => console.log('listener 2:', val));

console.log('before emit');
emitter.emit('data', 'hello'); // SYNCHRONOUS โ€” all listeners run before returning
console.log('after emit');

// Output:
// before emit
// listener 1: hello
// listener 2: hello
// after emit  โ† NOT before the listeners!

// โŒ DANGER: async listener throws โ€” unhandled error
emitter.on('data', async (val) => {
  await doSomething();
  throw new Error('oops'); // this is an UNHANDLED rejection โ€” not caught by emit()
});

// โœ… Handle async errors in async listeners
emitter.on('data', async (val) => {
  try {
    await doSomething();
  } catch (err) {
    emitter.emit('error', err); // propagate to error listener
  }
});

// EventEmitterAsyncResource (Node 17+) โ€” integrates async context tracking
// with EventEmitter so that async listeners preserve AsyncLocalStorage context

15 What are the Node.js LTS release schedule and version support policy?

Ecosystem

  • Major versions โ€” released every 6 months. Even-numbered versions (18, 20, 22) become LTS (Long-Term Support); odd-numbered versions (17, 19, 21) are short-lived Current releases.
  • LTS lifecycle โ€” Active LTS: 18 months after release (bug fixes, security patches). Maintenance LTS: 12 more months (security patches only). Total: ~3 years of support per LTS version.
  • Current โ€” latest features, 6 months of support. Use only for experimentation.
# Check current version
node --version

# Node.js 20 LTS (Codename: Iron)   โ€” Active LTS until April 2025, Maintenance until April 2026
# Node.js 22 LTS (Codename: Jod)    โ€” Active LTS until April 2026, Maintenance until April 2027
# Node.js 24                        โ€” Current release (2025-2026)

# Use nvm (Node Version Manager) to manage multiple versions
nvm install 22          # install latest Node 22
nvm use 22              # switch to Node 22
nvm alias default 22    # set as default
nvm ls                  # list installed versions

# .nvmrc file โ€” pin project to specific version
echo "22" > .nvmrc
nvm use  # reads .nvmrc automatically

For production: always use the current Active LTS version. Upgrade to the next LTS within 3-6 months of its release to stay ahead of end-of-life. Use CI to test against multiple Node versions (node-version: [20, 22] in GitHub Actions).

16 What is the Node.js policy feature for supply chain security?

Security The Node.js Policy feature (experimental) allows you to define an integrity check policy for loaded modules โ€” ensuring every require()d file matches a pre-computed hash. This prevents supply chain attacks where a dependency is compromised after installation.

// policy.json โ€” define allowed modules and their integrity hashes
{
  "resources": {
    "./server.js": {
      "integrity": "sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K/uxy9rx7HNQlGYl1kPzQho1wx4JwY8wC"
    },
    "node_modules/express/index.js": {
      "integrity": "sha384-EXPECTED_HASH_HERE"
    }
  }
}

# Run with the policy
node --experimental-policy=policy.json server.js
# If any file doesn't match its hash โ†’ ERR_MANIFEST_ASSERT_INTEGRITY error

# Generate hashes for existing files
node -e "
  const crypto = require('crypto');
  const content = require('fs').readFileSync('./server.js');
  const hash = crypto.createHash('sha384').update(content).digest('base64');
  console.log('sha384-' + hash);
"

While still experimental, the Policy feature is one defence against attacks like event-stream (2018) where a popular npm package was compromised after transfer to a malicious maintainer. In practice, combine with npm audit, Sigstore/npm provenance, and lockfile integrity checking.

17 How does Node.js handle uncaughtException vs unhandledRejection?

Error Handling

// uncaughtException โ€” synchronous throw that reached the top of the call stack
process.on('uncaughtException', (err, origin) => {
  // err    = the Error object
  // origin = 'uncaughtException' or 'unhandledRejection'
  logger.fatal({ err, origin }, 'Uncaught exception โ€” process will exit');
  // Clean up synchronously (close DB connections, flush logs)
  // Then EXIT โ€” the process is now in an undefined state and must restart
  process.exit(1);
});

// unhandledRejection โ€” Promise rejected without a .catch() or try/catch
process.on('unhandledRejection', (reason, promise) => {
  // In Node.js <15: printed a warning
  // In Node.js 15+: CRASHES the process by default (converted to uncaughtException)
  logger.error({ reason, promise }, 'Unhandled promise rejection');
  // Best practice: fix the root cause (add .catch()) rather than handling here
});

// rejectionHandled โ€” fired when a previously-unhandled rejection gets a handler attached
process.on('rejectionHandled', (promise) => {
  logger.warn('Promise rejection handled late:', promise);
});

// Best practice: these are LAST RESORT handlers, not error handling strategies
// Fix unhandledRejection at source:
async function doWork() {
  try {
    await riskyOperation();
  } catch (err) {
    // handle here โ€” don't let it propagate to unhandledRejection
    logger.error(err);
  }
}

18 What is the Node.js import.meta object in ES Modules?

ESM In ES Modules, the CommonJS globals __dirname, __filename, require, and module are NOT available. import.meta provides module-level metadata as the ESM equivalent.

// ESM module (file.mjs or type: "module" in package.json)
import { fileURLToPath } from 'url';
import { dirname }       from 'path';
import { createRequire } from 'module';

// __filename equivalent in ESM
const __filename = fileURLToPath(import.meta.url);
// import.meta.url = "file:///home/user/app/server.mjs"
// __filename       = "/home/user/app/server.mjs"

// __dirname equivalent in ESM
const __dirname = dirname(__filename);
// __dirname = "/home/user/app"

// require() equivalent in ESM (to load CJS modules)
const require = createRequire(import.meta.url);
const config  = require('./config.json'); // load JSON the CJS way

// Other import.meta properties (bundler-specific)
console.log(import.meta.url);  // file URL of this module
// import.meta.env  โ€” Vite-specific (environment variables)
// import.meta.hot  โ€” Vite HMR API
// import.meta.dirname  โ€” Available in Node.js 21.2+ (no fileURLToPath needed!)
// import.meta.filename โ€” Available in Node.js 21.2+

19 How do you architect a Node.js application for high availability and fault tolerance?

Architecture

  • Graceful shutdown โ€” stop accepting new connections on SIGTERM, complete in-flight requests, close DB/Redis connections, then exit.
  • Health endpoints โ€” /health/live (is process alive?) and /health/ready (can it accept traffic?). Kubernetes uses these for restart and traffic routing decisions.
  • Automatic restart โ€” use PM2, Docker restart policies, or Kubernetes deployments to restart crashed processes automatically.
  • Circuit breakers โ€” prevent cascading failures when downstream services are degraded (opossum library).
  • Timeouts everywhere โ€” set timeouts on all outbound HTTP calls, DB queries, and cache operations. Never wait indefinitely.
  • Idempotent retries โ€” retry transient failures (network errors, 503s) with exponential backoff and jitter. Only retry idempotent operations.
  • Bulkhead pattern โ€” isolate critical paths. Use separate HTTP connection pools for critical and non-critical downstream calls so a slow dependency doesn’t exhaust your pool.
  • Structured logging + alerting โ€” log errors with request ID, user ID, and stack trace. Alert on error rate spikes, not individual errors.
// Retry with exponential backoff and jitter
async function withRetry(fn, maxRetries = 3) {
  for (let attempt = 1; attempt <= maxRetries; attempt++) {
    try { return await fn(); }
    catch (err) {
      if (attempt === maxRetries || !isTransient(err)) throw err;
      const delay = Math.min(1000 * 2 ** attempt + Math.random() * 100, 10000);
      await new Promise(r => setTimeout(r, delay));
    }
  }
}

20 What is the Node.js AbortController and AbortSignal?

Async AbortController / AbortSignal (global since Node.js 15, stable since Node 16) provide a standard mechanism for cancelling async operations โ€” HTTP requests, stream pipelines, delays, and custom operations.

const { AbortController } = globalThis; // global since Node 16

// Cancel a fetch request after timeout
const controller = new AbortController();
const { signal } = controller;

// Auto-abort after 5 seconds
const timeoutId = setTimeout(() => controller.abort(new Error('Timeout')), 5000);

try {
  const response = await fetch('https://api.example.com/data', { signal });
  const data = await response.json();
  clearTimeout(timeoutId);
  return data;
} catch (err) {
  if (err.name === 'AbortError') console.log('Request timed out');
  else throw err;
}

// AbortSignal.timeout() โ€” shorthand for auto-aborting (Node 17.3+)
const response = await fetch(url, { signal: AbortSignal.timeout(5000) });

// Cancel stream pipeline
const ac = new AbortController();
await pipeline(
  fs.createReadStream('input.txt'),
  transform,
  fs.createWriteStream('output.txt'),
  { signal: ac.signal }
);
// Cancel from another part of code: ac.abort();

// Custom cancellable operation
function delay(ms, signal) {
  return new Promise((resolve, reject) => {
    const id = setTimeout(resolve, ms);
    signal?.addEventListener('abort', () => {
      clearTimeout(id);
      reject(signal.reason || new Error('Aborted'));
    });
  });
}

21 What is the Node.js –watch flag and how does it work?

Tooling Node.js 18.11+ introduced a built-in --watch flag that restarts the process automatically when a watched file changes โ€” eliminating the need for nodemon in many development scenarios.

# Watch mode โ€” restart on file changes
node --watch server.js

# Watch a specific directory or file
node --watch-path=./src --watch-path=./config server.js

# Available since Node 18.11 โ€” stable since Node 20
# Node 22+ supports --watch for all built-in test runner too
node --watch --test tests/*.test.js

# Comparison:
# nodemon โ€” feature-rich, configurable, mature, supports ext filtering
# --watch  โ€” built-in, zero config, great for simple projects and CI

# nodemon still preferred for:
# - Watching specific file extensions: nodemon --ext ts,json
# - Running scripts before/after restart (nodemon events)
# - Non-JS files that trigger a restart
# - Delay before restarting: nodemon --delay 2

// Under the hood: --watch uses fs.watch() which uses OS-native
// inotify (Linux), FSEvents (macOS), or ReadDirectoryChangesW (Windows)
// for efficient file change detection without polling

📝 Knowledge Check

These questions mirror real senior-level Node.js architecture and internals interview scenarios.

🧠 Quiz Question 1 of 5

What is the role of libuv in Node.js?





🧠 Quiz Question 2 of 5

What is microtask queue starvation in Node.js and what causes it?





🧠 Quiz Question 3 of 5

What is the V8 deoptimisation and when does it occur?





🧠 Quiz Question 4 of 5

What does EventEmitter.emit() guarantee about the execution order of registered listeners?





🧠 Quiz Question 5 of 5

Why is N-API (Node-API) preferred over the original NAN bindings for writing native Node.js addons?





Tip: Expert Node.js interviewers want to understand your mental model of the runtime โ€” not just API knowledge. For the event loop, draw the phases and explain why setImmediate runs before setTimeout(fn, 0) inside I/O callbacks. For V8, explain JIT compilation before deoptimisation. For libuv, explain why the thread pool exists (OS async limitations). Framing your answer as: limitation โ†’ solution โ†’ how Node.js implements it shows true depth.