The most important thing to understand about Node.js โ and the reason it is excellent for REST APIs โ is that it is single-threaded and non-blocking. A single Node.js process handles all incoming requests on one thread, never waiting idle for a database query or file read to finish. This is the event loop at work. Understanding the event loop is not just academic โ it directly explains why your Express routes should never have synchronous blocking operations, why you must use async/await for database calls, and why your server can handle hundreds of concurrent requests without spawning hundreds of threads.
Single-Threaded vs Multi-Threaded Servers
| Traditional (Apache/PHP) | Node.js | |
|---|---|---|
| Threading model | One thread per request | One thread for all requests |
| I/O behaviour | Thread blocks while waiting for DB/file | Registers callback, continues immediately |
| Memory per request | High โ each thread needs its own stack | Low โ one shared event loop |
| Best for | CPU-heavy work per request | I/O-heavy work โ REST APIs, real-time, proxies |
| Weak at | Handling many concurrent idle connections | Long-running CPU computation (blocks the loop) |
Post.find() (a database query), Node.js sends the query to the database driver and immediately moves on to handle the next request. When the database responds, the event loop picks up the callback and resumes your handler. The thread was never blocked โ it was doing other work the whole time.console.log. Add a log before and after an await call, and also outside the async function. You will see that code after the async function call runs before the awaited result comes back โ proving the event loop did not block.require('worker_threads')) or offload to a separate service. This is one of the few cases where Node.js is not the best tool.The Event Loop โ Phase by Phase
Node.js Event Loop โ Simplified Phase Order
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ timers โ โ setTimeout / setInterval callbacks
โโโโโโโโโโโโฌโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโผโโโโโโโโโโโโโโโ
โ pending callbacks โ โ I/O callbacks deferred from previous loop
โโโโโโโโโโโโฌโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโผโโโโโโโโโโโโโโโ
โ poll (I/O) โ โ NEW I/O events: file reads, DB responses,
โ โ HTTP responses โ THIS is where most of your
โ โ Express callbacks resume
โโโโโโโโโโโโฌโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโผโโโโโโโโโโโโโโโ
โ check โ โ setImmediate callbacks
โโโโโโโโโโโโฌโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโผโโโโโโโโโโโโโโโ
โ close callbacks โ โ socket.on('close', ...) etc.
โโโโโโโโโโโโฌโโโโโโโโโโโโโโโ
โ
loop repeats
Microtasks (Promises, process.nextTick) run BETWEEN every phase.
Blocking vs Non-Blocking โ Side by Side
// โโ BLOCKING โ never do this in an Express route โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
const fs = require('fs');
app.get('/api/config', (req, res) => {
// readFileSync blocks the ENTIRE event loop until the file is read
// All other incoming requests are frozen during this time
const data = fs.readFileSync('./config.json', 'utf8');
res.json(JSON.parse(data));
});
// โโ NON-BLOCKING โ the correct way โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
const fs = require('fs/promises');
app.get('/api/config', async (req, res) => {
// readFile is async โ event loop continues handling other requests
// while the OS reads the file in the background
const data = await fs.readFile('./config.json', 'utf8');
res.json(JSON.parse(data));
});
The Call Stack, Callback Queue, and Event Loop Together
console.log('1 โ synchronous, runs first');
setTimeout(() => {
console.log('3 โ setTimeout callback: runs after current stack clears');
}, 0);
Promise.resolve().then(() => {
console.log('2.5 โ microtask: runs before setTimeout, after synchronous code');
});
console.log('2 โ synchronous, runs second');
// Output order:
// 1 โ synchronous, runs first
// 2 โ synchronous, runs second
// 2.5 โ microtask: runs before setTimeout, after synchronous code
// 3 โ setTimeout callback: runs after current stack clears
How This Applies to Your Express API
// This is what happens when TWO requests arrive at the same time
// Request A: GET /api/posts
app.get('/api/posts', async (req, res) => {
// 1. Mongoose sends query to MongoDB
// 2. Node.js does NOT wait โ event loop continues
const posts = await Post.find(); // โ non-blocking I/O
// 5. When MongoDB responds, event loop resumes HERE
res.json({ data: posts });
});
// Request B arrives while Request A is awaiting Post.find()
app.get('/api/health', (req, res) => {
// 3. Event loop picks up Request B immediately
// 4. This responds instantly โ no waiting for Request A's DB query
res.json({ status: 'ok' });
});
// Result: Both requests are handled concurrently on ONE thread.
// Request B gets its response while Request A is still waiting for MongoDB.
Common Mistakes
Mistake 1 โ Using synchronous file/crypto operations in route handlers
โ Wrong โ synchronous methods with Sync suffix block the event loop:
app.post('/api/auth/register', (req, res) => {
const hash = bcrypt.hashSync(req.body.password, 12); // blocks for ~200ms
// All other requests queue up behind this one
});
โ Correct โ use the async version:
app.post('/api/auth/register', async (req, res) => {
const hash = await bcrypt.hash(req.body.password, 12); // non-blocking โ
});
Mistake 2 โ Large synchronous loops in route handlers
โ Wrong โ processing a large array synchronously blocks all other requests:
app.get('/api/report', (req, res) => {
const result = hugeArray.map(item => expensiveTransform(item)); // blocks!
res.json(result);
});
โ Correct โ offload to a Worker Thread or a background job queue (Bull/BullMQ) for CPU-heavy processing.
Mistake 3 โ Assuming async functions run in parallel automatically
โ Wrong โ awaiting two independent queries sequentially when they could run in parallel:
const posts = await Post.find(); // waits 50ms
const users = await User.find(); // then waits another 50ms = 100ms total
โ
Correct โ use Promise.all() to run independent async operations concurrently:
const [posts, users] = await Promise.all([Post.find(), User.find()]); // ~50ms total โ
Quick Reference
| Concept | Key Point |
|---|---|
| Event loop | Runs callbacks when I/O operations complete โ never blocks |
| Non-blocking I/O | Use async functions for all file, DB, and network operations |
| Microtasks | Promise callbacks run before the next event loop phase |
| Blocking danger | Any *Sync call or long loop blocks ALL requests |
| Concurrency | Use Promise.all() for independent parallel async calls |
| CPU work | Delegate to Worker Threads or a job queue |