1. What is Node.js and how does it work?
Node.js is an open-source, cross-platform JavaScript runtime environment that allows developers to execute JavaScript code outside of a web browser. It is built on the V8 JavaScript engine from Google Chrome, which compiles JavaScript directly to native machine code, enabling high performance.
Node.js works by using a single-threaded event loop to handle asynchronous I/O operations efficiently. It leverages non-blocking I/O calls, allowing it to manage multiple operations concurrently without creating new threads for each request. This makes it particularly suitable for I/O-bound tasks like web servers.
Note: Node.js includes a standard library for tasks such as file system operations, networking, and cryptography.
2. Why use Node.js for backend development?
Node.js is popular for backend development due to its high performance, scalability, and ability to handle real-time applications. It uses JavaScript on both frontend and backend, enabling full-stack development with a single language, which reduces context switching and improves developer productivity.
- Non-blocking I/O: Efficient for handling concurrent requests without threads.
- Large ecosystem: NPM provides millions of packages for rapid development.
- Real-time capabilities: Ideal for chat apps, gaming, or streaming services.
- Microservices: Lightweight and fast startup times suit containerized environments.
Additionally, it's used by companies like Netflix and LinkedIn for its speed and efficiency in handling large-scale applications.
3. Explain the event-driven architecture in Node.js.
Node.js follows an event-driven architecture where events trigger callbacks or functions. This model is non-blocking, meaning the program doesn't wait for operations to complete but instead registers callbacks to be executed when events occur, such as data received from a file or network.
The core is the EventEmitter class, which allows objects to emit named events and register listeners for those events. This architecture enables efficient handling of asynchronous operations without blocking the main thread.
Example of event-driven code:
const EventEmitter = require('events');
const myEmitter = new EventEmitter();
// Listener for 'event'
myEmitter.on('event', () => {
console.log('An event occurred!'); // Callback executed on event
});
// Emit the event
myEmitter.emit('event');
Note: This pattern is fundamental for modules like HTTP servers in Node.js.
4. What is the Event Loop in Node.js?
The Event Loop is a core component of Node.js that enables non-blocking I/O operations despite JavaScript being single-threaded. It continuously checks the call stack and task queues, executing callbacks when the stack is empty.
It consists of phases like timers, pending callbacks, idle/prepare, poll, check, and close callbacks. Each phase processes specific types of tasks, ensuring efficient asynchronous execution.
Note: Understanding the Event Loop is crucial for optimizing Node.js applications to avoid blocking operations.
5. How does Node.js handle asynchronous operations?
Node.js handles asynchronous operations using callbacks, promises, or async/await, leveraging the Event Loop and libuv library for non-blocking I/O. When an async operation is initiated, it's offloaded to the system kernel or thread pool, and a callback is registered to be executed upon completion.
This allows the main thread to continue processing other tasks without waiting.
Example with file reading:
const fs = require('fs');
fs.readFile('file.txt', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err); // Handle error
return;
}
console.log('File content:', data); // Process data asynchronously
});
console.log('This logs before file reading completes'); // Non-blocking
6. Differentiate between synchronous and asynchronous code in Node.js.
Synchronous code executes sequentially, blocking the thread until the operation completes. It's straightforward but can lead to performance issues in I/O-heavy applications.
Asynchronous code doesn't block the thread; it initiates operations and continues execution, handling results via callbacks when ready. This is efficient for scalability.
Synchronous example:
const fs = require('fs');
const data = fs.readFileSync('file.txt', 'utf8'); // Blocks until done
console.log(data);
console.log('This logs after reading');
Asynchronous example:
const fs = require('fs');
fs.readFile('file.txt', 'utf8', (err, data) => {
console.log(data); // Logs when ready
});
console.log('This logs before reading');
Note: Prefer async for production to avoid blocking the Event Loop.
7. What is a callback function in Node.js?
A callback function is a function passed as an argument to another function, executed after an asynchronous operation completes or an event occurs. It's a fundamental pattern for handling async results in Node.js.
Callbacks often follow the error-first pattern, where the first argument is an error object (null if no error).
Example:
function fetchData(callback) {
setTimeout(() => {
const data = 'Fetched data';
callback(null, data); // Error-first callback
}, 1000);
}
fetchData((err, result) => {
if (err) return console.error(err);
console.log(result);
});
8. Explain callback hell and how to avoid it.
Callback hell refers to deeply nested callbacks in asynchronous code, making it hard to read, maintain, and debug. It creates a "pyramid of doom" structure.
Example of callback hell:
asyncOperation1((err1, res1) => {
if (err1) return handleError(err1);
asyncOperation2(res1, (err2, res2) => {
if (err2) return handleError(err2);
asyncOperation3(res2, (err3, res3) => {
// More nesting...
});
});
});
To avoid it:
- Use modularization: Break into named functions.
- Use Promises: Chain with .then() and .catch().
- Use async/await: Write async code synchronously.
Promise example:
asyncOperation1()
.then(res1 => asyncOperation2(res1))
.then(res2 => asyncOperation3(res2))
.catch(handleError);
Note: Modern Node.js favors Promises or async/await over raw callbacks.
9. What are Promises in Node.js?
Promises are objects representing the eventual completion or failure of an asynchronous operation and its resulting value. They provide a cleaner way to handle async code compared to callbacks, allowing chaining and better error handling.
A Promise can be in pending, fulfilled, or rejected states.
Example:
function fetchData() {
return new Promise((resolve, reject) => {
setTimeout(() => {
const success = true;
if (success) {
resolve('Data fetched'); // Fulfill
} else {
reject('Error fetching data'); // Reject
}
}, 1000);
});
}
fetchData()
.then(data => console.log(data))
.catch(error => console.error(error));
Note: Many Node.js APIs now return Promises.
10. How do async/await work in Node.js?
Async/await is syntactic sugar over Promises, allowing
asynchronous code to be written in a synchronous style. The
async keyword declares an
asynchronous function, and
await pauses execution until a
Promise resolves.
It simplifies error handling with try/catch blocks.
Example:
async function fetchData() {
try {
const response = await fetch('https://api.example.com/data'); // Await Promise
const data = await response.json();
console.log(data);
} catch (error) {
console.error('Error:', error); // Handle rejection
}
}
fetchData();
Note: Async functions always return a Promise. Use in Node.js v7.6+.
11. What is the difference between process.nextTick() and setImmediate()?
process.nextTick() and setImmediate() are both used to schedule callbacks in Node.js, but they differ in timing and execution phase within the Event Loop.
- process.nextTick(): Schedules the callback to run immediately after the current operation completes, before the Event Loop continues to the next phase. It has higher priority and can lead to starvation if overused.
- setImmediate(): Schedules the callback to run after the current poll phase completes, in the check phase of the Event Loop. It's lower priority than nextTick.
Key difference: nextTick executes before I/O callbacks, while setImmediate executes after.
console.log('Start');
process.nextTick(() => {
console.log('nextTick callback'); // Runs before setImmediate
});
setImmediate(() => {
console.log('setImmediate callback'); // Runs after nextTick
});
console.log('End');
// Output: Start, End, nextTick callback, setImmediate callback
Note: Use nextTick for deferring execution without delaying the Event Loop, but avoid recursion to prevent blocking.
12. What is NPM and what is its purpose?
NPM (Node Package Manager) is the default package manager for Node.js, used to install, manage, and share JavaScript packages. Its purpose is to simplify dependency management, allowing developers to easily integrate third-party libraries into projects.
- Installs packages from the NPM registry.
- Manages versions and dependencies.
- Supports scripts for automation (e.g., build, test).
It comes bundled with Node.js and is essential for modern JavaScript development.
Note: NPM also includes tools like npx for executing packages without global installation.
13. How do you install a package globally vs locally in Node.js?
In Node.js, packages can be installed locally (project-specific) or globally (system-wide) using NPM.
- Local installation: Installs in the project's node_modules folder. Use for project dependencies.
- Global installation: Installs in a system directory, accessible from anywhere. Use for CLI tools.
// Local installation
npm install express // Adds to package.json dependencies
// Global installation
npm install -g nodemon // Installs globally, no entry in package.json
To use globally installed packages, they must be in the system's PATH.
Note: Local is preferred to avoid version conflicts; use npx for temporary global-like execution.
14. What is package.json and what does it contain?
package.json is a manifest file in Node.js projects that defines project metadata, dependencies, and scripts. It's created with npm init and is crucial for reproducibility.
It contains:
- name and version: Project identifiers.
- dependencies: Production packages.
- devDependencies: Development tools.
- scripts: Commands like "start", "test".
- engines: Required Node.js version.
- Other fields like description, author, license.
{
"name": "my-app",
"version": "1.0.0",
"dependencies": {
"express": "^4.17.1"
},
"scripts": {
"start": "node server.js"
}
}
Note: Use semantic versioning for dependencies to manage updates safely.
15. Explain modules in Node.js.
Modules in Node.js are reusable code blocks encapsulated in files, promoting modularity and code organization. Node.js uses CommonJS by default, but supports ES modules.
Each file is a module with its own scope. Modules export functionality for import in other files.
Types:
- Built-in (e.g., fs, http).
- Third-party (from NPM).
- Custom (user-defined).
// math.js (custom module)
module.exports = {
add: (a, b) => a + b
};
// app.js
const math = require('./math');
console.log(math.add(2, 3)); // 5
Note: Modules cache after first load for performance.
16. What is the difference between require() and import?
require() is part of CommonJS (synchronous, runtime), while import is from ES modules (asynchronous, static analysis).
- require(): Loads modules dynamically; can be conditional.
- import: Static, enables tree-shaking; requires "type": "module" in package.json.
// CommonJS
const fs = require('fs'); // Synchronous
// ES Module
import fs from 'fs'; // Asynchronous, static
CommonJS example (dynamic):
if (condition) {
const mod = require('module'); // Allowed
}
ES Module limitation:
if (condition) {
import mod from 'module'; // Syntax error
}
Note: Node.js v12+ supports ES modules natively with .mjs or package.json flag.
17. What is module.exports?
module.exports is an object in CommonJS modules that defines what a module exposes when required. It can be reassigned or have properties added.
By default, it's an empty object. exports is an alias, but reassigning exports breaks the reference.
// Single export
module.exports = function greet() {
return 'Hello';
};
// Multiple exports
module.exports = {
greet: () => 'Hello',
farewell: () => 'Goodbye'
};
// Usage
const utils = require('./utils');
console.log(utils.greet()); // Hello
Note: Prefer module.exports over exports for clarity, especially when exporting a single value.
18. How do you create a basic HTTP server in Node.js?
To create a basic HTTP server, use the built-in http module. It listens for requests and sends responses.
const http = require('http');
const server = http.createServer((req, res) => {
res.statusCode = 200; // Set status
res.setHeader('Content-Type', 'text/plain'); // Set header
res.end('Hello, World!\n'); // Send response
});
server.listen(3000, '127.0.0.1', () => {
console.log('Server running at http://127.0.0.1:3000/');
});
Access via browser or curl. Handles basic routing via req.url.
Note: For production, use frameworks like Express for better routing and middleware.
19. What is Express.js and why is it used?
Express.js is a minimal, flexible web framework for Node.js that simplifies building web servers and APIs. It provides robust routing, middleware, and template support.
Why used:
- Handles HTTP requests/responses easily.
- Supports middleware for added functionality.
- Extensible with plugins.
- Fast and unopinionated.
const express = require('express');
const app = express();
app.get('/', (req, res) => {
res.send('Hello, Express!');
});
app.listen(3000, () => console.log('Server on port 3000'));
Note: Ideal for RESTful APIs and single-page apps.
20. How do you handle routes in Express.js?
In Express.js, routes map HTTP methods and paths to handler functions. Use app.METHOD(path, handler) for definition.
const express = require('express');
const app = express();
// GET route
app.get('/users', (req, res) => {
res.json([{ id: 1, name: 'John' }]);
});
// POST route
app.post('/users', (req, res) => {
res.send('User created');
});
// Route parameters
app.get('/users/:id', (req, res) => {
res.send(`User ID: ${req.params.id}`);
});
app.listen(3000);
Use express.Router() for modular routing.
Note: Routes match in order; place specific ones first.
21. What is middleware in Express.js?
Middleware in Express.js are functions that process requests before handlers. They can modify req/res, end the cycle, or call next().
Types: Application-level, router-level, error-handling.
const express = require('express');
const app = express();
// Application middleware
app.use((req, res, next) => {
console.log('Request received');
next(); // Proceed
});
// Route-specific
app.use('/api', (req, res, next) => {
if (req.query.apiKey) next();
else res.status(401).send('Unauthorized');
});
app.get('/api/data', (req, res) => {
res.send('Data');
});
Note: Error middleware takes four arguments: (err, req, res, next).
22. Explain the types of middleware in Node.js.
In Node.js (especially Express), middleware types include:
- Application-level: Applied to all routes via app.use().
- Router-level: Scoped to a router instance.
- Error-handling: Catches errors with (err, req, res, next).
- Built-in: Like express.json() for parsing.
- Third-party: Like morgan for logging.
// Error-handling middleware
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(500).send('Error!');
});
Note: Middleware executes sequentially; order matters.
23. What are streams in Node.js?
Streams in Node.js are objects for handling continuous data flows efficiently, without loading everything into memory. Ideal for large files or real-time data.
They inherit from EventEmitter and support events like 'data', 'end'.
const fs = require('fs');
const readStream = fs.createReadStream('file.txt');
readStream.on('data', (chunk) => {
console.log(chunk.toString()); // Process chunks
});
readStream.on('end', () => {
console.log('Done');
});
Note: Streams prevent memory issues with big data.
24. Differentiate between readable, writable, and duplex streams.
Streams types:
- Readable: Source of data (e.g., fs.createReadStream). Emits 'data' events.
- Writable: Destination for data (e.g., fs.createWriteStream). Has write() method.
- Duplex: Both readable and writable (e.g., net.Socket).
- Transform: Duplex that modifies data (e.g., zlib.createGzip).
// Readable
const readable = fs.createReadStream('input.txt');
// Writable
const writable = fs.createWriteStream('output.txt');
readable.pipe(writable); // Duplex-like behavior
Note: Use pause/resume for flow control in readable streams.
25. What is piping in streams?
Piping connects a readable stream's output to a writable stream's input, automating data transfer. It handles backpressure and events.
const fs = require('fs');
const zlib = require('zlib');
fs.createReadStream('input.txt')
.pipe(zlib.createGzip()) // Transform stream
.pipe(fs.createWriteStream('input.txt.gz')); // Compress and write
Chains multiple streams efficiently.
Note: Piping simplifies code and optimizes performance for I/O operations.
26. How do you handle file uploads in Node.js?
File uploads in Node.js are handled using middleware like Multer in Express.js, which processes multipart/form-data. It parses files and fields, storing them on disk or in memory.
const express = require('express');
const multer = require('multer');
const app = express();
// Disk storage
const storage = multer.diskStorage({
destination: (req, file, cb) => cb(null, 'uploads/'),
filename: (req, file, cb) => cb(null, Date.now() + '-' + file.originalname)
});
const upload = multer({ storage });
// Single file upload
app.post('/upload', upload.single('file'), (req, res) => {
if (!req.file) return res.status(400).send('No file uploaded');
res.send(`File uploaded: ${req.file.filename}`);
});
app.listen(3000);
Use req.file for single, req.files for multiple. Validate with fileFilter.
Note: Always validate file types and sizes to prevent security issues.
27. What is the Buffer class in Node.js?
The Buffer class in Node.js handles raw binary data, representing fixed-size sequences of bytes. It's used for encoding/decoding, file I/O, and network data, outside JavaScript's string handling.
Buffers are instances of Uint8Array but with additional methods.
const buf = Buffer.from('Hello, Node.js!');
// Convert to string
console.log(buf.toString('utf8')); // Hello, Node.js!
// Slice
const slice = buf.slice(0, 5);
console.log(slice.toString()); // Hello
// Hex encoding
console.log(buf.toString('hex')); // 48656c6c6f2c204e6f64652e6a7321
Note: Buffers are mutable; avoid sharing slices to prevent side effects.
28. Explain error handling in Node.js.
Error handling in Node.js uses callbacks (error-first), Promises (.catch()), async/await (try-catch), and domain or process events for unhandled errors. Distinguish between operational (fixable) and programmer (code bugs) errors.
- Callbacks: Check first arg for error.
- Promises: Use .catch() or try-catch with await.
- Global: process.on('uncaughtException') as last resort.
// Callback error handling
fs.readFile('file.txt', (err, data) => {
if (err) {
console.error('Error:', err.message);
return;
}
console.log(data);
});
Note: Log errors and exit gracefully on fatal ones to prevent crashes.
29. What is the try-catch block used for in async code?
In async code, try-catch wraps await expressions to handle Promise rejections synchronously, similar to synchronous code. It catches errors from awaited Promises.
async function fetchData() {
try {
const response = await fetch('https://api.example.com/data');
if (!response.ok) throw new Error('Failed to fetch');
const data = await response.json();
return data;
} catch (error) {
console.error('Error fetching data:', error.message);
throw error; // Re-throw if needed
}
}
fetchData().catch(err => console.log('Global catch:', err));
Note: Try-catch only catches errors in the current async function; propagate with throw.
30. How do you handle uncaught exceptions in Node.js?
Uncaught exceptions are handled with process.on('uncaughtException'), which runs a listener before the process exits. Use for cleanup, but avoid relying on it—fix code instead. Also handle unhandled rejections with 'unhandledRejection'.
process.on('uncaughtException', (err) => {
console.error('Uncaught Exception:', err.message);
// Cleanup: close DB, etc.
process.exit(1); // Exit with failure code
});
process.on('unhandledRejection', (reason, promise) => {
console.error('Unhandled Rejection at:', promise, 'reason:', reason);
});
throw new Error('This will be caught'); // Example
Note: In production, log to external service and restart via PM2 or similar.
31. What is clustering in Node.js?
Clustering in Node.js uses the cluster module to create multiple worker processes sharing the same port, leveraging multi-core CPUs for better performance and scalability.
The master process forks workers; if a worker dies, the master can respawn it.
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
for (let i = 0; i < numCPUs; i++) {
cluster.fork(); // Create workers
}
cluster.on('exit', (worker) => {
console.log(`Worker ${worker.process.pid} died`);
cluster.fork(); // Respawn
});
} else {
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello from worker ' + process.pid);
}).listen(8000);
console.log(`Worker ${process.pid} started`);
}
Note: Workers don't share memory; use IPC for communication.
32. How do you implement clustering for scalability?
Implement clustering by forking workers equal to CPU cores, load-balancing requests across them. Use sticky sessions for session affinity if needed. Monitor and restart dead workers.
For scalability, combine with process managers like PM2.
const cluster = require('cluster');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('online', (worker) => {
console.log(`Worker ${worker.process.pid} is online`);
});
} else {
// Worker logic: e.g., Express app
require('./app'); // Start server
}
Note: Test under load; clustering improves throughput for CPU-bound tasks minimally due to single-threaded nature.
33. What are worker threads in Node.js?
Worker threads in Node.js (since v10.5) allow running JavaScript in parallel threads, sharing memory via SharedArrayBuffer or message passing. Useful for CPU-intensive tasks without blocking the Event Loop.
const { Worker, isMainThread, parentPort, workerData } = require('worker_threads');
if (isMainThread) {
const worker = new Worker(__filename, { workerData: { num: 10 } });
worker.on('message', (msg) => console.log('Worker says:', msg));
worker.on('error', (err) => console.error(err));
worker.on('exit', (code) => {
if (code !== 0) console.error(`Worker stopped with exit code ${code}`);
});
} else {
// Worker code
const result = workerData.num * 2;
parentPort.postMessage(result);
}
Note: Overhead exists; use for heavy computations like image processing.
34. Differentiate between clusters and worker threads.
Clusters use separate processes (full isolation, higher overhead), while worker threads use threads (shared memory possible, lower overhead but potential race conditions).
| Aspect | Clusters | Worker Threads |
|---|---|---|
| Isolation | Process-level (copies memory) | Thread-level (shared possible) |
| Communication | IPC messages | Messages or SharedArrayBuffer |
| Use Case | Scaling across cores (I/O) | Parallel CPU tasks |
| Overhead | High (forking) | Low |
Note: Clusters for horizontal scaling; threads for concurrency within process.
35. What is the role of libuv in Node.js?
libuv is a cross-platform C library providing asynchronous I/O via event loops, used by Node.js for non-blocking operations like file system, DNS, and timers. It abstracts OS differences.
It manages the thread pool for blocking tasks (e.g., file I/O) and handles signals.
Note: libuv enables Node's single-threaded model with multi-threaded backend for I/O.
36. Explain the V8 engine and its relation to Node.js.
V8 is Google's open-source JavaScript engine that compiles JS to native machine code using JIT compilation, powering Chrome and Node.js.
In Node.js, V8 executes JS code, while Node provides bindings to libuv and other C libraries for I/O and system access.
V8 features: Ignition (interpreter), TurboFan (optimizer), garbage collection.
Note: Node.js embeds V8; updates to V8 improve Node's performance.
37. What is REPL in Node.js?
REPL (Read-Eval-Print Loop) is an interactive shell for executing
JS code line-by-line, useful for testing and debugging. Start with
node in terminal.
$ node
> const x = 5;
undefined
> x * 2
10
> .help // Shows commands
> .exit // Quit
Supports multi-line input, history, and custom REPLs via repl module.
Note: Use for quick prototypes; .break for multi-line escape.
38. How do you debug a Node.js application?
Debug Node.js apps using built-in inspector (Chrome DevTools), VS Code debugger, or CLI with --inspect. Set breakpoints and step through code.
// Start with inspector
node --inspect server.js
// Then: chrome://inspect in Chrome
Use console.log for simple logging, or libraries like debug for conditional.
In VS Code: Launch config with "type": "node", "request": "launch".
Note: --inspect-brk pauses on first line; use for startup issues.
39. What tools can ensure consistent code style in Node.js?
Tools for code style: ESLint (linter), Prettier (formatter), StandardJS (all-in-one). Integrate with editors and CI/CD.
- ESLint: Rules for JS/ES6+.
- Prettier: Auto-formats code.
- husky + lint-staged: Pre-commit hooks.
Note: Enforce via npm scripts like "lint": "eslint ."
40. What is ESLint and how is it used?
ESLint is a pluggable, configurable linter for JavaScript that identifies problematic patterns and enforces style rules.
Usage: Install globally/locally, init config, run on files.
// Install
npm install eslint --save-dev
npx eslint --init // Setup
// .eslintrc.js
module.exports = {
env: { node: true, es2021: true },
extends: ['eslint:recommended'],
rules: { 'no-console': 'warn' }
};
// Run
npx eslint your-file.js
Integrate with VS Code extension for real-time feedback.
Note: Customize rules; use --fix for auto-fixes.
41. Explain RESTful APIs in Node.js.
RESTful APIs in Node.js follow Representational State Transfer principles, using HTTP methods (GET, POST, PUT, DELETE) to perform CRUD operations on resources identified by URIs. They are stateless, use JSON for data exchange, and leverage HTTP status codes for responses.
In Node.js, Express.js is commonly used to build RESTful APIs due to its routing capabilities.
Example structure:
const express = require('express');
const app = express();
app.use(express.json());
// GET: Retrieve resource
app.get('/users/:id', (req, res) => {
res.json({ id: req.params.id, name: 'John' }); // 200 OK
});
// POST: Create resource
app.post('/users', (req, res) => {
const user = req.body;
res.status(201).json(user); // 201 Created
});
// PUT: Update resource
app.put('/users/:id', (req, res) => {
res.json({ updated: true }); // 200 OK
});
// DELETE: Remove resource
app.delete('/users/:id', (req, res) => {
res.status(204).end(); // 204 No Content
});
app.listen(3000);
Note: Ensure APIs are versioned (e.g., /api/v1/users) for maintainability.
42. How do you connect Node.js to MongoDB?
Connect Node.js to MongoDB using the official MongoDB Node.js driver. Install via npm install mongodb, then use MongoClient to establish a connection.
const { MongoClient } = require('mongodb');
const uri = 'mongodb://localhost:27017/mydb'; // Connection string
async function connectDB() {
const client = new MongoClient(uri);
try {
await client.connect(); // Establish connection
console.log('Connected to MongoDB');
const db = client.db('mydb'); // Select database
// Perform operations
return db;
} catch (err) {
console.error('Connection error:', err);
} finally {
await client.close(); // Close connection
}
}
connectDB();
Use async/await for handling connections. For production, handle reconnections and errors gracefully.
Note: Use environment variables for URI to secure credentials.
43. What is Mongoose and its benefits?
Mongoose is an ODM (Object Data Modeling) library for MongoDB and Node.js, providing a schema-based solution to model data, validate inputs, and handle relationships.
Benefits:
- Schema definition for data structure enforcement.
- Built-in validation and type casting.
- Query building with chaining.
- Middleware (hooks) for pre/post operations.
- Population for referencing documents.
const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost:27017/mydb');
const userSchema = new mongoose.Schema({
name: { type: String, required: true },
age: Number
});
const User = mongoose.model('User', userSchema);
// Usage
const user = new User({ name: 'John', age: 30 });
user.save().then(() => console.log('Saved'));
Note: Mongoose simplifies MongoDB interactions but adds overhead; use for complex apps.
44. How do you handle authentication in Node.js?
Authentication in Node.js can be handled using strategies like JWT, sessions, or OAuth. Common libraries include Passport.js for multiple strategies, bcrypt for password hashing, and jsonwebtoken for tokens.
Example with JWT:
const express = require('express');
const jwt = require('jsonwebtoken');
const bcrypt = require('bcrypt');
const app = express();
app.use(express.json());
const users = []; // Mock DB
app.post('/register', async (req, res) => {
const hashed = await bcrypt.hash(req.body.password, 10);
users.push({ username: req.body.username, password: hashed });
res.status(201).send('Registered');
});
app.post('/login', async (req, res) => {
const user = users.find(u => u.username === req.body.username);
if (user && await bcrypt.compare(req.body.password, user.password)) {
const token = jwt.sign({ username: user.username }, 'secret');
res.json({ token });
} else {
res.status(401).send('Invalid credentials');
}
});
// Protected route
app.get('/protected', (req, res) => {
const token = req.headers.authorization?.split(' ')[1];
try {
const decoded = jwt.verify(token, 'secret');
res.send(`Hello, ${decoded.username}`);
} catch {
res.status(401).send('Unauthorized');
}
});
app.listen(3000);
Note: Use HTTPS in production and store secrets securely.
45. What is JWT and how is it implemented?
JWT (JSON Web Token) is a compact, URL-safe token for securely transmitting information as a JSON object, consisting of header, payload, and signature.
Implementation: Sign tokens on server, verify on subsequent requests.
const jwt = require('jsonwebtoken');
const secret = 'your-secret-key';
// Generate JWT
const token = jwt.sign({ userId: 123, role: 'admin' }, secret, { expiresIn: '1h' });
// Verify JWT
try {
const decoded = jwt.verify(token, secret);
console.log(decoded); // { userId: 123, role: 'admin', iat: ..., exp: ... }
} catch (err) {
console.error('Invalid token');
}
Use in headers: Authorization: Bearer <token>.
Note: Never store sensitive data in payload; use secure secrets.
46. Explain sessions vs tokens for authentication.
Sessions: Server stores user data in memory/database, client gets session ID (cookie). Stateful, requires server storage.
Tokens (JWT): Client stores self-contained token, server verifies signature. Stateless, scalable, but harder to revoke.
| Aspect | Sessions | Tokens |
|---|---|---|
| Storage | Server-side | Client-side |
| State | Stateful | Stateless |
| Scalability | Needs sticky sessions | Easy across servers |
| Revocation | Easy (delete session) | Harder (blacklist) |
Example session with express-session:
const session = require('express-session');
app.use(session({ secret: 'secret', resave: false, saveUninitialized: true }));
app.post('/login', (req, res) => {
req.session.user = { id: 1 };
res.send('Logged in');
});
Note: Tokens suit APIs; sessions for web apps.
47. What is CORS and how do you enable it in Express?
CORS (Cross-Origin Resource Sharing) is a security feature restricting HTTP requests across domains. Enable in Express using cors middleware.
const express = require('express');
const cors = require('cors');
const app = express();
// Enable for all origins
app.use(cors());
// Specific origins
app.use(cors({
origin: 'https://example.com', // Allow this origin
methods: ['GET', 'POST'], // Allowed methods
credentials: true // Allow cookies
}));
app.get('/', (req, res) => {
res.send('CORS enabled');
});
app.listen(3000);
Note: Configure carefully to avoid security risks.
48. How do you secure Node.js applications?
Secure Node.js apps by: using HTTPS, validating inputs, hashing passwords, implementing auth/authorization, rate limiting, helmet for headers, and keeping dependencies updated.
- Use helmet: Sets security headers.
- HTTPS: With fs and https module.
- Input validation: Libraries like joi.
const helmet = require('helmet');
app.use(helmet());
// HTTPS example
const https = require('https');
const fs = require('fs');
https.createServer({
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
}, app).listen(443);
Note: Run behind reverse proxy like Nginx for production HTTPS.
49. What are common security vulnerabilities in Node.js?
Common vulnerabilities: Injection (SQL/NoSQL), XSS, CSRF, broken auth, insecure dependencies, unhandled errors exposing info, DoS via unvalidated inputs.
- Injection: Sanitize inputs.
- XSS: Escape outputs.
- Dependencies: Use npm audit.
Note: Follow OWASP guidelines; use Snyk for scanning.
50. Explain SQL injection prevention in Node.js.
SQL injection occurs when unsanitized inputs are concatenated into queries. Prevent using prepared statements or parameterized queries with libraries like mysql2 or pg.
const mysql = require('mysql2');
const connection = mysql.createConnection({ /* config */ });
// Vulnerable
connection.query(`SELECT * FROM users WHERE id = ${req.query.id}`); // Bad
// Safe: Parameterized
connection.query('SELECT * FROM users WHERE id = ?', [req.query.id], (err, results) => {
// Handle
});
Use ORMs like Sequelize for automatic escaping.
Note: Never trust user input; always parameterize.
51. What is rate limiting in APIs?
Rate limiting restricts request count from a client in a time window to prevent abuse, DoS, or overload.
Implement with express-rate-limit.
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // Limit each IP to 100 requests
message: 'Too many requests'
});
app.use(limiter); // Global
app.use('/api/', limiter); // Route-specific
Note: Use with Redis for distributed limiting.
52. How do you implement caching in Node.js?
Implement caching using in-memory (Node cache) or external stores like Redis. Cache responses to reduce database hits.
const NodeCache = require('node-cache');
const cache = new NodeCache({ stdTTL: 600 }); // 10 min TTL
app.get('/data', (req, res) => {
const key = 'mydata';
const cached = cache.get(key);
if (cached) {
return res.json(cached);
}
// Fetch from DB
const data = { /* fetched */ };
cache.set(key, data);
res.json(data);
});
Note: Invalidate cache on updates; use ETags for HTTP caching.
53. What is Redis and its use with Node.js?
Redis is an in-memory key-value store used for caching, sessions, queues. With Node.js, use redis client for fast data access.
const redis = require('redis');
const client = redis.createClient();
client.on('error', err => console.log('Redis Client Error', err));
await client.connect();
await client.set('key', 'value');
const value = await client.get('key');
console.log(value); // 'value'
await client.quit();
Uses: Caching API responses, pub/sub, rate limiting.
Note: Persistent with AOF/RDB; scale with clusters.
54. Explain WebSockets in Node.js.
WebSockets provide full-duplex communication over a single TCP connection for real-time apps. In Node.js, use ws library.
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 8080 });
wss.on('connection', (ws) => {
ws.on('message', (message) => {
console.log('Received:', message.toString());
ws.send('Echo: ' + message); // Send back
});
ws.send('Connected');
});
Client: new WebSocket('ws://localhost:8080')
Note: Handles upgrades from HTTP; use for chat, games.
55. What is Socket.io and how does it work?
Socket.io is a library for real-time, bidirectional communication, abstracting WebSockets with fallbacks (polling).
It works by establishing a connection, emitting events, and listening.
const express = require('express');
const { createServer } = require('http');
const { Server } = require('socket.io');
const app = express();
const httpServer = createServer(app);
const io = new Server(httpServer);
io.on('connection', (socket) => {
console.log('User connected');
socket.on('chat message', (msg) => {
io.emit('chat message', msg); // Broadcast
});
socket.on('disconnect', () => console.log('User disconnected'));
});
httpServer.listen(3000);
Client: <script src="/socket.io/socket.io.js"></script>
Note: Supports rooms, namespaces for scalability.
56. How do you handle real-time data in Node.js?
Handle real-time data using WebSockets/Socket.io for push updates, or Server-Sent Events (SSE) for unidirectional.
SSE example:
app.get('/events', (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
const interval = setInterval(() => {
res.write(`data: ${new Date().toString()}\n\n`);
}, 1000);
req.on('close', () => clearInterval(interval));
});
Note: Use for notifications, live feeds; scale with Redis pub/sub.
57. What is the difference between GET and POST requests?
GET: Retrieves data, idempotent, parameters in URL, cached, limited length.
POST: Submits data, non-idempotent, body for data, not cached, secure for sensitive info.
| GET | POST |
|---|---|
| URL params | Body |
| Bookmarkable | Not |
| Read | Create/Update |
Example in Express:
app.get('/search', (req, res) => res.send(req.query.q)); // GET ?q=term
app.post('/submit', (req, res) => res.send(req.body.data)); // POST body
Note: Use GET for safe operations, POST for changes.
58. Explain HTTP status codes relevant to Node.js.
Common codes:
- 200 OK: Success.
- 201 Created: Resource created.
- 204 No Content: Success, no body.
- 400 Bad Request: Invalid input.
- 401 Unauthorized: Auth required.
- 403 Forbidden: Access denied.
- 404 Not Found: Resource missing.
- 500 Internal Server Error: Unexpected error.
- 503 Service Unavailable: Temporary overload.
res.status(404).send('Not Found');
Note: Use appropriate codes for clear API responses.
59. What is the fs module and its methods?
The fs (File System) module provides APIs for file I/O. Methods: readFile, writeFile, appendFile, unlink, mkdir, etc. Supports sync/async/promises.
- readFile: Reads file async.
- writeFile: Writes file async.
- existsSync: Checks existence sync.
const fs = require('fs');
// Async read
fs.readFile('file.txt', 'utf8', (err, data) => {
if (err) console.error(err);
console.log(data);
});
Note: Prefer async to avoid blocking.
60. How do you read and write files asynchronously?
Use fs.promises for async file operations with await, or callbacks.
const fs = require('fs').promises;
async function handleFiles() {
try {
const data = await fs.readFile('input.txt', 'utf8'); // Read async
console.log(data);
await fs.writeFile('output.txt', 'New content'); // Write async
console.log('Written');
} catch (err) {
console.error(err);
}
}
handleFiles();
Note: Handle errors; use streams for large files.
61. What is the path module in Node.js?
The path module in Node.js provides utilities for working with file and directory paths. It handles platform-specific path delimiters (e.g., '/' on Unix, '\' on Windows) and offers methods for joining, resolving, normalizing, and parsing paths.
It's a built-in module, so no installation is required.
const path = require('path');
// Join path segments
const fullPath = path.join('/users', 'john', 'docs', 'file.txt');
console.log(fullPath); // /users/john/docs/file.txt (Unix)
// Resolve absolute path
const absolute = path.resolve('relative/file.txt');
console.log(absolute); // Current working dir + /relative/file.txt
// Get file extension
console.log(path.extname('file.txt')); // .txt
// Parse path
const parsed = path.parse('/home/user/dir/file.txt');
console.log(parsed); // { root: '/', dir: '/home/user/dir', base: 'file.txt', ext: '.txt', name: 'file' }
Note: Always use path module to avoid hardcoding delimiters for cross-platform compatibility.
62. Explain environment variables in Node.js.
Environment variables in Node.js are key-value pairs available to the process, used for configuration like API keys, ports, or database URLs. They are accessed via process.env and can be set via OS, .env files, or programmatically.
They promote security by keeping secrets out of code and enable environment-specific configs (dev, prod).
// Access env var
const port = process.env.PORT || 3000; // Default to 3000 if not set
console.log(`Server on port ${port}`);
// Set env var (temporary, for current process)
process.env.API_KEY = 'secret';
// List all
console.log(process.env); // Object with all vars
Note: Avoid committing sensitive env vars; use .gitignore for .env files.
63. How do you use dotenv for configuration?
dotenv is a module that loads environment variables from a .env file into process.env, useful for development. Install via npm install dotenv, then require it early in the app.
// Install: npm install dotenv
// .env file
// PORT=3000
// DB_URL=mongodb://localhost:27017/mydb
// app.js
require('dotenv').config(); // Load .env
const port = process.env.PORT;
const dbUrl = process.env.DB_URL;
console.log(`Port: ${port}, DB: ${dbUrl}`);
// For ES modules
import { config } from 'dotenv';
config();
Note: Do not use dotenv in production; use system env vars instead.
64. What is PM2 and why is it used?
PM2 is a process manager for Node.js applications, providing features like clustering, load balancing, auto-restarts, monitoring, and log management. It's used to keep apps running reliably in production.
Why used: Handles crashes, scales across cores, zero-downtime deploys, and integrates with ecosystems.
// Install globally: npm install -g pm2
// Start app
// pm2 start app.js --name my-app
// Cluster mode
// pm2 start app.js -i max // Forks for each CPU core
// Monitor
// pm2 monit
// Logs
// pm2 logs my-app
Note: Use ecosystem.config.js for advanced configs like env vars per mode.
65. How do you deploy a Node.js app to production?
Deploying a Node.js app involves: preparing code (minify, bundle), setting up server (e.g., AWS, Heroku), using process manager like PM2, configuring env vars, enabling HTTPS, and monitoring.
- Build/test locally.
- Push to repo (Git).
- Deploy to platform (e.g., Heroku: git push heroku main).
- Set env vars on platform.
- Use reverse proxy (Nginx) for static files/HTTPS.
- Monitor with tools like New Relic.
// package.json scripts
"scripts": {
"start": "node server.js",
"build": "npm install" // Or webpack if needed
}
// Heroku Procfile
web: npm start
Note: Use CI/CD for automated deploys; secure with firewalls.
66. Explain Docker for Node.js applications.
Docker containerizes Node.js apps, ensuring consistency across environments by packaging code, deps, and runtime. Use Dockerfile to define image, then run containers.
Benefits: Portability, isolation, easy scaling with orchestrators like Kubernetes.
# Dockerfile
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["node", "server.js"]
# Build and run
docker build -t myapp .
docker run -p 3000:3000 -d myapp
Note: Use multi-stage builds for smaller images; .dockerignore for excluding files.
67. What is CI/CD in the context of Node.js?
CI/CD (Continuous Integration/Continuous Deployment) automates testing, building, and deploying Node.js apps. CI runs tests on code changes; CD deploys successful builds.
Tools: GitHub Actions, Jenkins, CircleCI. For Node.js: Lint, test with Jest, build, deploy to Heroku/AWS.
# GitHub Actions workflow (.github/workflows/ci.yml)
name: Node.js CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js
uses: actions/setup-node@v2
with: { node-version: '18' }
- run: npm ci
- run: npm test
Note: Include security scans and artifact uploads in pipelines.
68. How do you test Node.js code?
Test Node.js code using frameworks like Jest, Mocha, or Tap. Write unit/integration tests, mock deps, and run via npm test. Cover assertions, async handling.
// sum.js
function sum(a, b) { return a + b; }
module.exports = sum;
// sum.test.js (Jest)
const sum = require('./sum');
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
// package.json
"scripts": { "test": "jest" }
Note: Aim for high coverage; use supertest for API testing.
69. What is Jest and how is it used for testing?
Jest is a testing framework for JavaScript/Node.js, offering zero-config setup, mocking, snapshots, and parallel execution. Install via npm install --save-dev jest.
// Config in package.json
"jest": {
"testEnvironment": "node"
}
// Example test
describe('Math functions', () => {
it('multiplies 2 * 3 to equal 6', () => {
expect(2 * 3).toBe(6);
});
test('async test', async () => {
const data = await Promise.resolve('data');
expect(data).toBe('data');
});
});
// Run: npm test
Note: Jest supports Babel for ES6+; use --coverage for reports.
70. Explain unit testing vs integration testing.
Unit testing: Tests individual components (functions, classes) in isolation, mocking externalities. Fast, focused on logic.
Integration testing: Tests interactions between components/modules, e.g., API with DB. Slower, catches integration issues.
// Unit (Jest)
test('unit: sum', () => {
expect(sum(1, 2)).toBe(3); // Isolated
});
// Integration (supertest + Express)
const request = require('supertest');
const app = require('./app');
test('integration: GET /', async () => {
const res = await request(app).get('/');
expect(res.status).toBe(200);
expect(res.text).toContain('Hello'); // With real server
});
Note: Pyramid: More units, fewer integrations; use both for robust apps.
71. What is the EventEmitter class?
EventEmitter is a core class in Node.js for implementing event-driven programming. It allows objects to emit events and register listeners, forming the basis for modules like http.
const EventEmitter = require('events');
class MyEmitter extends EventEmitter {}
const emitter = new MyEmitter();
emitter.on('event', (arg) => {
console.log('Event fired with:', arg); // Listener
});
emitter.emit('event', 'data'); // Emit
Note: Use once() for single-fire; removeListener() to unsubscribe.
72. How do you create custom events in Node.js?
Create custom events by extending EventEmitter and defining event names. Emit with data, listen with callbacks.
const EventEmitter = require('events');
class Logger extends EventEmitter {
log(message) {
console.log(message);
this.emit('logged', { time: Date.now(), msg: message }); // Custom event
}
}
const logger = new Logger();
logger.on('logged', (data) => {
console.log(`Logged at ${data.time}: ${data.msg}`);
});
logger.log('Hello'); // Triggers event
Note: Handle errors with 'error' event to avoid crashes.
73. What is the child_process module?
The child_process module enables spawning child processes to run shell commands or other Node.js scripts, useful for parallel tasks or external tools.
Methods: exec (buffered), spawn (streaming), fork (Node-specific).
const { exec } = require('child_process');
exec('ls -la', (err, stdout, stderr) => {
if (err) console.error(err);
console.log(stdout); // Output
});
Note: Handle exit events for cleanup.
74. Differentiate between fork(), spawn(), and exec().
fork(): Creates Node.js child process, establishes IPC channel. For heavy JS tasks.
spawn(): Launches process with streaming stdin/stdout/stderr. For any command, non-buffered.
exec(): Runs command in shell, buffers output. For simple commands with full output.
const { fork, spawn, exec } = require('child_process');
// fork
const child = fork('script.js'); // Node script
child.send('message'); // IPC
// spawn
const ls = spawn('ls', ['-la']);
ls.stdout.on('data', (data) => console.log(data.toString())); // Stream
// exec
exec('ls -la', (err, stdout) => console.log(stdout)); // Buffered
Note: spawn/fork for large data; exec for convenience.
75. What is the util module?
The util module provides utility functions like promisify (convert callback to Promise), types checks, and debugging tools.
const util = require('util');
const fs = require('fs');
const readFilePromise = util.promisify(fs.readFile);
async function read() {
const data = await readFilePromise('file.txt', 'utf8');
console.log(data);
}
read();
// Type check
console.log(util.types.isPromise(Promise.resolve())); // true
Note: Useful for modernizing legacy callback APIs.
76. Explain promises chaining.
Promise chaining sequences asynchronous operations using .then(), where each returns a Promise. .catch() handles errors at chain end.
fetch('https://api.example.com/data')
.then(response => {
if (!response.ok) throw new Error('Failed');
return response.json(); // Chain
})
.then(data => {
console.log(data);
return processData(data); // Next chain
})
.then(result => console.log(result))
.catch(err => console.error(err)); // Central error
Note: Return values in .then() to pass down; flat chains avoid nesting.
77. What is the difference between Observable and Promise?
Promise: Handles single async value (resolves once). Eager execution.
Observable (RxJS): Handles multiple values over time, like streams. Lazy, cancellable, operators for transformations.
// Promise
const promise = new Promise(resolve => setTimeout(() => resolve('done'), 1000));
promise.then(console.log); // 'done'
// Observable (RxJS)
const { Observable } = require('rxjs');
const observable = new Observable(observer => {
observer.next(1);
observer.next(2);
observer.complete();
});
observable.subscribe(console.log); // 1, 2
Note: Observables for events/streams; Promises for single ops.
78. How do you handle memory leaks in Node.js?
Handle leaks by: avoiding global vars, closing connections, using streams, profiling with --inspect, fixing event listener accumulation.
- Monitor with process.memoryUsage().
- Use heap snapshots in Chrome DevTools.
- Avoid large caches without eviction.
// Potential leak: Growing array
const leaks = [];
setInterval(() => leaks.push(new Array(10000)), 1000); // Bad
// Fix: Limit size
if (leaks.length > 100) leaks.shift();
Note: Leaks cause OOM; test under load.
79. What tools detect memory leaks?
Tools: Node Inspector (--inspect), heapdump + Chrome DevTools, Clinic.js, Memwatch-next, v8-profiler.
// heapdump
const heapdump = require('heapdump');
heapdump.writeSnapshot('/path/to/snapshot.heapsnapshot'); // Analyze in Chrome
// Clinic.js: clinic heap-profiler -- node app.js
Note: Compare snapshots before/after suspected leaks.
80. Explain garbage collection in Node.js.
Garbage collection (GC) in Node.js (V8) automatically frees unused memory using mark-sweep-compact algorithm. Minor GC (Scavenger) for young gen, Major for old.
Triggers: Allocation failure. Tune with flags like --max-old-space-size.
// Force GC (dev only)
if (global.gc) global.gc(); // With --expose-gc
// Monitor
console.log(process.memoryUsage().heapUsed); // Bytes used
Note: Avoid blocking GC; weak refs for caches.
81. What is the process object?
The process object in Node.js is a global object that provides information and control over the current Node.js process. It allows access to environment variables, command-line arguments, process ID, and methods to handle events like exit or uncaught exceptions.
It is an instance of EventEmitter, enabling event listening for process-related events.
// Access process info
console.log(process.pid); // Process ID
console.log(process.cwd()); // Current working directory
console.log(process.version); // Node.js version
// Exit process
process.exit(0); // Exit with success code
Note: Avoid using process.exit() in production as it doesn't allow cleanup; use it for scripts or error handling.
82. How do you get command-line arguments?
Command-line arguments in Node.js are accessed via process.argv, an array where the first two elements are the Node executable path and script path, followed by user-provided arguments.
Use slicing or libraries like yargs for parsing complex args.
// script.js
const args = process.argv.slice(2); // Skip first two
console.log('Arguments:', args);
// Run: node script.js --name=John age=30
// Output: Arguments: [ '--name=John', 'age=30' ]
// Parsing with yargs
const yargs = require('yargs');
const argv = yargs.argv;
console.log(argv.name); // John
Note: For flags, use process.argv directly or minimist/yargs for better handling.
83. What is the os module?
The os module provides operating system-related utilities, such as CPU info, memory usage, network interfaces, and platform details. It's built-in and helps in writing cross-platform code.
const os = require('os');
console.log(os.platform()); // e.g., 'linux'
console.log(os.arch()); // e.g., 'x64'
console.log(os.cpus().length); // Number of CPU cores
console.log(os.totalmem() / (1024 * 1024 * 1024)); // Total memory in GB
console.log(os.freemem() / (1024 * 1024 * 1024)); // Free memory in GB
console.log(os.networkInterfaces()); // Network info
Note: Useful for scaling apps, like forking workers based on CPU count.
84. Explain performance optimization techniques.
Performance optimization in Node.js involves profiling, efficient code, caching, clustering, async patterns, and database tuning. Key techniques:
- Use async operations to avoid blocking the Event Loop.
- Implement clustering for multi-core utilization.
- Cache frequently accessed data (e.g., Redis).
- Optimize database queries and use indexing.
- Minify/compress assets; use streams for large files.
- Profile with tools like Clinic.js or --inspect.
- Avoid memory leaks by managing scopes and listeners.
// Example: Streaming large file
const fs = require('fs');
const http = require('http');
http.createServer((req, res) => {
fs.createReadStream('largefile.txt').pipe(res); // Efficient
}).listen(3000);
Note: Benchmark with tools like Apache Bench; monitor in production.
85. How do you profile a Node.js application?
Profile Node.js apps using built-in tools like --prof (V8 profiler), --inspect (Chrome DevTools), or third-party like Clinic.js. Collect metrics on CPU, memory, and bottlenecks.
// Run with profiler
// node --prof app.js
// Then process: node --prof-process isolate-0x...-v8.log > profile.txt
// With inspect
// node --inspect app.js
// Open chrome://inspect, take heap snapshots or CPU profiles
Use Clinic.js for visual flamegraphs:
// clinic flame -- node app.js
Note: Profile under realistic load; focus on hot paths.
86. What is the clinic.js tool?
Clinic.js is a suite of tools for diagnosing Node.js performance issues, providing visual profiles like flamegraphs (CPU), bubbleprof (async), and doctor (general health).
Install via npm install -g clinic; run with your app for insights.
// Install: npm install -g clinic
// Usage examples:
// clinic doctor -- node app.js // General diagnostics
// clinic flame -- node app.js // CPU flamegraph
// clinic bubbleprof -- node app.js // Async bubble chart
Note: Generates HTML reports; great for identifying bottlenecks visually.
87. Explain microservices with Node.js.
Microservices in Node.js involve building apps as independent services communicating via APIs (HTTP, gRPC). Node's lightweight nature suits it for scalable, modular architectures.
Benefits: Independent deployment, tech diversity. Use Express for APIs, Docker for containerization, Kubernetes for orchestration.
// Service A: User service
const express = require('express');
const app = express();
app.get('/users', (req, res) => res.json([{ id: 1 }])); // Mock
app.listen(3001);
// Service B: Calls Service A
const axios = require('axios');
axios.get('http://localhost:3001/users').then(res => console.log(res.data));
Note: Handle service discovery with Consul; monitor with Prometheus.
88. What is GraphQL and its integration with Node.js?
GraphQL is a query language for APIs allowing clients to request specific data, reducing over/under-fetching. In Node.js, integrate with Apollo Server or express-graphql.
Define schema, resolvers; clients query via POST.
const { ApolloServer, gql } = require('apollo-server');
const typeDefs = gql`
type Query {
hello: String
}
`;
const resolvers = {
Query: { hello: () => 'Hello world!' }
};
const server = new ApolloServer({ typeDefs, resolvers });
server.listen().then(({ url }) => console.log(`Server at ${url}`));
Note: Supports subscriptions for real-time; use with databases via data sources.
89. How do you handle database migrations?
Handle database migrations in Node.js using tools like Knex.js, Sequelize, or Prisma. Define up/down scripts for schema changes, run via CLI.
// With Knex: npm install knex
// migration.js
exports.up = function(knex) {
return knex.schema.createTable('users', table => {
table.increments('id');
table.string('name');
});
};
exports.down = function(knex) {
return knex.schema.dropTable('users');
};
// Run: knex migrate:latest
Note: Version control migrations; test in staging.
90. What is Sequelize ORM?
Sequelize is an ORM for Node.js supporting SQL databases like PostgreSQL, MySQL. It provides model definitions, associations, validations, and query methods.
const { Sequelize, DataTypes } = require('sequelize');
const sequelize = new Sequelize('database', 'user', 'pass', {
host: 'localhost',
dialect: 'postgres'
});
const User = sequelize.define('User', {
name: { type: DataTypes.STRING, allowNull: false }
});
(async () => {
await sequelize.sync(); // Create table
await User.create({ name: 'John' });
const users = await User.findAll();
console.log(users);
})();
Note: Supports migrations, hooks; good for relational data.
91. Explain TypeScript with Node.js.
TypeScript is a superset of JavaScript adding static types, compiled to JS. In Node.js, it improves code quality, catch errors early, and enhances IDE support.
Set up with tsconfig.json, compile with tsc, run with node or ts-node.
// greeter.ts
function greet(name: string): string {
return `Hello, ${name}`;
}
console.log(greet('World')); // Type-checked
// Compile: tsc greeter.ts → greeter.js
// Run: node greeter.js
Note: Use @types/node for Node typings.
92. What are the benefits of using TypeScript?
Benefits of TypeScript: Static typing reduces runtime errors, better refactoring, improved documentation via types, enhanced IDE autocompletion, and scalability for large teams.
- Catches errors at compile time (e.g., wrong types).
- Supports interfaces, generics, enums.
- Backward compatible with JS.
- Better maintainability in complex apps.
Example benefit: Type safety prevents bugs.
// TS error: Argument of type 'number' is not assignable to 'string'
greet(123); // Compile error
Note: Overhead in learning curve; optional types allow gradual adoption.
93. How do you set up a Node.js project with TypeScript?
Set up: Initialize npm, install TypeScript and types, create tsconfig.json, write TS code, compile, and run.
// Steps:
npm init -y
npm install typescript @types/node --save-dev
npx tsc --init // Creates tsconfig.json
// tsconfig.json excerpt
{
"compilerOptions": {
"target": "es6",
"module": "commonjs",
"outDir": "./dist",
"strict": true
}
}
// package.json scripts
"scripts": {
"build": "tsc",
"start": "node dist/server.js"
}
// Run: npm run build && npm start
Note: Use ts-node for dev without compiling.
94. What is the crypto module?
The crypto module provides cryptographic functionality like hashing, encryption, signing. It's built-in, using OpenSSL under the hood.
const crypto = require('crypto');
// Hash
const hash = crypto.createHash('sha256');
hash.update('password');
console.log(hash.digest('hex')); // Hex digest
// Random bytes
crypto.randomBytes(16, (err, buf) => {
console.log(buf.toString('hex')); // Secure random
});
Note: Use for security-sensitive ops; prefer scrypt/pbkdf2 for passwords.
95. How do you hash passwords in Node.js?
Hash passwords using bcrypt or crypto with pbkdf2. Bcrypt is recommended for its salt and work factor.
const bcrypt = require('bcrypt');
const saltRounds = 10;
async function hashPassword(password) {
const hash = await bcrypt.hash(password, saltRounds);
return hash;
}
async function checkPassword(password, hash) {
return await bcrypt.compare(password, hash);
}
// Usage
const hashed = await hashPassword('secret');
console.log(await checkPassword('secret', hashed)); // true
Note: Never store plain passwords; use high work factor in production.
96. Explain SSL/TLS in Node.js.
SSL/TLS secures network communications with encryption. In Node.js, use https module with key/cert for servers, or request with https for clients.
Generate self-signed or use Let's Encrypt for certs.
const https = require('https');
const fs = require('fs');
const options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
};
https.createServer(options, (req, res) => {
res.end('Secure!');
}).listen(443);
Note: Enforce HTTPS; handle certificate management.
97. What is the https module?
The https module is built-in for creating secure HTTP servers/clients using SSL/TLS. It extends http with security options.
const https = require('https');
// Server (as above)
// Client request
https.get('https://example.com', (res) => {
res.on('data', (d) => console.log(d.toString()));
});
Note: Use agent for connection pooling; verify certs in production.
98. How do you implement logging in Node.js?
Implement logging using console for basics, or libraries like Winston/Pino for structured, leveled logs with transports (file, console, cloud).
const winston = require('winston');
const logger = winston.createLogger({
level: 'info',
format: winston.format.json(),
transports: [
new winston.transports.Console(),
new winston.transports.File({ filename: 'app.log' })
]
});
logger.info('Info message');
logger.error('Error message');
Note: Rotate logs; use correlation IDs for distributed systems.
99. What is Winston for logging?
Winston is a versatile logging library for Node.js, supporting multiple transports, formats, levels, and metadata. It's extensible and production-ready.
Features: Query logs, profiles, exceptions handling.
// As above example
// Custom format
const { format } = winston;
const myFormat = format.printf(({ level, message }) => `${level}: ${message}`);
logger.format = format.combine(format.timestamp(), myFormat);
Note: Integrates with Express via middleware.
100. Explain error monitoring with tools like Sentry.
Error monitoring with Sentry involves capturing exceptions, breadcrumbs, and context for debugging. Integrate via SDK to report to dashboard.
const Sentry = require('@sentry/node');
Sentry.init({ dsn: 'your-dsn' });
// Capture error
try {
throw new Error('Test error');
} catch (e) {
Sentry.captureException(e);
}
// Express integration
app.use(Sentry.Handlers.requestHandler());
app.use(Sentry.Handlers.errorHandler());
Note: Add user context; monitor performance too.
101. What is serverless architecture with Node.js?
Serverless architecture in Node.js refers to building and running applications without managing servers. Developers focus on code, while cloud providers handle infrastructure, scaling, and maintenance. Node.js is popular for serverless due to its lightweight runtime and fast startup times.
Key features: Event-driven, pay-per-use, auto-scaling. Common platforms: AWS Lambda, Google Cloud Functions, Azure Functions.
Example use: API endpoints, data processing, webhooks.
Note: "Serverless" means no server management, but servers still exist—managed by providers.
102. How does AWS Lambda work with Node.js?
AWS Lambda executes Node.js code in response to events, handling scaling and execution. Upload code as a ZIP or container, configure triggers (e.g., HTTP, S3), and Lambda runs it in a managed environment.
Workflow: Event → Lambda function → Execution (cold start if idle) → Response.
// index.js (Handler)
exports.handler = async (event, context) => {
console.log('Event:', event); // Log input
return {
statusCode: 200,
body: JSON.stringify({ message: 'Hello from Lambda!' })
};
};
// Deployment: Zip and upload to Lambda, set runtime to Node.js 20.x
Supports layers for deps, environment vars, and timeouts.
Note: Minimize cold starts with Provisioned Concurrency; monitor with CloudWatch.
103. Explain API gateways.
API Gateways act as entry points for APIs, managing requests, routing, authentication, rate limiting, and caching. In serverless, like AWS API Gateway, they trigger Lambdas and handle responses.
Features: Request transformation, CORS, authorization (e.g., JWT), logging.
Example: Routes /users → Lambda function.
Note: Improves security and scalability; alternatives include Kong, Apigee.
104. What is the dns module?
The dns module in Node.js provides functions for DNS operations like lookups, resolutions, and reverse lookups. It's built-in and uses system resolvers or custom servers.
const dns = require('dns');
dns.lookup('example.com', (err, address, family) => {
console.log('Address:', address); // e.g., 93.184.216.34
console.log('Family:', family); // IPv4 or IPv6
});
Note: Asynchronous by default; use promises with dns.promises.
105. How do you perform DNS lookups?
Perform DNS lookups using dns.lookup() for hostname to IP, or dns.resolve() for specific records (A, MX, etc.). Handle errors for invalid hosts.
const dns = require('dns').promises;
async function lookupHost(host) {
try {
const { address, family } = await dns.lookup(host);
console.log(`IP: ${address}, Family: ${family}`);
const records = await dns.resolve(host, 'MX'); // Mail exchange records
console.log('MX Records:', records);
} catch (err) {
console.error('DNS Error:', err.message);
}
}
lookupHost('example.com');
Note: lookup() uses OS cache; resolve() queries DNS servers directly.
106. What is the net module for TCP servers?
The net module creates TCP or IPC servers and clients. For servers, use net.createServer() to listen for connections, handle data, and manage sockets.
const net = require('net');
const server = net.createServer((socket) => {
console.log('Client connected');
socket.write('Hello from server!\n'); // Send data
socket.on('data', (data) => {
console.log('Received:', data.toString());
});
socket.on('end', () => console.log('Client disconnected'));
});
server.listen(8080, () => console.log('TCP server on 8080'));
Note: Low-level; use for custom protocols, or prefer http for web.
107. Explain UDP with dgram module.
UDP (User Datagram Protocol) is connectionless, lightweight for fast, unreliable data transfer. The dgram module handles UDP sockets for sending/receiving datagrams.
const dgram = require('dgram');
// Server
const server = dgram.createSocket('udp4');
server.on('message', (msg, rinfo) => {
console.log(`Received: ${msg} from ${rinfo.address}:${rinfo.port}`);
});
server.bind(41234, () => console.log('UDP server bound'));
// Client
const client = dgram.createSocket('udp4');
client.send('Hello UDP!', 41234, 'localhost', (err) => {
if (err) console.error(err);
client.close();
});
Note: No guarantees on delivery/order; use for logging, gaming.
108. What is the querystring module?
The querystring module parses and stringifies URL query strings, handling encoding/decoding of key-value pairs.
const querystring = require('querystring');
const query = 'name=John&age=30';
const parsed = querystring.parse(query);
console.log(parsed); // { name: 'John', age: '30' }
const obj = { city: 'New York', zip: 10001 };
const stringified = querystring.stringify(obj);
console.log(stringified); // city=New%20York&zip=10001
Note: Deprecated in favor of URLSearchParams; use for legacy code.
109. How do you parse URL queries?
Parse URL queries using URLSearchParams or querystring.parse(). Extract from req.url in HTTP servers.
const http = require('http');
const url = require('url');
http.createServer((req, res) => {
const parsedUrl = url.parse(req.url, true); // true for query object
const query = parsedUrl.query;
console.log('Query params:', query); // e.g., { search: 'node' }
res.end('Parsed');
}).listen(3000);
// Modern: URLSearchParams
const params = new URLSearchParams('foo=bar&baz=qux');
console.log(params.get('foo')); // bar
Note: Handle arrays with multiple same keys.
110. Explain the url module.
The url module parses and formats URLs, providing components like protocol, host, path, query.
const url = require('url');
const myUrl = 'https://user:pass@example.com:8080/path?query=1#hash';
const parsed = url.parse(myUrl, true);
console.log(parsed.hostname); // example.com
console.log(parsed.query); // { query: '1' }
const formatted = url.format({
protocol: 'https',
hostname: 'example.com',
pathname: '/path'
});
console.log(formatted); // https://example.com/path
Note: Use WHATWG URL API for modern parsing: new URL(myUrl).
111. What is the zlib module for compression?
The zlib module provides compression/decompression using Gzip, Deflate, Brotli. Useful for reducing data size in transfers.
const zlib = require('zlib');
const input = 'Compress me!';
// Gzip compress
zlib.gzip(input, (err, buffer) => {
if (err) console.error(err);
console.log('Compressed:', buffer.toString('base64'));
// Decompress
zlib.gunzip(buffer, (err, output) => {
console.log('Decompressed:', output.toString()); // Compress me!
});
});
Note: Asynchronous; use promises for cleaner code.
112. How do you compress responses?
Compress HTTP responses using compression middleware in Express or zlib with http. Check Accept-Encoding header.
const express = require('express');
const compression = require('compression');
const app = express();
app.use(compression()); // Auto-compress responses
app.get('/', (req, res) => {
res.send('Compressed response');
});
app.listen(3000);
// Manual with zlib
const http = require('http');
const zlib = require('zlib');
http.createServer((req, res) => {
if (req.headers['accept-encoding']?.includes('gzip')) {
res.writeHead(200, { 'Content-Encoding': 'gzip' });
const output = zlib.createGzip();
output.pipe(res);
output.end('Data to compress');
} else {
res.end('Uncompressed');
}
}).listen(8080);
Note: Compress text-based content; skip for images.
113. Explain assertions in Node.js.
Assertions test expected conditions, throwing errors if false. Used for debugging, validation in tests/scripts.
Node's assert module provides functions like assert.equal(), assert.strictEqual().
const assert = require('assert');
assert.equal(1 + 1, 2, 'Math error'); // Passes
// assert.equal(1 + 1, 3); // Throws AssertionError: Math error
Note: Not for production error handling; use in tests (e.g., Mocha).
114. What is the assert module?
The assert module is built-in for writing tests/assertions. It includes methods for equality, deep equality, throws, etc.
const assert = require('assert').strict; // Strict mode
assert.deepStrictEqual({ a: 1 }, { a: 1 }); // Passes
assert.throws(() => { throw new Error('Boom'); }, Error); // Passes
Note: Strict mode avoids type coercion; export for unit tests.
115. How do you validate inputs?
Validate inputs using libraries like Joi, Validator.js, or custom checks. Prevent injection, ensure types/formats.
const Joi = require('joi');
const schema = Joi.object({
username: Joi.string().alphanum().min(3).max(30).required(),
email: Joi.string().email().required(),
age: Joi.number().integer().min(18)
});
const { error, value } = schema.validate({ username: 'john', email: 'john@example.com', age: 25 });
if (error) {
console.error('Validation error:', error.details[0].message);
} else {
console.log('Valid:', value);
}
Custom example:
function validateEmail(email) {
const re = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
return re.test(email);
}
Note: Validate on server; sanitize to prevent XSS.
116. What is Joi for validation?
Joi is a popular schema description language and data validator for JavaScript, commonly used in Node.js for validating request bodies, query parameters, and other inputs in APIs. It allows defining schemas with rules for types, formats, and constraints, providing detailed error messages on validation failures.
Benefits include chainable API, support for complex validations, and integration with frameworks like Hapi or Express.
// Install: npm install joi
const Joi = require('joi');
const schema = Joi.object({
username: Joi.string().alphanum().min(3).max(30).required(),
email: Joi.string().email({ minDomainSegments: 2 }).required(),
age: Joi.number().integer().min(18).optional()
});
const data = { username: 'john_doe', email: 'john@example.com', age: 25 };
const { error, value } = schema.validate(data);
if (error) {
console.error('Validation failed:', error.details[0].message); // e.g., "username" must only contain alpha-numeric characters
} else {
console.log('Validated data:', value);
}
Note: Use abortEarly: false in options to get all errors at once.
117. Explain data serialization.
Data serialization is the process of converting complex data structures (like objects, arrays) into a format that can be easily stored or transmitted, such as strings or binary data. In Node.js, common formats include JSON, BSON, or custom binary formats. Deserialization reverses this.
It's crucial for APIs, databases, caching, and inter-process communication to ensure data integrity and compatibility.
// JSON serialization
const data = { name: 'John', age: 30, hobbies: ['reading', 'coding'] };
const serialized = JSON.stringify(data);
console.log(serialized); // '{"name":"John","age":30,"hobbies":["reading","coding"]}'
// Deserialization
const deserialized = JSON.parse(serialized);
console.log(deserialized.name); // John
// Custom with Buffer (binary)
const buffer = Buffer.from('Hello');
console.log(buffer.toString('hex')); // 48656c6c6f (serialized)
Note: JSON doesn't support cycles or functions; use libraries like msgpack for binary efficiency.
118. What is the vm module?
The vm (Virtual Machine) module in Node.js allows compiling and running code within V8 virtual machine contexts, providing sandboxed execution environments. It's useful for evaluating untrusted code or creating isolated scopes.
It supports contexts, scripts, and timeouts for safety.
const vm = require('vm');
const code = 'a + b';
const context = { a: 1, b: 2 };
vm.createContext(context); // Sandbox
const result = vm.runInContext(code, context);
console.log(result); // 3
Note: Not fully secure; avoid for highly untrusted code without additional isolation.
119. How do you run code in a sandbox?
Run code in a sandbox using the vm module to create isolated contexts, limiting access to globals. For stricter isolation, use worker threads or external processes.
const vm = require('vm');
const untrustedCode = 'globalVar = "hacked"; return 42;'; // Attempt to modify global
const sandbox = { console }; // Limited globals
vm.createContext(sandbox);
try {
const result = vm.runInNewContext(untrustedCode, sandbox, { timeout: 1000 });
console.log('Result:', result); // 42
} catch (err) {
console.error('Execution error:', err.message);
}
console.log(typeof globalVar); // undefined (not modified)
Note: Freeze sandbox objects with Object.freeze() for extra safety.
120. What is the repl module for programmatic REPL?
The repl (Read-Eval-Print Loop) module allows creating custom REPL servers programmatically, useful for embedding interactive shells in applications or customizing Node's REPL behavior.
It handles input, evaluation, and output with options for prompts, writers, and completers.
const repl = require('repl');
const myRepl = repl.start({
prompt: 'my-app> ',
eval: (cmd, context, filename, callback) => {
// Custom eval logic
callback(null, eval(cmd));
}
});
// Add custom command
myRepl.defineCommand('sayhello', {
help: 'Say hello',
action(name) {
this.clearBufferedCommand();
console.log(`Hello, ${name}!`);
this.displayPrompt();
}
});
Note: Use for CLI tools or debugging interfaces.
121. Explain domain module deprecation.
The domain module was introduced for handling multiple asynchronous operations as a group, capturing errors across them. It was deprecated in Node.js v1.4.2 (2015) due to design flaws, potential memory leaks, and inconsistent behavior.
As of Node.js v12+, it's removed. Alternatives: async_hooks, zones, or structured error handling with try-catch and Promises.
Reason: Domains didn't integrate well with Promises and could lead to unhandled errors.
// Old usage (not recommended)
const domain = require('domain');
const d = domain.create();
d.on('error', (err) => console.error('Caught:', err));
d.run(() => {
setTimeout(() => { throw new Error('Boom'); }, 100);
});
Note: Migrate to async/await with try-catch for better error management.
122. What is the readline module?
The readline module provides an interface for reading data from readable streams (like process.stdin) line-by-line, useful for CLI applications, REPLs, or processing files.
It emits events for lines, close, and supports history, prompts.
const readline = require('readline');
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: true
});
rl.setPrompt('Enter text> ');
rl.prompt();
rl.on('line', (input) => {
console.log(`Received: ${input}`);
if (input === 'exit') rl.close();
});
rl.on('close', () => {
console.log('Goodbye!');
process.exit(0);
});
Note: Handles backspace, history with up/down arrows in terminal mode.
123. How do you read user input?
Read user input using process.stdin for raw data or readline for line-based processing. For prompts, use readline.question() or inquirer for advanced CLIs.
const readline = require('readline');
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
rl.question('What is your name? ', (name) => {
console.log(`Hello, ${name}!`);
rl.close();
});
// Raw stdin
process.stdin.setEncoding('utf8');
process.stdin.on('data', (chunk) => {
console.log(`Input: ${chunk.trim()}`);
});
Note: Pause/resume stdin as needed; handle SIGINT for clean exit.
124. Explain the string_decoder module.
The string_decoder module decodes Buffer objects into strings, handling multi-byte characters split across chunks, preventing invalid UTF-8 sequences.
Useful in streams where data arrives in parts.
const { StringDecoder } = require('string_decoder');
const decoder = new StringDecoder('utf8');
const euro = Buffer.from([0xE2, 0x82]); // Partial € (3 bytes)
console.log(decoder.write(euro)); // '' (incomplete)
const euro2 = Buffer.from([0xAC]);
console.log(decoder.write(euro2)); // € (complete)
console.log(decoder.end()); // Flush remaining
Note: Preferred over buffer.toString() for chunked data to avoid corruption.
125. What is TLS for secure connections?
TLS (Transport Layer Security) is a protocol for encrypting and authenticating network connections, successor to SSL. In Node.js, it's used via tls module for secure sockets, ensuring data privacy and integrity.
Handles handshakes, certificates, and ciphers.
const tls = require('tls');
const fs = require('fs');
const options = {
key: fs.readFileSync('server-key.pem'),
cert: fs.readFileSync('server-cert.pem'),
rejectUnauthorized: false // For testing
};
const server = tls.createServer(options, (socket) => {
console.log('Secure connection established');
socket.write('Hello secure world');
socket.end();
});
server.listen(8000, () => console.log('TLS server listening'));
Note: Always validate certificates in production.
126. How do you create self-signed certificates?
Create self-signed certificates using OpenSSL command-line tool, then use in Node.js for development/testing. Not for production due to trust issues.
// Command-line (requires OpenSSL)
openssl req -newkey rsa:2048 -nodes -keyout key.pem -x509 -days 365 -out cert.pem -subj "/CN=localhost"
// In Node.js: Use as in tls example above
// Programmatic (with node-forge)
const forge = require('node-forge');
const keys = forge.pki.rsa.generateKeyPair(2048);
const cert = forge.pki.createCertificate();
cert.publicKey = keys.publicKey;
cert.serialNumber = '01';
cert.validity.notBefore = new Date();
cert.validity.notAfter = new Date();
cert.validity.notAfter.setFullYear(cert.validity.notBefore.getFullYear() + 1);
const attrs = [{ name: 'commonName', value: 'localhost' }];
cert.setSubject(attrs);
cert.setIssuer(attrs);
cert.sign(keys.privateKey);
const pemCert = forge.pki.certificateToPem(cert);
const pemKey = forge.pki.privateKeyToPem(keys.privateKey);
fs.writeFileSync('cert.pem', pemCert);
fs.writeFileSync('key.pem', pemKey);
Note: Add to trust store for browsers; use Let's Encrypt for free trusted certs.
127. Explain performance bottlenecks in Node.js.
Performance bottlenecks in Node.js include blocking the Event Loop with sync operations, memory leaks, inefficient queries, unoptimized code, and high concurrency without clustering.
- Sync I/O: Blocks thread.
- CPU-intensive tasks: Use workers.
- Memory: Large objects, uncleared intervals.
- Network/DB: Slow responses amplify.
Note: Profile regularly; use async everywhere possible.
128. What is the heap snapshot?
A heap snapshot is a capture of the V8 heap memory at a point in time, used to analyze memory usage, find leaks, and inspect objects/references.
Generate with --heap-prof or Chrome DevTools via --inspect.
// With heapdump module
const heapdump = require('heapdump');
heapdump.writeSnapshot('myapp.heapsnapshot'); // Analyze in Chrome
// Via process
process.memoryUsage(); // Basic info, not snapshot
Note: Compare snapshots to identify growing objects.
129. How do you monitor CPU usage?
Monitor CPU usage using os.loadavg() for system load, process.cpuUsage() for process-specific, or tools like PM2, New Relic, or Clinic.js.
const os = require('os');
setInterval(() => {
const usage = process.cpuUsage();
console.log('CPU Usage:', usage); // { user: micros, system: micros }
const load = os.loadavg();
console.log('Load Average:', load); // [1min, 5min, 15min]
}, 1000);
Note: High load > CPU cores indicates bottlenecks.
130. Explain load balancing in Node.js.
Load balancing distributes incoming requests across multiple Node.js instances or servers to improve scalability and reliability. In Node.js, use cluster module for internal balancing or external like Nginx/HAProxy.
// With cluster
const cluster = require('cluster');
const os = require('os');
if (cluster.isMaster) {
for (let i = 0; i < os.cpus().length; i++) {
cluster.fork();
}
} else {
// Worker: http server
require('http').createServer((req, res) => res.end('Hello')).listen(8000);
}
Node's cluster uses round-robin; external balancers offer sticky sessions.
Note: Share state via Redis; monitor health checks.
131. What is Nginx as a reverse proxy?
Nginx is a high-performance web server that can act as a reverse proxy, sitting between clients and backend servers (like Node.js apps). It forwards client requests to the appropriate server and returns responses, providing load balancing, SSL termination, caching, and security features.
As a reverse proxy, it hides backend details, handles static files efficiently, and can compress responses or add headers.
# Example Nginx config (nginx.conf)
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://localhost:3000; # Forward to Node.js app
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
Note: Reload Nginx with nginx -s reload after config changes.
132. How do you handle graceful shutdown?
Graceful shutdown in Node.js involves closing connections, finishing tasks, and exiting cleanly on signals like SIGTERM. Listen for signals, close servers, and handle pending operations to avoid data loss or abrupt terminations.
const http = require('http');
const server = http.createServer((req, res) => res.end('Hello'));
server.listen(3000, () => console.log('Server running'));
let shuttingDown = false;
process.on('SIGTERM', () => {
console.log('SIGTERM received. Shutting down gracefully');
shuttingDown = true;
server.close(() => {
console.log('Server closed');
// Close DB connections, etc.
process.exit(0);
});
// Force exit after timeout
setTimeout(() => {
console.error('Force shutdown');
process.exit(1);
}, 10000); // 10s timeout
});
// Reject new requests during shutdown
server.on('request', (req, res) => {
if (shuttingDown) {
res.setHeader('Connection', 'close');
res.statusCode = 503;
res.end('Server shutting down');
}
});
Note: In production, use PM2 or Kubernetes for orchestrated shutdowns.
133. Explain signal handling.
Signal handling in Node.js uses process.on() to listen for OS signals like SIGINT (Ctrl+C), SIGTERM (termination), SIGHUP (reload). Handlers perform cleanup or restarts.
Common signals: SIGTERM for graceful stop, SIGINT for interrupt, SIGUSR2 for debugging.
process.on('SIGINT', () => {
console.log('SIGINT received. Exiting...');
// Cleanup
process.exit(0);
});
process.on('SIGTERM', () => {
console.log('SIGTERM received. Graceful shutdown');
// Close resources
process.exit(0);
});
process.on('SIGHUP', () => {
console.log('SIGHUP received. Reloading config');
// Reload logic
});
Note: Some signals like SIGKILL can't be caught; use for container orchestration.
134. What is the cluster module's isMaster property?
The cluster module's isMaster (deprecated, now isPrimary) property is a boolean indicating if the current process is the primary (master) process that forks workers. In workers, it's false.
Use to branch logic: primary forks, workers run app code.
const cluster = require('cluster');
if (cluster.isMaster) { // Or cluster.isPrimary in newer versions
console.log('Primary process:', process.pid);
cluster.fork(); // Fork worker
} else {
console.log('Worker process:', process.pid);
// App code here
}
Note: Deprecated in v16+; use cluster.isPrimary instead.
135. How do you share ports in clusters?
In clusters, the primary process listens on a port and shares the server handle with workers via IPC. Workers accept connections on the shared port without conflicts.
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
} else {
http.createServer((req, res) => {
res.writeHead(200);
res.end(`Hello from worker ${process.pid}`);
}).listen(8000); // Shared port
}
Note: OS kernel load-balances incoming connections.
136. Explain sticky sessions.
Sticky sessions (session affinity) route client requests to the same backend server based on session ID or IP, ensuring session data consistency in load-balanced environments.
In Node.js clusters, not natively supported; use external proxies like Nginx with ip_hash or cookies.
# Nginx config for sticky sessions
upstream node_app {
ip_hash; # Hash client IP
server localhost:3000;
server localhost:3001;
}
server {
location / {
proxy_pass http://node_app;
}
}
Note: Prefer stateless apps with shared stores like Redis for sessions.
137. What is the http2 module?
The http2 module enables HTTP/2 protocol support in Node.js, allowing multiplexed streams, header compression, and server push for better performance over HTTP/1.1.
It provides compatible APIs with http but with HTTP/2 features.
const http2 = require('http2');
const fs = require('fs');
const server = http2.createSecureServer({
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
});
server.on('stream', (stream, headers) => {
stream.respond({
'content-type': 'text/plain',
':status': 200
});
stream.end('Hello HTTP/2');
});
server.listen(8443);
Note: Requires HTTPS; use allowHTTP1: true for fallback.
138. How does HTTP/2 improve performance?
HTTP/2 improves performance via multiplexing (multiple requests over one connection), header compression (HPACK), server push (preemptive resource sending), binary protocol (efficient parsing), and prioritization.
Reduces latency, eliminates head-of-line blocking in HTTP/1.1.
Note: Benefits most in high-latency networks; measure with tools like Lighthouse.
139. Explain multiplexing in HTTP/2.
Multiplexing in HTTP/2 allows multiple requests/responses over a single TCP connection simultaneously via streams, each with independent frames. Avoids HTTP/1.1's queuing.
Streams have IDs, priorities, and can be interleaved.
// Client example
const http2 = require('http2');
const client = http2.connect('https://example.com');
const req1 = client.request({ ':path': '/resource1' });
req1.on('response', (headers) => console.log(headers));
req1.end();
const req2 = client.request({ ':path': '/resource2' }); // Multiplexed
req2.on('response', (headers) => console.log(headers));
req2.end();
Note: Improves page load times by parallelizing asset fetches.
140. What is the inspector module?
The inspector module provides an API for interacting with the V8 inspector, enabling debugging, profiling, and inspection via Chrome DevTools protocol.
Used for programmatic control of debugging sessions.
const inspector = require('inspector');
inspector.open(9229, 'localhost', true); // Open session
// Use chrome://inspect to connect
inspector.close(); // Close when done
Note: Typically used with --inspect flag; module for advanced scenarios.
141. How do you connect to Chrome DevTools?
Connect to Chrome DevTools by running Node with --inspect or --inspect-brk, then open chrome://inspect in Chrome, select the target, and use debugger.
// Run: node --inspect=9229 app.js
// Or --inspect-brk to break on first line
// In code (advanced)
const inspector = require('inspector');
const session = new inspector.Session();
session.connect();
session.post('Runtime.evaluate', { expression: 'console.log("From DevTools")' });
Note: Use ws://localhost:9229/json for programmatic access.
142. Explain source maps in Node.js.
Source maps map minified/transpiled code back to original source, aiding debugging by showing original lines/files in stack traces and debuggers.
In Node.js, enable with --enable-source-maps; useful for TypeScript/Babel.
// Run: node --enable-source-maps dist/app.js
// Generate map with tsc
// tsconfig.json: "sourceMap": true
Note: Maps add overhead; use in dev, strip in prod.
143. What is the trace_events module?
The trace_events module allows tracing built-in events like GC, HTTP, promises for performance analysis. Part of diagnostics.
const trace_events = require('trace_events');
const tracing = trace_events.createTracing({
categories: ['v8.gc', 'node.http']
});
tracing.enable();
setTimeout(() => {
tracing.disable();
}, 5000);
Note: Output to console or files; analyze with tools like Chrome Tracing.
144. How do you trace async operations?
Trace async operations using async_hooks or trace_events for hooks like async init, before, after. Logs async context for debugging spans.
const async_hooks = require('async_hooks');
const fs = require('fs');
const hook = async_hooks.createHook({
init(asyncId, type, triggerAsyncId) {
fs.writeSync(1, `Init ${type} ${asyncId} from ${triggerAsyncId}\n`);
},
destroy(asyncId) {
fs.writeSync(1, `Destroy ${asyncId}\n`);
}
});
hook.enable();
setTimeout(() => {}, 100); // Traced
Note: High overhead; use sparingly in production.
145. Explain custom inspectors.
Custom inspectors extend Node's inspector API to add custom debugging panels or data in Chrome DevTools via custom protocol messages or domains.
Involves registering custom domains and handling methods/events.
const inspector = require('inspector');
const session = new inspector.Session();
session.connect();
session.post('Profiler.enable', () => {
// Custom logic
});
// For full custom: Implement InspectorBackend.registerDomain
Note: Advanced; used in tools like VS Code extensions.
146. What is the worker_threads module API?
The worker_threads module in Node.js provides an API for creating and managing threads to run JavaScript in parallel, enabling multi-threading for CPU-intensive tasks. It includes classes like Worker, MessageChannel, and functions for sharing memory.
Key components:
- Worker: Creates a new thread with a script.
- isMainThread: Boolean to check if current thread is main.
- parentPort: Communication port in workers.
- workerData: Data passed to worker on creation.
- MessagePort: For inter-thread messaging.
const { Worker, isMainThread } = require('worker_threads');
if (isMainThread) {
const worker = new Worker(__filename, { workerData: 'data' });
} else {
console.log('Worker received:', workerData);
}
Note: Available since Node.js v10.5; use for tasks like crypto, not I/O.
147. How do you communicate between threads?
Communicate between threads using postMessage() on MessagePort objects. The main thread uses worker.postMessage(), workers use parentPort.postMessage(). Data is structured-cloned (like postMessage in browsers).
Listen with 'message' event.
const { Worker, isMainThread, parentPort, workerData } = require('worker_threads');
if (isMainThread) {
const worker = new Worker(__filename, { workerData: { num: 5 } });
worker.on('message', (msg) => console.log('From worker:', msg)); // Receive
worker.postMessage('Hello from main'); // Send
} else {
parentPort.on('message', (msg) => console.log('From main:', msg)); // Receive
const result = workerData.num * 2;
parentPort.postMessage(result); // Send
}
Note: Avoid sending large data; use SharedArrayBuffer for shared memory.
148. Explain MessageChannel.
MessageChannel creates a pair of MessagePort objects for bidirectional communication between threads or processes. Useful for direct worker-to-worker comms without main thread relay.
Ports are port1 and port2; transfer one via postMessage with transfer list.
const { Worker, isMainThread, MessageChannel } = require('worker_threads');
if (isMainThread) {
const worker1 = new Worker(__filename);
const worker2 = new Worker(__filename);
const { port1, port2 } = new MessageChannel();
worker1.postMessage({ port: port1 }, [port1]); // Transfer port1 to worker1
worker2.postMessage({ port: port2 }, [port2]); // Transfer port2 to worker2
} else {
parentPort.on('message', (msg) => {
const port = msg.port;
port.on('message', (data) => console.log('Received on port:', data));
port.postMessage('Hello from other worker');
});
}
Note: Ports are transferable; close when done to free resources.
149. What is Atomics for thread synchronization?
Atomics provides atomic operations on shared memory (SharedArrayBuffer) for thread synchronization, ensuring operations like add, store are indivisible to prevent race conditions.
Includes methods like load, store, add, sub, wait, notify.
const { Worker, isMainThread } = require('worker_threads');
if (isMainThread) {
const sab = new SharedArrayBuffer(4);
const int32 = new Int32Array(sab);
const worker = new Worker(__filename, { workerData: sab });
Atomics.store(int32, 0, 123); // Store value
console.log(Atomics.load(int32, 0)); // 123
} else {
const int32 = new Int32Array(workerData);
Atomics.add(int32, 0, 1); // Atomic increment
}
Note: Use with SharedArrayBuffer; wait/notify for signaling.
150. How do you avoid race conditions?
Avoid race conditions by using Atomics for atomic ops, mutex-like locks with Atomics.wait/notify, or message passing instead of shared state. Design threads to minimize shared mutable data.
// Using Atomics for lock
const sab = new SharedArrayBuffer(4);
const lock = new Int32Array(sab);
function acquireLock() {
while (Atomics.compareExchange(lock, 0, 0, 1) !== 0) {} // Spinlock
}
function releaseLock() {
Atomics.store(lock, 0, 0);
}
// In thread
acquireLock();
// Critical section
releaseLock();
Alternative: Use worker pools with task queues.
Note: Prefer immutable data; test with multiple threads.
151. Explain shared memory.
Shared memory in Node.js uses SharedArrayBuffer, allowing multiple threads to access the same memory buffer concurrently. Views like Int32Array provide typed access.
Enables fast data sharing without cloning, but requires synchronization to avoid races.
const { Worker, isMainThread } = require('worker_threads');
if (isMainThread) {
const sab = new SharedArrayBuffer(1024); // 1KB shared
const worker = new Worker(__filename, { workerData: sab });
const view = new Uint8Array(sab);
view[0] = 42; // Write
} else {
const view = new Uint8Array(workerData);
console.log(view[0]); // 42 (read shared)
}
Note: Enabled by default since v10.5; size fixed on creation.
152. What is the perf_hooks module?
The perf_hooks module provides performance measurement APIs, including PerformanceObserver, performance.now(), and hooks for monitoring Node.js internals like GC, HTTP.
Based on browser Performance API.
const { performance, PerformanceObserver } = require('perf_hooks');
const obs = new PerformanceObserver((items) => {
items.getEntries().forEach((entry) => console.log(entry));
performance.clearMarks();
});
obs.observe({ entryTypes: ['measure'] });
performance.mark('start');
setTimeout(() => {
performance.mark('end');
performance.measure('MyMeasure', 'start', 'end');
}, 1000);
Note: Use for benchmarking; doesn't affect runtime.
153. How do you measure execution time?
Measure execution time using performance.now() for high-resolution timestamps, or console.time()/timeEnd() for simple logging.
const { performance } = require('perf_hooks');
// High-res
const start = performance.now();
for (let i = 0; i < 1e6; i++) {} // Work
const end = performance.now();
console.log(`Execution time: ${end - start} ms`);
// Console method
console.time('loop');
for (let i = 0; i < 1e6; i++) {}
console.timeEnd('loop'); // loop: 1.23ms
Note: performance.now() monotonic; avoid Date.now() for precision.
154. Explain histogram for performance.
Histogram in perf_hooks records distribution of values, like latencies, with buckets for percentiles. Use PerformanceObserver to monitor.
const { PerformanceObserver, performance } = require('perf_hooks');
const obs = new PerformanceObserver((items) => {
const entry = items.getEntries()[0];
console.log(`Mean: ${entry.mean}, P95: ${entry.percentile(0.95)}`);
});
obs.observe({ entryTypes: ['histogram'] });
const hist = performance.histogram('my-hist', {
min: 0,
max: 1000,
figures: 3 // Accuracy
});
hist.record(100); // Add value
hist.record(200);
Note: Since v15; useful for metrics in production.
155. What is the diagnostics_channel?
diagnostics_channel is a module for publishing and subscribing to diagnostic events in Node.js, like HTTP requests, without overhead when unsubscribed.
Channels are named, publish data objects.
const diagnostics_channel = require('diagnostics_channel');
const channel = diagnostics_channel.channel('my-channel');
channel.subscribe((message, name) => {
console.log(`[${name}]`, message);
});
if (channel.hasSubscribers) {
channel.publish({ some: 'data' });
}
Note: Low-overhead alternative to async_hooks.
156. How do you subscribe to events?
Subscribe to events in diagnostics_channel using channel.subscribe(callback), where callback receives message and channel name.
const dc = require('diagnostics_channel');
// Built-in channel example: http.server.request
const reqChannel = dc.channel('http.server.request');
reqChannel.subscribe((data) => {
console.log('Request:', data.req.url);
});
// Custom
const myChannel = dc.channel('custom');
myChannel.subscribe((msg) => console.log(msg));
myChannel.publish('Event occurred');
Note: Unsubscribe with channel.unsubscribe(callback).
157. Explain async_hooks for tracking async resources.
async_hooks tracks asynchronous resources' lifecycle (promises, timeouts) with hooks like init, before, after, destroy. Provides execution context via AsyncLocalStorage.
const async_hooks = require('async_hooks');
const hook = async_hooks.createHook({
init(asyncId, type, triggerAsyncId, resource) {
console.log(`Init ${type} ${asyncId}`);
},
destroy(asyncId) {
console.log(`Destroy ${asyncId}`);
}
});
hook.enable();
setTimeout(() => {}, 10); // Triggers hooks
Note: High overhead; use for debugging.
158. What is the executionAsyncId?
executionAsyncId() returns the current async execution context ID, identifying the active async resource. Useful for logging or tracing chains.
const async_hooks = require('async_hooks');
console.log(async_hooks.executionAsyncId()); // 1 (main)
setImmediate(() => {
console.log(async_hooks.executionAsyncId()); // New ID
});
Note: Paired with triggerAsyncId() for parent-child.
159. How do you debug async contexts?
Debug async contexts using async_hooks to log lifecycle, or AsyncLocalStorage for context propagation. Tools like Clinic.js visualize.
const async_hooks = require('async_hooks');
const { AsyncLocalStorage } = async_hooks;
const als = new AsyncLocalStorage();
als.run({ traceId: '123' }, () => {
setTimeout(() => {
console.log(als.getStore().traceId); // 123 (propagated)
}, 100);
});
Note: Enable with --async-stack-traces for better stacks.
160. Explain promise hooks.
Promise hooks in async_hooks track promise lifecycle: init (creation), before (settle), after (settled), promiseResolve (resolved).
const async_hooks = require('async_hooks');
const hook = async_hooks.createHook({
init(asyncId, type, trigger) {
if (type === 'PROMISE') console.log(`Promise init ${asyncId}`);
},
promiseResolve(asyncId) {
console.log(`Promise resolved ${asyncId}`);
}
});
hook.enable();
Promise.resolve(42); // Triggers
Note: Specific to promises; part of async_hooks.
161. What is the undici library for HTTP?
Undici is a modern, high-performance HTTP/1.1 client library for Node.js, designed to be fast, secure, and compliant with standards. It powers Node.js's built-in fetch API and provides features like connection pooling, pipelining, and async iterators for streaming responses.
It's used for making HTTP requests efficiently, especially in server-side environments.
// Install: npm install undici
const { request } = require('undici');
async function fetchData() {
const response = await request('https://example.com');
const body = await response.body.json(); // Streamable
console.log(body);
}
fetchData();
Note: Undici is the underlying engine for Node's experimental/global fetch since v16.
162. How does it differ from built-in http?
Undici differs from the built-in http module by being promise-based, supporting modern features like async/await, streaming, and better performance with connection reuse. Http is callback-based, lower-level, and lacks built-in promise support.
- http: Event-driven, manual handling of chunks, no built-in JSON parsing.
- undici: Higher-level, fetch-like API, automatic redirects, better error handling.
// Built-in http
const http = require('http');
http.get('http://example.com', (res) => {
let data = '';
res.on('data', (chunk) => data += chunk);
res.on('end', () => console.log(data));
});
// Undici
const { request } = require('undici');
request('http://example.com').then((res) => res.body.text()).then(console.log);
Note: Undici is faster for multiple requests due to pooling.
163. Explain fetch API in Node.js.
The fetch API in Node.js is a web-standard API for making HTTP requests, similar to browsers. It returns Promises, supports streaming, and handles JSON natively. Available globally since Node.js v18 (stable in v20+).
async function getData() {
const response = await fetch('https://api.example.com/data');
if (!response.ok) throw new Error('Failed');
const data = await response.json();
console.log(data);
}
getData();
Note: Uses undici under the hood; enable with --no-experimental-fetch if needed.
164. What is the experimental fetch?
Experimental fetch refers to the initial implementation of the fetch API in Node.js (v16-v17), requiring the --experimental-fetch flag. It was stabilized in v18, becoming global without flags.
During experimental phase, it tested compatibility with web standards.
// Old: node --experimental-fetch app.js
// Now: Just node app.js
fetch('https://example.com').then(res => res.text()).then(console.log);
Note: As of 2025, it's fully stable; check node --help for flags.
165. How do you handle large payloads?
Handle large payloads using streaming with fetch's body stream or undici's bodyUsed. Read in chunks to avoid memory overload, process with TransformStreams.
async function handleLarge() {
const response = await fetch('https://example.com/large');
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
// Process chunk (Uint8Array)
console.log('Chunk:', value.length);
}
}
handleLarge();
Note: Use AbortController for timeouts on large downloads.
166. Explain AbortController.
AbortController allows canceling fetch requests or other async operations via signals. Create a controller, pass signal to fetch, call abort() to cancel.
const controller = new AbortController();
const signal = controller.signal;
fetch('https://example.com', { signal })
.then(res => res.text())
.then(console.log)
.catch(err => {
if (err.name === 'AbortError') console.log('Aborted');
});
// Cancel after 5s
setTimeout(() => controller.abort(), 5000);
Note: Works with DOM-like APIs in Node; useful for timeouts.
167. What is the web streams API?
The web streams API provides standard interfaces for streaming data: ReadableStream, WritableStream, TransformStream. Enables efficient handling of large data in chunks.
const readable = new ReadableStream({
start(controller) {
controller.enqueue('chunk1');
controller.enqueue('chunk2');
controller.close();
}
});
const reader = readable.getReader();
reader.read().then(({ value }) => console.log(value)); // chunk1
Note: Integrated with fetch for response.body.
168. How do you implement backpressure?
Implement backpressure in streams by pausing reading when the consumer can't keep up, using highWaterMark and signals in Readable/WritableStreams.
const { ReadableStream, WritableStream } = require('stream/web');
const readable = new ReadableStream(/* ... */);
const writable = new WritableStream({
write(chunk) {
// Process; return Promise if slow
return new Promise(resolve => setTimeout(resolve, 100));
}
});
readable.pipeTo(writable); // Auto-handles backpressure
Note: PipeTo respects signals; manual with reader/writer.
169. Explain transform streams.
Transform streams modify or transform data as it passes through, extending Duplex with transform() method for chunk processing.
const { TransformStream } = require('stream/web');
const upperCase = new TransformStream({
transform(chunk, controller) {
controller.enqueue(chunk.toString().toUpperCase());
}
});
const readable = new ReadableStream(/* data */);
readable.pipeThrough(upperCase).pipeTo(/* writable */);
Note: Useful for compression, encryption on-the-fly.
170. What is the test runner in Node.js?
The test runner is a built-in module (node:test) for running tests, supporting TAP output, async tests, and suites. Run with node --test.
const test = require('node:test');
const assert = require('node:assert');
test('Math test', async (t) => {
await t.test('Addition', () => {
assert.strictEqual(1 + 1, 2);
});
});
Note: Since v18; integrates with coverage.
171. How do you write TAP tests?
Write TAP (Test Anything Protocol) tests using node:test, which outputs TAP. Define tests with test(), use assert for checks.
// Run: node --test test.js
const test = require('node:test');
const assert = require('node:assert/strict');
test('Suite', (t) => {
t.test('Subtest', () => {
assert.equal(2 * 2, 4, 'Math works');
});
});
Note: TAP is human-readable; use reporters for formats.
172. Explain coverage reports.
Coverage reports show tested code percentage. In Node.js, use --experimental-test-coverage with test runner, outputting V8 coverage.
// Run: node --test --experimental-test-coverage
// In code: Tests as above
// Output: Lines, branches, functions covered
Note: View with c8 or istanbul; aim for 80%+.
173. What is the policy module for security?
The policy module (experimental) enforces security policies like integrity checks for modules, preventing tampering.
// policy.json: {"resources": {"./mod.js": {"integrity": "sha256-..."}}}
// Run: node --experimental-policy=policy.json app.js
const policy = require('node:policy');
policy.checkIntegrity('./mod.js');
Note: Generates manifests with --experimental-policy-integrity.
174. How do you set resource limits?
Set resource limits using process.resourceUsage() to monitor, or OS tools like ulimit. In code, no direct API; use worker threads with limits.
console.log(process.resourceUsage()); // { userCPUSeconds, ... }
// External: ulimit -v 100000 # Memory limit
Note: For containers, use Docker limits.
175. Explain URL parsing with WHATWG.
WHATWG URL API is the standard parser in Node.js (since v10), using new URL() for parsing, searching params.
const url = new URL('https://user:pass@example.com/path?query=1#hash');
console.log(url.hostname); // example.com
console.log(url.searchParams.get('query')); // 1
url.searchParams.set('new', 'value');
console.log(url.href); // Updated URL
Note: Replaces legacy url.parse(); stricter parsing.
176. What is the webcrypto API?
The Web Crypto API is a standard interface for performing
cryptographic operations in JavaScript, implemented in Node.js via
crypto.webcrypto. It provides
primitives for key generation, encryption/decryption,
signing/verification, hashing, and random number generation,
aligning with browser APIs for consistency.
It uses the
SubtleCrypto interface for
low-level operations, ensuring secure cryptography without
exposing sensitive data.
const crypto = require('node:crypto').webcrypto;
console.log(crypto.subtle); // SubtleCrypto instance
Note: Available since Node.js v15; prefer over legacy crypto for web-compatible code.
177. How do you generate keys?
Generate keys using
crypto.subtle.generateKey(),
specifying algorithm, extractable, and usages. Supports RSA, EC,
AES, HMAC.
const crypto = require('node:crypto').webcrypto;
async function generateRSAKey() {
const keyPair = await crypto.subtle.generateKey(
{
name: 'RSA-OAEP',
modulusLength: 2048,
publicExponent: new Uint8Array([1, 0, 1]),
hash: 'SHA-256'
},
true, // Extractable
['encrypt', 'decrypt'] // Usages
);
console.log(keyPair.privateKey); // CryptoKey
return keyPair;
}
generateRSAKey();
For symmetric:
async function generateAESKey() {
return await crypto.subtle.generateKey(
{ name: 'AES-GCM', length: 256 },
true,
['encrypt', 'decrypt']
);
}
Note: Keys are CryptoKey objects; export with exportKey().
178. Explain subtle crypto operations.
SubtleCrypto provides low-level crypto ops like encrypt, decrypt, sign, verify, digest, deriveKey. Operations are async, Promise-based, and algorithm-specific.
Example: Hashing with digest.
const crypto = require('node:crypto').webcrypto;
async function hashData(data) {
const encoder = new TextEncoder();
const buffer = encoder.encode(data);
const hash = await crypto.subtle.digest('SHA-256', buffer);
return Array.from(new Uint8Array(hash)).map(b => b.toString(16).padStart(2, '0')).join('');
}
hashData('Hello').then(console.log); // 185f8db32271fe25f561a6fc938b2e264306ec304eda518007d1764826381969
Encryption example:
async function encryptData(key, data) {
const iv = crypto.getRandomValues(new Uint8Array(12));
const encoded = new TextEncoder().encode(data);
const ciphertext = await crypto.subtle.encrypt(
{ name: 'AES-GCM', iv },
key,
encoded
);
return { iv, ciphertext };
}
Note: Subtle means no direct access to raw keys for security.
179. What is the temporal API proposal?
The Temporal API is a TC39 proposal for modern date/time handling in JavaScript, addressing Date object's flaws like mutability and timezone issues. It provides immutable objects like Temporal.ZonedDateTime, Temporal.Duration.
As of November 2025, it's at Stage 3.
Note: Not yet native; use polyfills like @js-temporal/polyfill.
180. How does it handle dates?
Temporal handles dates with classes like Temporal.PlainDate (year-month-day), Temporal.PlainDateTime (with time), Temporal.ZonedDateTime (with timezone). Supports arithmetic, comparisons, formatting.
// With polyfill
const { Temporal } = require('@js-temporal/polyfill');
const date = Temporal.PlainDate.from('2025-11-14');
console.log(date.toString()); // 2025-11-14
const future = date.add({ days: 7 });
console.log(future.toString()); // 2025-11-21
const zoned = Temporal.ZonedDateTime.from('2025-11-14T10:00[America/New_York]');
console.log(zoned.toString()); // 2025-11-14T10:00:00-05:00[America/New_York]
Note: Immutable; chain methods for ops.
181. Explain internationalization.
Internationalization (i18n) in Node.js adapts apps for locales using Intl API for formatting numbers, dates, strings, collation. Supports pluralization, currency.
const num = new Intl.NumberFormat('de-DE', { style: 'currency', currency: 'EUR' }).format(123456.789);
console.log(num); // 123.456,79 €
const date = new Intl.DateTimeFormat('ja-JP', { dateStyle: 'full' }).format(new Date());
console.log(date); // 2025年11月14日金曜日
Note: Requires full-icu for all locales; default small-icu limited.
182. What is the icu module?
ICU (International Components for Unicode) is the library Node.js uses for i18n features via Intl. Node builds with small-icu (basic locales) or full-icu (all).
Enable full-icu with --with-intl=full-icu or npm full-icu.
// Check locales
console.log(Intl.DateTimeFormat.supportedLocalesOf(['en', 'ja', 'ar'])); // With full-icu: ['en', 'ja', 'ar']
Note: Small-icu supports en-US, others partial.
183. How do you localize apps?
Localize apps using Intl for formatting, i18n libraries like i18next for messages. Detect locale from headers/env, load translations.
// With i18next: npm install i18next
const i18n = require('i18next');
i18n.init({
lng: 'en',
resources: {
en: { translation: { hello: 'Hello' } },
fr: { translation: { hello: 'Bonjour' } }
}
});
console.log(i18n.t('hello')); // Hello
i18n.changeLanguage('fr');
console.log(i18n.t('hello')); // Bonjour
Use Intl for dates/numbers within translations.
Note: Handle RTL, plurals with ICU format.
184. Explain decorators in Node.js.
Decorators are a proposal for metadata/annotations on classes, methods, properties. In Node.js, use with transpilers like Babel/TypeScript, as native support pending.
// With Babel/TypeScript
function logged(target, key, descriptor) {
const original = descriptor.value;
descriptor.value = function(...args) {
console.log(`Calling ${key}`);
return original.apply(this, args);
};
}
class Example {
@logged
method() {
return 'done';
}
}
const ex = new Example();
ex.method(); // Logs: Calling method
Note: Enhances frameworks like NestJS.
185. What is the stage of proposal?
As of November 2025, the decorators proposal is at Stage 3 in TC39.
Note: Stage 3 means candidate, likely to advance.
186. How do you use private fields?
Use private fields with # prefix in classes, inaccessible outside. Native since Node.js v12.
class Counter {
#count = 0; // Private field
increment() {
this.#count++;
}
get value() {
return this.#count;
}
}
const c = new Counter();
c.increment();
console.log(c.value); // 1
// console.log(c.#count); // SyntaxError
Note: Enforces encapsulation.
187. Explain top-level await.
Top-level await allows await at module top-level, making modules async. Supported in Node.js since v14.8 for ESM.
// mod.js
const data = await fetch('https://example.com').then(res => res.text());
export { data };
// app.js
import { data } from './mod.js';
console.log(data); // Awaits mod loading
Note: Blocks dependent modules until resolved.
188. What is the impact on modules?
Top-level await makes module evaluation async, potentially delaying imports. Circular dependencies with await can deadlock; execution order matters more.
Pros: Cleaner async init. Cons: Footgun for perf if overused.
Note: Use judiciously; fallback to promises if needed.
189. Explain import assertions.
Import assertions (now attributes) specify module type, like JSON/CSS, ensuring correct loading. Deprecated assertions; use attributes in newer Node.
// Old: assert
import data from './data.json' assert { type: 'json' };
// New: with
import data from './data.json' with { type: 'json' };
console.log(data); // Parsed JSON
Note: Fails if type mismatches.
190. How do you validate JSON modules?
Validate JSON modules by importing with type 'json', which parses and throws on invalid JSON. Add schema validation post-import.
import data from './valid.json' with { type: 'json' }; // Parses or errors
// Schema validation (e.g., with ajv)
const Ajv = require('ajv');
const ajv = new Ajv();
const schema = { type: 'object', properties: { name: { type: 'string' } }, required: ['name'] };
const validate = ajv.compile(schema);
if (!validate(data)) {
console.error('Invalid:', validate.errors);
}
Note: Import fails on syntax errors; use try-catch for handling.
191. How do you validate JSON modules?
Validate JSON modules by importing them with the 'json' type attribute, which automatically parses and throws errors on invalid syntax. For schema validation, use libraries like Ajv or Joi after import.
// data.json must be valid JSON
import data from './data.json' with { type: 'json' }; // Throws if invalid
// Schema validation with Ajv
const Ajv = require('ajv');
const ajv = new Ajv();
const schema = { type: 'object', required: ['name'], properties: { name: { type: 'string' } } };
const validate = ajv.compile(schema);
try {
if (!validate(data)) {
throw new Error('Schema validation failed: ' + JSON.stringify(validate.errors));
}
console.log('Valid data:', data);
} catch (err) {
console.error(err.message);
}
Note: The import step handles syntax validation; add runtime checks for semantics or use TypeScript for type safety.
192. What is the shadows DOM?
Shadow DOM (likely a typo for "Shadow DOM") is a web standard for encapsulating DOM and CSS within web components, creating a scoped subtree hidden from the main document's JavaScript and styles. It enables isolated, reusable components without style conflicts.
In Node.js (server-side), Shadow DOM isn't directly applicable as there's no browser DOM, but libraries like jsdom can simulate it for testing or SSR.
// Browser example (conceptual for Node with jsdom)
const jsdom = require('jsdom');
const { JSDOM } = jsdom;
const dom = new JSDOM('<div id="host"></div>');
const host = dom.window.document.getElementById('host');
const shadow = host.attachShadow({ mode: 'open' }); // Create shadow root
shadow.innerHTML = '<p>Encapsulated content</p><style>p { color: red; }</style>';
console.log(host.shadowRoot.innerHTML); // Access in Node simulation
Note: Use for web components; in Node, mainly for testing browser code.
193. Explain module federation.
Module federation is a Webpack feature allowing multiple JavaScript applications to share code and resources dynamically at runtime, enabling microfrontends where apps load modules from remote bundles. It supports sharing dependencies to reduce bundle sizes and enables distributed development.
- Host: App consuming remote modules.
- Remote: App exposing modules.
// webpack.config.js for remote
const { ModuleFederationPlugin } = require('webpack').container;
module.exports = {
plugins: [
new ModuleFederationPlugin({
name: 'remoteApp',
filename: 'remoteEntry.js',
exposes: { './Module': './src/Module.js' }, // Expose
shared: { react: { singleton: true } } // Share deps
})
]
};
// For host
new ModuleFederationPlugin({
name: 'hostApp',
remotes: { remoteApp: 'remoteApp@http://localhost:3001/remoteEntry.js' } // Consume
})
Usage in host:
import('remoteApp/Module').then(Module => {
// Use Module
});
Note: Improves scalability for large teams; works with other bundlers via plugins.
194. How do you share code dynamically?
Share code dynamically using module federation in Webpack or dynamic imports with ESM. Load modules from remote URLs at runtime, enabling shared libraries across apps.
// Dynamic import
async function loadModule() {
const { func } = await import('https://example.com/module.js'); // Load remotely
func();
}
loadModule();
// With federation (as above)
For Node.js server-side, use require() conditionally or ESM dynamic import.
async function dynamicRequire(path) {
const mod = await import(path); // Dynamic
return mod.default;
}
Note: Ensure CORS for remote; use for microservices or plugins.
195. What is the sharp library for images?
The sharp library is a high-performance Node.js module for image processing, using libvips to resize, crop, format convert (JPEG, PNG, WebP, AVIF), and optimize images quickly. It's efficient for server-side manipulation.
// Install: npm install sharp
const sharp = require('sharp');
sharp('input.jpg')
.resize(300, 200) // Resize
.toFormat('webp') // Convert
.toFile('output.webp', (err, info) => {
if (err) console.error(err);
console.log(info); // { format: 'webp', width: 300, height: 200, ... }
});
Note: Supports streaming; faster than ImageMagick.
196. How do you resize images?
Resize images using sharp's resize() method, specifying width, height, fit strategy, and quality. Supports async/await.
const sharp = require('sharp');
async function resizeImage(input, output, width, height) {
try {
await sharp(input)
.resize({
width,
height,
fit: 'cover', // Fit mode: cover, contain, fill, etc.
position: 'center' // Crop position
})
.jpeg({ quality: 80 }) // Compression
.toFile(output);
console.log('Resized successfully');
} catch (err) {
console.error('Error:', err);
}
}
resizeImage('input.png', 'resized.jpg', 400, 300);
For buffers:
const buffer = await sharp(inputBuffer).resize(200).toBuffer();
Note: Preserve aspect with null height; handle errors for invalid images.
197. Explain WebAssembly integration.
WebAssembly (WASM) integration in Node.js allows running compiled WASM modules via the WebAssembly global object. Compile from C/Rust/Go, instantiate, and call exports. Enhances performance for compute-heavy tasks.
// add.wat: (module (func $add (param i32 i32) (result i32) (i32.add (local.get 0) (local.get 1))) (export "add" (func $add)))
const fs = require('fs');
const wasmBuffer = fs.readFileSync('add.wasm');
async function runWasm() {
const { instance } = await WebAssembly.instantiate(wasmBuffer);
console.log(instance.exports.add(5, 3)); // 8
}
runWasm();
Note: Use Emscripten for C to WASM; WASI for system access.
198. What is the wasi module?
The WASI (WebAssembly System Interface) module in Node.js provides POSIX-like APIs for WASM modules to access OS resources securely, like files and env vars. Experimental, enables running WASM CLI tools.
const { WASI } = require('wasi');
const fs = require('fs');
const wasi = new WASI({ version: 'preview1' }); // wasip1
const wasm = await WebAssembly.compile(fs.readFileSync('tool.wasm'));
const instance = await wasi.instantiate(wasm, {});
wasi.start(instance); // Run with args/env
Note: wasip1 deprecated; use wasip2 for future.
199. How does it enable secure WASM?
WASI enables secure WASM by using capability-based security, granting explicit permissions for resources instead of full access. Sandboxed execution prevents unauthorized ops.
const { WASI } = require('wasi');
// Preopen dirs for security
const wasi = new WASI({
preopens: { '/sandbox': '/real/path' }, // Limit FS access
args: process.argv.slice(2),
env: { ...process.env }
});
// Instantiate and start as above
Note: Prevents escapes; combine with WASM sandbox.
200. Explain edge computing with Node.js.
Edge computing with Node.js runs code closer to users at network edges (CDNs, edge servers) for low latency, using platforms like Cloudflare Workers, Vercel Edge, Fastly Compute. Node's async model suits real-time apps.
// Cloudflare Worker example (Node-compatible)
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request));
});
async function handleRequest(request) {
return new Response('Hello from edge!', { status: 200 });
}
Benefits: Faster responses, reduced bandwidth.
Note: Limited APIs at edge; use for APIs, SSR.
201. What is Deno and how does it compare to Node.js?
Deno is a secure JavaScript/TypeScript runtime built on V8, with built-in tools like bundler, tester, and secure by default (permissions required). In 2025, Deno 2 emphasizes performance, security, native TS.
Comparison:
| Aspect | Deno | Node.js |
|---|---|---|
| Ecosystem | Smaller, growing | Mature, vast NPM |
| Security | Permissions-based | Full access |
| Performance | Faster in benchmarks | Optimized, stable |
| TS Support | Native | Via transpilers |
| Modules | URL imports | require/import |
Deno example:
// Deno: deno run --allow-net app.ts
const response = await fetch('https://example.com');
console.log(await response.text());
Node:
const fetch = require('node-fetch');
fetch('https://example.com').then(res => res.text()).then(console.log);
Note: Deno compatible with Node via npm specifier; choose Deno for security, Node for ecosystem.