Week 11: Node.js File System & HTTP
Unit IV - Server-Side Development
Working with files and building HTTP servers from scratch!
What You'll Learn
- File System (fs) module
- Reading and writing files
- Path module for file paths
- HTTP module - building servers
- Request handling and routing
- Serving static files
Why Built-in Modules?
Before using Express.js, it's important to understand what happens under the hood. The fs and http modules are the foundation of Node.js server development!
Press → or click Next to continue
The Path Module
Utility module for working with file and directory paths - works cross-platform!
Common Methods
const path = require('path');
// Join path segments
path.join('/users', 'alice', 'docs')
// → '/users/alice/docs'
// Resolve to absolute path
path.resolve('src', 'index.js')
// → '/full/path/to/src/index.js'
// Get file extension
path.extname('app.js') // '.js'
path.extname('style.css') // '.css'
path.extname('README') // ''
// Get filename
path.basename('/path/to/file.js') // 'file.js'
path.basename('/path/to/file.js', '.js') // 'file'
// Get directory name
path.dirname('/path/to/file.js')
// → '/path/to'
Path Parsing
// Parse a path into components
path.parse('/home/user/docs/file.txt')
// {
// root: '/',
// dir: '/home/user/docs',
// base: 'file.txt',
// name: 'file',
// ext: '.txt'
// }
// Build path from components
path.format({
dir: '/home/user',
base: 'file.txt'
})
// → '/home/user/file.txt'
// Normalize a path
path.normalize('/foo/bar//baz/../qux')
// → '/foo/bar/qux'
// Platform-specific separator
path.sep // '/' on Unix, '\\' on Windows
Always use path.join() instead of string concatenation for file paths!
File System - Reading Files
The fs module lets you read, write, and manage files.
Synchronous (Blocking)
const fs = require('fs');
// Read entire file (blocks execution)
try {
const data = fs.readFileSync(
'hello.txt',
'utf-8'
);
console.log(data);
} catch (err) {
console.error('Error:', err.message);
}
// Read as Buffer (binary)
const buffer = fs.readFileSync('image.png');
console.log(buffer.length); // bytes
Avoid Sync methods in servers! They block the entire process. Use async alternatives.
Asynchronous (Non-blocking)
const fs = require('fs');
// Callback-based
fs.readFile('hello.txt', 'utf-8', (err, data) => {
if (err) {
console.error('Error:', err.message);
return;
}
console.log(data);
});
// Promise-based (Modern!)
const fsPromises = require('fs/promises');
async function readMyFile() {
try {
const data = await fsPromises.readFile(
'hello.txt',
'utf-8'
);
console.log(data);
} catch (err) {
console.error('Error:', err.message);
}
}
readMyFile();
File System - Writing Files
Write File
const fs = require('fs/promises');
// Write entire file (creates or overwrites)
async function writeExample() {
await fs.writeFile(
'output.txt',
'Hello, Node.js!'
);
console.log('File written!');
}
// Write JSON
async function writeJSON() {
const data = {
users: [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
]
};
await fs.writeFile(
'data.json',
JSON.stringify(data, null, 2)
);
}
// Write with options
await fs.writeFile('file.txt', content, {
encoding: 'utf-8',
flag: 'w' // w=write, a=append
});
Append to File
const fs = require('fs/promises');
// Append to file (creates if doesn't exist)
async function appendLog(message) {
const timestamp = new Date().toISOString();
const line = `[${timestamp}] ${message}\n`;
await fs.appendFile('app.log', line);
}
appendLog('Server started');
appendLog('User logged in');
Copy & Rename
// Copy file
await fs.copyFile('source.txt', 'dest.txt');
// Rename / Move file
await fs.rename('old-name.txt', 'new-name.txt');
// Move to different directory
await fs.rename(
'downloads/file.txt',
'archive/file.txt'
);
File System - Directories
Directory Operations
const fs = require('fs/promises');
// Create directory
await fs.mkdir('new-folder');
// Create nested directories
await fs.mkdir('path/to/deep/folder', {
recursive: true
});
// Read directory contents
const files = await fs.readdir('.');
console.log(files);
// ['app.js', 'package.json', 'node_modules']
// Read with file types
const entries = await fs.readdir('.', {
withFileTypes: true
});
entries.forEach(entry => {
if (entry.isDirectory()) {
console.log(`DIR: ${entry.name}`);
} else {
console.log(`FILE: ${entry.name}`);
}
});
// Remove directory
await fs.rmdir('empty-folder');
await fs.rm('folder-with-files', {
recursive: true
});
File Information
const fs = require('fs/promises');
// Get file stats
const stats = await fs.stat('app.js');
console.log('Size:', stats.size, 'bytes');
console.log('Created:', stats.birthtime);
console.log('Modified:', stats.mtime);
console.log('Is file:', stats.isFile());
console.log('Is dir:', stats.isDirectory());
// Check if file/dir exists
async function exists(path) {
try {
await fs.access(path);
return true;
} catch {
return false;
}
}
if (await exists('config.json')) {
console.log('Config exists!');
}
// Watch for file changes
const watcher = fs.watch('.');
for await (const event of watcher) {
console.log(event.eventType, event.filename);
}
Streams
Process data piece by piece instead of loading everything into memory.
Why Streams?
// BAD: Loads entire file into memory
const data = fs.readFileSync('large-file.csv');
// 500MB file = 500MB in memory!
// GOOD: Process in chunks
const stream = fs.createReadStream('large-file.csv');
stream.on('data', (chunk) => {
// Process chunk (64KB at a time)
console.log('Chunk size:', chunk.length);
});
stream.on('end', () => {
console.log('Done reading!');
});
Stream Types
- Readable: Source of data (fs.createReadStream)
- Writable: Destination (fs.createWriteStream)
- Transform: Modify data in transit
- Duplex: Both read and write
Piping Streams
const fs = require('fs');
// Copy file using streams
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');
readStream.pipe(writeStream);
writeStream.on('finish', () => {
console.log('Copy complete!');
});
// Stream with HTTP
const http = require('http');
const server = http.createServer((req, res) => {
// Stream a large file to client
const stream = fs.createReadStream('big-video.mp4');
stream.pipe(res); // Efficient!
});
// Compress while streaming
const zlib = require('zlib');
const gzip = zlib.createGzip();
fs.createReadStream('input.txt')
.pipe(gzip)
.pipe(fs.createWriteStream('input.txt.gz'));
The HTTP Module
Create HTTP servers without any framework - the foundation of Express.js!
Basic HTTP Server
const http = require('http');
const server = http.createServer((req, res) => {
// req = IncomingMessage (what client sent)
// res = ServerResponse (what we send back)
// Set response headers
res.writeHead(200, {
'Content-Type': 'text/html'
});
// Write response body
res.write('<h1>Hello World!</h1>');
// End the response
res.end();
});
// Start listening
server.listen(3000, () => {
console.log('Server on http://localhost:3000');
});
Request Properties
const server = http.createServer((req, res) => {
// Request information
console.log('Method:', req.method);
// 'GET', 'POST', 'PUT', 'DELETE'
console.log('URL:', req.url);
// '/about', '/api/users?page=2'
console.log('Headers:', req.headers);
// { host: 'localhost:3000', ... }
// Response shortcuts
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello World');
});
Express.js is built on top of Node's http module! Understanding this helps you understand Express.
URL Parsing
Parse request URLs to extract paths and query parameters.
URL Module
const { URL } = require('url');
// Parse a URL
const myUrl = new URL(
'http://localhost:3000/search?q=node&page=2'
);
console.log(myUrl.pathname); // '/search'
console.log(myUrl.search); // '?q=node&page=2'
// Get query parameters
console.log(myUrl.searchParams.get('q'));
// 'node'
console.log(myUrl.searchParams.get('page'));
// '2'
// Iterate all parameters
for (const [key, value] of myUrl.searchParams) {
console.log(`${key}: ${value}`);
}
// q: node
// page: 2
Using in HTTP Server
const http = require('http');
const server = http.createServer((req, res) => {
// Parse the request URL
const baseURL = `http://${req.headers.host}`;
const url = new URL(req.url, baseURL);
console.log('Path:', url.pathname);
console.log('Query:', Object.fromEntries(
url.searchParams
));
// Route based on path
if (url.pathname === '/') {
res.writeHead(200, {
'Content-Type': 'text/html'
});
res.end('<h1>Home</h1>');
} else if (url.pathname === '/about') {
res.writeHead(200, {
'Content-Type': 'text/html'
});
res.end('<h1>About</h1>');
} else {
res.writeHead(404);
res.end('Not Found');
}
});
server.listen(3000);
Routing with HTTP Module
Handle different URLs and HTTP methods without a framework.
const http = require('http');
const { URL } = require('url');
// In-memory data
let todos = [
{ id: 1, text: 'Learn Node.js', done: false },
{ id: 2, text: 'Build a server', done: false }
];
let nextId = 3;
const server = http.createServer((req, res) => {
const url = new URL(req.url, `http://${req.headers.host}`);
const path = url.pathname;
// Set JSON content type for API
res.setHeader('Content-Type', 'application/json');
// GET /api/todos
if (req.method === 'GET' && path === '/api/todos') {
res.writeHead(200);
res.end(JSON.stringify(todos));
}
// GET /api/todos/:id
else if (req.method === 'GET' && path.match(/^\/api\/todos\/\d+$/)) {
const id = parseInt(path.split('/')[3]);
const todo = todos.find(t => t.id === id);
if (todo) {
res.writeHead(200);
res.end(JSON.stringify(todo));
} else {
res.writeHead(404);
res.end(JSON.stringify({ error: 'Not found' }));
}
}
// POST /api/todos
else if (req.method === 'POST' && path === '/api/todos') {
let body = '';
req.on('data', chunk => body += chunk);
req.on('end', () => {
const { text } = JSON.parse(body);
const todo = { id: nextId++, text, done: false };
todos.push(todo);
res.writeHead(201);
res.end(JSON.stringify(todo));
});
}
// 404
else {
res.writeHead(404);
res.end(JSON.stringify({ error: 'Route not found' }));
}
});
server.listen(3000, () => console.log('Server on http://localhost:3000'));
Serving Static Files
Build a file server that serves HTML, CSS, JS, and images.
const http = require('http');
const fs = require('fs/promises');
const path = require('path');
// MIME type mapping
const MIME_TYPES = {
'.html': 'text/html',
'.css': 'text/css',
'.js': 'text/javascript',
'.json': 'application/json',
'.png': 'image/png',
'.jpg': 'image/jpeg',
'.gif': 'image/gif',
'.svg': 'image/svg+xml',
'.ico': 'image/x-icon',
'.txt': 'text/plain'
};
const PUBLIC_DIR = path.join(__dirname, 'public');
const server = http.createServer(async (req, res) => {
try {
// Map URL to file path
let filePath = path.join(PUBLIC_DIR, req.url);
// Default to index.html for directory requests
if (req.url === '/') {
filePath = path.join(PUBLIC_DIR, 'index.html');
}
// Get file extension and MIME type
const ext = path.extname(filePath);
const contentType = MIME_TYPES[ext] || 'application/octet-stream';
// Read and serve the file
const content = await fs.readFile(filePath);
res.writeHead(200, { 'Content-Type': contentType });
res.end(content);
} catch (err) {
if (err.code === 'ENOENT') {
res.writeHead(404, { 'Content-Type': 'text/html' });
res.end('<h1>404 - File Not Found</h1>');
} else {
res.writeHead(500);
res.end('Internal Server Error');
}
}
});
server.listen(3000, () => console.log('File server on http://localhost:3000'));
This is essentially what express.static() does under the hood!
Handling POST Data
Request body arrives in chunks that must be collected.
Reading Request Body
const http = require('http');
const server = http.createServer((req, res) => {
if (req.method === 'POST') {
let body = '';
// Collect data chunks
req.on('data', (chunk) => {
body += chunk.toString();
});
// All data received
req.on('end', () => {
const data = JSON.parse(body);
console.log('Received:', data);
res.writeHead(200, {
'Content-Type': 'application/json'
});
res.end(JSON.stringify({
received: data,
message: 'Data processed!'
}));
});
// Handle errors
req.on('error', (err) => {
res.writeHead(400);
res.end('Bad request');
});
}
});
Helper Function
// Reusable body parser
function parseBody(req) {
return new Promise((resolve, reject) => {
let body = '';
req.on('data', chunk => body += chunk);
req.on('end', () => {
try {
resolve(JSON.parse(body));
} catch (e) {
reject(new Error('Invalid JSON'));
}
});
req.on('error', reject);
});
}
// Usage
const server = http.createServer(
async (req, res) => {
if (req.method === 'POST') {
try {
const data = await parseBody(req);
res.writeHead(201, {
'Content-Type': 'application/json'
});
res.end(JSON.stringify(data));
} catch (err) {
res.writeHead(400);
res.end('Invalid request body');
}
}
}
);
CampusKart Milestone: Server & Uploads
Deliverable: Server that serves frontend files, handles form submissions, and accepts image uploads
What to Build
- Serve all frontend files from /public
- Route handling: /, /products, /register
- Accept product listing POST requests
- Save products to JSON file (temporary)
- Handle product image uploads to /uploads
- Log all requests to access.log
- Return JSON responses for API endpoints
Connection Moment
"Your CampusKart frontend from Weeks 3-9 is now served by YOUR server. You built both sides."
Push to GitHub when done.
Video Resources
Frontend Masters
Node.js Docs - fs
Node.js Docs - http
Node.js Docs - path
Practice Exercises
Exercise 1: File Logger
- Write logs to a file
- Append timestamps
- Rotate log files
Exercise 2: Static File Server
- Serve HTML/CSS/JS files
- Handle MIME types
- 404 error pages
Exercise 3: JSON Data API
- Read data from JSON file
- CRUD operations via HTTP
- Persist changes to file
Exercise 4: File Explorer
- List directory contents
- Navigate folders via URLs
- Download files
Week 11 Summary
Key Concepts
pathmodule for cross-platform pathsfsmodule for file operations- Sync vs Async (prefer async!)
- Streams for large files
httpmodule for servers- URL parsing and routing
- Serving static files
- Handling POST body data
fs/promises Quick Reference
const fs = require('fs/promises');
await fs.readFile(path, 'utf-8')
await fs.writeFile(path, data)
await fs.appendFile(path, data)
await fs.copyFile(src, dest)
await fs.rename(old, new)
await fs.rm(path)
await fs.mkdir(path)
await fs.readdir(path)
await fs.stat(path)
Next Week
Week 12: Databases and MongoDB!