R
ROHIT SINGH
Guest
When working with large files (videos, images, logs, CSVs) or real-time data, loading everything into memory is inefficient. Thatβs where streams in Node.js come in.
Instead of reading/writing the whole file at once, streams break the data into small chunks. This makes Node.js apps faster, memory-efficient, and scalable.
What Are Streams in Node.js?
Streams are built-in objects in Node.js that allow reading/writing data piece by piece.
There are four main types of streams:
Readable β stream you can read from (e.g., fs.createReadStream)
Writable β stream you can write to (e.g., fs.createWriteStream)
Duplex β both readable and writable (e.g., TCP sockets)
Transform β streams that can modify data (e.g., compressing files)
Why Use Streams?
Handle large files without crashing the server
Improve performance & scalability
Enable real-time processing (chat apps, video streaming, logging)
Lower memory usage
Example 1: Reading a File Using Streams
Instead of reading an entire file with fs.readFile(), use streams:
Here, the file is read in 64KB chunks, not all at once.
Example 2: File Copy Using Pipe
Node.js makes it super simple to pipe streams:
Efficiently copies large files without loading them fully into memory.
Example 3: Streaming an HTTP Response
Streams are great for serving large files in HTTP servers:
Instead of loading the full video into memory, the server sends chunks to the client as theyβre read.
Example 4: Transform Streams (Compression)
You can also transform data on the fly:
Useful for logs, backups, and optimizing storage.
Real-World Use Cases of Node.js Streams
Video/Audio streaming (Netflix, YouTube-like apps)
Large file uploads/downloads
Log processing in real-time
Chat applications with WebSockets
Data pipelines (ETL jobs)
Conclusion
Streams are a core strength of Node.js that let you handle large data efficiently without crashing servers. Whether youβre copying files, serving videos, or compressing data, streams make your app scalable, memory-friendly, and performant.
If youβre building high-performance apps in 2025, mastering Node.js streams is a must!
Read writing from
Rohit Singh
on Medium. Full-stack developer with 6+ years in Angular, Node.js & AWS. Sharing tips, best practices & real-world lessons from building scalable apps.
medium.com
Continue reading...
Instead of reading/writing the whole file at once, streams break the data into small chunks. This makes Node.js apps faster, memory-efficient, and scalable.

Streams are built-in objects in Node.js that allow reading/writing data piece by piece.
There are four main types of streams:
Readable β stream you can read from (e.g., fs.createReadStream)
Writable β stream you can write to (e.g., fs.createWriteStream)
Duplex β both readable and writable (e.g., TCP sockets)
Transform β streams that can modify data (e.g., compressing files)






Instead of reading an entire file with fs.readFile(), use streams:
Code:
const fs = require("fs");
// Create a readable stream
const readableStream = fs.createReadStream("bigfile.txt", {
encoding: "utf-8",
highWaterMark: 64 * 1024 // 64 KB chunks
});
// Listen to data events
readableStream.on("data", (chunk) => {
console.log("Received chunk:", chunk.length);
});
readableStream.on("end", () => {
console.log("File reading completed!");
});


Node.js makes it super simple to pipe streams:
Code:
const fs = require("fs");
const readable = fs.createReadStream("input.txt");
const writable = fs.createWriteStream("output.txt");
// Pipe data from read β write
readable.pipe(writable);
console.log("File copied successfully using streams!");


Streams are great for serving large files in HTTP servers:
Code:
const http = require("http");
const fs = require("fs");
http.createServer((req, res) => {
const stream = fs.createReadStream("video.mp4");
res.writeHead(200, { "Content-Type": "video/mp4" });
stream.pipe(res);
}).listen(3000, () => {
console.log("Server running on http://localhost:3000");
});


You can also transform data on the fly:
Code:
const fs = require("fs");
const zlib = require("zlib");
const readable = fs.createReadStream("input.txt");
const compressed = fs.createWriteStream("input.txt.gz");
// Compress file using gzip
readable.pipe(zlib.createGzip()).pipe(compressed);
console.log("File compressed successfully!");


Video/Audio streaming (Netflix, YouTube-like apps)
Large file uploads/downloads
Log processing in real-time
Chat applications with WebSockets
Data pipelines (ETL jobs)

Streams are a core strength of Node.js that let you handle large data efficiently without crashing servers. Whether youβre copying files, serving videos, or compressing data, streams make your app scalable, memory-friendly, and performant.
If youβre building high-performance apps in 2025, mastering Node.js streams is a must!


Rohit Singh
β Medium
Read writing from


Continue reading...