πŸš€ Streaming in Node.js: A Complete Guide with Examples

R

ROHIT SINGH

Guest
When working with large files (videos, images, logs, CSVs) or real-time data, loading everything into memory is inefficient. That’s where streams in Node.js come in.

Instead of reading/writing the whole file at once, streams break the data into small chunks. This makes Node.js apps faster, memory-efficient, and scalable.

πŸ”Ή What Are Streams in Node.js?

Streams are built-in objects in Node.js that allow reading/writing data piece by piece.

There are four main types of streams:

Readable – stream you can read from (e.g., fs.createReadStream)

Writable – stream you can write to (e.g., fs.createWriteStream)

Duplex – both readable and writable (e.g., TCP sockets)

Transform – streams that can modify data (e.g., compressing files)

πŸ”Ή Why Use Streams?

βœ… Handle large files without crashing the server
βœ… Improve performance & scalability
βœ… Enable real-time processing (chat apps, video streaming, logging)
βœ… Lower memory usage

πŸ”Ή Example 1: Reading a File Using Streams

Instead of reading an entire file with fs.readFile(), use streams:


Code:
const fs = require("fs");

// Create a readable stream
const readableStream = fs.createReadStream("bigfile.txt", {
  encoding: "utf-8",
  highWaterMark: 64 * 1024 // 64 KB chunks
});

// Listen to data events
readableStream.on("data", (chunk) => {
  console.log("Received chunk:", chunk.length);
});

readableStream.on("end", () => {
  console.log("File reading completed!");
});

πŸ‘‰ Here, the file is read in 64KB chunks, not all at once.

πŸ”Ή Example 2: File Copy Using Pipe

Node.js makes it super simple to pipe streams:


Code:
const fs = require("fs");

const readable = fs.createReadStream("input.txt");
const writable = fs.createWriteStream("output.txt");

// Pipe data from read β†’ write
readable.pipe(writable);

console.log("File copied successfully using streams!");

βœ… Efficiently copies large files without loading them fully into memory.

πŸ”Ή Example 3: Streaming an HTTP Response

Streams are great for serving large files in HTTP servers:


Code:
const http = require("http");
const fs = require("fs");

http.createServer((req, res) => {
  const stream = fs.createReadStream("video.mp4");
  res.writeHead(200, { "Content-Type": "video/mp4" });
  stream.pipe(res);
}).listen(3000, () => {
  console.log("Server running on http://localhost:3000");
});

πŸ‘‰ Instead of loading the full video into memory, the server sends chunks to the client as they’re read.

πŸ”Ή Example 4: Transform Streams (Compression)

You can also transform data on the fly:


Code:
const fs = require("fs");
const zlib = require("zlib");

const readable = fs.createReadStream("input.txt");
const compressed = fs.createWriteStream("input.txt.gz");

// Compress file using gzip
readable.pipe(zlib.createGzip()).pipe(compressed);

console.log("File compressed successfully!");

βœ… Useful for logs, backups, and optimizing storage.

πŸ”Ή Real-World Use Cases of Node.js Streams

Video/Audio streaming (Netflix, YouTube-like apps)

Large file uploads/downloads

Log processing in real-time

Chat applications with WebSockets

Data pipelines (ETL jobs)

🎯 Conclusion

Streams are a core strength of Node.js that let you handle large data efficiently without crashing servers. Whether you’re copying files, serving videos, or compressing data, streams make your app scalable, memory-friendly, and performant.

If you’re building high-performance apps in 2025, mastering Node.js streams is a must! πŸš€


πŸš€ Rohit Singh πŸš€ – Medium


Read writing from πŸš€ Rohit Singh πŸš€ on Medium. Full-stack developer with 6+ years in Angular, Node.js & AWS. Sharing tips, best practices & real-world lessons from building scalable apps.

favicon
medium.com

Continue reading...
 


Join 𝕋𝕄𝕋 on Telegram
Channel PREVIEW:
Back
Top