Streams in Node.js are a powerful way to handle reading and writing data. They are especially useful for working with large datasets or when you want to process data before it’s fully loaded, such as reading from a file or making HTTP requests. Streams allow you to work with data piece by piece (chunks), which is more efficient in terms of memory usage and processing time compared to reading the entire data at once.
Stream workflow “read from the input and write to output sequentially“.
Types of Streams in Node.js
There are 4 types of Streams
1. Readable Streams: Used for reading data. Examples include fs.createReadStream()
, HTTP requests, and responses.
2. Writable Streams: Used for writing data. Examples include fs.createWriteStream()
, HTTP requests, and responses.
3. Duplex Streams: These are both readable and writable. An example is a network socket.
4. Transform Streams: These are duplex streams that can modify or transform the data as it is read or written. Examples include zlib.createGzip()
for compression.
Why Use Streams?
Memory Efficiency: Streams handle data in chunks, reducing the memory footprint by not loading the entire data into memory at once.
Time Efficiency: Streams allow you to start processing data as soon as you receive it, which can be faster than waiting for all data to arrive.
Handling Large Files: When working with large files, streams allow you to read or write the data in chunks without consuming too much memory.
some events of the stream
1) data: this event is fired to read the data.
2) end: this event is fired when no data is available to read.
3) error: This event is fired when there is any error to read or write data.
4) finish: This event is fired when all the data has been flushed to the underlying system.
Basic Example of a Readable Stream
Here’s an example of reading a file using a readable stream:
const fs = require('fs');
// Create a readable stream from a file
const readableStream = fs.createReadStream('example.txt', {
encoding: 'utf8',
highWaterMark: 16 * 1024 // 16KB chunk size
});
// Listen for data events, which provide chunks of data as they are read
readableStream.on('data', (chunk) => {
console.log('New chunk received:');
console.log(chunk);
});
// Handle the end of the stream
readableStream.on('end', () => {
console.log('No more data to read.');
});
// Handle errors
readableStream.on('error', (err) => {
console.error('Error:', err);
});
Basic Example of a Writable Stream
Now, let’s see how to write data using a writable stream:
const fs = require('fs');
// Create a writable stream to a file
const writableStream = fs.createWriteStream('output.txt');
// Write data to the stream
writableStream.write('Hello, ');
writableStream.write('World!\n');
// End the stream
writableStream.end();
// Handle stream finish event
writableStream.on('finish', () => {
console.log('All data written to file.');
});
// Handle errors
writableStream.on('error', (err) => {
console.error('Error:', err);
});
Piping Streams
One of the most common use cases for streams is piping, where you connect a readable stream to a writable stream. This is particularly useful for tasks like reading from a file and immediately writing to another file or responding to an HTTP request.
Here’s an example of piping a readable stream to a writable stream:
const fs = require('fs');
// Create a readable stream and a writable stream
const readableStream = fs.createReadStream('example.txt');
const writableStream = fs.createWriteStream('output.txt');
// Pipe the readable stream into the writable stream
readableStream.pipe(writableStream);
// Handle the finish event
writableStream.on('finish', () => {
console.log('File successfully copied.');
});
Transform Streams
Transform streams are special types of streams that modify the data as it passes through. For example, you can compress a file using a transform stream.
Here’s an example of compressing a file using the zlib
module:
const fs = require('fs');
const zlib = require('zlib');
// Create a readable stream, a writable stream, and a transform stream
const readableStream = fs.createReadStream('example.txt');
const gzipStream = zlib.createGzip();
const writableStream = fs.createWriteStream('example.txt.gz');
// Pipe the readable stream through the transform stream and then to the writable stream
readableStream.pipe(gzipStream).pipe(writableStream);
// Handle the finish event
writableStream.on('finish', () => {
console.log('File successfully compressed.');
});
Error Handling in Streams
Error handling is crucial when working with streams. You can listen for error
events on any stream to catch and handle errors:
readableStream.on('error', (err) => {
console.error('Read error:', err);
});
writableStream.on('error', (err) => {
console.error('Write error:', err);
});