Node Streams
Introduction
Streams in Node.js are objects that facilitate the handling of streaming data. They allow you to work with large data sets by processing the data in chunks rather than loading the entire set into memory. This approach is memory efficient and particularly useful for handling files, network communications, or any data that flows over time. Streams are an essential part of Node.js, as they provide an efficient way to handle I/O operations.
Types of streams -
i) Readable Streams: These streams allow you to read data from a source. Examples includefs.createReadStream()
for reading files andhttp.IncomingMessage
for reading HTTP requests.
ii) Writable Streams: These streams allow you to write data to a destination. Examples includefs.createWriteStream()
for writing to files andhttp.ServerResponse
for writing HTTP responses.
iii) Duplex Streams: These are streams that are both readable and writable. Examples include network sockets (net.Socket
) andzlib
compression streams.
iv) Transform Streams: These are a special type of duplex stream where the output is computed based on the input. Examples include file compression and encryption/decryption operations.Key Methods -
readableStream.pipe(writableStream)
: Pipes data from a readable stream to a writable stream.
readableStream.read(size)
: Reads data from the stream.
writableStream.write(chunk)
: Writes data to the stream.
writableStream.end(chunk)
: Signals the end of data writing.Events -
data
: Emitted when a chunk of data is available to be read.
end
: Emitted when there is no more data to be read.
finish
: Emitted when all data has been flushed to the writable stream.
error
: Emitted when an error occurs.How stream works
i) Readable Streams
Readable streams emit data in chunks, which can be read in two modes: flowing and paused.
Flowing Mode: Data is read automatically and provided to the application through events likedata
.
Paused Mode: You need to explicitly call methods likeread()
to get chunks of data.const fs = require('fs'); const readableStream = fs.createReadStream('example.txt', { encoding: 'utf8' }); readableStream.on('data', (chunk) => { console.log(`Received ${chunk.length} bytes of data.`); console.log(chunk); }); readableStream.on('end', () => { console.log('No more data to read.'); });
ii) Writable Streams
Writable streams allow you to write data in chunks and provide methods likewrite()
andend()
to control the flow of data.const fs = require('fs'); const writableStream = fs.createWriteStream('output.txt'); writableStream.write('This is some data to write to the file.\n'); writableStream.write('This is more data.\n'); writableStream.end('This is the end of the data.'); writableStream.on('finish', () => { console.log('All data has been written.'); });
iii) Duplex Streams
Duplex streams are both readable and writable, allowing for two-way communication.const { Duplex } = require('stream'); const duplexStream = new Duplex({ read(size) { this.push('This is some data from the readable side.'); this.push(null); // Signaling the end of the stream }, write(chunk, encoding, callback) { console.log(`Writing: ${chunk.toString()}`); callback(); } }); duplexStream.on('data', (chunk) => { console.log(`Received: ${chunk.toString()}`); }); duplexStream.write('This is some data for the writable side.'); duplexStream.end();
iv) Transform Streams
Transform streams process the input data to produce the output data, allowing you to modify the data as it is written and read.
const { Transform } = require('stream'); const transformStream = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); } }); process.stdin.pipe(transformStream).pipe(process.stdout);
Advantage of Streams
Streams offer several advantages in Node.js, particularly when dealing with large amounts of data or data that arrives over time.
i) Memory Efficiency
Streams allow you to process data in chunks, which means you don't need to load the entire data set into memory at once. This is particularly beneficial when dealing with large files or large amounts of data.ii) Time Efficiency
Streams process data as it arrives, allowing your application to start working on data as soon as the first chunk is received, rather than waiting for the entire data set.Readable Streams
Readable streams are used in Node.js, such as reading from files, HTTP requests, network sockets, standard input, and HTTP responses. By using readable streams, you can handle large data sources efficiently without loading everything into memory at once.
Readable streams in Node.js are used to read data from a source in a controlled manner. Here are a few common examples of readable streams,
i) Reading from a Fileconst fs = require('fs'); // Create a readable stream from a file const readableStream = fs.createReadStream('example.txt', { encoding: 'utf8' }); readableStream.on('data', (chunk) => { console.log(`Received ${chunk.length} bytes of data.`); console.log(chunk); }); readableStream.on('end', () => { console.log('No more data to read.'); }); readableStream.on('error', (err) => { console.error('Error:', err); });
ii) Reading from an HTTP Request
const express = require("express"); const app = express(); const port = 3000; app.get("/", (req, res) => { let body = ""; req.on("data", (chunk) => { body += chunk.toString(); }); req.on("end", () => { console.log("Received data:", body); res.end("Data received"); }); req.on("error", (err) => { console.error("Error:", err); res.statusCode = 500; res.end("Server error"); }); }); app.listen(port, () => { console.log("Server listening on port 3000"); });
Writable Streams
Writable streams are used in Node.js, such as writing to files, HTTP responses, network sockets, standard output, HTTP requests, and transform streams. By using writable streams, you can handle data writing efficiently and manage backpressure in your applications.
Writable streams in Node.js allow you to write data to a destination. Here are a few common examples of writable streams,
i) Writing to a Fileconst fs = require('fs'); // Create a writable stream to a file const writableStream = fs.createWriteStream('output.txt'); // Write some data to the stream writableStream.write('This is the first line of the file.\n'); writableStream.write('This is the second line of the file.\n'); // Signal the end of writing writableStream.end('This is the end of the file.\n'); // Handle the finish event writableStream.on('finish', () => { console.log('All data has been written to the file.'); }); // Handle the error event writableStream.on('error', (err) => { console.error('Error:', err); });
ii) Writing to an HTTP Response
const http = require('http'); const server = http.createServer((req, res) => { res.writeHead(200, { 'Content-Type': 'text/plain' }); // Write some data to the response res.write('Hello, World!\n'); res.write('This is an HTTP response.\n'); // Signal the end of writing res.end('Goodbye!\n'); }); server.listen(3000, () => { console.log('Server listening on port 3000'); });