What are Streams ?
Collections of data that might not be available all at once and don’t have to fit in memory.
The fs module can be used to read from and write to file using a stream interface.
Writing to this big.file through a writeable stream, and we’re just writing 1 million lines with a loop, using the .write method. Then we use the .end method when we’re done. Running this script generates a file that’s about 400 MB.
const fs = require('fs');
const file = fs.createWriteStream('./big.file');
for(let i=0; i<= 1e6; i++){
file.write('Lorem ispum dolor sit amet, consectetur adipisicing elite');
}
file.end();
Run this command to create big file (change for loop value before that "1e2") : node create-big-file
We'll serve this file in a simple http server using the readFile method and writing the response inside its callback.
const fs = require('fs');
const server = require('http').createServer();
server.on('request', (req, res) => {
fs.readFile("./big.file", (err, data) => {
if(err) throw err;
res.end(data);
})
})
server.listen(3000, ()=>{
console.log('Server running in port 3000.')
})
When run the server.js file, the computer memory will take more memory. That's because our code basically buffered the whole big file in memory before it wrote it out. This is very inefficient. More Details
Windows command :
curl http://localhost:3000/
We can avoid this more memory utilization through stream. The response object is also a writable stream, if we have the big file as a readable stream, we can simply pipe one into the other and avoid filling up the memory. The fs module can give a readable stream for any file using this createReadStream method and then we simply pipe this readable stream into the response writable stream.
const fs = require('fs');
const server = require('http').createServer();
server.on('request', (req, res) => {
// fs.readFile("./big.file", (err, data) => {
// if(err) throw err;
// res.end(data);
// })
const src=fs.createReadStream('./big.file');
src.pipe(res);
})
server.listen(3000, ()=>{
console.log('Server running in port 3000.')
})
What is the setEncoding function in Node.js?
Some of the most common character encoding(s) supported by Node.js are listed below.
base64
ucs2
ascii
latin1
utf8
const file_stream = require('fs');
//creatomg two readable streams
const file1 = file_stream.createReadStream("file1.txt");
const file2 = file_stream.createReadStream("file2.txt");
//setting encoding of file 1 to hex
file1.setEncoding('hex');
//printing each data-element of file 1
file1.on('data', (data_element) => {
console.log(data_element);
});
//setting encoding of file 2 to utf8
file2.setEncoding('utf8');
//printing each-data element of file 2
file2.on('data', (data_element) => {
console.log(data_element);
});
Understanding Streams in Node.js
async iterator
It’s highly recommended to use async iterator when working with streams.
You can use async iterator when reading from readable streams:
import * as fs from 'fs';
async function logChunks(readable) {
for await (const chunk of readable) {
console.log(chunk);
}
}
const readable = fs.createReadStream(
'tmp/test.txt', {encoding: 'utf8'});
logChunks(readable);
// Output:
// 'This is a test!\n'
It’s also possible to collect the contents of a readable stream in a string:
import {Readable} from 'stream';
async function readableToString2(readable) {
let result = '';
for await (const chunk of readable) {
result += chunk;
}
return result;
}
const readable = Readable.from('Good morning!', {encoding: 'utf8'});
assert.equal(await readableToString2(readable), 'Good morning!');
Streams in NodeJs tutorials point
Parsing post data 3 different ways in Node.js without third-party libraries
Understanding HTML Form Encoding: URL Encoded and Multipart Forms
File upload Form Data- Clickup API - Attachement
Node.js - get raw request body using Express

uploading-and-downloading-files-streaming-in-node-js
Above Code Source
pipe-a-uploaded-file-to-a-remote-server-with-node-ideally-with-same-filename
Node.js Streams: Everything you need to know - freecodecamp

