Nodejs File Operation

Different Way to Process file :

1. Upload file to a disk
2. Uploaded file data into a database
3. Upload file from incoming request to outgoing request

What is difference between file reading and streaming?

First thing is fileread is fully buffered method. and streaming is partial buffered method.

Now what does it mean?

Fully buffered function calls like readFileSync() and readFile() expose the data as one big blob. That is, reading is performed and then the full set of data is returned either in synchronous or asynchronous fashion. With these fully buffered methods, we have to wait until all of the data is read, and internally Node will need to allocate enough memory to store all of the data in memory. This can be problematic – imagine an application that reads a 1 GB file from disk. With only fully buffered access we would need to use 1 GB of memory to store the whole content of the file for reading – since both readFile and readFileSync return a string containing all of the data.

Partially buffered access methods are different. They do not treat data input as a discrete event, but rather as a series of events which occur as the data is being read or written. They allow us to access data as it is being read from disk/network/other I/O.Partially buffered access methods are different. They do not treat data input as a discrete event, but rather as a series of events which occur as the data is being read or written. They allow us to access data as it is being read from disk/network/other I/O.

Streams return smaller parts of the data (using a Buffer), and trigger a callback when new data is available for processing.

Streams are EventEmitters. If our 1 GB file would, for example, need to be processed in some way once, we could use a stream and process the data as soon as it is read. This is useful, since we do not need to hold all of the data in memory in some buffer: after processing, we no longer need to keep the data in memory for this kind of application.

The Node stream interface consists of two parts: Readable streams and Writable streams. Some streams are both readable and writable.

So what’s the difference here ?? also what is the difference between fs.createReadStream(); and fs.readFile(); both are asynchronous, right !!

Well aside from the fact that fs.createReadStream() directly returns a stream object, and fs.readFile() expects a callback function in the second argument, there is another huge difference.

Yes they are both asynchronous, but that doesn’t change the fact that fs.readFile() doesn’t give you any data until the entire file has been buffered into memory. This is much less memory-efficient and slower when relaying data back through server responses. With fs.createReadStream(), you can pipe() the stream object directly to a server’s response object, which means your client can immediately start receiving data even if the file is 500MB.

Not only this, you also improve memory efficiency by dealing with the file one chunk at a time rather than all at once. This means that your memory only has to buffer the file contents a few kilobytes at a time rather than all at once.

Here’s two snippets demonstrating what I’m saying:

const fs = require('fs');
const http = require('http');

// using readFile()
http.createServer(function (req, res) {
    // let's pretend this is a huge 500MB zip file
    fs.readFile('some/file/path.zip', function (err, data) {
        // entire file must be buffered in memory to data, which could be very slow
        // entire chunk is sent at once, no streaming here
        res.write(data);
        res.end();
    });
});

// using createReadStream()
http.createServer(function (req, res) {
    // processes the large file in chunks
    // sending them to client as soon as they're ready
    fs.createReadStream('some/file/path.zip').pipe(res);
    // this is more memory-efficient and responsive
});

Google Search Key : difference between filesystem and stream in nodejs

Understanding HTML Form Encoding: URL Encoded and Multipart Forms
The encoding type of a form is determined by the attribute enctype. It can have three values,

application/x-www-form-urlencoded – Represents a URL encoded form. This is the default value if enctype attribute is not set to anything.

multipart/form-data – Represents a Multipart form. This type of form is used when the user wants to upload files

text/plain – A new form type introduced in HTML5, that as the name suggests, simply sends the data without any encoding

Great post! I’ve only one question about how can we get a file’s content, without parsing the request body into a string. I’ve tried parsing it, got succeed but the result takes a too long time to complete.

How to disable Express BodyParser for file uploads (Node.js)

get raw request body using Express

Edit 2: Release 1.15.2 of the body parser module introduces raw mode, which returns the body as a Buffer. By default, it also automatically handles deflate and gzip decompression. Example usage:

var bodyParser = require('body-parser');
app.use(bodyParser.raw(options));

app.get(path, function(req, res) {
  // req.body is a Buffer object
});

Node.js – get raw request body using Express

By default, the options object has the following default options:

var options = {
  inflate: true,
  limit: '100kb',
  type: 'application/octet-stream'
};

If you want your raw parser to parse other MIME types other than application/octet-stream, you will need to change it here. It will also support wildcard matching such as */* or */application.

Parsing post data 3 different ways in Node.js without third-party libraries

Using async/await with a forEach loop
Sure the code does work, but I’m pretty sure it doesn’t do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.

Reading in sequence
f you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:

async function printFiles () {
  const files = await getFilePaths();

  for (const file of files) {
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  }
}

Reading in parallel

Reference

If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you’re throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you’ll get with Promise.all:

async function printFiles () {
  const files = await getFilePaths();

  await Promise.all(files.map(async (file) => {
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  }));
}

what-is-difference-between-file-reading-and-streaming

nodejs_file_system

node-js-streams-filestream-pipes-read-writestream guru99

node-js-buffers – Tutorialkart

node-fs Tutorialkart

streaming-an-uploaded-file-to-an-http-request

uploading-and-downloading-files-streaming-in-nodej using database

multer-single-file-upload-with-express-and-node-js

Leave a Reply

Your email address will not be published. Required fields are marked *