Buffers and Streams in Node.js

Ckmobile
3 min readJan 4, 2021

Stream is let us start using the data before fully read.

Normally, there are two ways to read the big file, one is we could wait until all of it has been read. But this could take a very long time.

The other is pass a bit of data through a stream. Small chunks of data are packaged up into a buffer and then sent down the stream every time the buffer filled.

Full course:

Inside the folder, we have a large text file.

createReadStream()

We are going to create a stream to read this big file.

The first argument is where we pass data from, here is largetext.txt.

const readStream = fs.createReadStream('largetext.txt');readStream.on('data', chunk => {console.log('##### new chunk #####');console.log(chunk)})

readStream.on(‘data’) mean listening to a data even on this read stream, everytime we get a chunk of data, we fire a callback function.

Result of readStream
readStream.on('data', chunk => {console.log('##### new chunk #####');console.log(chunk.toString())})

We can make it readable by adding toString() method.

After adding toString() method

We can also add encoding method as utf-8 in the second argument. So even without the toString(), the result is also readable.

--

--