Streams are flow of data from one part of code to another. NodeJS provides stream module to create streams but that is very cumbersome to use and understand. So we will use an application of stream to demonstrate the power of streams. The fs module we know, uses streams internally. This article shows you why and how to use them.
Previously we saw about File IO in NodeJS. File IO is a very time consuming task (for a computer). It requires searching a memory location on the disk, read properties, read 1s and 0s, determine what it is, process etc. The File IO methods we saw uptill now work fine when the file is relatively small. Say upto 100 MB. If the file size is large, the IO time becomes large which can halt the system if care is not taken.
Our current model works as shown above. The file we need to process is first brought into the RAM as whole. This takes some finite amount of time (say 20s in the example above). Then this data is processed using some function of ours. This also takes some time (say 4s at 50MB/s speed). So after a user click a button, he would need to wait a total of 24 seconds before he can see anything. This is not desirable.
Say the file contains some article. Instead of using above method, what we can do is that read it line by line or paragraph by paragraph. This way although loading the whole file takes same amount of time, the user can see the first paragraph within a short time (0.12s) and while he is reading that paragraph, the other paragraphs arrive one by one. Hence the user perceives that the system is fast as his wait time is reduced.
Hope this gives you an idea of how streams work.
In NodeJS these streams have event emitters with emit different events, but only few are frequently used namely:
Reading using Streams
Running this code will give you an output like:
We first create a readable stream from the myTextFile.txt using the fs module. Whenever a chunk of data is ready to be processed by the stream , the stream emits a ‘data’ event. On every data event we add the current chunk to our string. You can add your processing logic inside this event so that whatever data is appended to the string is processed in chunks. After the stream has finished reading a file, it emits an ‘end’ event. So after the end event is emitted, we display the data to the console.
Similar to the readable stream, there is a writeable stream This is very helpful when some data is transferred over network and you need to write the data into a file. So instead of waiting for the data to be transmitted completely before writing, you can write the data to the file as soon as it arrives.
Example of writeable stream
But you will never in practise use streams for writing files like this. Instead you will a combination of both readable and writeable to read from a large file and write to another file.
The stream also provides a pipe function to ease the above functionality. The above example can be written like
Pipe allows you to directly transfer the chunks from one stream to another.