NodeJS - Express Framework

 
Express Framework
      Previously we saw how to build a basic web server using http module of NodeJS. But there were some drawbacks regarding code maintainability. Hence a library/module for NodeJS was developed -- ExpressJS, which can handle the routes and functions of your server. 

As we said, ExpressJS is a NodeJS module which is not a core NodeJS module. This means we need install express for our application.

Install ExpressJS by  running “ npm install express --save ” in the terminal in your project directory. That is all we need to do. 

Now we will create the same server we previously built using ExpressJS.
Change your server.js file to:

server.js


"use strict";
const express = require('express');
var app = express();
let PORT = 3000;

app.get('/articles', function(req, res){
    res.end("Code to display articles here");
});

app.get('/stats', function(req, res){
  res.end("Stats will be displayed here once we complete our app");
});

app.use(function(req, res){
  res.end("More logic to be implemented");
});

app.listen( PORT, function(){
    console.log("Server started listening on port: "+PORT);
});

Here we require the express module and create a new instance of express which will be our server. With this server, we specify the handler for a url. 

All the assigning of handlers to the routes should be called before app.listen(). The order in which also matters as the routes are matched from top to bottom. The handler that matches the route first is executed independent of the next handlers.

Restart your server and open your web browser and visit the previous urls. You should get the same responses as before. 

As with the http module, the handler function contain req, res and a next (optional) arguments which you can use to achieve various functions. The req object stores information about the request like the url, params body etc while the res object stores info about the response. 

First, a little more about the request object
Two commonly used support modules for ExpressJS are cookie-parser and body-parser. These were a part of ExpressJS 3.0 but were moved to independent modules in version 4.0. Cookie parser is used to parse the cookies stored in client and store them in req.cookies object while the body 

Here we show the functioning of body-parser. Cookie-parser will be explained when we will create a full web app using angular.
Install these modules by doing
$ npm install body-parser --save
$ npm install cookie-parser --save

After installing we need to tell the express app to use these modules. Modify the server.js to include these modules with main app:

Till now we have used res.send() to send data to the client. If we need to send a JSON object as response, we can send using res.json() which will send it as a JSON instead of converting it into string.

Now for testing a post request, we will use an app called Postman. This chrome app is available for all the platforms and is widely used for the same.

Start the app and send a POST request to http://localhost:3000/user with the data of your choice.


An example is shown:

Send this post request. You should get a response like

The body-parser module parses the POST request into a json object stored in req.body body object. You can then individually process the fields as per the app logic. In the above example we simply display the content of the request body as the response for the request.

This is just the surface of ExpressJS framework. There are lots of functionalities which we will see in due course of this tutorial. 

TASK
Let's do another simple task. Store the POST request and send it when requested through GET “/user” end point.

Try this yourself. See the solution below when ready.

Solution

This is the sample response to the GET /user endpoint after 2 POST requests.

Similarly you can use PUT, HEAD, DELETE, GET or OPTIONS request types. Each one of these have a different use as you saw in article about RestfulAPI.

Web Server in NodeJS

 
    In the previous articles we saw various features of NodeJS. Now it is time to sew everything together and start making our Web Server. Before we start building we should know what role does URL play in server. 


For this you should know what the different parts of a URL:
The domain is a human readable address of the web page. It is translated to the actual IP address of the server by various DNS. The route of the URL is what decides what to execute on the server. Query and Params are some data which we send via the URL.

Simple Web Server
     To create a basic web server, we will use the http module. This is one of a core NodeJS modules used for building a web server. It lets you interface the routes and params of a request to execute functions accordingly. 

Create a server.js file with the following code:

server.js

use strict;
const http = require(http);
let PORT = 3000;

let server = http.createServer( function(req, res) {
    res.end(You have visited +req.url);
});

server.listen( PORT, function(){
    console.log(Server started listening on port: +PORT);
});


Now run this code, you should see something like this:


Note that the script does not exit and keeps running. To see the server in action, open your favorite web browser and visit http://localhost:3000 or http://127.0.0.1:3000 depending on how your system is configured. You should see something like this:

Now let us understand the code.
     First we require the http module to use it in our application. We then define a PORT on which the server should listen to. Port is a communication endpoint in any operating system. Various processes interact with each other using ports. When developing your web servers, try to use ports greater than 1024 as port numbers lesser than those are reserved by the operating system.

Then we create a HTTP server using http.createServer() which accepts a callback function that acts as the handler for our server. All the HTTP requests sent to our server will be handled by this callback function. The logic inside this handler function will be executed on every request. 

The server.listen() function takes two arguments -- PORT number and callback function. The callback function is executed when the process is started and is listening on the specified PORT. It will be executed only once in the process lifetime i.e. until the server crashes or is stopped.

Try changing the last part of url

The object req.url returns the route part of the URL and not the complete URL as the domain name depends on the system where the server is hosted. 

To add the some logic to our server modify the handler function as:


let server = http.createServer( function(req, res) {
    if(req.url === /articles){
        res.end(Code to display articles here);
    }else if(req.url === /stats){
        res.end(Stats will be displayed here once we complete our app);
    }else{
        res.end(More logic to be implmented);
    }
});

Restart the server by terminating the node process and again starting it and visit the urls in your browser and visit different routes:

This is how a Web Server is built on NodeJS. 

This is all well and good but in a large web application, there will be many routeswith different methods and different logic. If we use http module for such large applications the whole code will become very complicated. In the next article we will see how to use ExpressJS for simplifying the server code.


Streams in NodeJS

 
     Streams are flow of data from one part of code to another. NodeJS provides stream module to create streams but that is very cumbersome to use and understand. So we will use an application of stream to demonstrate the power of streams. The fs module we know, uses streams internally. This article shows you why and how to use them.

   Previously we saw about File IO in NodeJS. File IO is a very time consuming task (for a computer). It requires searching a memory location on the disk, read properties, read 1s and 0s, determine what it is, process etc. The File IO methods we saw uptill now work fine when the file is relatively small. Say upto 100 MB. If the file size is large, the IO time becomes large which can halt the system if care is not taken.


Our current model works as shown above. The file we need to process is first brought into the RAM as whole. This takes some finite amount of time (say 20s in the example above). Then this data is processed using some function of ours. This also takes some time (say 4s at 50MB/s speed). So after a user click a button, he would need to wait a total of 24 seconds before he can see anything. This is not desirable.

Say the file contains some article. Instead of using above method, what we can do is that read it line by line or paragraph by paragraph. This way although loading the whole file takes same amount of time, the user can see the first paragraph within a short time (0.12s) and while he is reading that paragraph, the other paragraphs arrive one by one. Hence the user perceives that the system is fast as his wait time is reduced.

Hope this gives you an idea of how streams work.

In NodeJS these streams have event emitters with emit different events, but only few are frequently used namely: 
  • data
  • end
We will use the same myTextFile.txt in our code. You can make that file large so that you can see the effect. 

Reading using Streams
     Change the server.js code to read using stream.


use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);

var string = “”;
readable.on(data, function(chunk){
     string += chunk;
});
readable.on(end, function(){
    console.log(string);
    console.log(Finished reading file);
});


Running this code will give you an output like:

       We first create a readable stream from the myTextFile.txt using the fs module. Whenever a chunk of data is ready to be processed by the stream , the stream emits a ‘data’ event. On every data event we add the current chunk to our string. You can add your processing logic inside this event so that whatever data is appended to the string is processed in chunks. After the stream has finished reading a file, it emits an ‘end’ event. So after the end event is emitted, we display the data to the console. 

Similar to the readable stream, there is a writeable stream This is very helpful when some data is transferred over network and you need to write the data into a file. So instead of waiting for the data to be transmitted completely before writing, you can write the data to the file as soon as it arrives.

Example of writeable stream

use strict;
var fs = require(fs);
var writeable = fs.createWriteStream(.\myTextFile.txt);

writeable.write(This is some new text);
writeable.end();

But you will never in practise use streams for writing files like this. Instead you will a combination of both readable and writeable to read from a  large file and write to another file.

use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);
var writeable = fs.createWriteStream(.\myTextFileCopy.txt);

readable.on(data, function(chunk){
     writeable.write(chunk);
});

readable.on(end, function(){
    console.log(Finished copying);
});

The stream also provides a pipe function to ease the above functionality. The above example can be written like 

use strict;
var fs = require(fs);
var readable = fs.createReadStream(.\myTextFile.txt);
var writeable = fs.createWriteStream(.\myTextFileCopy.txt);
readable.pipe(writeable);


Pipe allows you to directly transfer the chunks from one stream to another.

FileIO in NodeJS

 
      In any system, file input output services are a crucial functionality which governs many things. In NodeJS, file IO is required file serving html pages or when you want your client to download some file or even when you create some other tools with NodeJS apart from web servers. Following are some of the file IO methods which are frequently used by developers.

The examples shown consider the following file structure:

The server.js file will hold our code to execute and myTextFile.txt will contain some message which we will read or write to. For file IO, NodeJS provides fs module which included in the NodeJS core modules. You do not need to do npm install for fs module.

myTextFile.txt
Hello..!! I am inside myTextFile.txt                                                                             

Reading a file
      The fs module provides 2 functions to read files. One method is synchronous i.e. it runs on the main thread (a blocking operation) and the other one is asynchronous. Here is the basic syntax for using the same:

Synchronous Reading

use strict;
const fs = require(fs);
var readFileMainThread = function(){
   var content = fs.readFileSync(.\myTextFile.txt);
   console.log(content.toString());
};
readFileMainThread();

Asynchronous Reading

use strict;
const fs = require(fs);
var readFileAsync = function(){
   fs.readFile(.\myTextFile.txt, function(err, content){
       console.log(content.toString());
   });
};
readFileAsync();

In both the cases, the content returned is in the form of a buffer. To display the actual content you need to convert this into string. Buffers are the exact bits in which the data is stored in the computer -- in the form of 1s and 0s. When you print a buffer to console, it will show some hexadecimal numbers whose binary representation is how the data is stored on the disk. 

Writing a file
    Writing is also similar to reading. The fs module provides a synchronous method as well as an asynchronous method to write something to a file. 

Synchronous Writing

use strict;
const fs = require(fs);
var writeFileMainThread = function(){
   var content = This is my new content to write to a file;
   fs.writeFileSync(.\myTextFile.txt, content);
};
writeFileMainThread();

Asynchronous Writing

use strict;
const fs = require(fs);
var writeFileAsync = function(){
   var content = This content will also be written to a file. But Asynchronously;
   fs.writeFile(.\myTextFile.txt,content, function(err){
       if(err) //Some error has occured while writing
       else //Successful
   });
};
writeFileAsync();

In both the cases, the first argument to function is the file path to which the content is to be written. If the file does not exists then the file is automatically created. The 2nd argument is the content which is to be written. Here we pass the string instead of buffer (as is returned when reading). 

Checking Existence of Files and Folders
    Many a times you are saving some dynamic data into file. In such cases if you try to read a non-existing file, your application will crash. To prevent this, it is a  good practise to first check if a file exists or not. Although fs module provides both synchronous and asynchronous methods to achieve this, the synchronous function is more commonly used. This is how you use it:

use strict;
const fs = require(fs);
var readFileAfterChecking = function(){
    if(fs.existsSync(.\myTextFile.txt)){
       var content = fs.readFileSync(.\myTextFile.txt).toString();
       console.log(content);
    }else{
       console.log(File does not exists);
    }
};
readFileAfterChecking();

The fs.existsSync() function returns a boolean value. true if the file exists else false. The fs.existsSync() function can be used to check the existence of file and folders both. Just pass the folder path as the first argument.

Creating a directory
    Sometimes there is a need to create some directory from your program either in your server or any command line tool. The fs module provides function to create a directory on your disk. Again both synchronous and asynchronous methods are available.

use strict;
const fs = require(fs);
var createDirectorySync = function(){
    if(!fs.existsSync(.\myDirectory)){
       fs.mkdirSync(.\myDirectory);
    }else{
       console.log(Directory already exists);
    }
};
createDirectorySync();

This fs.mkdirSync() function will crash if the same folder already exists. Hence it is advisable to use it only if the directory does not exists.

Please note that for creating a folder its parent folder should exists. i.e. if you are trying to create /folder1/child1 folder, you need to ensure that folder1 exists. fs.mkdir function will not create the whole path if it does not exists.

Reading Contents of a Directory
fs module provides two functions to read the content of a directory. One of them is synchronous which is frequently used with forEach() function of JavaScript. 

Synchronous Method

use strict;
const fs = require(fs);
var readDirectorySync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory).forEach(function(content){
           console.log(content);
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectorySync();

Asynchronous Method

use strict;
const fs = require(fs);
var readDirectoryAsync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory, function(err, contents){
           contents.forEach(function(content){
              console.log(content);
           });
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectoryAsync();

The fs.readdirSync() function returns an array containing the name of files and folders which are present in the directory. IT JUST RETURNS THE NAMES. You have to decide if that particular name is of folder or a file. Once we get the array of contents, we iterate over it using forEach() function of JavaScript.

Checking if file or folder
In the previous function we had a problem then fs does not tell us directly which is a directory and which one is a file. For finding out this you need to use another function fs.statSync().


use strict;
const fs = require(fs);
const path = require(path);
var readDirectorySync = function(){
    if(fs.existsSync(.\myDirectory)){
       fs.readdirSync(.\myDirectory).forEach(function(content){
           var stat = fs.statSync(path.join(.\myDirectory, content));
           if(stat.isDirectory()){
               console.log(content + is a directory);
           }else{
               console.log(content + is a file);
           }
       });
    }else{
       console.log(Directory does not exists);
    }
};
readDirectorySync();

The fs.statSync() function returns an object having a function isDirectory(). This returns a boolean true if the passed path is a folder else returns false. The fs.statSync() function requires that you pass the full path of the content you are testing. Else it won’t work.

Deleting a file
The fs module provides a function fs.unlink to delete a FILE (only file) from the disk. Again it has 2 variants synchronous and asynchronous.


use strict;
const fs = require(fs);
var deleteFileAfterChecking = function(){
    if(fs.existsSync(.\myTextFile.txt)){
       fs.unlinkSync(.\myTextFile.txt);
       // File deleted
    }else{
       console.log(File does not exists);
    }
};
deleteFileAfterChecking();

If you try to delete a nonexistent file then the script will crash. So be careful.

Deleting a folder
    The fs module has a functions fs.rmdir which accepts path of the directory to be removed as its argument. This function has two variants - a synchronous and another asynchronous.  Note: This command can only delete empty directories. If the directory is not empty then you need to remove it's contents using fs.unlink or fs.rmdir on child directories. 


use strict;
const fs = require(fs);
var deleteFolderAfterChecking = function(){
    if(fs.existsSync(.\myFolder)){
       fs.rmdirSync(.\myFolder);
       // Folder deleted
    }else{
       console.log(Folder does not exists);
    }
};
deleteFolderAfterChecking();


That is a lot about file IO in NodeJS. 

Global Events in NodeJS

 
       We have seen what events and event emitters are in NodeJS. Also we have seen how to create and use our own events. This article exposes you to some of the predefined events in NodeJS which play a crucial role in your application life cycle. 

Following are some of the global events in NodeJS.

Open server.js file and try the following codes to the working of each of the event. The following events are heard by the process global variable. To register a listener for an event, we use 

process.on(<event name>, liistenerFunction)                                                                             
exit
This is an event which is emitted whenever a NodeJS process is terminating. It can be because the event loop senses that no threads are running or calling process.exit() function explicitly.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;

  if(i==2){
    process.exit();
  }
  
}, 1000);

process.on('exit', function(code){
  console.log("Exiting: "+code);
});


SIGINT
        SIGINT stands for SIGNAL INTERRUPT. This event is triggered whenever the user interrupts a Node process by pressing Ctrl+C or Cmd+C (in Mac). The listener you define here will override the default functionality i.e. stopping the code. That is why we will add a process.exit() inside the listener else the process won’t ever stop by an interrupt.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;
}, 1000);

process.on('SIGINT', function(){
  console.log("Exiting process");
  process.exit(0);
});


It will keep on printing integers unless we interrupt by pressing CTRL+C


On pressing Ctrl+C, our listener is executed which first prints out message defined in our listener. After this the process exists because of process.exit().

warning
       This event is emitted whenever the current NodeJS process emits a warning. You can emit a warning using process.emitWarning() function. This event is used by many database modules whenever some configuration mismatch has happened. You should use this instead of console.log() when developing some modules.


"use strict";

var i = 0;
setInterval(function(){
  console.log(i);
  i++;
  if(i==3){
    process.emitWarning("i has reached 3");
  }
}, 1000);

process.on('warning', function(){
  console.log("warning received");
});


As you can see when i becomes 3,  a warning is emitted which prints (node:22536)Warning: i has reached 3. After this our listener is called which prints warning received. The (node:22536) before the message makes the message stand out from your other logs. The number 22536 is the process id which the OS assigns the NodeJS process. It will be different every time you run this.

uncaughtException
      This event is emitted whenever a function throws an error but is not catched using try-catch block. This is a very powerful event as you can prevent your app from crashing even if an error has occurred. This event allows you to perform some custom function with the error like logging it or sending it to developer team etc. If you catch an error, then this event is not emitted.


"use strict";

var validString = `{"website": "compiletimeerror.com"}`;
var invalidString = `{"website" =  "compiletimeerror.com"}`;

var validJSON = JSON.parse(validString);
var invalidJSON = JSON.parse(invalidString);

process.on('uncaughtException', function(err){
  console.log(err);
});


As you can see, since the invalidString is an invalid JSON object, when you try to parse it into JSON NodeJS throws an error. Since we are not catching this error, it shows up in uncaughtException. Also on uncaught error, the process crashes. Since in a server, we do not want our server to crash on error, we can write the code to restart our server within this uncaughtException event handler.

Now see the same thing with try catch.


"use strict";

var validString = `{"website": "compiletimeerror.com"}`;
var invalidString = `{"website" =  "compiletimeerror.com"}`;

try{
var validJSON = JSON.parse(validString);
var invalidJSON = JSON.parse(invalidString);
}catch(err){
  console.log("Some error");
}

process.on('uncaughtException', function(err){
  console.log(err);
});

In this example we surround the parsing functions within a try catch block. Whenever some error occurs, the try catch block catches it and it prevents server from crashing. In such cases, since the exception or error is handled, the uncaughtException event is not triggered.


As you can see the catch block executes our code and prints “Some Error” instead of the error message.

These are the frequently used process events in NodeJS. There are others also but are less frequently used. You can find more information about them in the official documentation of NodeJS.


Event Loop and Event Emitters in NodeJS

 
      Previously we have seen the thread structure of NodeJS. We saw that there is a single thread which handles client connection and passes on heavy tasks to child threads. In this article we will see how to interact with those threads.

Event Loop
Recapitulate how the concept of threads was used to explain the multi-threaded nature of NodeJS.

“There is a bakery in which customers come and give order to you. You pass on the baking of cakes to different ovens according to their availability. Once a cake is ready you serve it back to the customer who requested it”

In this example, you handle the creation of threads and other process management tasks. In NodeJS terminology, you are the Event Loop. An event loop is a thread which keeps running in background and checks for any other threads which might have been pending. 

In other languages like Ruby on Rails or Django where the server is multithreaded to the client, what happens is that there is no main thread. The whole server is split into many threads but nothing sort of Event Loop.  Imagine it like you go to a bakery, place an order, the shopkeeper with his many ovens, instead of cooking a cake as whole, divides the cake into some parts and bakes them separately and then joins them together to form the complete cake. Although baking a cake may become faster, but he handles only one customer at a time which results in other customers waiting for their turn. 

This is completely opposite of Non-Blocking I/O of NodeJS. This is the reason why NodeJS can handle much larger concurrent connections. It is evident from the following statistics

Source: hashrocket.com
The graph shows the maximum number of concurrent connections that a server built with different technologies can handle. 

Events
      Again let us go back to the NodeJS bakery. We have the clients, event loop and the threads. But did you think of how will you know when a cake is ready? The answer is simple… The oven beeps and alarm or so. Right? This alarm or the sound that the oven makes when the cake is baked is called event in NodeJS.

Events are some signal which tell the main thread that a thread has finished executing and is ready with the result. Events are extensively used in NodeJS due to its non-blocking architecture. All the network interfaces and database management modules emit some sort of event to interact with the main thread.

Now let us create some events.

Create server.js file with the following contents

use strict;

const events = require(events);
const emitter = new events.EventEmitter();
let sayHelloListener = function(){
    console.log(Hello World);
};
emitter.addListener(greet, sayHelloListener);
emitter.emit(greet);

As usual, we need to import the events module to use events. For emitting events we would use the EventEmitter class of events module. We then define a listener. A listener is a piece of code which is executed whenever an event has occurred. That means that whenever the event loop hears the greet event, the sayHelloListener() will be executed. Finally we emit the greet event. You should see something like this

NOTE: Once the event is emitted and executed, the event loop still keeps listening for these events until you remove the listener.

This example shows usage of event in the main thread. Now let us see events from another thread. For this we will use setTimeout() function to execute a function in another thread.


use strict;

const events = require(events);
const emitter = new events.EventEmitter();

let sayHelloListener = function(){
    console.log(Hello World);
};

emitter.addListener(greet, sayHelloListener);

setTimeout(function(){
  emitter.emit('greet');
}, 5000);

console.log(End of script);

Here we emit the event greet in another thread which will initially be paused for 5 seconds.
Initially the output will be


Observe that the program has not exited as the event loop detected that some thread is still executing. Hence the event loop keeps waiting for the thread to finish instead of exiting.

After 5 seconds, the output becomes


This is concept of events. There are various applications of events which you will see in next articles.


Threads In NodeJS

 
        A thread of execution is the smallest sequence of programmed instructions that can be managed independently by a scheduler, which is typically a part of the operating system. Understand the following way:

Imagine you have to bake some cakes. In the first case, you have a single oven for baking. When an order comes for a cake, you put the batter in the oven and set the timer. When the timer reaches zero you take out the cake and serve it. Now imagine that another order comes while the first cake is still cooking, well obviously you will have to wait till the first one is finished baking before you bake the second one. 

This example is similar to a single threaded architecture. A single thread can execute only a single task at any instant of time. If another process need to execute, it has to wait till the thread becomes free.

Now suppose you have 10 ovens. In this case you can bake upto 10 cakes at a time. At it is probable that all 10 of them are not used at the same time. This is how multi-threaded architecture works

That is about threads. But what is NodeJS? Single threaded or multithreaded?
Again go to bakery. Whenever a customer comes, does he put the cake in the over? No! right?... A customer gives his/her order to you who then looks for free ovens and starts baking. This is the model followed by NodeJS. 

NodeJS has a main process which is responsible for communicating with the clients, taking instructions from the customers, uses any free threads from its internal thread pool, fetches result from threads and send the result back to the customer. 

Now let a customer enter your bakery and give some task to you which would take you around 5 minutes to complete. During these 5 minutes, since you are busy executing some task you won’t be able to handle other customers or assign new processes. This would cause a delay for the customer who may leave after frustration.  

Similarly if you keep the NodeJS’ main thread busy, it will cause delay in responding to the clients. This is a bad design as any server should try to minimise the response delay. Hence it is a good practice to execute long time consuming tasks in child threads.

See the working of threads in the following example.

Hello World Application 

In this example, we will display the contents of a file to console. For this we will be using ‘fs’ module of NodeJS which enables us to perform file I/O.

Create a new project and create a new file server.js and myTextFile.txt with the following content:

server.js


use strict;
const fs = require(fs);
const path = require(path);

var sayHello = function(){
  var str = fs.readFileSync(path.join(__dirname, myTextFile.txt)).toString();
  console.log(str);

  console.log(Finished reading file);
};

sayHello();



myTextFile.txt
Hello..!! I am inside myTextFile.txt                                                                                                         

Here the ‘path’ module is used for specifying the address of the text file independent of the operating system. 

When you execute this program, you should see something like this

As expected.No surprised here.
Now let us read the file in another thread.

Modify the sayHello() function above as 

use strict;
const fs = require(fs);
const path = require(path);

var sayHello = function(){
  fs.readFile(path.join(__dirname, myTextFile.txt), function(err, buffer){
     var str = buffer.toString();
     console.log(str);
  });

  console.log(Finished reading file);
};

sayHello();

Now run and see the output.

Unexpected??????? You might be wondering why the second console.log() is executed first. Well that is where threads come into play. 

The code you run is always on the main thread. So when you call sayHello() function, you are calling it in the main thread. Now in the sayHello() function, the first statement is executed as readFile BUT this function is executed in a separate thread. This fs.readFile() handles reading file in a separate thread. The logic is defined in the core NodeJS modules. You pass a callback function as an argument to readFile()  function which is executed after the reading of file is complete. 

fs.readFile() executes in another thread as said above BUT the outer console.log() executes in the main thread. That is why it executes while the other thread is processing. 

In the next article we will show how to create another thread manually.


Callbacks concept in NodeJS

 
     One of the most confusing concept in programming is used in NodeJS - the Callbacks. Even after being a complex component, callbacks are somehow used in other programming languages as well. This article exposes you to the concept of Callbacks. 

Let us understand with the help of another simple application.

Hello World Broadcast
     Till now all the code we saw ran instantly. But if a situation arises where you would need to execute some function after certain delay, they we would have to pause the execution of our program. Let us say we want to display a message every second. Then what you would need to do is to pause the program for exactly 1 second before again printing the message. 

One way to do this is to run a for loop with a large counter. If your computer executes 1 instruction in a microsecond, then you would need to run a loop having 10,00,000 iteration. But this would be specific to your machine and won’t work as expected in different machines. 

The above functionality can be achieved by using setTimeout() or setInterval() method of JavaScript. Here is how would you do this:

server.js

use strict
const message = Hello World;

setInterval( function(){
  console.log(message);
}, 1000);


Hey Hey…!! what is this “use strict”; on top? and how can you pass a function as an argument to another function?? 

The “use strict”; on top of the file ensures that all JavaScript errors are properly caught. If your code doesn’t include this line, the you can use undeclared variables in your code. 

For those who are coming from C, C++, Python or such languages, passing a function as an argument would seem like a mistake. But this is a main part  of functional programming. JavaScript allows you to pass an entire function as an argument. This function which is passed as an argument into another function is called a Callback. This allows you to call the callback function from within another function without having to import any modules or libraries. 

On running this you should see something like this:


This same thing is available in your browser’s developer console. 
Open your developer console by Ctrl+Shift+i. You can execute javascript codes in this console.

Try the following code in the console.


var a = function(callback){
           console.log(12345);
           callback();
       };

a(function(){
  console.log(Hello World);
});

 Here you can see that when you call a(..) , it first prints 12345 and after printing this, the a() function calls the callback function. It is the callback function which prints Hello World to the console.


You can even pass a previously declared function as a callback.

Example:

server.js


use strict;
var a  = function(callback){
  console.log(Inside function a);
  callback();
};

var b = function(){
  console.log(Inside function b);
};

a(b);


Here we declare two functions a and b which prints “Inside function a” and “Inside function b” respectively. Then we pass the already declared function b() as an argument to a(). Observe that when passing a function as callback, we only need to use the function name. 

i.e. use a(b); and not a( b() );


Of course you can pass multiple callback functions.

use strict;
var a  = function(callback, callback1){
  console.log(Inside function a);
  callback1();
  callback();
};

var b = function(){
  console.log(Inside function b);
};

var c = function(){
  console.log(Inside function c);
};

a(b, c);



As you can see, the order in which the callback functions are called in the function a() matters, not the order in which the functions are passed as arguments. Even though we are passing the function b() as the first argument, but callback1() is called before callback() inside the a() function. That is why c() function is called before b().

This concept of callbacks is very heavily used in NodeJS. Its use will be demonstrated in the article about Threads in NodeJS.


Popular Posts

Find us on Facebook