'Handling multiple requests without previous one blocking the next one

I have problems trying to handle multiple requests from the client. By right, each route should be handled asynchronously, however I can't make handling request after request without the previous one blocking the next one. With the following code I am using promises and using Express.js router

  let api = Router();

  api.get('/a', (req, res, next) => {
    console.log('A: get all A');
    getAllA()
      .then(() => console.log('A: finish retrieving A'));
    console.log('A: do something else');
  });

  api.get('/b', (req, res, next) => {
    console.log('B: get all B');
    getAllB()
      .then(() => console.log('B: finish retrieving B')
    console.log('B: do something else');
  });

So if I request /a and then /b almost instantly after, the output currently is:

// Request /a
'A: get all A'
'A: do something else'
// Request /b
'A: finish retrieving A'
'B: get all B'
'B: do something else'
'B: finish retrieving B'

While I expect it to be something like this:

// Request /a
'A: get all A'
'A: do something else'
// Request /b
'B: get all B'
'B: do something else'
'A: finish retrieving A'
'B: finish retrieving B'

Any idea whether this is expected or there are other ways to handle this?



Solution 1:[1]

A node.js server runs only one Javascript thread of execution at a time. It does not actually process multiple requests at the exact same time. So, if one request hogs the CPU for awhile, no other requests will get processed until that first request finishes and gives up the CPU.

But, if your requests do async operations (async file I/O, async networking, set timers to do something in the future, use some third party library that implements some other async operation, etc...), then whenever a request starts an async operation and returns control back to the event queue, then another request can start processing until it finishes or also hits an async operation. Whether it does or doesn't will depend entirely on the timing of things. In this way, multiple requests can be "in-flight" at the same time, even though only one is ever actually running Javascript at a time.

So, in your code, it really depends upon what getAllA() and getAllB() are doing. If they are synchronous operations, then they will block other threads while they run. If they are asynchronous operations, but they content for the same resources (such as a database or file or server), then they may still cause some serialization because of the mutual dependency upon the availability of an external resource.

One way to get true parallel request processing in node.js is with clustering. With node.js clustering, you start up multiple node.js processes on the same computer (usually about the same number of processes as CPU cores you have) and then each node.js process can independently process requests in parallel. If you have shared resources that all the processes in the cluster must access, then you will still have to find a way to share those across processes (database, custom central repository of shared info, redis, etc..).

For example, you could see multiple requests in flight at the same time if you did something like this:

let api = Router();

api.get('/a', (req, res, next) => {
    console.log('start of a');
    setTimeout(function() {
        console.log('done with a');
        res.json({status: "ok", msg: "done with a"});
    }, 500);
});

api.get('/b', (req, res, next) => {
    console.log('start of b');
    setTimeout(function() {
        console.log('done with b');
        res.json({status: "ok", msg: "done with b"});
    }, 500);
});

If you then fire off requests for /a and for /b right after one another, you should get this in the server console:

start of a
start of b
done with a
done with b

Both requests were "in-flight" at the same time, though not actually executing Javascript at the same time. Note: I used setTimeout() just to simulate an async operation with a known and controllable response time. That can be any series of asynchronous operations.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1