Making Promises safer in Node.js
By Matteo Collina

Promises (available from Node.js v4.0.0 onwards) can be a powerful choice for a project, but before buying into them there are some pitfalls to be aware of.

With EventEmitter, and anything built on top of it such as streams, developers are used to Node.js exiting if an error event is emitted without any listener to handle it. In this case, EventEmitter causes the process to emit a global uncaughtException event. This event is emitted by process, in general, when any unhandled exceptions occur.

A simple example:

const fs = require('fs')

process.on('uncaughtException', (err) => {
  console.log(`Error: ${err.message}`)
})

const file = fs.createReadStream('non-existent-file.md')
file.pipe(process.stdout)

Running this code will output Error: ENOENT: no such file or directory, open 'non-existent-file.md'. This is because fs.createReadStream returns a stream.ReadStream instance. In turn, stream.ReadStream is an instance of events.EventEmitter, therefore uncaughtException is emitted.

Internally to Node, when there’s a fatal exception V8 calls an onMessage callback function in the C++ layer. This function is provided in src/node.cc and calls another C++ function (FatalException), which in turn calls a JavaScript function (process._fatalException) that is created in lib/internal/boostrap_node.js, which finally calls process.emit('uncaughtException') when the error goes unhandled.

Thrown errors are also emitted because of integration between EventEmitter and V8:

const fs = require('fs')

process.on('uncaughtException', (err) => {
  console.log(`Error: ${err.message}`)
})

const file = fs.createReadStream('non-existent-file.md')
file.pipe(process.stdout)

const err = new Error('Random error')
throw err

Currently, Node.js gives unhandled promise rejections a little more leeway. If a rejection happens and there’s no .reject(fn) handler, the runtime prints this error to the console without crashing:

(node:10044) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

In order to use promises successfully, a rejection handler (.catch(handleFn)) should always be used, and should always be attached synchronously.

The Importance of Handling Rejections

Oftentimes any extra logic for handling errors in an application is seen as unnecessary, but properly dealing with exceptions is crucial for any application’s security, efficiency, and performance. Many asynchronous operations leave file descriptors lying around or use a significant amount of RAM, and it’s important to clean up when they fail in order to avoid memory leaks and other denial-of-service situations.

Take this example server:

const http = require('http')

const server = http.createServer(handler)
server.listen(3000)

const queue = [];

function handler(req, res) {
  const url = req.url

  generateBuffer(url, res)
    .then((reply) => {
      // Use a buffer and clean it up
      queue.pop()
    })
    // No rejection handler

  res.end('OK')
}

function generateBuffer(url) {
  // Create a buffer 1 GB in size
  queue.push(Buffer.alloc(1073741824, 1));

  if (url === '/') {
    return Promise.resolve()
  }

  const err = new Error(`Rule for ${url} not found`)
  return Promise.reject(err)
}

If this server receives a request for /, it will respond and clean up the buffer as expected. For all other requests, two things are noted:

  • There will be a warning in the console: (node:32236) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 2): Error: Rule for /blog not found
  • The requests don’t clean up after themselves, so more and more RAM is used. Unless the path is /, generateReply() returns Promise.reject(...) and the request handler has no .catch(...) so clean up is never performed.

The behaviour can be reproduced by testing the server with ApacheBench (ab):

ab -n 400 http://localhost:3000/crash

In order to fix these kinds of errors, add a .catch() call to the promise and handle the rejection:

function handler(req, res) {
  const url = req.url

  generateBuffer(url, res)
    .then((reply) => {
      // Use a buffer and clean it up
      queue.pop()
      res.end('OK')
    })
    .catch((error) => {
      // Clean up the buffer and handle the error
      queue.pop()
      res.statusCode = 500
      res.end(error.message)
    })
}

Future-Proofing using make-promises-safe

Node.js versions 6 and 8 will continue to chug along following an unhandled promise rejection. In future versions, unhandled rejections will cause the Node.js process to terminate, as per DEP0018.

The best way to ensure that an application is future-proofed is to emulate Node.js’s future behaviour today. Matteo Collina’s make-promises-safe module binds an event listener to the global uncaughtRejection event, causing any unhandled promise rejections to terminate the Node.js process. This is crucial because even when a developer knows to always use .catch(), it is easy to forget to add it every time a promise is used.

Installing make-promises-safe

To install the module, use npm install make-promises-safe --save, which will also make it a dependency of the project.

Usage

Using make-promises-safe is as simple as requiring it in the application’s entry script.

require('make-promises-safe')

The application’s code, along with any external modules, will be bound to this new behaviour. To test how the application fares with this behaviour, consider running unit tests; any new regressions most likely point toward promises without any catch() handler.

Conclusion

Node.js is moving towards treating unhandled promise rejections similarly to uncaughtException errors in the future. Soon, Node’s behaviour will be to terminate with a stack trace whenever an unhandled promise rejection occurs. Using the make-promises-safe module, developers can use that behaviour today. This promotes best practices by requiring unfulfilled promises to be handled with .catch().

New Call-to-action
join the discussion