Collect fast Pino logs in an Azure Event Hub
By Dr. Jan-Jan van der Vyver


In our previous article, entitled “Rapid application development on Microsoft Azure”, we demonstrated how easy it is to create a Web App on Microsoft’s Azure. While we did show how to enable logging locally for a Web App, that is not really practical or useful for a real app.

For real world applications, we need a logger that gets out of the way. Logging is not the true value of our application. Rather, we need logger that is fast, to free up the maximum computing cycles for our business logic. And we need to collect the output of that logger (and others).

Pino is fast, very fast


Pino is an obvious choice to increase our application’s performance. Or at least to minimise the impact of logging on our app’s performance.

Pino was created by Matteo Collina of nearForm. Pino logs only in JSON via a binary stream (a file or stdout). Pino outperforms the other existing loggers by up to 6x.

If you are used to Bunyan, Bole, or Winston, in 99% of cases you should just be able to drop in Pino.

Pino plays nicely with Hapi (& Express)

Pino plays nicely with a lot of packages. Keeping in line with our article series we will be using Hapi. Hooking up Pino with Hapi is pretty straight forward using the hapi-pino npm module:

register(require('hapi-pino'), (err) => {
  if (err) {

  // Start the server
  server.start((err) => {
    if (err) {

This creates logs on all our routes of the format:

{"pid":11518,"hostname":"local-server","level":30,"time":1499413842512,"msg":"server started","created":1499413842353,"started":1499413842502,"host":"localhost","port":1337,"protocol":"http","id":"local-server:11518:j4tkad9t","uri":"http://localhost:1337","address":"","v":1}

There are two additional ways of logging available to pass down extra information:

// the logger is available in'Pino is registered')
// and through Hapi standard logging system
server.log(['subsystem'], 'standard logging way for accessing it')

To see more examples of pino output:

  1. clone our pino branch
  2. run npm install
  3. run npm run watch
  4. in another tab run curl localhost:1337

Caveat Emptor: Pino && Hapi && TypeScript

Pino also provides server.logger().info(‘some info’) as a form of logging with Hapi. However, if you have a TypeScript project then you are restricted to the type definitions supplied by @types/hapi, and they do not allow this form of logging yet.

Log aggregation with Event Hubs

Now we have a logger working, but a feed on a local machine is not that useful. We need to be able to get this information into a dashboard.

Here we need to take a step back and consider our architecture. We are deploying an app in the cloud, i.e., on a VM or container. Presumably, there are multiple instances of your app, and their other instances of, e.g., databases. There will also be other services if you are doing a micro-services deploy. You might also be collecting client-side or IoT data. Connecting all of these to a dashboard in parallel might overwhelm it. It makes more sense to connect to a log aggregation service. Event Hubs is the obvious candidate since this is part of our series on Azure.

Microsoft describes Event Hubs as:

Azure Event Hubs is a highly scalable data streaming platform and event ingestion service capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. With the ability to provide publish-subscribe capabilities with low latency and at massive scale, Event Hubs serves as the “on ramp” for Big Data.

To collect our pino logs, we will provide a Web App and an Event Hub. Then pipe the logs from the Web App to the Event Hub:


Event Hubs can be used even if part of your architecture is on another cloud or own infrastructure. We will make use of this during development below to test the connection from our local machine to an Event Hub.

That said, it is still necessary for us to connect a dashboard by subscribing it to our Event Hub. We will do so in the next article.

Note that when using Event Hub there is a delay of ±10 minutes when viewing data on its dashboard. Keep this in mind as you build your own application.

Provision an Event Hub

In our previous article we demonstrated how to provision a Web App. Assuming you have an Azure and a deployment user, then this simple bash script will get you started:

#! /bin/bash


CREATE_GROUP="az group create -n $GROUP_NAME --location $LOCATION"

CREATE_PLAN="az appservice plan create -n $PLAN_NAME -g $GROUP_NAME --sku $SKU"

CREATE_WEBAPP="az webapp create -n $APP_NAME -g $GROUP_NAME -p $PLAN_NAME"

SET_WEBAPP_SOURCE="az webapp deployment source config-local-git -n $APP_NAME -g $GROUP_NAME --query url --output tsv"

Please remember to replace APP_NAME with a unique name of your choice. Note that while we could deploy a Web App for free, Event Hubs requires at least the Basic 1 (BI) plan.

In case you have not done it before set your Azure git repository as a target.

git remote add azure https://<username>

Provisioning Event Hubs is possible using the Azure CLI. But, that is kind of tricky since we need to use templates. To simplify things for this example, we will use the web interface.

Under create a (unique) namespace. We used pinoEventHubNS (which will be accessible via, hence why it needs to be unique.):



Then create eventHub called pinoEventHub:


Here you can see the Event Hub pinoEventHub linked under the namespace pinoEventHubNS:


Then create Shared Access Policy called sendPinoEvent:



Congratulations you have provisioned a simple Web App infrastructure, including a log ingestor. More can be done to fine tune Event Hubs provisioning, but for now this suffices.

Send your first event

To be able to send your first event to an Event Hub, all we need to do is create a Shared Access Signature (SAS).

First, you will need the data from Shared Access Policy under CONNECTION STRING–PRIMARY KEY, eg,


Using Microsoft’s example as basis we created a simple function:

functioncreateSignature (uri, ttl, sapk) {
  const signature = encodeURIComponent(uri) +'\n'+ ttl
  const signatureUTF8 = utf8.encode(signature)
  const hash = crypto.createHmac('sha256', sapk)
  return encodeURIComponent(hash)

Where the ttl is the expiry date of the SAS in Unix time. And the uri is simply

Armed with your SAS, you can now send an event:

curl -sl -w "%{http_code}" -H 'Authorization: SharedAccessSignature<your-SAS>&se=<your-ttl>&skn=sendPinoEvent' -H 'Content-Type:application/atom+xml;type=entry;charset=utf-8' --data '{ "event": "hello world" }'

Which should return 201, denoting a successful POST.

Piping stdout to an Event Hub

We can now successfully post to an Event Hub, now all we need is to send all the Pino logs to such a posting mechanism. Since Pino logs to stdout, all we need is something like node yourapp.js | eventhub.

What we have is a stream of output from pino. First, we can break those into discrete events using split2. Then, POST these lines in batches (for better performance) to an Event Hub with a writable stream in object mode using https.

Something similar to:

const writable = new Writable({
  objectMode: true,
  writev: function (lines, done) {
    const events = lines
      .map(line => {
        return `{"UserProperties":${line}}`

    const req = https.request(options, function () { done() })
  write: function (line, enc, done) {
    const req = https.request(options, function () { done() })

pump(split2(), writable)

We have captured this in an npm module called pino-eventhub. You have already installed this as part of your download, and we can test this on your local machine using:

node build/index.js | ./node_modules/.bin/pino-eventhub -s pinoeventhubns -e pinoeventhub -n sendPinoEvent -a <your-SAS> -x <unix-time-expiry>

To see it working:

  1. run curl localhost:1337
  2. open
  3. click on pinoEventHubNS (or your equivalent Event Hub namespace)
  4. wait ±10 minutes, and you will see a blip under incoming messages


The alternative: AMQP

Alternatively, AMQP could be used to send the logs to an Event Hub. We stuck to https for simplicity.

Deploying to Web App

We have our basic Web App setup done already. However, to keep in line with Twelve-Factor App development, we want to simplify our npm run start command to node build/index.js | pino-eventhub. We have setup our npm module to take environment variables, so this can be done using the Azure CLI:

az webapp config appsettings set -n pino007 -g nearGroup --settings \
  PINO_EVENT_HUB_NAMESPACE=pinoeventhubns \
  PINO_EVENT_HUB=pinoeventhub \

Now all we need to do is:

git push azure pino:master
az webapp browse -n pino007 -g nearGroup

Again, you can go to your Event Hubs dashboard on the Azure Portal to see your events arrive.

Alternative: Application Insights

Instead of using Pino with Event Hubs, we could’ve opted for a solution like Application Insights (AI).

According to Microsoft:

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your live Web App. It will automatically detect performance anomalies. It includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It’s designed to help you continuously improve performance and usability.

All the functionality AI ties into looks great. AI, however, tightly couples to your application through monkey-patching. At the moment AI is instrumented for Bunyan, Console, MongoDB, MongoDB-Core, Mysql and Redis. However, for this application, we are looking at the lightest/fastest possible logging solution. Hence, Event Hubs made more sense for us.


Building on our previous article, we wanted to be able to collect the log files from our application. We showed that it is easy to provision an Event Hub on the Azure portal, using Pino as our logger. After creating credentials (using the tool we provided), your Web App can stream to an Event Hub from a single-line command.

In the next article in the series, we will select from the logs collected in an Event Hub, and represent them visually.


Subscribe to our monthly newsletter!
join the discussion