If you’re new to Bangle.js, a hackable open source JS and tensorflow-driven smart watch, and its recent launch at NodeConf EU by NearForm Research and Espruino, you can read the announcement that we published here.
In this first in a series of follow-up posts, we wanted to give you more of the backstory and tell you about the latest details. Of course, as it’s all Open Source, you’ll find plenty of links to the code too.
Open Health Platform
Bangle.js is an exemplar of what’s possible through European and international collaboration around Open Source. We said in the announcement post that Bangle.js/NodeWatch at NodeConf EU is mostly about fun and experimentation but is backed by very serious intent. NearForm believes deeply in sustainable Open Source, Open APIs and your right to your data. This project has the potential to bootstrap a community-driven open health platform where anyone can build or use any compatible device and everyone owns their own data. I’ll be writing more about this in the coming weeks.
The initial introductory session to Bangle.js was at NodeConf EU on Nov 11th 2019 and included James Snell using gesture control based on TensorFlow and HID to control his slides (more on this later). Gordon Williams made the important point that 6 weeks ago, we had no software running on this particular watch. Gordon also launched his KickStarter so anyone can get their hands on this device.
Here’s the recording of their talk which had everyone in the room very excited indeed:
And here are the three badges from 2017, 2018 and 2019 together showing the evolution of the idea.
Espruino based conference badges of 2017, 2018 and 2019 at NodeConf EU 2019. Kilkenny, Ireland. November 2019.
NodeWatch and Bangle.js
NodeWatch is the specific implementation of Bangle.js for NodeConf EU 2019, co-developed by Espruino and NearForm Research.
Back to 2018
As soon as NodeConf EU was finished in 2018, we started planning 2019. And we immediately hit the problem: “how the heck do we top that?”. Various ideas fell by the wayside quickly (Belt Buckle Badge ;-) ) and we settled on a smartwatch when we realised all of the new possibilities such a device offered.
I had seen the Espruino community reverse-engineering smartwatches and healthbands for several years so we knew it was very doable. Similarly, back in 2017, I messed with a 3D printed case from a community member, along with a Puck.js, Oled screen and Nato strap. The results were not pretty:
But the thing I keep coming back to was the Pebble. I still have mine. It still works. And I was one of many people who were heartbroken when they folded.
I’ve been wondering if Bangle.js could be its spiritual successor, where the functionality is created by the community for the community?
One critical aspect of Bangle.js is that we weren’t reliant on a single piece of hardware. As long as we could get Espruino on to a device, then it could be a Bangle.js. This led to many months of purchasing various NRF-based Smartwatches and trying them out.
Some were fabulous but impossible to update over the air. Others were easy to hack but badly built and with few features. Finally, we landed on the chosen one. Easily hackable, filled full of hardware features, solid build quality and relatively easy to get our hands on. Here are a few I worked on including my patent-pending programming jig ;-)
Once the device was selected and Espruino ported, we needed to do a few more things – build a lot of software for it and give it a name. I won’t go into some of the names that were created (scream) but when I landed on Bangle.js, we knew instantly we had a winner.
As I mentioned in the first post, I was ridiculously excited after Google’s TensorFlow Dev conference in March. Not only did Clinic.js get a shout-out during the TensorFlow.js session, but Google announced TensorFlow Lite for Microcontrollers.
The idea of being able to run ML models on $2 Bluepill boards from Aliexpress was astounding to me. I ran over to my hardware desk, grabbed a Bluepill and CP2102 cable, connected it to my Mac and started working through the example. After a bit of toolchain/cross-compiler head-scratching (I used to cross-compile GCC for 68000 variants on Sparcstations in the 90s), it all worked and I had TensorFlow running on a tiny device in rural West Cork in Ireland.
We then brought in the NearForm Research Data Science team to help optimise the port and work on example apps, as simple sine waves examples were not going to be enough for NodeConf EU. Andreas Madsen has been deeply involved in our Clinic.js project which previously resulted in him integrating TensorFlow.js into Clinic.js Doctor and creating the Node Cephes project which leverages WASM for special maths functions.
We needed a really meaty example and James Snell came up with the idea of hand/arm gesture recognition. If we could reliably recognise a set of gestures, we could easily connect the watch via Bluetooth HID to a laptop and control Powerpoint, or in fact any app.
We hit some roadblocks and reached out to the TensorFlow Micro team via long-term friend of NodeConf EU, Myles Borins. As the Microcontroller version of TensorFlow Lite is still marked experimental, we naturally had a few issues with docs; the move to TensorFlow 2.0; plus various bugs and limitations. Between deep dives into the source code by the team (Open Source FTW) and guidance from the TensorFlow team (Thanks to Pete, Ian and Tiezhen!), we quickly got a model working well and detecting gestures. We have also submitted some patches and improvements back to TensorFlow.
Bias and ML
Avoiding bias is always important and the same is obviously true when building Machine Learning models. While you might think that something as simple as clapping is universal, it too is affected by culture. So the team factored that in from the start.
We now have that trained model baked into every Bangle.js at NodeConf EU.
Andreas and James will have a deep dive post on the ML aspects of NodeWatch quite soon. But in the meantime you can read how it is implemented in Espruino and look at its reference docs for Bangle, tensorflow and TFMicroInterpreter.
Everything is Open Source and you’ll find lots of info over on nodewatch.dev
Workshop and Google Colab
Gordon and I ran a workshop at NodeConf EU which was split between basic usage of Bangle.js and an intro to Machine Learning on it.
You can follow it step-by-step here and the repo with the gesture models and full Jupyter Notebook for Colab is here.
App “Store” / App Loader
It uses Web Bluetooth to transfer the apps. Some of the initial ones included:
- HID Control for keyboard or music
- Gesture control using HID and TensorFlow Lite
- Hear Rate Monitor
- Plus Codes / Open Location
- Custom QR Code
- Text Watch
Again, as everything is Open Source you can submit your own Apps via Pull Request. In the next post, we’ll highlight some of the apps that people have created for it already!
MQTT BLE Bridge
We created an experimental MQTT-BLE bridge which enabled attendees to send/receive MQTT messages over Bluetooth from their Bangle.js which were then relayed to a Mosquitto server on AWS. This also worked very nicely with Node-RED.
In the next post, we’ll highlight some of the delighted responses to Bangle.js and showcase some of the apps and watch faces already created for it.
The NearForm Research and Espruino teams have now created three highly successful conference badges together. From a NodeConf EU perspective, we’re already planning the 2020 device (oh boy!). From a Bangle.js perspective, I think we’re on the cusp of something huge.
Conor O’Neill is Chief Product Officer at NearForm and is responsible for all productization activities and works closely with NearForm’s Open Source and R&D team to evolve the web platform. Some of the projects he has responsibility for in NearForm are Clinic.js and the NodeConf EU Digital badge.
Feel free to connect with him on LinkedIn.