Making our LED Wall emotionally aware with IBM’s Watson

It’s friday afternoon and things are starting to wind down as the weekend approaches. We decided to add a cool new integration to our MASSIVE LED wall. Jumping on the AI and NLP bandwagon we decided to make our LED wall emotionally aware with IBM’s Watson API.

One of function of our rio API functions take text (from twitter, web/mobile forms etc) and displays it on the wall.

image alt text

Albeit cool we decided to go one step further, we now analyse the text and determine if it reflects anger,disgust,fear,joy or sadness.

image alt text

image alt text

How does this magic work?

At a high level here’s what actually happens:

  1. A request comes through to display text on the wall (from a twitter stream or an an API request)

  2. Our text input processor takes the message and posts it to Watson’s API

  3. Watson sends a response giving a range of emotions with a score for each, we take the highest score and configure the message to send based on the emotion.

  4. We draw the text to a HTML canvas in a series of animated frames and then send the color data to our raspberry pi firmware to control the LEDs.

The code to make this all happen has been added to github, so you can try it out for yourself - specifically the Watson integration can be seen here: https://github.com/SolidStateGroup/rio/pull/4/files.