AI and Emotion Science

AI sessions were some of the most popular at SXSW this year. A common theme was the difficulty in teaching machines to think like – and interact with – humans. Particularly in teaching them the messy, conceptual and imprecise way that we communicate with each other.

By now, most people have interacted with a bot or artificial intelligence of some kind. What may seem like a simple transaction with a phone prompt or customer service chat bot is actually a complex array of functions that exists on a continuum: starting with the most basic prompt-and-response and ranging to complex contextual language interaction.

Even the most basic AI requires the ability to understand and react to input, but the far end of the spectrum also requires the ability to learn and adapt. It is more natural for humans to converse in the nuanced generalities we have been using for thousands of years than it is for us to adapt to the precise language of machines. We must therefore teach machines to understand the nuanced language of humans.

While massive amounts of data input can sometimes create the illusion of a computer-based conversationalist, computers have not yet achieved anything like parity with human’s facility for language. A toddler can see a cat, hear the word cat, and instantly understand what a cat is. On the other hand, a computer may need to see a million pictures of a cat to understand what a cat is, but still will not have an emotional understanding of what ‘cat’ actually means.

This gap between recognition of object and recognition of meaning is a key bridge for AI to cross, especially in a business context, where language alone can fail to help us fully understand consumers, and, particularly online, is inadequate to help consumers express themselves. However, this a front on which many AI scientists are working. Sessions such as ‘The Future of Emotional Machines’ explored how we are beginning to chip away at this barrier.

In one breakout session, we discussed consumer investment strategies and how we could use AI to better predict an investor’s risk tolerance. The big problem is that new investors frequently express a medium or high tolerance to risk, yet run for the hills at
the slightest market jitters. We theorized that utilizing emotion science (like we do with MindSight at Isobar) could help tease out the emotional state of an investor, which could then be tracked to actual investor behavior – ultimately predicting their behavior,
and perhaps figuring out new ways to ask the right question to get a better answer.

It feels like a long way off (and it probably is – SXSW positions its view on new technology as ‘stuff which is still around two years away’) but AI’s potential ability to detect emotional traits, and to learn and respond accordingly, may play a role in helping
us make the right decisions sooner than we think.

More News

| 20th Aug 2018

Bring Isobar to SXSW 2019! Vote Now

We need your help! We’d like to bring our people and ideas to the SXSW stage again in 2019.

| 3rd May 2018

SXSW Afterglow: The Sessions That Stuck with Me

From gender diversity to technology, there was no lack of resonating content at SXSW 2018.

| 21st Mar 2018

SXSW 2018 Recap Report

The top tech, trends and highlights from the 2018 SXSW Interactive Festival.