Facial_rec_01.jpg

Opened Mind

OpenMind

 
 
 

Overview

OpenMind was a proposed interface that helps children with autism spectrum disorder (ASD) read projected emotions, made in collaboration with Oblong Industries.

Powered by Microsoft Azure's emotion-recognition API, OpenMind displays the strongest detected emotion behind two users in conversation, providing an alternative to "emotion labeling" practices often used with those with ASD.

Role: Project lead, responsible for the conception and design of the interface. Oblong Industries engineers provided technical assistance in integrating my tool with their model

Tools: Microsoft Azure (Face Service), Python

Timeline: 6 weeks from ideation to first prototype

Challenge

Oblong Industries is a design-driven software and hardware company whose mission is to put a new user interface on every computer in the world.

For this project, they challenged students to design and prototype a novel interface for their multi-screen, interactive brain model. The only requirements were that it had to have...

1) A unique interface

2) An applicable use case

 

Fig. 1: A returned dataset from Microsoft Azure's emotion recognition API

Fig. 1: A returned dataset from Microsoft Azure's emotion recognition API

Ideation

This project was unique in that I experimented with a bottom-up approach to design. My concept began with Microsoft Azure's emotion recognition API, a tool that I had wanted to tinker with.

The API analyzes the users expression and returns the confidence across a set of emotions, such as anger, contempt, happiness, etc (See Fig. 1).

 

Research

I immediately thought of my personal experience growing up with my sister, who has ASD. Like many children with ASD, she struggled to read emotions that were projected through nuanced facial expressions. Families were often advised to practice “emotion labeling” at home, where we explicitly state our emotions throughout a conversation (e.g. “I’m happy”, “that makes me feel sad”).

To further develop my concept, I interviewed three children with ASD and their families about their experiences with emotion labeling and found that:

  • It makes it difficult for both parties to have “normal” conversations. The explicit nature of emotion labeling often derails the conversation and keeps it from progressing naturally.

  • It takes agency away from those with ASD. Because the frequency and nature of emotion labeling is determined by the conversation partner, the child’s needs are not always met. All three subjects reported that their conversational partners often under-labeled or over-labeled emotions when engaging in this practice.

  • It restricts learning opportunities to interactions with neurotypical figures, as it relies on someone to monitor the conversation. Two families expressed concern that this practice would result in their kids feeling isolated from others in the ASD community.

 
Fig. 2: An early rendering of OpenMind

Fig. 2: An early rendering of OpenMind

Solution

OpenMind uses Microsoft Azure’s emotion recognition API with Oblong’s large-screen digital brain model as a way to signal changes in emotion.

Designed for two people in conversation, the screens display two 3D models of the brain, one behind each user. Each user faces a webcam that is linked to Azure's emotion-recognition API, which analyzes the users expression once per second. When a new dataset is returned, the screen behind each user displays the strongest detected emotion and the area of the brain associated with it.

This allows someone to easily detect changes in their partner’s emotions without any conversational disruption. The subtle nature of this interface gives agency back to those with ASD, as they can reference their partner’s display according to their own needs. The dual user setup also allows two people with ASD to have a shared learning experience without the presence of a neurotypical figure.


 

Outcome

I created a working prototype that was presented to Oblong executives and other students. Unfortunately, due to the nature of this assignment, I was unable to arrange a playtest with my interview subjects and/or iterate on my prototype. If I were to continue moving forward on this project, a playtest with children with ASD would be the next priority.

Fig. 3: Still from prototype presentation

Fig. 3: Still from prototype presentation