Design Tools for Affective Robots

Collaborators: Paul Bucci, David Marino, Laura Cang, Karon MacLean

As robots enter our daily lives, they need to communicate believably, that is, emotionally and on human terms. When physical agents gesture, touch, and breathe, they help humans and robots work together, and even lead to applications like robot-assisted therapy. However, making a believable robot takes a huge amount of time and expertise - animators must avoid the uncanny valley.

We developed a hardware design system for DIY robots called CuddleBits. Designers and makers can create and remix CuddleBits extremely rapidly to try new forms while experiencing their believability.

We also introduce Voodle (vocal doodling), a design tool that maps vocal features to robot motion. Voodle lets performers act out robot behaviours vocally, allowing them to imbue CuddleBits with a spark of life.

CuddleBit Video


CuddleBit Abstract

Social robots that physically display emotion invite natural communication with their human interlocutors, enabling applications like robot-assisted therapy where a complex robot's breathing influences human emotional and physiological state. Using DIY fabrication and assembly, we explore how simple 1-DOF robots can express affect with economy and user customizability, leveraging open-source designs.

We developed low-cost techniques for coupled iteration of a simple robot's body and behaviour, and evaluated its potential to display emotion. Through two user studies, we (1) validated these CuddleBits' ability to express emotions (N=20); (2) sourced a corpus of 72 robot emotion behaviours from participants (N=10); and (3) analyzed it to link underlying parameters to emotional perception (N=14).

We found that CuddleBits can express arousal (activation), and to a lesser degree valence (pleasantness). We also show how a sketch-refine paradigm combined with DIY fabrication and novel input methods enable parametric design of physical emotion display, and discuss how mastering this parsimonious case can give insight into layering simple behaviours in more complex robots.