Signal - Intuitive Gestural Interfaces

2016

Role: Designer, Technologist, Researcher

Thesis Project | Carnegie Mellon University

In the search for more intuitive ways for controlling sound synthesis, the hands can be the direct source of control. 

 

Signal is a free-hand gesture-based instrument for sound manipulation and performance using 2 networked Leap Motion controllers. This allows for direct control over sound parameters with our hand in 3d space.

By training Neural Networks, a form of machine learning, non-linear relationships are established between the gestures and the desired outputs.

 

Finding the Metaphor

Delegating technical details to the algorithm allows the designers' focus to shift towards finding appropriate gestural metaphors for specific synthesis processes.

 
 

Intuitive Gesture Design

Taking cues from traditional musical interfaces, Signal separates synthesis functions between the hands, providing distinct platforms for each hand to execute precise gestures (dominant hand) and expressive gestures (supporting hand).

 

The Left Hand

The Left Hand controls various parameters simultaneously, providing expressive possibilities. Held in an elevated position, it can perform free-hand gestures, combining various movements. There are several gestures to choose from.

 

Selection Gesture

Using the right hand, the distance between the thumb and index finger determines the size of the subsample window. The position of the thumb also determines the position of the window within the larger sound file. I chose this gesture because the distance between the index and thumb is commonly used to represent size, and within the audio community it is common to scrub through recordings on a horizontal timeline using a window similar to this one.

  • Expressive

    Once in the active space, the left hand can open and close, separating the tips of the fingers from the thumb. This controls the Grain Birth Rate, which is the time between when each grain starts playing. The wider the open hand, the greater the distance between the grains.

  • Rotation

    Rotating the forearm controls the Grain Size. This means the time it takes the grain to playback from beginning to end. Outward rotation decreases the grain size, while inward rotation increases the grain size.

  • Depth

    Extending the arm forward decreases the Pitch. This is in relation to depth, and depth to decreasing pitch.

 

System

Skeletal hand data is captured in real-time with 2 Leap Motion controllers. openFrameworks converts hand data into OSC messages, which are then sent to anything that receives OSC, in this case Wekinator (a machine learning middleman), which processes the hand data and sends the results to Max/MSP for sonification.

 

Framework

Although technologies change, this work exemplifies how a technology-agnostic framework can be established for interaction. Focusing on relevant metaphors instead of the affordances of a particular technology, we can achieve rich and intuitive interactions