Designing "The Bionic Harpist"

Longitudinal participatory research on interface design for professional performance.

A woman sitting in the middle of a large but messy research studio. She is playing a concert harp that is equipped with two electronic controllers that are fitted on either side of the harp strings. They rise up at an angle from the soundboard, and each has a layout of several knobs, buttons, and sliders. The units are made of red 3D printed frames and black acrylic panels. Around the room is strewn a large amount of research equipment: computers, black cases, a violin bow, and other miscellaneous objects. Rehearsing with the Bionic Harp. CIRMMT, Montréal, Canada.

The Bionic Harpist is an ongoing project around the research and design of augmented control interfaces for a professional concert harpist. The project has produced three iterations of hardware and software that have been, and still are, used in performances. Read general information about the project here.

Table of Contents


Alexandra Tibbitts performs with the Bionic Harpist controllers in Montreal, QC. 2021.


This project began in 2016 as a design-research collaboration with harpist Alexandra Tibbitts. The goals were twofold:

  1. To design bespoke interfaces for a concert harp to allow the performer to play live solo electroacoustic music entirely on the harp.
  2. To conduct a practice-based longitudinal study of participatory design with — and for — professional performers.


  1. A motion capture study and analysis of movement and gesture in harp performance.
  2. Design and implementation of a wireless motion gesture system for augmenting instrumental performance.
  3. Design and fabrication of hardware interfaces that physically attach to the harp. Two versions have been produced, both used regularly in professional performances.
  4. Evaluation of participatory co-design methods towards successful adoption and long-term use.

I: Researching harp gesture

The project began with a motion capture study of harp performance. Eight harpists performed excerpts of harp music in a variety of different expressive styles while being recorded in a motion capture studio. Analysis of the performances revealed both instrumental (sound producing) and ancillary (non-sound producing) gestures. Using this as a guide, we experimented with gestures for processing and modulating sound (sampling and looping, and controlling audio effects).

An animated gif showing two synchronized videos side by side. On the right, a woman sits in a music studio playing a harp. On the left, the same movement is shown in a 3D reconstruction, with the performer and instrument drawn as points and connecting lines.
Synchronized motion capture reconstruction of harp performance
Comparing left hand the movement of harpists' playing the same excerpt. The X-axis on the left (the harpist's side to side movements) show consistent movements with different amplitudes. The Z-axis on the right (vertical movement) shows more expressive variation of movement between performers.
Comparing left hand the movement of harpists’ playing the same excerpt. The X-axis on the left (the harpist’s side to side movements) show consistent movements with different amplitudes. The Z-axis on the right (vertical movement) shows more expressive variation of movement between performers.

Working with another collaborator, we developed wearable hardware motion controllers and a performance interface that would map the harpist’s movements to standard music software over a wireless network.

The system was put to use in a new performance for solo harp and electronics.

A woman seated on stage playing a concert harp. A computer sits on a table next to her and a music stand is in front of her. Visible on the back of the performer's hand is a small device, which is the wireless gesture acquisition device.
Tibbitts performs with the gesture controllers

This early work was presented at the MOCO (Movement and Computing) and ICLI (Live Interfaces) conferences.1, 2

II: Co-designing hardware controllers

After using the controllers for a period of time, we identified a number of limitations to the system and developed a set of design specifications for a new harp interface:

  1. Physically augment the harp (vs. open-handed gesture)
  2. Permits simple configuration into the harpist’s normal performance workflow
  3. Non-permanent and non-damaging hardware
  4. Ergonomic and non-invasive design to afford natural expressive performance.

Tibbitts and I co-designed the interfaces with participatory approach that included: ideation and sketching, prototyping a t multiple levels of fidelity (non-functional to functional), CAD design and fabrication, testing, customization, and finally, performance.

Prototyping workflows

Two images side to side. On the left, a paper prototype in the shape of two panels matching the soundboard of a harp, with cut out buttons and other controls attached. On the left is a 3D CAD model of the same layout in a semi-realistic image.
Non-functional paper prototypes into CAD models
Two images side by side. On the left is a notebook with a sketched GUI containing buttons, knobs and sliders. Next to it is an iPad with the same GUI displayed on the screen. The righthand image is the same GUI represented as a semi-realistic 3D CAD image.
From sketches and CAD to functional digital interfaces
There are three images. On the left, A woman sits at a harp, holding two black cardboard rectangles to the harp soundboard on either side of the strings. This represents the approximate location of the controllers. The second image shows a close-up of the woman's hand holding one of the pieces of cardboard at a slight angle, making it easier for the performer to reach. The third panel shows a semi-realistic 3D CAD visualization of the finished controller housing, which has followed the same angled shape and rises up from the harp soundboard.
Testing ergonomics with cardboard prototypes


A large and cluttered desk is shown, with stacks of connecting wires, electrical components, 3D printed pieces, and tools to assemble the  electronics into controllers.
Workstation for controller assembly

Finished hardware and live performance

📽️ First tests of the completed controllers.

Tibbitts has been using the controllers in her professional performance and touring setup since their creation. This has included high profile appearances at the MUTEK festival in Canada, Mexico and (remotely) in Japan.

Alex Tibbitts performing as the Bionic Harpist at MUTEK festival, Montreal, QC.

III: Reflection and iteration

After three years of heavy use, Tibbitts and I began work on an updated version of the controllers for her continuously developing live show. I interviewed Tibbitts to get an understanding of her experiences and needs, as well as to get her perspective on the evolution of the project and continued work.

The interviews provided valuable reflection on this type of design research, and gave a clear roadmap for a new controller design and build, which was completed in 2022. The new controllers preserve the same basic footprint and hardware architecture, however nearly every component and feature has been rethought and redesigned, to provide a more functional and robust controller interface that can withstand heavy professional use.

Two hardware controllers sit on a desk. They are made out of black #D printed frames and black acrylic panels, and feature layouts of buttons, knobs and sliders.
Version 2 of the Bionic Harp controllers.

Research and creative outcomes

On the creative end, Tibbitts is nearing completion of her first studio album “The Bionic Harpist: Impressions”.

A photo of a woman in a recording studio sitting next to a harp. She is listening to something back in the headphones. The studio is cozy and full of wood, and there are pianos, amplifiers, and other instruments lining the walls.
Tibbitts at work in the studio.

On the research end, am preparing a submission for the 2024 New Interfaces for Musical Expression (NIME) Conference that provides an update on the research we have done and provides insights on interface design for professional users. This corresponds with a growing body of research I have carried out around design for professionals and other “extreme” users that place great demands on the technologies they use.3, 4

  1. Sullivan, J., Tibbitts, A., Gatinet, B., & Wanderley, M. M. (2018). “Gestural Control of Augmented Instrumental Performance: A Case Study of the Concert Harp.” Proceedings of the International Conference on Movement and Computing, Genoa, Italy. ↩︎

  2. Tibbitts, A., Sullivan, J., Bogason, Ó., & Gatinet, B. (2018). “A Method for Gestural Control of Augmented Harp Performance (performance).” Proceedings of the International Conference on Live Interfaces. Porto, Portugal. ↩︎

  3. Sullivan, J., Guastavino, C., & Wanderley, M. M. (2021). “Surveying digital musical instrument use in active practice.” Journal of New Music Research, 50(5), 469–486. ↩︎

  4. Sullivan, J., Wanderley, M. M., & Guastavino, C. (2022). “From Fiction to Function: Imagining New Instruments Through Design Workshops”. Computer Music Journal, 46(3). ↩︎

John Sullivan
John Sullivan
Postdoctoral researcher

Postdoctoral researcher exploring research through design in the areas of music, movement, dance, and human-computer interaction.