The Adaptive Use Instruments Project
Recently I bumped into composer and performer Pauline Oliveros (PO) in San Diego. We got to talking about one of her current projects, the Adaptive Use Musical Instruments for the Physically Challenged. This project introduces software designed to be used in therapy sessions to give children with limited motor skills the opportunity to participate in music, and offer them an outlet for musical expression. I arranged for a follow-up interview by email so that we could learn more about what this project involves. Joining us is Zevin Polzin (ZP), the technical lead in the project.
How did the Adaptive Use project get off the ground?
PO – I have a long time friend, Leaf Miller, who is a drummer and certified occupational therapist. We talked for years about how to use technology to help children with severely limited volitional movement to play in her drum class at REHAB Programs North Road School. I finally secured a small grant from the Malcolm S. Morse Foundation to begin.
In January 2007, I assembled a team together with Don Millard, director of the Academy of Electronic Media at RPI where I teach along with RPI student researchers and the staff of Deep Listening Institute, Ltd. (a non profit arts organization that I founded in 1985).
I asked Leaf for three students with the least volitional movement to begin the project. These students are confined to wheel chairs, cannot speak and have some controllable head movement. Our team did a site visit to assess the situation and the technology in use at the school. It was clear that we could do something not only to help the students but to also help the therapists.
How did you go about creating the software interfaces for the kids in the first place?
PO – Zane Van Duzen, an undergraduate at the time, came to the next session with a camera tracker programmed in Max/MSP/Jitter on his MacBook Pro. Most of this patch is based on the cv.jit library developed by International Academy of Media Arts & Sciences and Jean-Marc Pelletier.
Van Duzen’s prototype showed the student’s image on the computer screen with a marker placed on their virtual nose to move within the object zone. With very little lateral head movement crossing a line the student could trigger a digital switch programmed arbitrarily with a snare drum sound. The object zone is of course adjustable.
Additionally, the patch had a virtual keyboard programmed with a blues scale. The keyboard could also be played with lateral head movement. The Max patch was an immediate success and allowed the three children to improvise melodies. They made music for the very first time in their lives. Needless to say, to experience that moment with each child was very moving. Everyone present was changed by those moments.
Questions and comments from the therapists helped us to advance our work together. What colors to use for the interface, positioning of the camera, attack time of samples in relationship to movement, resolution, length of sessions, these are all important details. There is new information in every session.
Why motion tracking?
ZP – We’ve really been focused on motion tracking for the time being, because it means that there are no moving parts or things to attach to the kids. Their movement ranges can vary day-to-day depending on a number of factors, so the software can be programmed to accommodate that.
Most existing tools presuppose a full range of movement, or more complicated gestures than these children are capable of. The real meat of this project is amplifying very small, limited movements into real musical expression. The only gestures these kids have made previously to communicate are all binary — yes/no answers, or using physical switches to turn music on or off. Opening up a whole gradient of movement is new to them and is something we’re in the middle of exploring.
Tell us about the kinds of things which have happened in sessions.
PO – Early on, we upgraded the patch so that 4 different drum sounds could be triggered with slight side to side up and down head movements. With this increase in possibilities we had a breakthrough. Leaf played a drum pattern for A, a 16-year-old and then pointed to her to play. A answered with a pattern of her own using the 4 sounds. She improvised! Everyone in the room cheered. Without any prompting at all A understood on her own that she could move quickly to any of the 4 sounds by staying in the center of the quadrant pictured on the computer screen.
In the next session Leaf brought in some students from the drum class. They would play and listen for A. She played! We really felt like we were getting somewhere.
In the most recent session with A, we hooked up larger speakers so that her sounds had as much power and volume as the live drums. In this session A initiated patterns for the others to answer. This improvisation generated a great deal of excitement among all the students and therapists.
G, an 11-year-old played the virtual keyboard for the first time for an hour and did not want to leave the session. In a later session his mother came in the door and was stunned to hear her son playing music. G’s mother was overcome and could not speak. When she came back another time she was able to participate. She told me that they are a musical family. G’s sisters play flute and clarinet. She was excited and remarked that G was not in his high tone (stiffness) during instruction. We noticed that the students were more and more relaxed for subsequent sessions. G’s mother was very helpful during the rest of the session.
B, a 9-year-old was often distracted by seeing himself on the computer screen. We learned to remove the picture and leave just the marker as a bright orange spot. Though B’s response time was slower than the other two B could also trigger drum sounds and play the virtual keyboard.
How are these musical activities contributing to the kids’ overall therapy?
PO – As I suspected — opening a creative, empowering portal for these children has initiated the beginning of potential for others with more mobility at the school to improvise and create their own music as well.
We are still in formative stages of the work. It is definitely improvisational and very creative for all concerned. The important thing is that the three students that we selected to work with have all been successful in making music of their own. It is quite easy to see how affected they are with their musical results. There are holistic and therapeutic side effects as well. We want to increase their possibilities for choice with improvisation as an empowerment for them.
Honing their physical movements with musical feedback gives the students greater ability to learn and interact in their environments and more precise control. My RPI colleague Curtis Bahn is very adept at translating physical motions into musical phrases using sensors and Max programming. Curtis is one of our technical advisors and is joining the Adaptive Use project. His interest in computer interfaces for the physically challenged is a great asset to this project and its future evolution.
How is the project developing and how do you see its future? Do you have any plans to open source it or make it available for others to use?
ZP – We’re interested in moving away from grids and using relative movement. This will enable us to track more subtle movements and create more sophisticated sounds.
We’re working to make the patch more modular and “open source” so other people can participate in the development. However, the underlying patch will never be entirely “open source” — not for commercial reasons but because a large part of the project is developing a methodology with the therapists which greatly influences the design.
The sound-making algorithms themselves are modular and open source, and we hope to provide guidelines on how to program “modules” for this project in the future.
PO – Nan Jia, an engineering student from RPI has created a robot drummer that plays a miniature drum set. We connected it to the camera tracker so the children could play it. The robot drummer is an important direction for this project as my ideal plan is to make it possible for these students to play real instruments with robotic interfaces.
For the future we want to have an Adaptive Use wiki and invite the participation of others to develop the high potential of this project and to meet the needs of millions of people who would benefit greatly emotionally and spiritually from making music.
If you want to know more about the Adaptive Use Instrument or Pauline Oliveros’ Deep Listening Institute, please visit http://www.deeplistening.org/adaptiveuse
Cycling ’74 would like to thank Pauline and Zevin and all the staff on the Adaptive Use project for making this interview possible.