The Voice-Controlled Interface for Digital Musical Instruments (VCI4DMI) is a system driven solely by voice for the real-time control of electronic musical instruments, including sound generators (synthesizers) and sound processors (effects). The VCI4DMI implements is a generic and adaptive method to map the voice to digital musical instruments real-valued parameters, and it is based on several real-time signal processing and off-line machine learning algorithms for producing and using maps between heterogeneous spaces, with the aim of maximizing the interface expressivity and minimizing the user intervention in the system setup.
Sonoplastic is an audiovisual performance based on gesture analysis to produce and control sounds and images. Musician’s gestures during performance has been historically dependent on the ergonomics and functionality of musical instruments, most of it involved the body parts responsible of activating the exact pitch on the exact spot with the exact pressure at the exact timing and only part of it to the desired expression and meaning. It is time, now that the technology opens up new scenarios, for a paradigm shift that lies in the elimination of the dichotomy between the figure of an interpretation (merely intended as body movements mainly finalized to control instruments that generate movement of air particles) and a corporeal experience and mental representation of movement that generates and elaborates creative processes in a sense giving activity.