In the context of controllers for sound synthesis, the keyboard and the mouse are limited as controllers. This has motivated me to develop two hand-held controllers. In this two-part Master's project, the first part is the practical development of a digital musical instrument for real-time processing I have named the eBoy Instrument. The second part is this thesis, in which I document and present a theoretical evaluation of the eBoy Instrument.
An iterative methodology was used as a framework for the design process, in which two final control concepts were developed. The first concept is easy to use, while the second concept is harder to use. Analysis of sensor data and sound output has shown that concept number two has a better coupling between action and sound. Qualitative and quantitative evaluations have shown that 11 participants with a high degree of musical training selected concept number two as the best solution to control sound synthesis.
A theoretical evaluation, supported by observations of 11 participants, indicates that the eBoy Instrument functions well as a controller of sound synthesis.