It ain't what you say, it's the way that you say it

Coordinator – Rob Clark

 Outline - This exhibit explores manipulating the pitch and timing of synthetic speech as it is being synthesised.  Body motion, specifically hand movement, is tracked by a Kinect sensor and converted into instructions as how to modify the pitch (vertical movement of the right hand) and speed (vertical movement of the left hand) of synthetic speech as it is being generated. While this it fun to experiment with and enjoyable to watch, this exhibit is intended to highlight the difficulties involved in giving the user control over the speech in real-time in such a way as it would be easy to produce a natural sounding result.

 Outcome - The exhibit is the result of collaborative work between The Centre for Speech Technology Research at the University of Edinburgh and The Institute of New Media Art Technology, University of Mons, Belgium. The broader goals of the research are to develop interfaces for controlling speech synthesis where the expressiveness of synthetic speech used in conversation can be varied in real-time as it is being produced in reaction to changes in the direction of conversation, and changes in the environment where the conversation is taking place.