The MEGA project is centered on the modelling and communication of expressive and emotional content in non-verbal interaction by multi-sensory interfaces in shared interactive Mixed Reality environments.

In particular the project focuses on music performance and full-body movements as first class conveyors of expressive and emotional content.

Main research issues are:

- Analysis of expressive gestures 
How to recognize the expressive content conveyed through full body movement and musical gestures?

- Synthesis of expressive gesture 
How to communicate expressive content through computer generated expressive gesture, such as music performances, movement of virtual as well as real (robotic) characters, expressive utilization of visual media? 

- Mapping strategies 
How to use data coming from analysis for real-time generation and processing of audio and visual content?

- Cross-modal integration
How to combine data coming from different channels in order to analyze expressive gestures? 

A main output of the project is the MEGA System Environment, an environment for multimedia and performing arts applications where different software modules for real-time expressive gesture analysis and synthesis are interconnected with each other.

The research results have been used in a number of artistic performances and multimedia events.


Please send suggestions and comments to: WebMaster