MULTIMODAL INTERACTION FOR EDUCATION
ICMI 2017 INTERNATIONAL WORKSHOP - Glasgow, Scotland. November 13-17th, 2017
EXTENDED SUBMISSION DEADLINE: JULY 30, 2017
Technology is nowadays increasingly used in the classroom. Whereas, on the one hand, there exists a vast literature on use of multisensory technology for teaching, on the other hand at school the visual channel is often the one most frequently exploited, and the other channels are left a marginal role only. Moreover, current technologies for education do not still sufficiently ground on psychophysics and developmental psychology evidence.
A multimodal pedagogical approach taking into account the best suited modality to teach a specific concept and fully exploiting the potential of multimodal technologies can therefore be highly beneficial for education.
The 1st International Workshop on Multimodal Interaction for Education aims at investigating how multimodal interactive systems, robustly grounded on psychophysical, psychological, and pedagogical bases, can be designed, developed, and exploited for enhancing teaching and learning processes in different learning environments, with a special focus on children in the classroom. The workshop will bring together researchers and practitioners from different disciplines, including pedagogy, psychology, psychophysics, and computer science – with a particular focus on human-computer interaction, affective computing, and social signal processing – to discuss such challenges under a multidisciplinary perspective.
We invite contributions in form of research papers or demos (with accompanying poster).
The workshop is partially supported by the EU-H2020-ICT Project weDRAW.
The weDraw project aims to introduce a new teaching paradigm, based on multisensory technologies that exploits the most effective sensory channel in children. Specifically, the workshop will start from the knowledge acquired in the first year of weDRAW, and will be open to new perspectives and experiences.
© 2017 WeDRAW Project
WeDRAW has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement No. 732391