Engaging with Visual Art Through a Multisensory Experience

Project developed with Anna-Lena Theus, Rami Ibrahim and Sandra Gabriele, for the course HCIN5300 – Emerging Interaction Techniques, Carleton University


The goal for this study was to provide a fuller and richer experience for those viewing visual art (paintings) by appealing to the senses beyond sight. We designed and evaluated a multisensory experience whereby someone viewing a painting received a translation of works of art through a headset with music and a belt programmed with vibration patterns and changes in temperature. Participants viewed two paintings, one as a visual-only experience and another as a multisensory experience. Then, we asked them to select words describing emotional states and adjectives associated with the paintings. Responses indicated that the multisensory experience was not significantly different than the visual-only experience. However, questionnaire data revealed that participants preferred the multisensory approach, indicating there is a potential to enrich the overall experience of engaging with visual art through a multisensory approach.

Physical Prototyping

After I had the idea of using vibration, temperature and sound to enhance the experience of viewing art pieces such as paintings, my team-mates and I have developed a prototype called SensArt.

This slideshow requires JavaScript.

Usability Tests

We use a mixed-design with two possible conditions (one visual-only experience and one multisensory experience) and two different paintings. Before the experiment, participants filled out a brief pretest questionnaire. After viewing each painting, participants answered questions about the art and their experience. Participants selected from among two sets of words. The first set contained the target emotions expressed by the paintings and the second, adjectives describing the paintings. Each set included distractors words. At the end, they completed a post-test questionnaire and had an opportunity to comment verbally and ask questions.

This slideshow requires JavaScript.


Available in the paper published in the ISS ’17 Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces.