From October 2018 to May 2019, INT was in residence at the Musée de l'Elysée to develop a sound narration device driven by eye movement. Unlike a traditional audioguide, the system aims to provide information about the work and its context based on what the visitor is looking at.
This residency is part of a research project conducted by the three museums of PLATEFORME 10 under the initiative of Engagement Migros. The project continues to create tangible links between the visitor and the digitised content. Controllers such as eye tracking offer new possibilities for interaction and open the way to new forms of interactive content.
Beyond the purely technical aspect, the residency aims to explore a form of non-linear narrative. The gaze influences the information and the information influences the gaze. Who controls who? Can this principle be applied to present an artist's work? How to create a coherent scenario? How do visitors react to the device?
The first prototype presents REEF IDLIB, May 3, 2014, a work by Matthias Bruggmann. This image shows antique dealers cleaning antique parts.From February to May 2019, the experience has been renewed on a photograph from Paolo Woods.
From 16th of October till the 4th of December we anonymously recorded the course of the gaze from visitors on the photograph. The result of our experimentation can be seen both as printed and interactive data visualisation. (See images below).
How long do people stay to observe the artwork and listen to its hidden narrative ? What are the key elements that intrigued & attracted the visitor on the image? How could we create interesting & meaningful data visualisation out of the recorded data ?
Thanks to Brian Bendahan for the sound recording and editing, Claire Nicolas for her voice. Matthias Bruggmann & Paolo Woods for his implication in the project.