This project has two main objectives. Firstly, it explores the potential for a gesture-based interaction with our dynamic architectural space through the use of a Leap Motion device. Secondly, and more importantly, it explores the relationship between materials, form, and interactive systems of control in order to generate an empathetic relationship between users and their environment.
Mobile devices already use techniques based on touch and gesture-based languages – swiping, clicking, and dragging – as a natural, intuitive mechanism of control.
The installation consists of wood, stretchable fabric and PVC pipes controlled with Arduino micro-controller connected to a Leap Motion. The Leap Motion recognizes specific gestures, which will control several DC motors to operate several types of movement into the surface.
The depth camera captures topographic data of the surface in real time by using a Ausus Xtion PRO depth camera. It then processes this information in order to generate a series of topographic contour lines, which are projected on to the surface.
In a way, audiences generate various physical movement of the wall surface by their hand gesture while the new surface data information is processed and projected onto the surface. Interestingly, the projection and physical movement are locked into a feedback loop.
In the future it may even be possible to design a direct interface, which allows users to interact with their environments without any intermediary mechanism. Such interfaces will allow easier control of our physical environment, making the relationship more intimate.
- Behnaz Farahi
- Ramtin Khah
- Mitch Thomson, Sam Adelan, Chen Chen, Sky Roim