What if our clothing could sense the movement and emotions of those around us? How might technology expand our sensory experience and influence our social interactions? And in what ways could our clothing become a form of non-verbal communication, expressed through changes in color and texture?
The hummingbird is a remarkable creature. The male Anna’s hummingbird , for example, has feathers around his throat that appear at one moment completely green. With a twist of his head, however, he can turn them into an iridescent pink. He does this by exploiting the capacity of the microscopic structure of the feather to refract light like a prism, so that the feathers take on different shimmering hues, when viewed from different angles. This is how the Anna’s hummingbird attracts mates during his spectacular displays of aerial courtship.
Iridescence is an interactive collar, inspired by the gorget of the Anna’s hummingbird. It is equipped with a facial tracking camera and an array of 200 rotating quills. The custom-made quills flip their colors and start to make patterns, in response to the movement of onlookers and their facial expressions.
This project addresses a number of challenging technical issues. Firstly, the design and fabrication of color changing materials was informed by the logic of lenticular behavior. Not dissimilar to how light is refracted by the feathers of a hummingbird, Iridescence uses lenticular lenses laminated onto an array flat colored surfaces to provide color changing effects. Secondly, the rotation of the quills is controlled by a series of custom made electro-magnetic actuators, carefully designed to withstand the natural wear and tear of being part of an exhibition that lasts 15 months. These actuators can be easily removed and replaced by substitute actuators, if any of them fail. Another major engineering task has been to design a PCB driver boards to control and orchestrate the actuator's behavior.
The goal is to explore how wearables can become not only a vehicle for self-expression, but also an extension of our sensory experience of the world. The advantage of using such an innovation can be to gather visual information such as people’s facial expressions for those who have difficulties receiving or decoding this information such as those who suffers from visual impairment or autism. Iridescence can also express non-verbally and mimic those information, through its dynamic behavior. To make this work, we are borrowing from the latest advancement in AI facial expression tracking technology and embed it in bio-inspired material systems.
Overall, this project is an attempt to explore the possibilities afforded by AI facial tracking technology and the dynamic behavior of a smart fashion item. The intention is to address psychosocial issues involving emotions and sensations, and to see how these technologies might inform social interaction.
The making of Iridescence was very iterative and experimental. It took many months to from the initial inspiration to the final product. Here is the Making of Iridescence.
- Behnaz Farahi
- Paolo Salvagione
- Behnaz Farahi, Paolo Salvagione
- Julian Ceipek
- Hao Wen
- Merced Jackson
- Hao Wen, Valeri Neri, Kayla Paredes, Ryan Chavez
- Kristina Varaksina
- Behnaz Farahi
- Tobias Winde Harbo & Sophia Harbo
- Julia Covert
- Takashi Uchida
- Sara Tagaloa, William Soria
- Leon Gerardo,Fio Karpenko,Ryan Chavez
- Brian Cantrell
- Evelyn Glennie
- USC School of Cinematic Arts
- Will Rollins