News

SoIC News

Beesley uses tech to stretch architectural boundaries

2017-01-19
Philip Beesley

To look at a Philip Beesley installation is to ask yourself some questions.

Is the installation art? Is it architecture? Is it a pseudo-living entity?

The answer, oddly, is: All of the above.

Beesley, a professor in the School of Architecture at the University of Waterloo and director of the Living Architecture Systems group, presented a talk at the School of Informatics and Computing in December on the concept that buildings can move from “classical ideas of a static world of closed boundaries and instead use more organic architectural structures that mimic the natural world but also include elements of computing, artificial intelligence, and artificial-life chemistry.” He used some of his artwork to showcase the possibilities that might be available to architects of the future.

Using lightweight, mostly translucent materials coupled with sensors, lighting, and actuators, Beesley’s installations are both ethereal and seemingly sentient, reacting to input from observers by moving or vibrating while also collecting data. Frond clusters, which resemble the leaves of a fern, are fitted with shape-memory alloy mechanisms that respond to input, triggering sensors that ripple throughout the piece. Chemical reactions that rely on outside stimuli, such as light and temperature changes, take place within the installation to change the look of the pieces over time.

The result is a unique piece of art/architecture that can be both pleasing to the eye and illustrate aspects of data science, machine learning, artificial intelligence—areas at the heart of SoIC research.

“There is a surging movement in the conception of interconnected material components where intelligence is increasingly embedded within physical artifacts,” says Beesley, who was visiting as a guest of Distinguished Professor of Information Science and Intelligent Systems Engineering Katy Börner. “That can work in passive ways. There is an expectation that a door in a shopping center will open before us. Lights will turn on and off efficiently. That’s increasingly ordinary. However, very interesting qualities come about when we start to visualize multiple scales. When those kinds of visualizations to the responsive capabilities of dynamically coupled components emerge, then we get very interesting possibilities.”

Börner says Beesley’s work also helps exploit curiosity-driven machine learning.

“Philip’s sentient architectures foreshadow future intelligent environments and the emerging Internet of Things,” Börner says. “There is an urgent need to understand how embedded technologies affect the experience of individuals who inhabit these spaces and how these technologies can be most appropriately used to improve occupant experience, comfort, and well-being. Faculty and students at SoIC are well-positioned to design desirable futures that further increase our collective human-machine intelligence.”

A collaboration between Beesley and Börner is bringing Beesley’s unique structures to Bloomington. Andreas Bueckle, a Ph.D. student in information science, has been working with a testbed of structures, sensors, and actuators from Beesley’s team, and he is excited about the possibilities of Beesley’s sculptures.

“It’s a sculpture, but at the same time, it’s kind of an organism,” Bueckle says. “It has a metabolism, and there is an internal digestion of data and information. It has eyes and ears. The actual sculptures have microphones attached to them, and they have motion sensors. Over time, you can develop the sculptures into more sentient machines. They’re still machines, but they approach something from a sentient viewpoint because they are open to something from the environment. They are able to react to it.”

The testbed is known as “Cyclops” because the initial iteration featured just one motion sensor. It has since expanded to three sensors, and Cyclops could be key in the development of an app to help people interact with Beesley’s sculptures.

“We are trying to develop an augmented reality app to allow visitors to understand the signal flows within the sculpture, give them an insight into how the sculpture works,” Bueckles says.

It also will help build a better understanding of sensor-actuator pairs and how they interact while also allowing researchers to devise better scaffolding techniques. Most importantly, both Beesley’s work and the Cyclops testbed could better illustrate and teach data visualization.

“We always talk about Big Data, but most data is stored in tables or some kind of simple text document,” Bueckles says. “Through data visualization, you allow people to gain insight into that data and see how different inputs interact with one another.”

Visit Beesley’s website for more information about his sculptures.

Media Contact

Ken Bikoff
Communications Specialist
Phone: (812) 856-6908
kbikoff@indiana.edu