Elowan is an attempt to demonstrate what augmentation of nature could mean. Elowan’s robotic base is a new symbiotic association with a plant. The agency of movement rests with the plant based on its own bio-electrochemical signals, the language interfaced here with the artificial world.
These in turn trigger physiological variations such as elongation growth, respiration, and moisture absorption. In this experimental setup, electrodes are inserted into the regions of interest (stems and ground, leaf and ground). The weak signals are then amplified and sent to the robot to trigger movements to respective directions.
Such symbiotic interplay with the artificial could be extended further with exogenous extensions that provide nutrition, growth frameworks, and new defense mechanisms.
The difference between this plant-robot hybrid and others that we’ve seen in the past is that the plant is actually in control: The robotic base moves where the plant wants it to, to the extent that a.) plants want things and b.) the plant is able to communicate such, and c.) we’re able to correctly interpret it. So, it’s not just that the robot part is like, “Oh, there’s some light over there, plants like light, let’s go over to the light,” because that’d be completely independent of the plant itself. Instead, the system measures signals from the plant itself and takes direction from that. Whether it’s the right direction or not isn’t necessarily clear, but at least the plant is in the loop somewhere, rather than being just a passenger.
While the intent here is to give the plant some agency of its own, the practical result is still a robot with a plant on it that chases light. That’s a pretty safe thing for the robot to do, I suppose, but are plants more nuanced than that, and if so, is it something that robots could eventually detect and respond to? My dying houseplants really, really hope so.
[ MIT Media Lab ]