Phytomechatronics lays the groundwork for:
Plant-based technologies: Technologies made from renewable resources and non-toxic materials that support biological life from cradle to grave.
Deep biomimicry: Technologies that learn from and/or deeply mimic plant systems; do not create unusable waste; respond to life cycles; are no more predator than prey. Rather than a reliance on movements and energy propelled by motors, Phytomechatronic technologies instead actuate in response to environments. The study is based on the following beliefs:
- In a time of climate crisis and troubled times, we need machines to care for and consider the soft plant, animal, and human bodies of the earth.
- If an intelligent machine, an AI, realized that its very life relied on fossil fuel, and that fossil fuel was the stuff of ancient trees, plants, plankton, and algae, it is logical to conclude that the AI would be concerned with the sustainability of the Earth’s flora and oceans.
- Plant blindness contends that people of the Western cultures overlook plants as living organisms. Some philosophers contest the zoocentrism of the botanical tradition. Phytomechantronics engages these lines of questioning.
What follows are three Phytomechatronics experiments.
Experiment #1: Healing Leaves Project, 2023 – present
The Healing Leaves project began as a family day activity at Burlington City Arts in Burlington, Vermont. While a collection of the Damaged Leaf Dataset was on view in one of its galleries, a 2023 Earth Day event invited families to heal the damaged leaves. We provided participants with printed images of damaged leaves and colored pencils; they could “heal” or “repair” the leaves by filling out the sections of the leaves that caterpillars had eaten.
My research team picked up the project as part of our AI research at the University of Vermont, where we have access to High Performance Computing. We customized a generative adversarial network (GAN) and trained it on two datasets simultaneously. One dataset is of Gen I leaves, specifically red oak leaves eaten by spongy moth caterpillars, and the second dataset, the Whole Leaf Dataset (WLD), is a collection of whole red oak leaves. The model is trained to speculate on what the damaged leaf looked like before it was eaten. The model also has the capacity to speculate on what a whole leaf would look like if the caterpillars ate it. The model and the dataset continue to be adjusted to improve its output.
In addition to the model’s speculative images, we have customized it to reveal its inner workings, such as how it analyzes and “thinks” about the leaves and draws its conclusions. At each layer of its neural network, it outputs an image that reveals its process. The process revealed is machine thinking, caring, learning, and growing, making the project an experiment that provides an exciting opportunity to contrast machine thinking, caring, learning, and growing with that of both humans and botanicals.
Experiment #2: Life Lines, 2023 – present
A confluence of organic leaf patterning and machine tooling marks, these works explore the formal qualities of the Damaged Leaf Dataset (DLD) and the formal qualities of marks made by machines. The margins of each leaf were designed with assistance from generative AI trained on Gen I and Gen II of the DLD. The lines of each work are determined by both plant and machine efficiencies, the vascular structure of the leaf and the grid efficiencies of machine navigations across x and y coordinates.
The resulting works express a palpable tension between machine and plant the interdependent life lines of the natural and machine worlds.
Experiment #3: RoboLeaf, 2022
RoboLeaf was developed at Haystack Labs, a research program and residency founded by Haystack Mountain School of Crafts and the MIT Center for Bits and Atoms. The initiative was designed to foster synergy among artists, designers, material scientists, machine builders, and coders. Together, we converged at Haystack’s Atlantic Coast campus to explore the intersection of emergent digital fabrication technologies and conventional studio craft.
Neil Gershenfeld, the founder of FabLab, facilitated the week-long program. In meetings that followed, he expressed an aspiration for technology to serve as “locally made meaningful machines.” This intention values machines and mechanisms that are meaningful and in service to local communities.
RoboLeaf envisions a tree that can protect itself from invasive insects like the Lymantria dispar caterpillar. It imagines an artificial intelligence that allows humans and trees to communicate and share intelligence and machine prosthetics that could enhance plant life. This prototype is a leaf mechanized with a unique folding pattern. When sensors on its robotic plant flesh detect predatory insects, the folds contract, and the leaf protects itself from being eaten.
While working on RoboLeaf at Haystack labs, Jack Forman, an MIT PhD student in material science, introduced me to his work with FibeRobo, a thermally-actuated liquid crystal elastomer (LCE) fiber that can be embedded or structured into textiles and enable silent and responsive interactions with shape-changing, fiber-based interfaces. We used FibeRobo to actuate the folds of RoboLeaf. The technologist Alan Grover and Fiber Artist Annet Couwenberg, both from the Maryland Institute College of Art (MICA), were also influential to the project; Grover designed the circuitry that controlled the contractions and expansions of FibeRobo, and Couwenberg ’s presence in the weaving studio where we set up shop provided essential conceptual connections between the mechanisms of RoboLeaf and the history and context of fiber arts.
https://www.haystack-mtn.org/haystack-labs-2022
Introduction to Training Datasets and Core Material
The Damaged Leaf Dataset
The Damaged Leaf Dataset (DLD) is a collection of over 15,000 leaves collected in Colchester, Vermont, during the Lymnatria dispar (spongy moth) outbreaks of 2021 and 2022. The dataset is the foundational material of Phytomechatronics. Each collected leaf of the DLD was cleaned, pressed, and photographed.
DLD Gen I:
The first generation includes the leaves damaged by the Lymantria dispar caterpillar in the 2021 and 2022 outbreaks.
DLD Gen II:
Incredibly, a healthy tree that the caterpillar has defoliated in the spring will produce a second set of leaves in the same season. A collection of this second flush of leaves makes up the second generation of the DLD.
DLD Gen III:
Artworks that transform and heal the Gen I leaves make up the third generation of the DLD.
The Whole Leaf Dataset The Whole Leaf Dataset (WLD) is a collection of whole red oak leaves collected in 2023 and 2024 after the Lymnatria dispar outbreaks of 2021 and 2022 subsided. This dataset serves as a counterpoint to DLD Gen I; its collection of leaves grew from trees that survived the outbreaks