Project Catalog
Relations and Trajectories
This appendix expands and deepens the project documentation that Chapter 05 (Tools) draws on to develop its argument. Where the chapter discusses selected projects as specific instances in a larger argument about the tools of adaptive practice, the appendix provides the extended catalog, the comprehensive documentation, and the archival detail those projects require. A reader who wants to understand how a particular installation, prototype, or instrument was built, what it tested, who collaborated on it, and how it connects to the larger trajectory of the work will find that material here.
Practice-based research in design disciplines positions creative work not as illustration of theory but as a primary mode of inquiry. A method through which knowledge is produced, tested, and refined. The accumulated body of a practitioner’s projects constitutes an evolving research program, each work building upon, challenging, or extending insights generated by its predecessors. The career trajectory becomes legible as an epistemological instrument, a sustained investigation conducted through the medium of design itself.
The appendix presents twenty years of projects organized chronologically across four phases of development. The phased structure is not biographical convenience but reflects genuine shifts in the questions being asked, the technologies being deployed, and the scales of engagement being pursued. Each phase represents a distinct mode of inquiry into the relationship between landscape, computation, and ecological process while maintaining continuity with the larger project of understanding how designed systems might learn from and adapt to environmental dynamics.
The projects collected here range across registers of realization.
Practices that engaged directly with institutions, sites, materials, and implementations.
Manifests that rendered theoretical propositions operational through scholarship, prototypes, and working demonstrations.
Speculations that tested ideas through projective scenarios and disciplined imagination.
Across these registers, the work has consistently interrogated a central question. How might landscape architecture move from designing static forms to choreographing responsive processes?
The adaptive epistemology proposed in this dissertation did not emerge from abstract reasoning but from the friction of making. Theory, in this account, is distilled from practice rather than applied to it, and this appendix documents the practice from which Chapter 05 and the dissertation’s larger argument were distilled.
Phase I: The Digital Threshold (2005–2012)
The mid-2000s presented landscape architecture with a representational crisis. Digital tools imported from architecture and CAD systems optimized for fixed geometries, struggled to capture the indeterminacy fundamental to landscape materials. The vector line implied precision where the landscape demanded flux.
The projects of this phase interrogated the mechanics of representation itself, asking how digital media might be reconfigured to engage landscape’s atmospheric and temporal qualities. Working at Louisiana State University, these investigations developed along two parallel tracks: one concerned with the hybrid image, compositing techniques that layered raster and vector, photograph and drawing, to produce representations capable of expressing ambiguity. The other concerned with the live image including motion graphics and film as well as installations and prototypes that represented moments in time and coupled sensing to visualization, producing representations that responded to present conditions rather than recording past states.
This phase established foundational principles that would carry through subsequent work: that representation is never neutral but actively shapes how designers understand and engage sites. That digital tools must be adapted to landscape’s specific material logics rather than adopted wholesale from adjacent disciplines. That the gap between data collection and communication, between sensing and showing, might be collapsed through computational mediation.
Thresholds Installation (2006)
Project Type: Speculative Installation
Location: Louisiana State University, College of Art and Design Atrium
Collaborators: Wes Michaels
Medium: High-contrast mural, camera with blob detection software (Processing), real-time display
Description
Thresholds was an interactive installation that translated spatial occupation into topographic representation. A high-contrast mural painted on the atrium wall served as a visual datum, a stable ground condition against which change could be measured. A camera trained on the mural performed continuous blob detection, an image analysis technique that identifies regions of contrast difference. Custom software written in Processing converted these contrast gradients into isolines (contour lines), displayed in real-time on an adjacent monitor. As lighting conditions shifted throughout the day and as pedestrians moved through the space, they disrupted the contrast field, generating new isoline configurations—effectively producing “automated landscapes” in continuous flux.
The project established a feedback loop between environmental condition, sensing apparatus, and representational output. The mural functioned as a calibrated field; the camera as a sensing instrument; the algorithm as an analytical engine; and the monitor as an interface rendering analysis legible. Occupants of the atrium became unwitting participants in landscape generation—their bodies registering as topographic events, their movement producing ephemeral landforms.
Epistemological Contribution
Thresholds operated as an early prototype for concepts that would mature across subsequent projects: the notion that representation can be dynamic rather than static, responding to conditions rather than fixing them; that sensing and visualization can form a continuous circuit, collapsing the temporal gap between data collection and communication and that the landscape medium extends to phenomena beyond terrain, here, light, contrast, and human occupation become the “materials” from which topography is generated.
The choice of isolines as the representational mode was deliberate. Contour lines are perhaps the most fundamental notation in landscape architecture, the convention through which three-dimensional terrain is translated into two-dimensional plan. By generating isolines algorithmically and in real-time, Thresholds defamiliarized this taken-for-granted convention, revealing the contour not as a neutral transcription of existing landform but as an active interpretation of gradient and threshold. The installation asked: What if contours responded to present conditions rather than recording past surveys? What if the landscape drawing was always being redrawn?
This project also introduced the concept of contextual awareness, the mural serving as a datum that gave meaning to deviations captured by the sensing system. Without this stable reference, the isolines would have been noise; with it, they became information. This principle—that sensing requires calibrated context—would prove essential in later projects dealing with sediment dynamics, ecological monitoring, and adaptive management systems.
Surface Tension (2006–2007)
Project Type: Speculative / Hybrid Model
Location: Louisiana State University
Medium: Cast landscape surfaces with embedded electronics, LEDs, sensors — hybrid analog/digital model
Description
Surface Tension explored the boundary between physical landscape models and digital interactivity. Working at the intersection of material craft and embedded computation, the project produced cast landscape surfaces embedded with electronics, LEDs and sensors that responded to physical interaction. The model was simultaneously a representation of terrain and an interactive instrument, touching the surface triggered responses that made the model behave as a primitive responsive landscape at miniature scale. This early experiment in embedding computation within physical landscape models directly prefigures the geomorphology tables at REAL and UVA, establishing the principle that physical models could be not merely representational but operational, instruments that produce knowledge through interaction rather than observation alone.
Cross-Chapter Resonances: Ch 05 (tools, the model as instrument, not representation), Ch 06 (models, physical computing embedded in landscape models), Ch 09 (interactions, the model as responsive interface)
Ambient Space (ACADIA 2007)
Project Type: Manifest / Peer-Reviewed Paper + Installation
Location: ACADIA 2007 Conference, Halifax, Nova Scotia
Collaborators: Wes Michaels
Publication: Cantrell, B. (2007). “Ambient Space.” In Expanding Bodies: Art, Cities, Environment, Proceedings of the 27th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA). Halifax, NS.
Medium: Paper, projection installation, Processing sketches, sensor input
Description
“Ambient Space” proposed that landscape architecture’s engagement with digital media should move beyond representation toward the construction of ambient computational environments, spaces in which digital processes are not overlaid on physical conditions but integrated with them at the level of atmosphere and perception. The project developed Processing-based software sketches that translated real-time sensor input into spatial projections, creating environments that responded to occupants’ presence and movement. The ACADIA paper theorized this as a shift from “digital drawing” (using computers to represent landscapes) to “digital landscape” (using computation as a medium of spatial experience itself). This distinction between computation-as-representation and computation-as-medium would become foundational: every subsequent project in this practice operationalizes some version of the claim that computation is not a tool for depicting landscapes but a material condition of the landscapes themselves.
Cross-Chapter Resonances: Ch 05 (tools, computation as medium, not representation), Ch 07 (technogeographies, sensing as constitutive of spatial experience)
Landscapes of Fantasy: Representational Pedagogy (2008)
Project Type: Pedagogical Research / Curriculum Development
Location: Louisiana State University, Robert Reich School of Landscape Architecture
Presentation: Council of Educators in Landscape Architecture (CELA) Conference, 2008
Medium: Seminar curriculum (LA 2101 Representation Seminar)
Description
Landscapes of Fantasy was a pedagogical framework that introduced literary narrative as the structuring device for advanced digital representation instruction. Rather than teaching software skills through the replication of known visual forms (existing landscapes, precedent images, conventional renderings) the seminar asked students to represent environments that existed only in text. Science fiction novels, particularly Kim Stanley Robinson’s Red Mars, provided descriptive passages that students translated into visual media through a sequence of exercises addressing device, surface, environment, motion, and process.
The methodology recognized that landscape representation differs fundamentally from architectural representation in its concern with field rather than object. Commercial modeling and rendering software, developed primarily for architecture and industrial design, privileges the construction of discrete objects with hyper-real material surfaces. Landscape, by contrast, requires the representation of atmospheric conditions, temporal sequences, and experiential immersion—qualities more readily found in literary description than in the conventions of technical drawing.
The seminar structured learning through five project typologies:
Diorama/Scene: Students composed virtual environments from image-mapped planes, referencing Daguerre’s theatrical dioramas where landscape paintings transformed through shifting light and layered screens. The single-camera, fifteen-second animation emphasized composition from a defined vantage point.
Object/Device: Exploration of surface generation methods (polygon modeling, NURBS) through the design of devices that interface with landscape systems. The focus on “functioning, dynamic systems” anticipated later work on responsive infrastructure.
Environment/Context/Surface: Terrain modeling as both surface and volume, with iterative modifications driven by literary description. Context was constructed through billboard imagery updated according to fictional landscape changes, forcing students to reconcile local surface conditions with external constraints.
Dynamics/Interaction: Thirty-second animations of environmental processes—water moving down a stem, dust storms forming, morning dew evaporating—requiring critical examination of how time is framed through compression or expansion.
Epistemological Contribution
The framework’s central insight was that imagination precedes representation. By displacing students from known landscapes into fictional territories, the pedagogy broke habitual reliance on photographic reference and conventional visual tropes. Students could not Google an image of a Martian canyon they had to construct it from descriptive language, forcing engagement with the compositional logic of landscape experience rather than the replication of existing images.
The use of narrative as structuring device also introduced temporal coherence to representation pedagogy. Literary narrative unfolds through time; its landscape descriptions are embedded within sequences of events, character movements, and atmospheric shifts. By grounding representation exercises in narrative passages, students learned to think of images not as isolated frames but as moments extracted from continuous environmental processes.
This work extended the concerns of Thresholds (2006) from installation to pedagogy. Where Thresholds demonstrated that representation could be dynamically generated through sensing, Landscapes of Fantasy asked how designers might be trained to think representationally about dynamic systems. The shift from object to field and from “idealization of the object” to “expression of experience” became a teachable methodology.
Cyclone Larry / Ecological Abstraction Layers (ACADIA 2008)
Project Type: Manifest / Peer-Reviewed Paper
Location: ACADIA 2008 Conference, Minneapolis
Publication: Cantrell, B. (2008). Proceedings of the 28th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA).
Medium: Paper, digital visualization, ecological data mapping
Description
Presented at ACADIA 2008, “Cyclone Larry” developed a methodology for visualizing ecological transformation through what the paper terms “abstraction layers” — computational overlays that translate complex ecological data sets (wind patterns, vegetation damage gradients, soil moisture regimes, habitat connectivity) into spatial compositions legible to designers. Using the devastation of Cyclone Larry in Queensland, Australia as a case study, the project demonstrated that ecological processes operating at landscape scale could be rendered as designable conditions rather than merely scientific datasets. The methodology of layered abstraction, decomposing complex ecological phenomena into separable but interrelated spatial fields, became a recurring technique in subsequent work, particularly in the CSS visualization program and the UVA geomorphology lab’s data-driven modeling approach.
Cross-Chapter Resonances: Ch 05 (tools, abstraction as design methodology), Ch 06 (models, data visualization as modeling technique), Ch 07 (technogeographies, ecological data as spatial intelligence)
Abstraction Language (ACADIA 2009)
Project Type: Manifest / Peer-Reviewed Paper
Location: ACADIA 2009 Conference, Chicago
Publication: Cantrell, B. (2009). “Abstraction Language.” In Proceedings of the 29th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA). Chicago, IL.
Medium: Paper, computational visualization methodology
Description
Building on the ecological abstraction methodology introduced in the Cyclone Larry paper, “Abstraction Language” proposed a systematic vocabulary for translating between ecological data and spatial design operations. The paper argued that landscape architecture lacked a formal language for moving between quantitative ecological information and qualitative design decisions, a gap that computational tools could address not by automating the translation but by making its operations visible and manipulable. The “abstraction language” was conceived as a set of operations, filtering, layering, thresholding, and interpolating, that designers could apply to ecological datasets to produce spatial compositions that preserved the informational content while enabling formal and aesthetic judgment. This work established the theoretical foundation for the CSS/TiKI Lab’s visualization program and anticipated the broader argument of Ch 03: that tools are not neutral instruments but manifests that embody epistemological commitments.
Cross-Chapter Resonances: Ch 05 (tools, tools as manifests, design operations as epistemological commitments), Ch 06 (models, computational visualization methods), Ch 08 (landscape as medium, the gap between quantitative data and qualitative design)
Over/Under: Lausanne Jardins Competition (2009)
Project Type: Speculative Design / Competition Entry
Location: Lausanne, Switzerland (Lausanne Jardins 2009)
Collaborators: Allen Sayegh (INVIVIA), Edith K. Ackermann, Marcella Del Signore
Medium: Urban installation proposal integrating specimen horticulture, hydroponic infrastructure, and digital projection
Description
Over/Under proposed a vertical relationship between a surface-level garden and the underground subway station beneath it. A single specimen citrus tree—a species requiring intensive cultivation to survive the regional climate—would occupy the plaza above, supported by hydroponic infrastructure that made visible the technological apparatus sustaining its life. Below, in the subway station, the specimen would be digitally replicated and transformed, its image projected across the station walls as a rich, shifting planting that responded to the living plant’s seasonal changes.
The project explored the garden as a “mediating interface” between environmental phenomena and sensory perception. The specimen plant above functioned as a singular, intensely cultivated object. An homage to the orangery tradition of 17th-century aristocratic estates, where exotic species were sustained through artificial life-support systems. The digital replication below inverted this singularity into multiplicity, transforming one tree into a forest of projections that commuters would experience in transit.
The interdisciplinary team of a landscape architect, architect, interaction designer, and developmental psychologist, brought distinct expertise to the problem of urban environmental perception. Ackermann’s background in cognitive development informed the project’s interest in how “ambient cues” such as grass blowing or water vapor in air construct perception of landscape and knowledge of temporal events. Sayegh’s work in interactive environments provided the technical framework for linking physical specimen to digital projection.
The hydroponic system was conceived not merely as horticultural infrastructure but as a visible index of the labor required to sustain nature within urban contexts. The project made explicit what urban greenery typically conceals, that plants in cities are artifacts of intensive technological mediation, their survival dependent on irrigation, soil amendment, microclimate management, and ongoing maintenance.
Epistemological Contribution
Over/Under introduced the concept of environmental displacement as a design strategy in that the idea that experience of a landscape condition might be detached from physical presence in that landscape and reconstructed elsewhere through technological mediation. The subway commuter, moving through an underground space “devoid of nature,” would nonetheless encounter a rich botanical environment derived from the living specimen above. This displacement was not conceived as simulation or imitation but as transformation: the digital forest below was explicitly artificial, a replication that exceeded its source.
The project also articulated the specimen/field dialectic that would recur in later work. The singular, intensely cultivated plant that is removed from its native habitat, sustained by technological prosthesis stood in productive tension with the distributed, multiple, digitally-generated planting below. This relationship between the one and the many, the physical and the projected, the maintained and the autonomous, anticipated the “synthetic ecology” frameworks of Phase II.
Over/Under is also the embryonic precedent for the cultivant concept developed in Chapter 11, the singular citrus tree maintained by visible infrastructure staging, in miniature, the ongoing relationship between designed intention and biological agency that the cultivant later names at organism, garden, and territorial scale.
The collaboration itself modeled an interdisciplinary practice that would become increasingly central to the work: the recognition that responsive landscape systems require expertise spanning horticulture, computation, interaction design, and cognitive science. No single discipline possesses the knowledge necessary to design environments that mediate between biological process and human perception through technological interface.
Digital Drawing for Landscape Architecture: Contemporary Techniques and Tools for Digital Representation in Site Design (2010)
Project Type: Publication
Authors: Bradley Cantrell, Wes Michaels
Publisher: Wiley
Recognition: 2012 ASLA Award of Excellence in Communication
Description
Digital Drawing for Landscape Architecture addressed a fundamental crisis in the profession’s visual culture. By 2010, digital tools had become ubiquitous in practice, yet the available instructional literature remained either software-centric (button-pushing tutorials for Photoshop or AutoCAD) or borrowed wholesale from architectural visualization traditions that prioritized the clarity of objects over the atmospheric complexity of landscapes. The book provided landscape architecture with a methodology for digital representation calibrated to its specific disciplinary concerns.
The core theoretical contribution was the concept of the digital composite which was a recognition that landscape representation requires layering heterogeneous information (texture, shadow, vegetation, atmospheric effect, temporal change) that cannot be generated by any single software engine. The composite method synthesized the precision of vector drawing with the ambiguity of raster imagery, producing representations capable of holding the indeterminacy fundamental to landscape materials.
The book articulated what came to be known as the hybrid workflow:
Analog Base: The process often began with hand-drawn linework, not as nostalgic concession but as recognition that the human hand captures gestural information (the looseness of a tree canopy, the roughness of a path) that mouse-driven tools struggle to replicate.
Digital Synthesis: Analog bases were scanned and manipulated in image-editing software, layered with photographic textures, rendered passes from 3D models, and digital painting techniques.
Atmospheric Construction: Unlike architectural renderings prioritizing object clarity, the techniques focused on atmosphere with detailed workflows for rendering fog, light beams, rain, seasonal change, and the temporal qualities that define landscape experience.
Epistemological Contribution
Digital Drawing argued implicitly that representation is never neutral. The way a landscape architect draws a site determines how they understand it. Vector lines in CAD imply hard edges, walls, curbs, building footprints, yet landscape materials (vegetation, water, soil gradients, fog) are defined by indeterminacy and flux. By introducing compositing techniques that could hold ambiguity and atmospheric depth, the book equipped the discipline to represent and therefore to think about higher levels of complexity.
The ASLA jury recognized this contribution, noting that the book appealed “specifically to landscape architects who have been trained in traditional representation methods… but who would like to work effectively in digital media.” One juror remarked: “It’s the first drawings book I’ve seen that is contemporary and clear.” The recognition confirmed the central thesis of the textbook and had effectively standardized a new visual language for the profession.
The book also established a pedagogical model that would carry through subsequent publications that involved theory-grounded instruction that situated technical skills within larger questions about how representation shapes design thinking. The techniques were not ends in themselves but means toward richer engagement with landscape’s atmospheric and temporal qualities.
Coastal Sustainability Studio / TiKI Lab (2010–2013)
Project Type: Practice / Trans-disciplinary Research Laboratory
Location: Louisiana State University, Robert Reich School of Landscape Architecture
Collaborators: Jeff Carney, Lynne Carter, Traci Birch, John Supan; numerous research staff and graduate students
Funding: HUD, NOAA, Louisiana Governor’s Office of Homeland Security and Emergency Preparedness
Medium: GIS mapping, 3D animation, hydrological modeling, coastal resilience planning, community engagement, policy documents, video production
Description
The Coastal Sustainability Studio (CSS) and its technical research arm, the TiKI Lab (Technologies, Information, Knowledge, Interaction), were trans-disciplinary research centers at LSU dedicated to coastal resilience, community adaptation, and territorial-scale visualization of Louisiana’s land loss crisis. CSS brought together landscape architects, urban planners, environmental scientists, sociologists, and community advocates to develop resilience strategies for communities facing coastal erosion, subsidence, and storm surge vulnerability. The TiKI Lab focused specifically on developing visualization and communication technologies, producing animated sequences, interactive GIS platforms, and spatial narratives that translated complex scientific data into formats accessible to policymakers and affected communities. Over three years, the studios produced community resilience plans, policy documents, animated visualizations, and spatial analyses that directly informed Louisiana’s coastal protection and restoration planning process. The CSS/TiKI Lab demonstrated that trans-disciplinary studio culture, the organizing model that landscape architecture brings to complex territorial problems could operate as a primary mode of knowledge production, not merely as a setting for applying disciplinary expertise to predetermined problems.
Cross-Chapter Resonances: Ch 04 (ecology of practice, the studio as knowledge-producing community), Ch 05 (tools, visualization as manifest, the TiKI Lab as technical research instrument), Ch 06 (models, territorial-scale visualization and modeling), Ch 07 (technogeographies, who determines how coastal data is visualized and for whom)
Mississippi River Sediment Diversion Visualizations, Myrtle Grove (2011–2014)
Project Type: Trans-disciplinary Research / Visualization / Animation
Award: ASLA 2012 Professional Awards submission (Communications Category, PA-634)
Location: Myrtle Grove, Plaquemines Parish, Louisiana — 25 miles south of New Orleans
Client: National Wildlife Federation (NWF)
Collaborators: CSS/TiKI Lab research team
Medium: After Effects animation, 3D modeling, GIS mapping, photography, video production; scientific data from Army Corps of Engineers hydrological models
Description
A trans-disciplinary academic research studio effort, commissioned by the National Wildlife Federation, to explain the problems of coastal wetland loss and the potential solutions for rebuilding Louisiana’s coast through large-scale sediment diversions from the Mississippi River. The video focused on the proposed Myrtle Grove sediment diversion, a contained channel that would allow sediment-rich freshwater to pulse from the Mississippi River into degrading coastal marsh during peak flood seasons, rebuilding significant portions of wetland over a projected 50-year timeline. The production integrated scientific data and analysis, Army Corps engineering documents, historical records, GIS mapping, 3D modeling and animation, photography, and videography into a cohesive narrative accessible to the general public. The animation visualized both the historical process and the proposed engineered restoration: controlled pulsed sediment diversion employing the river’s natural forces to rebuild what infrastructure had destroyed.
Cross-Chapter Resonances: Ch 05 (visualization as manifest, the animation as a tool for public discourse), Ch 06 (models as instruments, hydrological models translated into spatial narrative), Ch 08 (landscape as medium, the Mississippi Delta as the paradigmatic case), Ch 07 (technogeographies, who controls how coastal restoration data is visualized)
Phase II: The Operative Landscape (2012–2016)
If Phase I asked how to represent dynamic systems, Phase II asked how to operate within them. The Mississippi River Delta, a landscape where the boundary between “natural” process and engineered infrastructure had long since collapsed, became the laboratory for this inquiry. In this landscape, sediment deposition builds land while subsidence and sea-level rise consume it. Infrastructural levee systems constrain flows that once nourished the surrounding wetlands. Oil infrastructure threads through ecosystems it simultaneously exploits and endangers while sustaining a region economically. The Delta demanded more than better pictures but better instruments and tools capable of sensing, simulating, and ultimately intervening in active geomorphic processes.
The projects of this phase moved from screen to territory, from the image to the apparatus. Physical-digital hybrid models coupled material simulation with real-time sensing, creating feedback loops between flow, deposition, and computational analysis. Speculative frameworks imagined infrastructures that would evolve autonomously, guided by ecological data rather than engineering specifications. The concept of “responsive systems” matured into a precise taxonomy of how technology might mediate between human intention and environmental process.
This phase established the cybernetic framework that would inform subsequent theoretical development and framed landscapes as systems characterized by feedback, self-regulation, and adaptive behavior. It also began to define design as the calibration of parameters rather than the specification of formal outcomes and technology as prosthesis extending the landscape’s own capacity to sense and respond to changing conditions.
Modeling the Environment: Techniques and Tools for the 3D Illustration of Dynamic Landscapes (2012)
Project Type: Publication
Authors: Bradley Cantrell, Natalie Yates
Publisher: Wiley
Description
If Digital Drawing addressed the static image, Modeling the Environment introduced the critical dimension of time. Landscape architecture is distinguished from architecture by the fact that its medium is alive. A building is finished when construction ends, but a landscape is just beginning. The book addressed the technical challenge of representing growth, decay, erosion, succession, and the other temporal processes that define landscape as dynamic system.
The publication introduced landscape students to procedural and parametric modeling approaches:
Procedural Modeling: Rather than modeling static terrain, the book demonstrated how to model terrain that could erode with surfaces defined by rules governing their transformation over time. Rather than placing static tree blocks, it explored simulating forest growth across decades, with planting plans that evolved according to ecological logic.
Data Integration: The book argued for integrating GIS (Geographic Information Systems) data directly into design models, transforming the digital model from a “picture” of the site into a “database” of the site. By linking models to real-world geospatial data, designers could work with actual topographic, hydrological, and vegetative information rather than abstracted representations.
Temporal Representation: The text developed techniques for representing seasonal change, growth patterns, material weathering, and other time-based phenomena. Landscape representation, the book argued, must capture not merely how a site appears at a moment but how it transforms across multiple temporal scales.
The focus on environmental dynamics positioned the book as a bridge between representation and simulation. Modeling was not merely a means of producing images but a way of understanding how systems behave which was a conceptual shift that would mature in subsequent research on responsive and adaptive systems.
Epistemological Contribution
Modeling the Environment advanced the argument that landscape representation must engage temporal depth. Static images, however atmospherically rich, cannot capture the fundamental characteristic that distinguishes landscape from architecture in that landscape is always in process, always becoming, never finished. By introducing techniques for modeling dynamic systems, the book provided tools for thinking about landscape as trajectory rather than state.
The integration of GIS data also signaled a methodological shift from representation to analysis. When the digital model contains actual geospatial information it becomes an instrument for investigating site dynamics rather than merely depicting site appearance. This data-driven approach foreshadowed the sensing-based methodologies that would characterize Phase II research.
The book contributed to an emerging discourse on design simulation, using modeling tools not to visualize predetermined designs but to explore how designs might perform under varying conditions. If a planting plan can be simulated across twenty years of growth, designers can evaluate not merely how it looks at installation but how it matures, how it responds to disturbance, how it provides ecological function across its lifespan.
Sedimachine (2012)
Project Type: Manifest / Prototype
Location: Louisiana State University
Collaborators: Frank Melendez (Assistant Professor of Architecture)
Research Assistant: Prentiss Darden
Medium: Plexiglass surface, perforated copper tube water delivery system, Arduino-controlled robotic spillway, Microsoft Kinect depth sensing
Description
The Sedimachine was a prototype apparatus developed to investigate the relationship between controlled water delivery, sediment deposition, and surface morphology. The project explored how small modifications to flow conditions might choreograph sediment patterning—a question that would later scale to territorial concerns about land-building in coastal environments.
Phase One: Passive Flow System
The initial configuration consisted of a plexiglass surface positioned at an incline, with water delivered via a copper tube perforated at intervals along its length. Sand introduced into the flow would pattern across the transparent surface based on which perforations allowed water through, creating depositional geometries responsive to the specific configuration of openings. The plexiglass substrate was chosen deliberately: its smooth, transparent surface provided minimal disruption to flow dynamics and reduced variables when documenting sediment behavior. Researchers could observe patterning from both above and below the surface, capturing how particles organized under varying flow conditions.
The system also enabled testing of flow-disrupting objects. Fishing weights, washers, and other small elements could be placed on the surface to investigate how obstacles manipulate sediment deposition that were the early prototypes of the “flow enhancers, resistors, and redirectors” that would become a formal design vocabulary in later research.
Phase Two: Robotic Spillway
The second phase introduced active control through a robotic spillway with twelve operable gates. Each gate was actuated by a servo motor controlled through an Arduino microcontroller, enabling programmable sequences of opening and closing that choreographed water delivery across the surface. This configuration transformed the apparatus from a passive observation system into an active instrument capable of testing how temporal patterns of flow control might produce specific depositional outcomes.
The robotic spillway represented an early implementation of responsive infrastructure at model scale with gates that could be programmed to react to conditions or cycle through predetermined sequences, producing sediment patterns through choreographed water management rather than static channel geometry.
Sensing Experiments:
The project included initial tests using a Microsoft Kinect depth sensor to document surface morphology. These experiments were largely unsuccessful: the depositional layer proved too thin for the Kinect’s resolution to capture meaningful topographic variation. This limitation would inform subsequent research, driving the development of multi-modal sensing approaches and the adoption of physical modeling systems (like the EmRiver geomorphology table) capable of producing thicker, more legible stratigraphic deposits.
Epistemological Contribution
The Sedimachine established several foundational principles that would carry through subsequent research:
Choreography over Form: The project demonstrated that sediment patterning could be approached as a design problem—not by sculpting deposits directly but by controlling the flow conditions that produce them. The perforated copper tube and robotic spillway were instruments for choreographing process rather than specifying outcome. This distinction between designing forms and designing the operations that generate forms became central to later theoretical frameworks.
The Plexiglass: The choice of a smooth, transparent substrate reflected a methodological commitment to isolating variables. By eliminating surface friction and enabling observation from multiple vantage points, the plexiglass surface allowed researchers to study flow-sediment interaction with minimal confounding factors. This controlled simplicity enabled clear documentation of cause-and-effect relationships, even as it abstracted from the complexity of actual riverine or coastal environments.
Object as Flow Modifier: The testing of weights and washers as flow-disrupting elements introduced the concept of designed objects whose primary function is to modulate dynamic processes rather than to occupy space as static forms. These small-scale tests prefigured the responsive prototypes (sediment gates, sieves, disruptors) developed at REAL and UVA, establishing a design approach focused on intervention within flows rather than construction of fixed structures.
Sensing Limitations as Research Drivers: The unsuccessful Kinect experiments revealed the technical challenges of capturing thin depositional layers through depth sensing. Rather than abandoning digital documentation, this limitation motivated the search for modeling systems producing more substantial morphological change and the development of complementary sensing approaches (ultrasonic range-finding, image analysis) that could capture phenomena at different scales and resolutions. Failure became productive, directing subsequent technical development.
Programmable Hydrology: The Arduino-controlled spillway introduced programmability into the experimental apparatus. Flow was no longer a constant to be observed but a variable to be designed—sequences of gate operations could be scripted, repeated, and modified based on observed outcomes. This capacity for programmable hydrology enabled systematic experimentation and laid groundwork for the repeatable hydrographs that would become standard practice in later geomorphology table research.
Fort Proctor: A Conditional Preservation (2013)
Project Type: Practice / Design Research + Peer-Reviewed Paper
Location: Fort Proctor, Plaquemines Parish, Louisiana (Lake Borgne)
Publication: Emery Mcclure, U. Cantrell, B. (2013). “Fort Proctor: A Conditional Preservation.” Proceedings of the Architectural Research Centers Consortium (ARCC).
Medium: Design research, field documentation, speculative preservation proposal, mapping, photography
Description
Fort Proctor, a Civil War-era fortification originally constructed on the southern shore of Lake Borgne in Plaquemines Parish, Louisiana, now stands entirely surrounded by water, the land that once connected it to the mainland has eroded and subsided entirely. The project used Fort Proctor as a case study for what the paper terms “conditional preservation”: a preservation ethic that accepts the dynamic, entropic character of landscape rather than attempting to freeze a historical moment. Traditional historic preservation assumes a stable ground condition, the building endures because the land endures. Fort Proctor’s condition reveals this assumption as untenable in deltaic landscapes where the ground itself is in continuous transformation. The design research proposed preservation strategies calibrated to the fort’s ongoing submergence, not restoration to an original condition, but designed engagement with the process of loss itself.
Cross-Chapter Resonances: Ch 08 (landscape as medium, the ground as dynamic, not stable), Ch 09 (interactions, adaptive management of heritage in dynamic landscapes), Ch 07 (technogeographies, sensing what is being lost and at what rate)
DredgeFest (2013–2014)
Project Type: Public Program / Workshop / Field Investigation / Symposium
Organizer: Dredge Research Collaborative (DRC) — Rob Holmes, Brett Milligan, Sean Burkholder, Brian Davis, Justine Holzman
Role: Workshop leader “Adaptive Devices” workshop (DredgeFest Louisiana, January 13–16, 2014)
Locations: Loyola University, New Orleans (symposium) + LSU Robert Reich School of Landscape Architecture, Baton Rouge (workshops)
Workshop Collaborators: Justine Holzman, David Merlin (Tulane)
Medium: Small-scale fluvial model, Arduino, Firefly/Grasshopper plugin, Rhino/Grasshopper scripts, servo motors, shape memory alloys, field investigation
Published: Milligan, B. and Holmes, R. (2015). “DredgeFest: Social Experiments in Sedimentary Landscapes.” ACSA 103rd Annual Meeting Proceedings, The Expanding Periphery and the Migrating Center, 220–229.
Description
DredgeFest was a series of public programs organized by the Dredge Research Collaborative that treated dredging, the mechanical excavation and relocation of sediment from waterways, as cultural, political, and designable infrastructure. For DredgeFest Louisiana (January 2014) at LSU, three workshops were hosted at the Robert Reich School of Landscape Architecture, each examining innovative methodologies of sediment manipulation at multiple scales. The “Adaptive Devices” workshop, led by Cantrell with Justine Holzman and David Merlin, explored how sensing and real-time feedback can alter the flow of sediment at localized sites. Participants used a small-scale fluvial model and physical computing, Arduino microcontrollers with the Firefly plugin for Grasshopper, servo motors, and shape memory alloys, to prototype responsive sedimentary infrastructures. The workshop staged a direct encounter between computational design tools and hydrological processes, prototyping at tabletop scale the kind of responsive infrastructure the dissertation theorizes at territorial scale.
Cross-Chapter Resonances: Ch 04 (ecology of practice, the DRC as formative community), Ch 05 (the workshop as proto-lab), Ch 08 (sediment as medium of territorial formation), Ch 09 (Arduino/Firefly/Grasshopper as responsive interface), Ch 10 (shape memory alloys as material responsivity)
Responsive Environments and Artifacts Lab (REAL) (2014–2017)
Project Type: Research Infrastructure / Manifest
Location: Harvard Graduate School of Design
Collaborators: Allen Sayegh (REAL co-director), Justine Holzman
Student Researchers: Phia Sennett, Jeremy Hartley
Medium: Physical-digital hybrid modeling system with real-time sensing and robotic actuation
Description
The Responsive Environments and Artifacts Lab (REAL), co-directed with Allen Sayegh at the Harvard Graduate School of Design, emerged from our shared interest in responsive environments, where Sayegh focused on small scale devices and architectural spaces and I was interested in regional scale research into infrastructural adaptation. My work at the lab investigated how responsive technologies could choreograph territorial-scale landscape processes, focusing on sediment transport and the potential to harness hydrological energy for land-building. REAL’s approach synthesized the computational capabilities of numerical modeling with the tangible, material engagement of physical hydraulic modeling—a tradition with deep roots in engineering practice.
The history of physical hydraulic modeling provides essential context. As Hubert Chanson has documented, reduced-scale models have been used since antiquity to study flow phenomena, but the practice reached institutional maturity in the early twentieth century through massive installations like the U.S. Army Corps of Engineers’ Mississippi Basin Model at Vicksburg. These models enabled engineers to observe complex fluid dynamics that resisted mathematical formalization (turbulence, sediment transport, channel migration) through direct material engagement. Yet traditional hydraulic models remained analog instruments, observed by human eyes and interpreted through engineering judgment. REAL asked what might become possible when physical models were instrumented with digital sensing and coupled to computational analysis.
The lab’s primary research instrument was a geomorphology modeling table, originally designed by Steve Gough at EmRiver, extensively augmented with sensing systems and software tools developed at REAL. The resulting apparatus operated as a composite model, a physical-digital hybrid where sensing technologies capture material behavior and feed data into computational systems for analysis, visualization, and feedback control. Chris Paola and colleagues have argued for the “unreasonable effectiveness” of such experimental stratigraphic work, noting that physical models can reproduce “complex nonlinear physical phenomena which are not yet fully understood, and therefore not available in the form of numerical models.” REAL extended this insight from scientific investigation to design methodology.
Physical Parameters: The modeling environment operates with synthetic sediment particles of varying sizes and densities. These particles self-organize based on water velocities, with larger/lighter particles (yellow) simulating gravel and smaller/denser particles (red) simulating fine silt. The self-sorting behavior produces stratigraphic patterns analogous to (though not scaled replicas of) natural fluvial deposits. As water moves across the surface, through channels, and as groundwater below, erosion and deposition organize the sediment to create landscapes of river beds, islands, deltas, beaches, dunes, and ripples.
Programmable pumps control both stream flow and groundwater flow through repeatable hydrographs, enabling systematic experimentation. Media feeders introduce measured sediment mixes, and dye injectors visualize flow patterns. The adjustable bed establishes lateral slope, longitudinal slope, and base water level. These variables set initial conditions and can be modulated over time, allowing researchers to test how systems respond to changing inputs.
Sensing Systems: REAL developed software tools integrating multiple sensing modalities:
Microsoft Kinect mounted overhead for continuous point cloud generation of surface topography, capturing data at approximately 60-degree viewing angle with point resolution of roughly 10mm at the surface
Ultrasonic range-finding sensors on motorized rails for precise spot elevations and section cuts, offering higher accuracy than depth cameras and controllable through predefined sampling patterns
Image analysis algorithms tracking sediment behavior, material sorting, and morphological change
Flow visualization through dye tracking to reveal velocity gradients and turbulence patterns
The overlay of sensing modalities produced data at multiple resolutions and temporalities, from continuous low-resolution point clouds to precise spot measurements, from instantaneous flow visualization to long-duration morphological tracking.
Responsive Prototypes: The feedback between sensing and physical model enabled development of actuated design elements: robotic sediment gates, repositionable sieves, and flow disruptors. These prototypes formed a design vocabulary of “flow enhancers, resistors, and redirectors”—operations that intervene locally but aggregate to produce territorial change. The research explored three modes of engagement: direct interaction through physical manipulation; responsivity through sensing-driven reaction to detected conditions; and autonomy through systems that learn or evolve behaviors through feedback.
Cyborg Coasts Seminar (2015): The geomorphology table and associated software debuted pedagogically in “Cyborg Coasts,” a seminar exploring how designed interventions might modulate sediment movement in coastal environments. Student research tested responsive infrastructural prototypes within the modeling environment. Ricardo Jnani Gonzalez’s “Attuning Sediment Transfer” investigated how actuated elements could choreograph deposition patterns. Andrew Boyd and Tyler Mohr’s “FIN” developed a formal language for land-formation devices, documenting their effects through iterative image analysis.
Epistemological Contribution
REAL established composite modeling as a design methodology for landscape architecture. The practice of coupling physical and numerical models in feedback relationship was still nascent in the hydraulic engineering community but showed exceptional promise for design applications. Physical models excel at reproducing emergent behaviors (sediment sorting, channel braiding, delta lobe switching) that resist mathematical formalization. Digital models excel at pattern recognition, analysis across scales, and control logic. The synthesis enabled new forms of design inquiry unavailable to either approach alone.
This methodology drew on scientific precedents while departing from scientific aims. Kleinhans and colleagues have demonstrated how experimental models can reveal morphodynamic principles even when not scaled to specific prototypes. Bernard Patten and Eugene Odum’s work on “The Cybernetic Nature of Ecosystems” provided theoretical grounding for understanding landscapes as self-regulating systems characterized by feedback loops. Anastassia Makarieva’s research on biotic regulation suggested how biological and physical processes co-produce environmental conditions. REAL synthesized these perspectives into an experimental practice where designed interventions could be tested against material dynamics in real-time.
The lab developed the critical concept of the model-as-site. Traditional hydraulic modeling seeks similitude with mathematical relationships (Froude scaling, Reynolds numbers) establishing proportional correspondence between model and prototype. REAL deliberately departed from this convention. The geomorphology table was conceived as its own environment, with its own elements of novelty and surprise, rather than a scaled replica of any particular landscape. The presumed scale was illustrative and diagrammatic rather than establishing linear or proportionate relationships with real-world conditions.
This reframing had methodological implications. Rather than using models to predict outcomes in specific sites, designers could use them to discover principles of interaction between flow, sediment, and intervention. The model became a generative space that is a dialogue between experimental environment and the broader systems it serves to inform and where design intentions could be tested against material dynamics without claiming predictive authority over particular landscapes.
Leif Estrada’s MLA Thesis, “Towards Sentience” explored the deconstruction and reconstitution of the Los Angeles River channel. The project proposed individually controlled insertions that redirect sediment from the demolished concrete channel, working with cyclical hydrology to re-pattern the river bed. The new channel emerges through interactions that guide outcomes within controlled ranges, with aggregated effects responsive to current hydrology while projecting future goals.
Andrew Boyd and Tyler Mohr’s “FIN” continued development of actuated land-formation devices, creating a systematic taxonomy of flow enhancers, resistors, and redirectors. The project documented effects through iterative image analysis, producing topographic plans that speak to movement and probabilities of change creating representations expressing landscape as gradient between stasis and flux rather than fixed form.
Antoine Picon’s historical scholarship on French hydraulic engineering, particularly his studies of how technical knowledge develops through iterative engagement with territorial systems, informed REAL’s understanding of the relationship between modeling and intervention. The lab’s work also drew on precedents in landscape practice, particularly the Owens Lake remediation project documented by Alexander Robinson, where adaptive management of dust emissions required ongoing negotiation between design intention and environmental response.
Four primary applications emerged from REAL’s investigations:
Modeling of dynamic systems: Creating physical environments exhibiting fluvial behaviors (meandering, delta building, channel braiding, tidal fluctuation)
Observation of complex dynamic systems: Using sensing technologies to capture and analyze emergent phenomena
Prototyping of devices that engage with dynamic systems: Testing actuated elements within the model environment
Testing responsive systems and methods of feedback: Developing control logics linking sensing to actuation
This framework positioned the geomorphology table not as a representational device but as an experimental territory, a space where the unpredictability of material behavior could inform design thinking about landscapes characterized by flux and indeterminacy.
Branding Islands Making Nations: Venice Architecture Biennale (2016)
Project Type: Speculative Exhibition / Film Installation
Location: 15th International Architecture Exhibition, Venice Architecture Biennale
Collaborators: Vertical Geopolitics Lab (VGL)
Medium: Film, large-format prints
Description
Branding Islands Making Nations was an exhibition and public program developed in collaboration with the Vertical Geopolitics Lab for the 2016 Venice Architecture Biennale. The project examined the relationship between physical land creation and the representational language deployed to legitimize it, exposing how branding, marketing, and visual communication extend the reach and power of nation-states through constructed territory.
The contribution included a film exploring how the production of new land intersects with the production of meaning. Where the geomorphology table research investigated the material dynamics of sediment deposition and land-building, this project interrogated the parallel process by which territorial claims are constructed through representation. The large-format prints documented case studies of how branding and marketing operate as instruments of geopolitical power, transforming artificial landmasses into naturalized extensions of sovereign territory.
The conceptual framework positioned representation as constitutive rather than solely descriptive and that the politics of representation determine the success of interventions in the built environment. Land that appears on maps, that receives names and flags, that circulates through media imagery, acquires a political reality that may precede or even substitute for material presence. The project asked how spatial practitioners (landscape architects, architects, designers, consultants) participate in this representational economy, whether consciously or not.
Competition Format: The public program took the form of a satirical case study competition, inviting marketing experts, communication designers, and advertising professionals to speculate on the role of branding in place-making. Departing from ethical critique to satire, the competition employed a format foreign to academic architectural discourse, the sales pitch. Entrants were tasked with developing branding packages for artificial landmasses instrumental in legitimizing territorial claims, specifically within the contested geography of the South China Sea.
The competition framework deliberately suspended moral evaluation, analyzing entries instead for their “rich interrogation of the role representation plays in its ability to distort, mislead, disguise, or reverse meaning.” By creating incongruity between reality and what is represented, successful entrants demonstrated how the negative might appear positive in legitimizing territorial claims. The goal was not to endorse such practices but to expose them, making visible the mechanisms by which representation manufactures political reality.
Critical Intent: The project operated through strategic exposure rather than direct opposition. As the exhibition statement noted: “it is based on the firm belief in the necessity to expose the glitches, loopholes, and gray areas in systems as first step toward conflict resolution.” By staging the representational strategies of geopolitical land-making as design problems, the exhibition created discourse on the status spatial practitioners hold within societal decision-making structures that are caught between accountability and profitability, between ethical responsibility and technical capability.
Epistemological Contribution
Branding Islands Making Nations extended the land-making research into explicitly political territory, revealing that the choreography of sediment is inseparable from the choreography of meaning. Several key insights emerged:
Representation as Territory: The project demonstrated that constructed land acquires political existence through representational practices as much as through material deposition. An artificial island becomes sovereign territory not merely when sand is dredged and piled but when it appears on official maps, when it receives a name, when satellite imagery circulates through news media, when it is branded with national identity. This insight complicated the technical optimism of the geomorphology research, suggesting that land-building is always already a political act embedded in systems of representation and power.
The Designer’s Complicity: By inviting spatial practitioners to occupy the position of nation-state consultants, developing branding packages for contested territorial claims, the project forced confrontation with the political economy of design expertise. Landscape architects, architects, and planners possess technical knowledge that can serve multiple masters; the capacity to choreograph land-making processes is also the capacity to enable territorial expansion, displacement, and conflict. The exhibition asked not whether designers should engage with such commissions but how they might do so with awareness of their position within larger power structures.
Satire as Method: The competition format employed satire as a critical methodology. By treating geopolitical branding as a legitimate design problem that were complete with briefs, juries, and evaluation criteria, the project defamiliarized practices that typically operate invisibly. The absurdity of awarding prizes for territorial propaganda exposed the absurdity already present in the actual practices of nation-state image-making. Satire enabled critique without moralizing, creating space for analysis of how representation functions rather than simply condemning its misuse.
From Material to Meaning: The project marked an important expansion in the research trajectory. Earlier work with the Sedimachine and with the geomorphology table focused on the material dynamics of land formation and how water, sediment, and intervention interact to produce terrain. Branding Islands Making Nations asked what happens after (or alongside) material production and how newly made land is inscribed with meaning, claimed by sovereigns, contested by rivals, and circulated through global media.
Responsive Landscapes: Strategies for Responsive Technologies in Landscape Architecture (2016)
Project Type: Publication
Authors: Bradley Cantrell, Justine Holzman
Publisher: Routledge
Description
Responsive Landscapes provided the discipline with a precise vocabulary for discussing “smart” systems in landscape contexts. While architecture and urban design had embraced discourses of responsive environments and smart cities, landscape architecture lacked a theoretical framework specific to its concerns with ecological process, territorial scale, and temporal depth. The book addressed this gap, offering both conceptual taxonomy and practical strategies for integrating sensing, processing, and actuation into landscape systems.
The book’s central contribution was a taxonomy of responsive modes—six distinct categories describing how technology might mediate between environmental process and human experience:
Elucidate — Technologies revealing invisible environmental processes or data. Making the invisible visible (air quality, water currents, soil moisture) to raise awareness.
Compress — Systems condensing vast spatial or temporal scales into human-perceivable formats. Allowing humans to understand geological time or territorial scale through data visualization.
Displace — Telepresence technologies enabling experience of a site from remote locations. Disconnecting experience from physical presence; essential for hazardous or inaccessible sites.
Connect — Networked systems linking disparate agents or sites (IoT). Creating planetary-scale landscape management through distributed sensing.
Ambient — Ubiquitous background technologies monitoring without intrusion. The landscape as sentient skin; technology receding into environmental background.
Modify — Active systems physically altering the environment in real-time. Transition from designing form to designing behavior; actuated infrastructure.
This taxonomy moved conversation beyond vague notions of “smart” landscapes to specific design strategies. Each mode implied different relationships between technology, environment, and human experience; designers could select and combine modes according to project-specific goals.
The book drew on research conducted at REAL and through projects like the Sedimachine and geomorphology table, grounding theoretical frameworks in experimental practice. Case studies ranged from small-scale installations to territorial-scale speculations, demonstrating that responsive approaches operated across scales.
Epistemological Contribution
Responsive Landscape argued that technology in landscape is not merely about efficiency but about phenomenology—changing how humans perceive and interact with environmental systems. This distinction separated the book from smart city discourses focused on optimization (smart traffic, smart grids, smart waste management) and positioned responsive technology as a medium for deepening environmental experience and understanding.
The taxonomy also established that responsiveness is not a single condition but a spectrum of engagements. The gradient from Elucidate (revealing) through Connect (networking) to Modify (actuating) traced increasing levels of technological intervention in environmental process. Designers could calibrate their approach along this spectrum, selecting modes appropriate to context, intent, and ethical consideration.
Jason Kelly Johnson and Nataly Gattegno, in the book’s foreword, described the work as defining “an emerging world of robotic ecologies, where matter at all scales is programmable, parametric, networked, and laden with artificial intelligence.” Katie Kingery-Page, reviewing the book in the Journal of Architectural Education, noted that it “presents compelling examples of the messy potential of responsive technology interacting with large-scale dynamic landscape systems.” These responses confirmed the book’s success in articulating a new domain of practice.
The publication also introduced the concept of landscape as cybernetic system—environments characterized by feedback loops between sensing, processing, and actuation. This framing drew on mid-century cybernetic theory (Wiener, Bateson) while updating it for contemporary technological and ecological contexts. Landscape was reconceived not as passive recipient of design intervention but as active participant in ongoing cycles of perception and response.
Phase III: Codification (2016–2020)
The move from Louisiana State University to Harvard’s Graduate School of Design and subsequently to the University of Virginia marked an institutional scaling of the computational agenda developed in previous phases. The question shifted from whether landscape architecture should engage computation to how it might develop computational methods specific to its disciplinary concerns. Architecture’s parametric turn had produced formally complex buildings and landscape’s computational turn would need to produce something different, a suite of tools for managing ecological complexity, indeterminacy, and temporal depth.
The projects and pedagogical initiatives of this phase focused on codification in multiple senses:
the literal writing of code (scripts, algorithms, custom software tools)
the establishment of codes (protocols, taxonomies, disciplinary standards for computational practice)
the encoding of landscape logics into executable form.
Students were trained not merely to use software but to make it, to open the “black box” of commercial tools and write algorithms calibrated to specific ecological questions. The landscape architect was reframed as “tool-maker” rather than tool-user.
This phase consolidated the earlier experimental work into transmissible methods, embedding computational approaches within curricula at three major institutions and establishing frameworks that could be adopted and extended by subsequent generations of practitioners and researchers.
UVA Geomorphology Lab and Adaptive Design Protocols (2017–present)
Project Type: Research Infrastructure / Pedagogical Platform
Location: University of Virginia School of Architecture
Collaborators: Brian Davis, Xun Liu
Medium: Responsive geomorphology model with multi-modal sensing, robotic prototyping, and machine learning integration
Description
Upon relocating to the University of Virginia, the geomorphology modeling research evolved from a focused research instrument into a multi-purpose platform supporting PhD student investigations, speculative advanced studios, and foundational instruction in landscape architecture. A team of academics, including Brian Davis and Xun Liu, collaborated on developing platforms and interfaces for adaptive design protocols that could engage territorial-scale landscape systems.
The UVA installation utilized an EmRiver geomorphology table (approximately 4m × 1.5m working area, 0.3m depth) augmented with expanded sensing and actuation capabilities. The methods developed experimentally at REAL were codified, transforming experimental practice into teachable methodology that could be engaged by students and research collaborators across multiple levels of expertise.
Physical Environment: The model is similar but larger than the one used at REAL and creates fluvial landscapes through the interaction of water flow and synthetic sediment. Particulate matter moves under the force of water through processes that sort and organize particles based on velocity, weight, and size. As water moves across the surface, through channels, and as groundwater below, erosion and deposition organize the sediment to create landscapes of river beds, islands, deltas, beaches, dunes, ripples, and other landforms.
The synthetic sediment comprises particles of varying sizes and densities that sort according to flow velocity where larger/lighter particles concentrate where velocities are higher, smaller/denser particles deposit where flow slows. This sorting creates visible stratigraphic patterns, with fine-grained material often marking “ancient” deposits overlain by subsequent coarse-grained layers. The self-organizing behavior produces recognizable geomorphological features without requiring precise scaling to any particular prototype landscape.
Control Parameters:
Bed slope (lateral and longitudinal) via adjustable supports
Water level through downstream weir height
Stream flow via variable-speed pump with programmable hydrographs
Groundwater flow via secondary pump system
Sediment input through calibrated media feeders
Flow visualization through dye injection system
The programmable hydrographs enable repeatable experimental conditions and the same flow regime can be run multiple times with measured variations in other parameters, supporting systematic investigation of cause and effect relationships while acknowledging the inherent variability of complex systems.
Sensing Systems, the multi-modal sensing infrastructure captures model behavior at multiple resolutions and temporalities:
Microsoft Kinect and Kinect 2 mounted overhead captures continuous point clouds of surface topography, with viewing angle of approximately 60 degrees and point resolution of roughly 10mm at the surface
Ultrasonic banner sensors on stepper-motor-driven rails (controlled via Arduino) measure precise spot elevations and generate section cuts, offering higher accuracy than depth cameras and controllable through predefined gridded sampling patterns
Image analysis tracks sediment sorting, channel migration, and morphological change through systematic photographic documentation
Dye visualization reveals flow paths, velocity gradients, and turbulence patterns
The overlay of sensing modalities produces a continuously updated digital representation of the physical model—not a prediction of external landscapes but a record of the model’s own dynamic behavior available for analysis and feedback control.
Responsive Prototypes: The lab serves as a testing ground for actuated design elements operating across a gradient of engagement modes. Direct interaction involves physical manipulation of the model environment—placing obstacles, adjusting slopes, introducing sediment. Responsivity involves sensing-driven devices that react to detected conditions. Autonomy involves systems that learn or evolve behaviors through feedback and devices whose response patterns develop through machine learning rather than pre-programmed rules.
Epistemological Contribution
The UVA phase consolidated experimental methods into pedagogy, enabling subsequent generations of students and researchers to engage the geomorphology table as a design tool. Beyond technical operation, this step required articulating theoretical frameworks and situating the work within broader disciplinary and scientific contexts.
Composite Modeling as Design Method: The UVA lab operates within an emerging practice of composite modeling that creates simultaneous interaction between numerical and physical models.
From Responsivity to Autonomy: The UVA phase advanced integration of machine learning into the responsive framework. A useful analogy distinguishes pre-programmed responsivity from learned adaptation in that early thermostats operated through fixed thresholds (when temperature drops below X, activate heating) and contemporary systems like Nest develop patterns of automation through machine learning, optimizing across multiple variables (user schedule, energy pricing, thermal dynamics of the space). The device develops behaviors unique to the learning algorithm it uses and the context in which it functions.
The geomorphology lab became a testing ground for this transition applied to landscape infrastructure. How might sediment gates develop response patterns through learned optimization rather than pre-programmed rules? How might infrastructure that “continually learns from itself by processing the results of its actions before choosing the next intervention” differ from infrastructure operating through fixed protocols? The gradient from interaction through responsivity to autonomy maps a trajectory of increasing machine agency in landscape systems—a trajectory the lab enables investigating at experimental scale before territorial deployment.
Notation for Indeterminacy: The iterative documentation produced through image analysis and sensing generates new forms of landscape notation. Traditional topographic plans represent terrain as it exists at a moment of survey creating a snapshot extracted from continuous process. The research produces topographic representations that speak to movement and probabilities of change, expressing flux as condition rather than capturing stasis as fact. Areas of stability and areas of active change appear as gradients rather than boundaries; the notation system provides graphic language for landscapes understood as dynamic systems rather than fixed forms.
Collapsing Design Phases: The research collapses representation, analysis, documentation, and construction into continuous feedback. Sensing systems that document the model’s behavior simultaneously provide data driving responsive interventions. The distinction between observing landscape and acting upon it dissolves into unified cycles of perception and response. This collapse prefigures the “learn-and-adjust” approach that characterizes adaptive management—design as ongoing negotiation with dynamic systems rather than one-time specification of fixed outcomes.
The work advances a form of design research holding great promise for engaging large-scale environmental change. Engaging highly dynamic systems and utilizing their energy to compose new terrains provides mechanisms for massive change with minimal input. The feedback between sensing systems, machine intelligence, and landscape modification creates construction methodologies that are tunable, challenging current methods of complex simulation and static infrastructure. The goal is to move beyond known representations and design methods, developing tools necessary to address global conditions characterized by flux and indeterminacy.
Codify: Parametric and Computational Design in Landscape Architecture (2018)
Project Type: Publication
Editors: Bradley Cantrell, Adam Mekies
Publisher: Routledge
Description
Codify addressed the question of how landscape architecture might develop computational methods specific to its disciplinary concerns. Architecture’s parametric turn had produced a decade of formally complex buildings and our position was that landscape’s computational engagement would need to produce something different, the tools for managing ecological complexity, indeterminacy, and temporal depth rather than generating novel geometries.
The book argued that ecology, technology, and urbanization were collapsing into a single sector where landscape architecture was uniquely positioned to operate. Code, understood broadly as algorithm, protocol, and executable instruction was proposed as the essential language for managing this convergence. The publication explored multiple dimensions of codification:
Generative Design: Algorithms that could generate planting plans adapting to microclimates, grading designs optimizing water retention, or infrastructure layouts responding to ecological gradients. Rather than manually placing thousands of elements, designers could write rules: “Place a Populus deltoides wherever soil moisture exceeds 40% and slope is less than 5 degrees.”
Managing Indeterminacy: Unlike architectural parametricism, which often sought precise formal control, landscape parametricism engaged indeterminacy as a design condition. Code provided rules (a genotype) generating landscapes (phenotypes) that could adapt to changing conditions. The designer became curator of algorithms, establishing parameters within which systems could evolve.
Tool-Making: The book championed a shift from using software to making software. Landscape architects needed to become “tool-makers,” opening the black box of commercial applications to write algorithms calibrated to specific ecological questions. This required literacy in scripting languages (Python, JavaScript) and visual programming environments (Grasshopper) that enabled customization of computational workflows.
The edited volume brought together practitioners, researchers, and educators demonstrating computational approaches across contexts, from parametric planting design to algorithmic hydrology to machine learning applications. The diversity of contributions illustrated that computational landscape architecture was not a singular method but an expanding field of approaches.
Epistemological Contribution
Codify articulated a fundamental shift: from using computation for landscape architecture to thinking computationally about landscape architecture. This distinction separated the book from software tutorials or digital technique manuals and it proposed that computation was not merely a tool for efficiency but a mode of thought capable of engaging landscape’s specific complexities.
Richard Hindle, commenting on the book, stated: “Codify convincingly argues that Landscape Architecture is uniquely positioned to define this sector of technology, and in the process redefine itself.” This observation captured the book’s disciplinary ambition of not merely adopting computational methods from adjacent fields but developing approaches native to landscape’s concerns with ecology, territory, and temporal process.
The book also advanced the concept of the landscape architect as algorithm curator. If ecological systems are too complex to fully specify, and if environmental conditions are too variable to predict, then design must operate through establishing rules and parameters within which outcomes emerge. The designer’s role shifts from determining form to calibrating process setting the conditions within which landscapes self-organize, adapt, and evolve.
The emphasis on tool-making had pedagogical implications developed through my concurrent work at Harvard GSD and UVA. Curricula evolved to include scripting, electronics prototyping (Arduino), and geospatial programming—skills enabling students to create rather than merely consume computational tools. This “white box” pedagogy, as opposed to “black box” software operation, empowered designers to customize their instruments to disciplinary needs.
Algorithmic Cultivation (2019)
Project Type: Installation / Manifest
Location: University of Virginia School of Architecture
Collaborators: Lucia Phinney (UVA), Robin Dripps (UVA), Emma Mendel (UVA / UPenn)
Medium: Custom acrylic planters, gantry robot with shearing mechanism, living plants, remote data feeds, sensing systems
Description
Algorithmic Cultivation was an installation exploring the interrelationship of human, machine, and plant through a robotic pruning system that shaped living vegetation according to data streams external to the growing environment. A gantry robot moved over plants housed in custom-designed acrylic planters, executing shearing operations derived from remote datasets—environmental data, species migration patterns, or other information streams having no direct relationship to the plants themselves. The project asked whether such mediated interactions might constitute a form of communication between radically different forms of life, and what such communication might reveal about human relationships to other species in an age of automation and ecological crisis.
The installation was conceived to operate over an extended duration, with the plants’ growth responses to pruning actions forming a feedback loop where data drives pruning, pruning shapes growth, growth is sensed and measured, and the cycle continues. The project was only partially realized and technical difficulties limited operation to a short period rather than the planned year-long duration. Yet even in its truncated form, the installation posed fundamental questions about agency, language, and the ethics of human intervention in nonhuman life.
Technical System: The apparatus consisted of three primary components: a growing environment (custom acrylic planters with controlled substrate and moisture), a robotic actuation system (gantry-mounted shearing mechanism), and a data-processing layer (software translating remote datasets into pruning instructions). The shearing operations followed hedge-pruning conventions: XY planar cuts across the top surface, XZ cuts along sides (angled to allow light penetration), and YZ cuts between individual plants creating visual apertures through the mass. Unlike conventional hedge maintenance occurring once or twice annually, the robotic system could execute frequent cuts in varying planes, producing growth patterns impossible through manual cultivation.
Data as Catalyst: The project deliberately employed data feeds disconnected from the immediate growing environment. Rather than optimizing plant health or maximizing yield the data operated as “an unfiltered random catalyst for provoking the feedback loop of stimulus and response.” This estrangement was intentional and by severing the instrumental relationship between data and outcome, the project created space for emergent patterns and unexpected plant responses. The robot acted not as an optimizing agent but as an interlocutor, posing questions through pruning to which the plant responded through growth.
Contextual Framing: The project statement situated the work within three converging crises including the pandemic emergence at urban-wildlife interfaces (written during the early COVID-19 period), species migration and extinction under climate change, and the transformation of agriculture through automation. Each crisis involves the fraught relationship between humans and other species and each involves technological mediation that both enables and obscures that relationship. Industrial greenhouses represent one trajectory, a controlled environment where plants are optimized for yield through automated systems opaque to most users. Algorithmic Cultivation proposed an alternative in which automation is service of communication rather than production, seeking empathy rather than instrumental relationships with plant life.
Artistic Precedents: The project drew explicitly on Jean Tinguely’s kinetic sculptures where machines whose “intentional oddities” produced surprising, sometimes disastrous, always unexpectedly beautiful actions and Natalie Jeremijenko’s urban infrastructures for nonhumans, which seek to help humans understand the sensory experience of other species. Tinguely’s machines questioned assumptions about mechanical predictability and Jeremijenko’s projects questioned assumptions about the boundary between human and nonhuman. Algorithmic Cultivation extended both trajectories, using robotic automation not for efficiency but for the production of “a nascent language” between species.
Epistemological Contribution
Despite its partial realization, Algorithmic Cultivation advanced several conceptual frameworks central to the research trajectory.
Data as Medium, Not Optimization: Precision agriculture employs data to maximize yields through sensing soil moisture to optimize irrigation, monitoring plant health to target fertilization. Algorithmic Cultivation proposed an alternative relationship between data and cultivation where information flows shape plant growth without predetermined outcomes. The data becomes a medium for interaction rather than a tool for control, introducing contingency and emergence into the human-plant relationship. This reframing anticipates the “learn-and-adjust” paradigm developed in theoretical chapters, where design operates through ongoing negotiation rather than upfront specification.
This reframing is the empirical ground on which the cultivant concept, the ongoing relationship between designed intention and biological agency in which maintenance is the primary design act, became visible. The concept is developed theoretically in Chapter 11.
Interspecies Communication: The project’s most speculative proposition concerned language formation. Drawing on theories that language emerges through “almost random sounds” and responsive feedback rather than instrumental need, the installation proposed that pruning and growth might constitute a proto-linguistic exchange. The robot’s cuts posed questions and the plant’s responses measured through leaf size, branching patterns, and color changes provided answers. This framework extended the cybernetic concepts from earlier work (sensing-processing-actuation loops) into explicitly communicative territory, suggesting that feedback systems might enable forms of understanding across radical difference.
The Productive Failure: The project’s technical difficulties and abbreviated operation constitute their own form of knowledge. The aspiration to maintain a year-long feedback loop between data, pruning, and growth proved beyond the available technical and institutional capacity. This failure illuminates the gap between speculative design and realized infrastructure and a gap that characterizes much of the “wildness creator” thinking from “Designing Autonomy.” The difficulties of maintaining robotic systems in humid growing environments, of keeping sensing equipment calibrated over extended periods, of sustaining institutional support for experimental installations, all reveal constraints that territorial-scale autonomous systems would face at far greater magnitude.
Third Intelligence Prototype: Algorithmic Cultivation can be read as a small-scale prototype for the “Third Intelligence” framework that would be articulated in subsequent theoretical work. The installation staged an encounter between human intention (setting initial parameters, selecting data feeds), machine cognition (translating data into movement, executing cuts), and biological agency (the plant’s growth responses, which follow their own logic regardless of human or machine intention). The plant is not a passive recipient of robotic intervention but an active participant whose responses cannot be fully predicted or controlled. The installation made visible the choreography of multiple intelligences that territorial-scale adaptive systems would require.
Agricultural Automation Critique: By exhibiting the apparatus within an architecture school rather than deploying it as agricultural infrastructure, the project invited critical reflection on automation’s transformation of cultivation. Algorithmic Cultivation proposed that automation might enable new forms of attentiveness, new modes of response, and new possibilities for reciprocal relationship.
Phase IV: Cultivated Wildness (2017–2025)
The emergence of machine learning and artificial intelligence as practical design tools prompted a fundamental reconsideration of agency in designed landscapes. Previous phases had developed responsive systems through environments that reacted to conditions according to pre-specified rules. But AI systems learn, adapt, and generate behaviors not explicitly programmed. They introduce a mode of autonomy that challenges traditional assumptions about the designer’s role.
Some of the projects of this phase engage this challenge directly, asking how landscape architecture might collaborate with, rather than deploy, artificial intelligence. The theoretical framework of “Cultivated Wildness” proposes that in the Anthropocene, preservation through non-intervention is no longer viable and maintaining ecological function requires active management, potentially including autonomous technological agents. The concept of “technodiversity” extends biodiversity’s logic to the technological sphere, arguing that resilient landscape systems require heterogeneous assemblages of devices for sensing, processing, and actuating rather than monocultural technological solutions.
The “Third Intelligence” framework positions the contemporary landscape architect as choreographer of interactions between human intention, biological agency, and machine cognition and none fully subordinate to the others, each contributing capacities the others lack. This phase represents a reconfiguration within a more distributed field of actors.
Designing Autonomy: Opportunities for New Wildness in the Anthropocene (2017)
Project Type: Publication / Theoretical Framework
Authors: Bradley Cantrell, Laura J. Martin, Erle C. Ellis
Publisher: Trends in Ecology & Evolution 32 (3): 156–166
DOI: 10.1016/j.tree.2016.12.004
Description
“Designing Autonomy” articulated a provocative proposition: that maintaining wild places in the Anthropocene increasingly requires intensive human intervention, and that this paradox might be resolved through fully automated systems capable of creating and sustaining wildness without ongoing direct human involvement. Published in Trends in Ecology & Evolution, a leading journal in ecological science, the paper brought landscape architecture’s engagement with responsive technologies into direct dialogue with conservation biology, rewilding discourse, and artificial intelligence research.
The paper emerged from an interdisciplinary collaboration. Laura J. Martin, an environmental historian at Harvard’s Center for the Environment, brought expertise in the social and cultural dimensions of conservation. Erle C. Ellis, a geographer at the University of Maryland, contributed his foundational work on anthropogenic biomes and the ecology of human-transformed landscapes. The collaboration positioned the research within broader scientific and humanistic debates about nature, technology, and the Anthropocene.
The Central Paradox: The paper purposefully opened with a contradiction that reducing human influences on species and ecosystems generally requires increasing levels of human management. Protected wilderness areas demand continuous intervention (control of invasive species, management of endangered populations, pollution remediation) and these interventions themselves further alter ecological patterns and processes. Traditional conservation, premised on non-interference, has become practically impossible; even “wilderness” is produced and maintained through institutional effort.
Conceptual Framework: The paper introduced a two-axis diagram mapping ecosystems by the relative degree to which their compositions and functions are shaped by human and nonhuman actors. The vertical axis depicted increasing “wildness” (defined as nonhuman biological influence) from sterile environments to late successional wilderness. The horizontal axis depicted increasing human influence from controlled burning to dense urbanization. This framework revealed that human and nonhuman influences need not be inversely related and both can increase simultaneously through processes the authors termed “intensive rewilding.”
Levels of Automation: Drawing on Parasuraman et al.’s taxonomy of automation in human-machine systems, the paper distinguished levels of automated environmental management.
Information acquisition (sensing, monitoring)
Information analysis (pattern recognition, modeling)
Decision and action selection (determining interventions)
Action implementation (executing interventions)
Current conservation technologies automate the first two functions and the paper explored what might happen if all four were automated through deep reinforcement learning (DRL) systems capable of learning their own strategies through environmental interaction rather than following pre-programmed rules.
The Wildness Creator: The paper’s most speculative element was the conceptual design for a “wildness creator,” an autonomous landscape infrastructure system that creates and sustains wildness by enhancing nonhuman influences while countering human influences. Key operating principles included:
Operations invisible and inscrutable to human observers
Algorithms learned from context rather than programmed by humans
Continuous monitoring and removal of anthropogenic influences
Promotion of nonhuman species autonomy without direct human intervention
The wildness creator represented a thought experiment pushing the logic of automated conservation to its limit, an infrastructure whose operations would eventually become “unrecognizable and incomprehensible to human beings,” producing ecological patterns divergent from anything previously created by humans.
Case Studies: The paper surveyed eight existing projects employing semi-autonomous strategies for ecosystem management, positioning the wildness creator as an extrapolation of current trajectories.
Oostvaardersplassen (Netherlands): rewilding through surrogate megafauna
COTSbot: autonomous underwater vehicle eliminating invasive starfish
Responsive landform process (REAL): prototype sediment management system
Drone re-seeders: aerial delivery of germinated seeds
Virtual fences: non-physical barriers guiding animal movement
Autonomous agricultural robots
Toxic cleanup swarm robots
Climate engineering
Epistemological Contribution
“Designing Autonomy” made several significant contributions to thinking about landscape, technology, and conservation.
Redefining Wildness: The paper challenged traditional definitions of wilderness premised on the absence of human interference. In the Anthropocene, such pristine spaces no longer exist; all landscapes are affected by anthropogenic climate change, pollution, and species redistribution. The paper proposed that “wild” might be reconceived as a state of relative freedom from human interventions, a condition that could, paradoxically, be produced and maintained through technological mediation. Wildness becomes something designed rather than preserved.
The Machine as Gardener: Inverting Leo Marx’s famous formulation of “the machine in the garden,” the paper proposed “the machine as gardener” as technological systems that tend wild places in lieu of human stewards. This reframing positioned autonomous systems not as intrusions into nature but as potential allies of nonhuman species, extending the logic of ecological restoration (which already involves intensive human intervention) toward its automated conclusion.
Distanced Authorship: The paper articulated the concept of “distanced authorship” where design that favors process, curation, and choreography over direct formal specification. To design spaces free from human influence requires stepping back from conventional design control, establishing systems that operate beyond human direction. This concept connected to broader discourses in landscape architecture around indeterminacy, emergence, and the limits of design intention.
Deep Reinforcement Learning as Conservation Method: By proposing DRL as a framework for conservation decision-making, the paper introduced machine learning to ecological management discourse. Unlike rule-based automation, DRL systems learn strategies through trial and error, potentially discovering approaches that human conservationists would never conceive. The AlphaGo example, where a deep learning system developed novel strategies to defeat world champion Go players suggested that artificial intelligence might similarly develop novel conservation strategies.
Ethical Complexity: The paper did not shy from the ethical difficulties of automated environmental management. It acknowledged risks of reduced situation awareness and complacency when humans cede decision-making control. It noted that fully automated wildness creation might prove “technologically, financially, or politically” impossible. And it raised the unsettling question of what happens to human-nature relationships when ecological curation occurs beyond human control or understanding.
The paper concluded with a challenge: “Wildness creation is the ultimate design challenge of the Anthropocene. Can we ‘paint ourselves out of the picture’ and devote our creativity and resources toward the interests and futures of species other than our own?”
Failure (2019–2020)
Project Type: Drawing / Exhibition / Film
Exhibitions: Drawing Codes, Pratt Institute (2019); Storm Signals, Chicago Architecture Biennial (2020)
Collaborators: Emma Mendel
Medium: Mixed-media drawing (code-generated base, ink, chemical transfers, erasure); video essay
Publication: Drawing Codes: Experimental Protocols of Architectural Representation (Applied Research + Design, 2021)
Description
Failure was a drawing and companion film that explored how landscape representation might engage indeterminacy, error, and the limits of prediction. Invited for the Drawing Codes exhibition at Pratt Institute, the project began with a provocation that if landscape is produced by multiple agents in constant flux, with indeterminate beginnings and endings, what does a drawing of such conditions look like? The challenge was not merely representational but epistemological, how to depict systems that never repeat themselves, processes whose outcomes cannot be predicted even when their dynamics are understood.
The Drawing: The drawing accumulated over months through an iterative process that deliberately embraced failure as method. A code-generated urban form provided the base layer, an algorithmic structure establishing initial conditions. Subsequent layers were added through irregular schedules, varied media, and collaborative exchange with linework, ink spills, chemical transfers, traced edges, and erasures. Each layer responded to traces of previous layers while establishing conditions for subsequent interventions. No layer was treated as sacred and each collaborator built upon, erased, or contorted the other’s work, creating tensions that unpacked and reversed the drawing’s history.
The process resisted conventional design workflows that move from concept through development to resolution. Instead, the drawing operated as palimpsest with each mark carrying traces of what came before while remaining open to subsequent transformation. Errors inherent in being human were not eliminated but incorporated highlighting intentions executed uniquely each time, consistency aspired to but never achieved, and questions extracted from the canvas without answers provided.
The exhibited work presented select layers from this accumulation, chosen to express the underlying algorithmic structure while revealing the history of intervention and transformation. The layers could be stacked in any order, misaligned, contorted, inverted, or erased to create an anti-drawing whose meaning emerged through process rather than residing in any definitive state.
Storm Signals Film: For the Storm Signals exhibition at the 2020 Chicago Architecture Biennial, the conceptual framework extended into video format. The film catalogued environmental failures across two centuries including infrastructural collapses, ecological disasters, and predictive errors presenting failure not as aberration but as constituent element of environmental history. The chronological survey demonstrated that failure accompanies all human engagement with dynamic systems; the question is not whether failure will occur but how systems respond to and recover from it.
Epistemological Contribution
Failure articulated a theoretical position on prediction, complexity, and design methodology that would inform subsequent research.
Failure as Landscape Condition: The project statement proposed that “failure is, at the heart of landscape, environments inherently complex, multi-valent relationships in constant flux.” This framing repositioned failure from design deficiency to system characteristic. Landscapes fail not because designers err but because complex systems exceed predictive capacity. “Complexity beguiles prediction, making it impossible to predict whether or not one unprecedented event is more likely to occur over another.” Accepting this condition requires abandoning the pursuit of comprehensive prediction in favor of approaches that enhance recovery and adaptation.
Anti-Fragility: Drawing on Nassim Taleb’s concept of anti-fragility, the project asked how designers might develop “systems that more readily recover from disturbance, systems that lack fragility and have the ability to respond to unknown outcomes.” This orientation shifts design focus from preventing failure (impossible in complex systems) to enabling recovery (achievable through redundancy, flexibility, and adaptive capacity). The paradigm shift moves from predict-and-control to learn-and-adjust—a framework that would become central to subsequent theoretical chapters.
Code as Script: The Drawing Codes exhibition theme invited reflection on the relationship between coded instruction and material execution. The project used “Code as Script” to explore “the unexpected between the intent of the code and how it’s executed.” The code-generated urban base established initial conditions but did not determine outcomes; subsequent interventions followed intuited rules rather than explicit protocols. The drawing demonstrated that algorithmic processes, even when precisely specified, produce variation through material engagement, environmental contingency, and human interpretation.
Process over Product: The drawing resisted the convention of the definitive architectural image. “No layer is a drawing, each cannot exist as an entity or contain meaning.” Meaning emerged through accumulation and relation rather than residing in any single state. This processual orientation aligned with landscape’s temporal nature in that sites are never finished, never static, and always in process of becoming. The drawing methodology enacted this condition representationally, producing an artifact that documented transformation rather than depicting fixed form.
Collaborative Tension: The two-person authorship introduced productive conflict into the drawing process. This dynamic modeled multi-agent landscape production at a small scale where different actors were pursuing different intentions and their interactions producing outcomes none fully controls. The drawing became a site of negotiation between collaborators, its history recording agreements and conflicts, accumulations and erasures.
Cyborg Ecology: Choreographing Landscape Resistance (2019/2020)
Project Type: Manifest / Peer-Reviewed Essay
Location: Published in Ambiguous Territory: Architecture, Landscape, and the Postnatural (ed. Bridges, Wrinch, and Clark), Actar Publishers
Collaborators: Solo-authored essay
Medium: Theoretical essay, conceptual framework, diagrammatic analysis
Description
“Cyborg Ecology: Choreographing Landscape Resistance” proposed that the boundary between designed landscape and natural ecology had become theoretically and materially untenable, that contemporary territorial conditions are better understood as cyborg ecologies in which biological processes, computational sensing, and infrastructural actuation operate as a single system rather than as separable domains. The essay developed the concept of “choreographing resistance,” designing not the form of a landscape but the conditions under which biological and computational agents interact, resist, and co-produce spatial outcomes. Drawing on Haraway’s cyborg theory, the essay extended the cyborg concept from individual bodies to territorial systems, arguing that the delta, the constructed wetland, and the sensored watershed are all cyborg ecologies, assemblages in which the boundary between organism, machine, and infrastructure is not blurred but constitutively absent.
Cross-Chapter Resonances: Ch 09 (coupled ecologies as territorial condition), Ch 10 (choreographing resistance as generational design), Ch 11 (multi-species co-creation within coupled systems), Ch 07 (technogeographies as the sensing layer of coupled ecologies)
Indeterminate Futures: Venice Architecture Biennale (2021)
Project Type: Digital Exhibition / NFT Archive
Location: 17th International Architecture Exhibition, Venice Architecture Biennale (digital component)
Collaborators: Xun Liu
Medium: Video and image documentation, notational overlays, NFT minting via Tezos blockchain, hic et nunc gallery platform
Description
Indeterminate Futures translated years of geomorphology table documentation into a distributed digital archive, minting incremental snapshots of sediment flow experiments as non-fungible tokens (NFTs) throughout the duration of the 2021 Venice Architecture Biennale. The project addressed the challenge of representing landscape systems characterized by continuous transformation. How does one document processes that never repeat, archive conditions that exist only momentarily, or exhibit research whose subject is change itself?
The vast database of imagery accumulated through geomorphology table experiments, the videos and photographs capturing sediment sorting, channel migration, delta formation, and responsive prototype testing were the raw material. Each 15–30 second increment was extracted, overlaid with notational systems documenting experimental parameters, and minted as a unique digital object. The growing catalog offered Biennale participants an expanding archive of morphological states, each distinct in its manifestation yet connected through the continuous processes generating them.
NFT as Documentation Medium: The choice of NFT format was both practical and conceptual. Practically, blockchain registration provided verifiable uniqueness for each snapshot which was a technical solution to the problem of documenting singular moments within continuous processes. Conceptually, the format resonated with the an emphasis on indeterminacy. Each minted token represented a unique configuration that would never reoccur, even if the same experimental parameters were repeated. The archive grew throughout the Biennale as new experiments were conducted, new snapshots extracted, and new tokens minted and the catalog itself exhibiting the temporal dynamics the research investigated.
Energy and Ethics: The project employed Tezos, a proof-of-stake blockchain consuming approximately 200 million times less energy than proof-of-work systems like Ethereum or Bitcoin. This choice reflected awareness of the environmental critique directed at NFT culture during this period and the irony of using energy-intensive blockchain systems to document ecological research, but the irony remained with less harm. The Tezos selection demonstrated that distributed ledger technologies could serve research purposes without catastrophic carbon footprints.
Participatory Archive: Visitors could view the growing NFT catalog and collect versions for themselves. This participatory dimension transformed documentation from passive record to active engagement and collectors became stakeholders in the archive, their selections reflecting individual responses to morphological configurations. The distributed ownership model also ensured the archive’s persistence beyond any single institutional repository and collected tokens existed across the blockchain, resilient to the closure of any particular gallery or platform. (Indeed, hic et nunc would eventually cease operations, though tokens minted there remain accessible through other Tezos interfaces.)
Epistemological Contribution
Indeterminate Futures addressed fundamental questions about documenting and disseminating research about dynamic systems.
Representing the Unrepresentable: Traditional landscape documentation privileges stable conditions yet the geomorphology table research investigated systems defined by flux. How does one represent processes whose essence is transformation? The NFT format offered one answer and instead of seeking a definitive image, the project produced an expanding archive of singular moments, each unique yet connected through underlying process. Representation became accumulative rather than definitive, serial rather than singular.
Notation as Interpretation: The overlay of notational systems onto video and image documentation made visible the interpretive framework through which researchers understood morphological change. Notation identified experimental parameters (flow rates, sediment inputs, slope conditions), marked features of interest (channel positions, depositional fronts, sorting boundaries), and tracked change across time. These graphic layers revealed that documentation is never a neutral recording but always an interpretive construction where the notation shapes what viewers see and how they understand it.
Distributed Archive: The blockchain registration created a distributed archive resistant to centralized control or institutional failure. Unlike documentation held in a single repository that is vulnerable to neglect, loss, or restricted access the NFT-based archives exist across distributed networks. This persistence matters for long-duration research and experiments conducted today may prove relevant to questions not yet formulated. The documentation must survive the institutional and technological changes that will occur over decades-long timescales.
Value and Attention: The collectibility of NFTs introduced questions of value into the practice of documentation. Which moments attract attention? Which configurations do collectors find compelling? The participatory dimension created a feedback loop between research production and public response, revealing patterns of interest that might inform future experimental directions. This dynamic differs fundamentally from traditional publication, where documentation flows one-directionally from researcher to audience.
Iteration and Uniqueness: The project articulated a productive tension between iteration and uniqueness. Each experiment followed protocols, each employed similar parameters, and each investigated recognizable phenomena. Yet each produced unique configurations and no two snapshots were identical, with no morphological states recurring. This tension characterizes landscape systems more broadly as ecological processes follow recognizable patterns yet produce singular conditions. The NFT format captured this duality, registering both the iterative nature of experimental method and the uniqueness of each outcome.
Prototyping the Bay: Landscape as Medium (2025)
Project Type: Pedagogical Research / Design Studio
Location: University of Virginia, Department of Landscape Architecture
Course: LAR 7020 Foundation Studio IV
Co-Instructor: Leena Cho (Associate Professor)
Recognition: Tulane Climate Curriculum Prize, 2025
Description
Prototyping the Bay is a fourth-semester foundation studio that positions the Chesapeake Bay as both subject and instrument, a territorial landscape engaged as adaptive infrastructure responding to environmental, programmatic, and sociocultural flux. The studio synthesizes the theoretical frameworks developed through two decades of research, translating concepts of responsive technologies, adaptive management, and cultivated wildness into pedagogical method. Students confront the challenge of designing for landscapes that never look the same, developing propositions for futures that are, at best, unpredictable.
The studio’s central provocation distinguishes between territory and landscape. Territory is “an abstracted space composed of objects and processes for the purposes of state administration—state borders, census tracts, tax parcels.” Landscape, by contrast, is “the living medium, and always evolving motley array of materials and spaces and relationships and histories that are composed and choreographed to produce life catalyzing forms.” This distinction frames landscape architecture’s unique capacity to engage territorial systems through living processes rather than administrative abstraction.
Site Context: The Chesapeake Bay presents a landscape of contradictions: historically important, ecologically productive, socially diverse, industrially prosperous. Yet simultaneously contaminated, impoverished, slowly submerging, sinking, and eroding due to climate change, exploitation, and single-purpose civil engineering. The studio focuses on the coastal landscapes and islands of Pocomoke Sound, where sea-level rise, land subsidence, and shifting ecological communities create conditions of radical uncertainty. Within this marginalized environment, the studio identifies “a latent wildness: remote areas that are difficult to access, novel ecological systems, hyper productive logistics, sites of extraction, cultural enclaves, and powerful infrastructures.”
Conceptual Framework: The studio draws explicitly on the “Wild Disequilibria” framework developed through prior research: “Earth’s new wilds will look very different from the wilderness of the past. Classical wilderness is characterized by purity: it is unsettled, uncultivated, and untouched. But given the massive reshaping of ecological patterns and processes across the Earth, wilderness has become less useful, conceptually. Wildness, on the other hand, is undomesticated rather than untouched.” This reframing shifts design priorities “from maintaining a precious and pure environment to creating plural conditions of autonomy and distributed control that promote both human and non-human form.”
Pedagogical Structure: The studio operates through three modules,
Module I: Unpacking the Bay combines research, mapping, modeling, and manifesto creation. Students develop familiarity with the Chesapeake through multiple representations, lenses, and scales while establishing core design values that guide subsequent work. The module produces both analytical understanding and ethical positioning.
Module II: Systems, Models, Surrogates tests research and stated values through systems diagrams and site prototypes. Students speculate on emergent effects of interventions on site dynamics and develop strategies for implementation. Workshops convey methods including geomorphology table techniques, generative AI tools, and machine learning applications. The module emphasizes “quick iterative testing” and feedback-driven development.
Module III: Landscape as Experiment develops comprehensive design strategies treating the Chesapeake as dynamic landscape shaped by multiple agents—“human/non-human, geologic, historic, biologic, political, hydrological.” Teams evolve methods from Module II to generate strategic approaches and future visions for experimental coastal landscapes. The module confronts the tension between experimentation and conservation, requiring active decisions about what must be acknowledged and conserved versus what can be purposefully reformed as research grounds.
Key Concepts: The studio introduces several theoretical frameworks.
Landscape as Experiment: Sites function simultaneously as models and cultural artifacts. Design proposals ask questions and provide answers and landscapes produce knowledge through their own transformation. This framing extends the geomorphology table methodology to the territorial scale where the bay itself becomes a prototype.
Time as Agency: Time operates not merely as linear process but as “an agency that affords the possibility of redemption and reconciliation because it offers the possibility of experience.” Students engage landscape’s endemic processes “that occur without our intervention and that can be choreographed, catalyzed, or ossified through our actions.”
Mediated Experience: The site is experienced not only through human senses but “alongside an array of other modes of mediated experience, representation, and monitoring.” This approach extends the responsive technologies research into studio pedagogy, positioning sensing and data as design materials.
Choreographing Matter: Following John May’s concept of the “managerial surface,” the studio asks students to work “beyond the natural and artificial, to an ontological plane where such distinctions no longer make sense and can no longer interfere with the choreographing of matter.”
Epistemological Contribution
Prototyping the Bay translates research insights into transmissible pedagogy, demonstrating how the theoretical frameworks developed through two decades of practice can inform design education:
From Prediction to Adaptation: The studio explicitly rejects the goal of “restoring” or “saving” the Chesapeake Bay. Instead, it asks how designers might engage rapidly changing environments without assuming predictive capacity. The emphasis on prototyping (testing, learning, adjusting) models the adaptive epistemology framework at the core of the dissertation’s theoretical contribution.
Landscape as Knowledge Production: By framing landscapes as experiments, the studio positions design as a mode of inquiry rather than merely a mode of implementation. Sites become “testing grounds” that “produce experiments that begin to identify physiological, relational, and biochemical thresholds that drive ecological responses to climatic shifts.” This framework extends practice-based research methodology from individual projects to territorial-scale propositions.
Multi-Species Design: The James Bridle epigraph—“Where we start to move forward is when we learn to ask questions which are less concerned with ‘Are you like us?’, and more interested in ‘What is it like to be you?’” frames the studio’s commitment to designing for human and nonhuman constituents. Students must consider how “a broader range of species, phenomena, and cultures be catalyzed within the place.”
Technology as Design Material: The studio engages “the technologies that have created the bay in its current form, the hydrological infrastructure, the monitoring systems, the agricultural machinery, and the logistics networks” while imagining how “contemporary technologies (machine learning, artificial intelligence, simulation, and robotics, but also natural infrastructure, nature-based features, and other technocratic approaches to landscape-making) can play a future role.” This dual attention to technological history and technological futures positions students to design with and through responsive systems.
Values-Based Design: The first course objective asks students to “develop coherent design values that speak to your convictions regarding the cultural, technical, and philosophical basis of your design research.” This emphasis on ethical positioning distinguishes the studio from purely technical approaches, insisting that engagement with dynamic systems requires explicit articulation of commitments and responsibilities.
NEOM: Landscapes of The Line (2022–2025)
Project Type: Territorial Design Consultation
Location: NEOM, Kingdom of Saudi Arabia
Client: NEOM / Sherwood Design Engineers
Collaborators: Adam Mekies (Sherwood Design Engineers)
Role: Design proposals, narrative development, technical consultation, visual consultation
Description
Landscapes of The Line was a multi-year consultation developing ecological frameworks for the territorial landscapes surrounding NEOM’s mega-city project, The Line, a 170-kilometer linear urban development traversing desert terrain from mountains to coast. The work applied theoretical frameworks developed through two decades of research to real-world territorial propositions, testing concepts of adaptive infrastructure, managed hydrology, and cultivated wildness at unprecedented scale.
The project addressed a fundamental challenge: how to transform arid desert territory into functioning ecological infrastructure while serving the urban systems of a city unlike any previously built. The Line’s unique geometry, a single narrow band cutting north to south, created both constraints and opportunities for landscape design. Unlike conventional urban development that sprawls across ecological boundaries, The Line presents discrete moments where hydrological corridors, species migration routes, and environmental phenomena can be designed rather than merely accommodated.
The Manifold Concept: The central design proposition positioned The Line as a landscape manifold, an infrastructural form that concentrates and redistributes water resources moving from north to south. As water moves through The Line, it is channeled to catalyze new habitats, vegetated regions, and carve geologies. The manifold concept transforms The Line from obstacle to instrument, using the urban structure itself to reshape territorial hydrology.
The project developed distinct landscape strategies for areas north and south of The Line:
North of The Line: Within the edge of proposed carbon-sequestration forests, water is routed intentionally to erode and carve the ground, creating ravines that shelter life from harsh climate. Areas are selectively ossified (hardened) while others erode more easily, producing differentiated terrain through controlled hydrological action. The protected spaces become “not only the beginning seeds of life but also an experimental laboratory for horticulture, silviculture, and long-term ecological research.
The Wadis: The traditional wadi systems (ephemeral desert watercourses) surrounding The Line are reconceived as holding areas, with water being pushed back into them, irrigating the land and recharging the aquifer. The wadis transform from flash flood channels to containment and conveyance systems that slow down water to sustain life and safely usher it into the cistern and caverns at the base of The Line.
South of The Line: A highly managed water system forms the output of the manifold, directing water to holding areas and creating landscapes that are nourishing and cleansing themselves. Detention areas of salt brine create eccentric, colored mirrors, reflecting light and highlighted as designed and geometric. The coastal interface celebrates the brackish zone—a fluctuating isohaline regulated through water inputs from The Line, with thriving mangroves stitching together the coastline.
Adaptive Canopy Systems: In areas directly surrounding the marina, shade structures form through an expanding web of overhead structures, printed through an inventive process that moves along self-made armatures. The project proposed canopy fabrication from mineralized sequestered carbon captured from industrial waste. The structures adapt slowly, building itself and responding to the life that forms below it, creating openings that allow light to filter through.
Hydrological Modeling: The technical work included extensive hydrological modeling using GeoHECRAS software to determine flooding impacts across multiple storm frequencies and durations. Watershed modeling generated inflow hydrographs using SCS methodologies, establishing baseline conditions for design interventions. The modeling revealed that conventional approaches, channelizing flood waters through infrastructure, would require widths exceeding 200 meters and velocities requiring extensive hardened concrete infrastructure. This finding drove the development of more nuanced approaches to sustainable management of wild waters.
Ecological Corridors: The project addressed species movement across The Line’s barrier. The Line’s singular geometry creates opportunity to design specific crossing moments by providing terrestrial movement under (light on the land, tunneling, or lifted) or flight through or above (apertures, guides, or nets). The landscape strategy includes expanding gradients of verdant conditions upstream and downstream that make ecological corridors more resilient and robust.
Monitoring Infrastructure: The Line creates a threshold moment, a space that can be monitored to understand the types and quantities of species in contact with the development. The project envisioned an internet of ecologies, an interconnected network of biodiversity hotspots, ecological corridors, and migration routes relaying real-time data to one another through an array of local sensors and remote monitoring systems. Cities become sensory instruments, constantly sensing, monitoring, and evolving while understanding their role in larger networks of migration, effects of climate, and expressions of culture.
Epistemological Contribution
The NEOM consultation tested theoretical frameworks at territorial scale, revealing both possibilities and constraints of adaptive landscape infrastructure:
From Form to Process: The project statement explicitly articulated the conceptual shift: The scale and ambition of The Line requires a rethinking of how humans construct and maintain the landscapes that comprise its context, a shift from form to process. Rather than designing fixed landscape forms, the work developed strategies for initiating and guiding landscape processes—erosion, deposition, vegetation establishment, ecological succession—that would unfold over generations.
Managed Complexity: The project challenged conventional hydrological management that minimizes indeterminacy through simplified infrastructure that accounts for a minimal series of variables, to keep water moving at constant velocities in engineered channels. The alternative proposed a hybrid hydrological management and infrastructural approach that engages the complex interactions that support ecosystem health. This framing extended the geomorphology table research to territorial application—using controlled interventions to choreograph rather than eliminate hydrological complexity.
Picturesque and Performative: The project articulated a synthesis between aesthetic and infrastructural landscape traditions: Humans admire our manipulations of the land, agricultural fields, conservation areas with scenic drives, and the marks of infrastructure across the landscape, each producing a moment of the sublime. The disconnect between each is the lack of intention between the designed/engineered landscape and the aesthetic outcome. The Landscapes of The Line propose a synthetic merger between the Picturesque and the Performative that is actively and intentionally evolved over time.
Generational Timeframes: The project operated within explicitly generational timeframes and ecologically it will develop over generations and the infrastructure and services that will create this framework are embedded within this vision. This temporal horizon exceeds conventional design practice, requiring frameworks that establish initial conditions while remaining adaptive to emergent developments.
The Line as Experiment: Echoing the landscape as experiment framework from pedagogical work, the project positioned the northern landscapes as an experimental laboratory for horticulture, silviculture, and long-term ecological research, a testing ground for environmental and ecological technologies.
Computational Practice Archive: Codebase (2009–2017)
The projects cataloged above were realized through a body of custom computational work — firmware, software, visual dataflow definitions, and compiled applications — that constitutes primary evidence of the practice-based research methodology the dissertation describes. This section catalogs the archived codebase, spanning approximately 2009 through 2017 across two primary contexts, the Harvard GSD computational lab (approximately 2013–2017) and an earlier academic research and teaching archive reaching back to 2009.
The code is not academic computing in any conventional sense — it is not modeling software, statistics, or GIS. It is a heterogeneous toolkit assembled to make the landscape legible in real time: firmware for physical sensors embedded in rivers and flumes, data pipelines that route sensor readings into design environments like Rhino and Grasshopper, visualization systems that convert depth maps and blob-detected contours into surface geometries, and teaching exercises that ask design students to engage with computation not as a postproduction tool but as a material condition of landscape practice. The code enacts what the dissertation describes as adaptive epistemology: design propositions are tested through instruments that return data, that data is fed back into the design environment, and the proposition is modified in response. The loop is not metaphorical — it is implemented in serial buffers, interrupt service routines, and socket connections.
File Inventory
The archive holds approximately 499 files of custom computational work across four primary languages and toolchains. Figure A1 summarizes the distribution.
[Figure A1. File Inventory. Custom computational work in the archive, 2009 to 2017.]
Temporal Arc
The archive reveals a coherent trajectory across roughly seven years of practice.
Phase 1 (2009–2011) — Sensing and seeing. The earliest code is Processing sketches for visual sensing: ISO Track’s blob detection, the Sensing research TCP server, the boundary/buildings/clouds Processing examples in the Sensing archive. The dominant concern is visibility — how do you make environmental processes visible in computational terms? The tools are general-purpose creative coding adapted for landscape-specific problems.
Phase 2 (2011–2013) — Physical models and embedded control. The Emriver flume instrumentation and the early dye injector code introduce embedded Arduino firmware. The concern shifts from visualization to actuation: how do you not just see but intervene? The data pipeline now runs in both directions — sensing in, actuation out. The physical flume model becomes an experimental apparatus that can be precisely controlled and measured.
Phase 3 (2013–2015) — Lab infrastructure. The GSD lab code base consolidates into a coherent instrument stack: Firefly Firmata as the standard Arduino-Grasshopper bridge, AccelStepper for motion control, banner sensors for surface profiling, the GUI sketch as the unified dashboard. MongoDB appears as data infrastructure. The tools are no longer one-off sketches; they are a platform.
Phase 4 (2015–2016) — Real-time synthesis. FluX, the sediment interface, the realInterface.exe, and the Cyborg Coasts workshop code represent a mature synthesis: Kinect depth cameras, fluid dynamics simulation, GPU acceleration, MongoDB time-series storage, and full 3D export to Rhino running together as a single environment. The physical and digital are genuinely coupled — changes in the sediment surface propagate immediately into the computational model, and the computational model can drive actuators that modify the physical surface. The adaptive feedback loop is complete.
Phase 5 (2016) — Codified pedagogy. The GSD 2241 exercise series represents the maturation of this technical practice into a teaching methodology. The exercises encode the lab’s tools — Grasshopper process analysis, Kinect sensing, parametric surface generation — as structured curriculum. Practice becomes pedagogy.
Codebase Entries
Firefly Firmata (2015) — Authors: Andrew Payne and Jason Kelly Johnson. Arduino firmware that makes a microcontroller fully transparent to Grasshopper via a serial protocol encoding sensor readings as comma-separated strings and actuator commands as offset integers (10000–19999 for digital, 20000–29999 for analog PWM, 30000–39999 for servo). Supports Uno, Leonardo, Mega, and Due board variants. ~250 lines. Arduino C/C++. Ch05 Tools, Ch09 Interactions.
Firefly QuadStepper (2013) — Author: Andrew O. Payne. Firmware for four-axis stepper motor control with acceleration ramping (AccelStepper library) and bidirectional completion feedback to Grasshopper via bitmask encoding, enabling parametric actuation synchronized to computational geometry. ~220 lines. Arduino C/C++. Ch05 Tools, Ch10 Generational Robots.
Firefly Wii Nunchuck (2012–2015) — Author: Tod E. Kurt / lab adaptation. Adapts a Nintendo Wii Nunchuck into a seven-channel gestural input device (triaxial accelerometer, joystick, two buttons) compatible with the Firefly/Grasshopper ecosystem, transmitting at 115,200 baud every 25ms. ~40 lines. Arduino C/C++. Ch09 Interactions.
Emriver Dye Injector Controller (2013, 2015) — Author: James Nation. Embedded controller for the Emriver tabletop river flume, managing precision two-color dye injection (green, blue, alternating) with configurable timing, pulse duration, Gray code rotary encoder, LCD menu interface, and acoustic feedback. ~700 lines. Arduino C/C++. Ch06 Models, Ch05 Tools.
Groundwater Motor Controller (2013–2015) — Author: James Nation. Feedback-controlled pump speed regulator with interrupt-driven paddle-wheel flow measurement (1441 pulses/liter K-factor), 100-sample moving average with outlier rejection, and real-time ml/s display. ~250 lines. Arduino C/C++. Ch06 Models, Ch05 Tools.
Harvard Media Feeder Controller (2015) — Authors: Steve Gough and James Nation. Stepper-driven granular media delivery system with proximity-sensor calibration, hydrograph playback (simulating dynamic flood sediment loading), grams-per-second feed rate control, and cumulative totalization. ~700+ lines. Arduino C/C++. Ch06 Models, Ch10 Generational Robots.
Magnetic Card Reader Interface (2015) — Author: James Nation. Magnetic stripe reader controller coupled to the media feeder hydrograph system, enabling card-coded experimental scenario selection as a social interaction layer for the physical landscape model. ~900+ lines. Arduino C/C++. Ch09 Interactions, Ch11 Co-Creation.
Banner Sensor Profiler (2015) — Three-axis ultrasonic distance sensor array with stepper motor control, providing continuous surface profiling of the sediment table for Grasshopper intake. ~120 lines. Arduino C/C++. Ch10 Generational Robots, Ch06 Models.
realInterface (2015–2016) — Compiled C# .NET application integrating Kinect depth sensing, MongoDB time-series storage, OpenTK 3D rendering, and RhinoCommon geometry export into a real-time instrument for the responsive sediment table. Includes temporal capture-and-replay workflow with record, stop, play, forward, and backward controls. C# .NET, MongoDB, Kinect SDK, RhinoCommon. Ch06 Models, Ch09 Interactions.
FluX Hydrodynamic Visualization (2016) — Author: Xun Liu (GSD MLA ’17). GPU-accelerated 2D Navier-Stokes fluid simulation coupled to real-time Kinect depth input and OpenCV plant detection (color-tracked pink markers), producing a hybrid physical/digital landscape model in which the evolving sediment surface drives a live flow simulation. Multi-file architecture (~400+ lines). Processing (Java), DwPixelFlow, KinectPV2, OpenCV. Ch06 Models, Ch08 Landscape as Medium.
Contour Map Generator (2015) — Based on Cedric Kiefer / onformative.com, lab adaptation. Converts height-map images (including Kinect depth captures) to interactive 3D topographic models using multi-threshold blob detection at 35 elevation levels. ~75 lines. Processing (Java), blobDetection library. Ch06 Models.
Sediment Table GUI / Instrument Dashboard (2015) — Processing application (1920×1080) integrating dual Arduino serial ports, three-axis stepper/sensor visualization with temporal fading strip charts, Kinect depth, OpenCV, and video — the primary operator interface for the responsive sediment table laboratory. ~650+ lines. Processing (Java), KinectPV2, ControlP5. Ch05 Tools, Ch09 Interactions.
Kinect Depth Capture Test (2015–2016) — Based on Thomas Sanchez Lengeling’s KinectPV2 library. Minimal Kinect v2 depth capture with threshold filtering (1090–1500mm) to isolate the active sediment surface from table structure and room. Processing (Java), KinectPV2. Ch07 Technogeographies, Ch06 Models.
GSD 2241 Exercise Series (2016) — Author: Bradley Cantrell. Ten-session computational landscape design curriculum progressing from parametric landform generation through environmental process analysis (runoff, infiltration, slope) to real-time Kinect-based terrain sensing. Grasshopper for Rhino. Ch05 Tools, Ch04 Ecology of Practice.
rt_sense_viz Banner-to-Section Archive (2015) — Collaborator: Leif Estrada. Iterative Grasshopper development archive (60+ versioned files) for a banner sensor to stepper motor to section surface workflow, including recorded banner_data.txt from live experiments. Grasshopper (.gh). Ch05 Tools, Ch10 Generational Robots.
Sedinnect001 / Unity Kinect Environment (2013–2015) — Unity 3D project with ZigFu OpenNI Kinect integration for bodily gestural interaction with terrain visualizations, including multi-user skeleton tracking and gesture detection. C# / Unity 3D, ZigFu OpenNI. Ch09 Interactions, Ch11 Co-Creation.
ISO Track / Time Blobs (2010–2011) — Authors: Aaron Steed, John Holder, and Hedley Roberts; adapted by Bradley Cantrell. Real-time multi-threshold blob detection system that renders successive video frames as extruded 3D contour surfaces in time, making environmental change visible as spatial form. ~1270+ lines. Processing (Java), BlobDetection library. Ch07 Technogeographies, Ch08 Landscape as Medium.
Sensing Research TCP Server (2009–2011) — Early distributed field sensing infrastructure: a Processing TCP server that receives incoming sensor node connections, synchronizes remote clocks by returning Unix epoch timestamps, and logs sensor streams. Processing (Java), Net library. Ch07 Technogeographies.
AAR Installation LED Data Replay (2014) — Two-part Arduino/Processing system that replays logged sensor data (CSV file) as a nine-channel LED light sequence, converting environmental measurement into a publicly readable spatial light event. Arduino C/C++, Processing (Java). Ch07 Technogeographies, Ch08 Landscape as Medium.
Terzidis Voronoi Pattern (2012) — Algorithm by Kostas Terzidis; adapted by Bradley Cantrell. Interactive Processing implementation of Terzidis’s Voronoi algorithm from Algorithms for Visual Design, placing computational design theory in dialogue with landscape spatial tessellation through working code. Processing (Java). Ch03 Refractions, Ch04 Ecology of Practice.
Cyborg Coasts Kinect Workshop Code (Spring 2016) — Author: Bradley Cantrell. Teaching code for GSD 6346 demonstrating Kinect depth acquisition and depth-to-3D point cloud conversion within a course framing coastal environments as hybrid cyborg systems. Processing (Java), KinectPV2. Ch05 Tools, Ch09 Interactions.
GSD 2241 Landscape Analysis Suite (2016) — Author: Bradley Cantrell. Six Grasshopper definitions encoding runoff simulation, height distribution analysis, slope calculation, infiltration modeling, point cloud reduction, and mesh-to-surface conversion as visual dataflow diagrams for student use. Grasshopper for Rhino. Ch05 Tools, Ch08 Landscape as Medium.
Dissertation Chapter Map
Figure A2 maps the codebase against the dissertation’s chapters, distinguishing primary relevance from secondary relevance for each project. The pattern that emerges confirms the appendix’s placement. Chapter 05 Tools receives the densest clustering of primary and secondary relevance across the codebase, with Chapter 06 Models, Chapter 07 Technogeographies, and Chapter 09 Interactions also heavily engaged. The codebase is the material from which the dissertation’s central technical arguments were distilled.
[Figure A2. Dissertation Chapter Map. Code projects and the chapters they inform.]