DesignIntelligence Media

Article

Michael LeFevre

All for Intelligence

Michael LeFevre

Managing Editor, DesignIntelligence

March 20, 2024


DI’s managing editor reflects on technology’s march, what is invisible and what is real.

I’m all for intelligence. Most of the time we need all we can get, in many forms: emotional, collective, even military varieties to keep us safe. But I must admit I’m still a bit uneasy about the artificial kind. You see, I’ve been an architect for 55 years now. I started working in a small office in 1968 at the age of 14 with two other talented mentors. Back then, we worked without access to personal computers and cellphones. We designed and drew by hand, lovingly, with lead drafting pencils on vellum.

Tivadar Balogh, AIA rendering, circa 1968

In those days, we gathered our intelligence about projects, architecture and the world from experience, books and the periodicals of the day. When the monthly issues of Architectural Record, Architecture and Architectural Forum arrived, they fueled our heated discussions of emerging trends in the profession. Debates ensued over the merits of John Johansen’s Mummer’s Theater in Oklahoma City, Robert Venturi’s polemic in “Complexity and Contradiction in Architecture” and the merits of “ducks” and “sheds.” Practicing in the Midwest with limited resources for travel and constrained by our limited world views, we were content with our dogma: Modernism was king, and drawing skillfully and beautifully, by hand, was essential! Our network of connected intelligence was small, local and scarcely technological — a small spider’s web of personal contacts and experience-based know-how.

The Coming Change

In the early 1980s, when technology evolved to provide us with new means of production, we embraced the change. Making the shift from graphite to plastic lead on mylar was easy enough. The subsequent emergence of pin bar overlay drafting seemed a fine advancement. It offered us the ability to separate layers of information and be more intelligent about data reuse. Drawing the floor background only once on its own layer helped us leverage that information for engineering backgrounds without having to recreate it. Smart. But, as has been well chronicled, this precursor to computer drafting simply enhanced our former process rather than transforming it. Our means of production, now primitively separating and reusing discrete data, began to tug at our shirtsleeves and drag us slowly into the age of automation and intelligence.

When computer aided design and drafting (CADD) entered the scene, a bigger shift shook our shoulders and rattled our brains. Personal computers and local area networks appeared, even in small design firms. The opportunity to leverage machine and computing power to automate and standardize our formerly artisanal practices was touted as “liberating.” “Three-to-one productivity” was the cry from software providers such as Autodesk, Bentley and others. “Free yourself from the enslavement of construction documents. More time for the high value, truly creative tasks you love,” they promised. Perhaps they were right, but what they neglected to mention was that all the time CADD freed up came with an equal or greater amount of time needed to learn the software, keep up with the hardware, grow the infrastructure, and pay for these digital tools, training and new staff members to operate them. “A black hole for money,” many called it. Were we more intelligent as a result of the CADD era? Yes! We had more data (provided we had entered it correctly and in common formats to flow between team members — an asymptotic pursuit). But we were now required to devote huge amounts of time, money and energy to structuring and maintaining that data. An unintended consequence was that most senior practitioners were hardly able to see their projects in progress anymore because instead of lying out, full size in full view on the drafting tables, the “drawings” were now stored inside small CRT screens, viewable only by zooming and scrolling relentlessly, frustratingly. Beyond the purported efficiencies, what intelligence and intuition had we lost in the translation?

 Beyond the purported efficiencies, what intelligence, connection and intuition had we lost in the translation? 
The BIM Boom

At the turn of the century, the dawn of the building information modeling (BIM) revolution, we were promised another generational, transformative wave of change. No longer would we simply automate drawing of the “dumb” lines of our past, they said. Now we would work in true, three-dimensional space, placing “perfectly accurate” 3D digital objects in a single integrated model! Laden with data, these objects would now be “intelligent,” the software makers claimed, with “attributes,” and again they were correct, except for one thing. Despite their promises of “all data in one place” and “a single source of truth,” we soon learned that, without standards for data creation, flow, storage and retrieval, we were not much better off than we had been. We saw that models fully loaded with data were too cumbersome to open, use or share. Regressing, we began to break them into bits once again. Our digital storage cabinets and transmission pipes simply weren’t big enough.

We struggled to know which data to include. What is “real” and necessary to you, the manufacturer, is better simplified, reduced and abstracted by me the designer to suit my purposes. The contractors’ model needs and uses were a far cry from the designers’. Sure, I celebrated the joys of no longer needing to use an electric eraser to grind holes in my drawing sheet when something had to change. I marveled at Revit’s speed and intelligence to design and model a stairway in seconds, far faster than the hours it used to take me to do it manually. Love those algorithms. But I also lamented the loss of connection with the medium. It seems all this “intelligence” came with some hidden, unanticipated costs. Somehow the processes of design and drawing seemed less real.

From 2000 to 2019, in a self-created position in a national construction management firm, my role was to liaise between designers and builders. When the digital revolution burst on the scene, thanks to our being well capitalized and risk savvy, my colleagues at Holder Construction were ready to carefully adopt the new technology. During those two decades, I became a renowned BIM evangelist while developing use cases and building our industry-leading team. In a short ten years, we recruited and trained a staff of more than 50 modelers deployed across the country on project sites. With me identifying needs, providing vision, building business cases and scrounging for funding, our team became experts at self-creating models designed by architectural partners, doing systems coordination and collision detection, and creating 4D time and sequence logistics and sequencing models, even to the point of developing a copyrighted in-house iOS-based facility management program. All the hands-on software use was done by a crackerjack team fresh out of school, all young enough to be my children. It was a heady time. Without question, we were more “digitally intelligent” and facile than many of our peer contractors and architectural partners. But despite the exponential growth of BIM — in our firm and across the industry — tapping into the power of this new tool set has still realized only a fraction of its potential.

Devon Energy Headquarters BIM, circa 2006, image courtesy Holder Construction

Reflection

In hindsight, I have no regrets about the digital journeys we embarked upon in any of these firms. I liken it to what it must have felt like in the early 1900s, in the heyday of modernism, the Bauhaus and the industrial age. Transoceanic excitement and broad sharing of ideas were rampant — for the modernists and for us leading, bleeding-edge BIM believers. Collaboration, sharing and a new attitude were the orders of the day. We were smarter, faster and without question a bit more automated, industrialized (read impersonal, bureaucratic) than we had been. But through it all, I was always glad someone else was doing the keyboard crashing and software coding — because they just weren’t my thing. My intelligence, what little I may have had, was of a different ilk.

And Now, AI?

The current tsunami overtaking the built environment industry (and all of civilization) is artificial intelligence (AI). Some would call it alternative intelligence or machine learning. The power inherent in AI is nothing short of frightening. And having read this memoirish rant thus far (lest you cast me off), I welcome AI’s arrival. I’m the first to delight in being able to talk to Alexa to select a movie to stream or to buy an airline ticket on Delta and select my seat, all in 30 seconds. Those smart, efficient, satisfying experiences are minimal examples of AI. I enjoy using intelligence and being efficient. I don’t want to return to the days where we didn’t have the information we needed to do our projects and were left to guess or approximate. (Let’s pool our ignorance!) In the decades since I began practice, it now seems we have too much information. Some of it is even intentionally shaded or shaped with ill intent to deceive or persuade us, whether by evildoers, politicos or simply by commercial or governmental entities with self-serving motives. This rampant propaganda now threatens our ability to be intelligent because we can scarcely tell what is real. Now our challenge is filtering, reducing, evaluating and rendering information so it is manageable and useful. With AI’s help to generate rough drafts and do generic research, we can direct our energies to those reductive efforts, to curating and judging.

In these very pages I have interviewed the likes of industry prophet Phil Bernstein, who offered understanding of his new book, “Machine Learning.” We at DI have shared an optimistic podcast with prolific author and educator Randy Deutsch, who paints a picture of AI’s liberating potential. I edited a scholarly essay by Eric Cesal, who, in his two-part essay: “In The Future, Everyone’s An Architect” shockingly, used free software to produce a stunning, AI-generated video featuring a typical architect, owner and contractor. The familiarity of the clichés they exchanged was frighteningly accurate. It’s true, it seems: We are creatures of habit, our patterns are predictable. As Cesal showed, even machines, drawing from available data, can design houses and say the exact, clichéd things that we experienced, highly trained design professionals say. Liberating? Perhaps. Scary? Indeed. In need of human oversight and curation? Absolutely! 

The Invisible

Students of culture know that the aspects manifest in an organization are what define and comprise it—things like language, artifacts, behavior, beliefs and values. But, surprisingly, even more important to cultures are their hidden principles, those things embedded in a culture that are not talked about or seen.

In most of the world, such tenets include things like the expectation that we wear clothing in public, seek to do the right thing and (except for a few politicians) that we be kind to others. In addition to other, now hidden beliefs in our capitalistic society like the assumptions and expectations of continued economic growth and always available resources, the belief that technology will always improve — and grow — to enhance human existence is now predominant.

Man 2.0

In his bestselling book “Homo Deus,” Yuval Noah Harari speaks of humanity’s next evolution into a greater mind, a higher level of consciousness through machines, computers and other forms. He calls this aspirational state “Man 2.0.” Perhaps this higher consciousness will become invisible, automatic and a part of daily life. Perhaps it already has. The internet — call it our collective intelligence — already knows my buying tendencies on Amazon. Those are clearly being tracked. Related internet feeds are being sent to me through RSS on Flipboard based on my interests and what I follow. To us amateurs, these are forms of machine learning, big data and artificial intelligence.

“What is essential is invisible to the eye.” 

— Antoine de Saint-Exupéry, “The Little Prince”

When artificial intelligence does become more invisible, integrated and inescapable in daily life, I’ll have no problem accepting it. I’ll embrace it as I have all other technology to date, provided it’s regulated, safe and that the ill-intended are somehow kept at bay. Acknowledging and accepting that that day (along with AI’s powerful, generative capabilities) is already upon us is still slightly beyond my comfort zone and ability to make sense of it. In the meantime, I’ll watch and wait.

Ability and Responsibility

Charles Darwin taught us that those best able to adapt are best equipped to survive. I’m proud of my ability to adapt to innovation and technology over the years. An ability to anticipate and react to the future, an evolving set of careers and a high level of tech savviness have benefited my career and life to a greater degree than many younger, less adapted colleagues. Starting with humanity’s discovery of fire, tools, language and the power of collaboration to improve our lot, we continue to evolve selectively to increase our intelligence.

But it’s the natural order of things for our rate of adaptation to slow down in our final decades as humans, and I will accommodate that pace. Do we senior members of the profession have a responsibility to confront, assess and embrace AI’s advance? Of course, but that doesn’t necessarily mean we must lead the charge. Rather, our value lies in offering perspective and humanity — aspects AI cannot offer.

The primitive and the industrialized, author photos

I am far from a technology denier. I welcome it. I’m sure in short order I’ll likely be a daily user of many more AI-based tools and services. I’m no Luddite. My technology adoption and use record supports that claim. But as Clint Eastwood’s Dirty Harry famously said, “A man’s got to know his limitations,” and I know mine. And while I welcome AI’s abilities to overcome them, I know others are much better equipped to serve as first-wave enablers in the co-creation role to help the machines find their way to help and serve us.

At this point in my trajectory, I’m happy to enjoy a few of the analog experiences I had to forgo while I was engaged in these last few digital revolutions — you know, the things that aren’t artificial, the things that are clearly real. Things like talking to people, writing, going for a walk, petting the dog and traveling our planet with my wife while I still can.

Yes, I don’t want to be the guy who bleeds and leads the way in figuring this one out. I’m quite happy to let others manage that charge. When AI can be made safe, easy and harmless for the mainstream, I await the opportunity. You see, I’m all for intelligence, just don’t ask me to show the way this time, because I’m not that intelligent.

Or maybe — in knowing that it’s not for me — I am.

From Atlanta, Georgia, USA, Earth, in the year 2024, I remain …

Michael LeFevre, FAIA emeritus, managing editor of DesignIntelligence; principal, DI Advisory; senior fellow in the Design Futures Council; and author of the Amazon bestselling new release, “Managing Design” (Wiley, 2019), a person of primarily human — and often limited — intelligence.