In the late 1960s, Nicholas Negroponte, a professor of computer science at MIT, was exploring a provocative question: what can be learned about the relationship between computers and people by studying how architects and planners might design digita
In the late 1960s, Nicholas Negroponte, a professor of computer science at MIT, was exploring a provocative question: what can be learned about the relationship between computers and people by studying how architects and planners might design digitally? The resulting book, The Architecture Machine, drew some fascinating conclusions, many of which hold true today and can guide current efforts to integrate digital tools into the building industry.
In The Architecture Machine,roponte posits that computers can have three potential relationships to the design process:
1. Accommodation: the machine attempts to replicate, as best possible, existing practices and customs of design;
2. Adaptation: design processes adjust to accommodate the limitations and constraints of the machine; and
3. Evolution: the design process and the machine integrate, each contributing their respective strengths, and each evolve to optimize their collective talents.
Working on IBM mainframes and early cathode ray displays, Negroponte developed a system called “Urban 5.0” that attempted not just to replicate the production of design deliverables (accommodation) nor constrain the design process to fit within the computational and evaluative limits of the machine and its program (adaptation), but to integrate and collaborate with the designer to expand their joint capabilities. Thirty years later, design technology providers struggle in various ways with these same issues, and in many ways Negroponte’s approach predicts the history of digital design up to today.
Thus Negroponte was far ahead of his time. Digital design in the fragmented building design, construction, and facilities management realms (hereafter referred to collectively as “the building industry”) expanded in fits and starts through the 1980s, only hitting its stride in the 1990s when computers, visual displays and plotters became sufficiently inexpensive in the aggregate to reach critical mass of adoption.
By the mid-nineties, almost all firms had invested heavily in computing infrastructure, software and training, and tremendous productivity gains paralleled an expansion in the construction economy that is only now subsiding. Few firms would attempt to provide services today without computers, and clients have come to expect digital design deliverables as routine.
Yet, in my view, our industry is only now completing its “accommodation” phase of digital development, and is beginning to move through “adaptation” to “evolution,” and for some very understandable reasons. Constrained by fierce fee competition, a slowly moving standard of care, and (in the 1980s) the liability crisis, designers demanded and got digital tools that replicated analog processes with increasing precision. Process innovation was not high on the list of software development priorities nor customer wish lists.
Computer systems evolved as “digital pencils,” created to carefully and efficiently replicate the graphical, drafted deliverables that form the common language of designers to constructors—the working drawing. Drafting productivity increased greatly, and less time was spent making and printing tracings. Much of our work at Autodesk revolved around carefully linking functionality to generally accepted drafting standards of the building industry.
Computers thus directly accommodated our need to create and deliver drawing-based deliverables, but the information input into those digital drawings had no more inherent intelligence than its graphite or plastic-lead predecessors. And, save the occasional delivery of digital drawings for preparation of shop drawings (a practice still mostly shunned by architects and engineers), one might argue that the digital revolution’s greatest contribution to our colleagues in the construction portion of the industry is our ability to create crisp plotted output that can be easily reduced in size and carried into the field for use by the crew. The inherent intelligence lent by the author of that design information remains in his or her head, and is flattened to ink by the plotting process.
The proliferation of building design computing in offices and schools took us directly into the “adaptive” phase of our development, however, as less emphasis was put on the teaching of drawing, and work product within offices spent its useful life “trapped” within the computers of the design team, only seeing the light of day during the periodic production of check plots. The daily mentoring of the analog, paper-based studio, where work was displayed on large sheets taped to the drafting table, disappeared accordingly. And, while few bemoaned the loss of architectural lettering skills in young architects, dependence on digital tools has steadily increased while the older, non-digital generation became largely distanced from the day-to-day production of the work, interacting instead by sitting before darkened screens full of yellow and green lines, and delivering hopeful sketches to their designers.
Other industries have reached the “evolutionary” stage of digital design far before ours. Aerospace, automobile design and other industrial design markets, taking advantage of economies of scale and highly constrained processes, have created simulation, real-time visualization, expert systems, and analysis and development environments without which the Boeing 777 or the Chrysler PT Cruiser weren’t possible. These systems integrate directly and profoundly into process as true extensions of their human design counterparts, with apparent understanding of the digital data created by their authors.
So, in our fragmented industry, where design and construction teams flash into and out of existence on each project, and every final product is a unique, never-to-be-built again result, how can we expect to have tools focused on supporting and integrating not just our required deliverables, but our processes as well? Some promising first steps are emerging as design technology providers, like ours, realign their focus on understanding the inherent quality of digital design data as it supports design process.
The key barrier to be overcome before evolution can be widespread is design information transaction standards. When computers are used to create the common language of graphics, the barriers to cross-functional operations are relatively low. For example, most of Autodesk’s competitors now tout “full .DWG compatibility,” allowing their solutions to read and write our format, the accepted standard. But the individual processes of the industry—conceptualization, drawing production, cost planning and estimation, fabrication, to name a few—each require specific tools and, in many cases, even more specific data formats, to be successful. It is unlikely that one tool, or one data standard, will accommodate all such processes well.
Two important initiatives being studied and developed vigorously by design technologists offer some hope that we will begin to see the advantages of integrated processes and eventually achieve our “evolutionary” phase. First, model-based design, information authored into datasets that have inherent intelligence—where a wall “knows” its size, characteristics, and relationship to adjacent doors and windows—offers the first real possibility that design data can become more than graphics, and that drawings are only one potential deliverable created by the design team. In model-based design, the richness of the building representation
A second equally interesting initiative is extensible mark-up language, or “XML,” a successor data language from the Web’s original HTML (hyper-text mark-up language). Recognizing that various specialized processes (such as conceptual design, cost estimating, construction documentation, facilities management, or life-safety evaluation) will require like specialized applications suited to those processes and related, specialized data structures, XML creates a medium of exchange where these disparate applications can exchange selected, relevant information.
For example, a cost estimating application needs specific, quantitative information from the design data set, but not colors or texture mapping from design rendering. Under pre-designed XML protocols (many of which are being developed under the auspices of the International Alliance for Interoperability’s Industry Foundation Classes), the design tool and the cost estimating tool agree to exchange, in XML format, only the relevant connecting data. XML-based relationships prevent every potential application from the burden of reading every piece of design information created in a digital model irrespective of its relevance to that application’s intended solution, making interaction both more lightweight and less difficult.
At Autodesk, we have the opportunity to talk to many architects, engineers, owners and contractors, many of whom look to us to determine the digital future of design. During these discussions, we try to carefully analyze emerging industry trends and how digital processes affect them. We believe it is our role to study building industry processes carefully and respectfully, and develop supporting tools accordingly. We increasingly hear from our customers that the advantages of mechanized drafting (Negroponte’s accommodation) have been fully realized, and that improvements can only be incremental. They explain how key aspects of design culture have shifted as work has transferred substantially onto computers (adaptation). And they ask: when will machines provide the substantial added value of use of digital design data they create?
I believe that the implications of these questions, and the answers that we choose to provide in developing the resulting tools, will bring our industry fully into the evolutionary phase of Negroponte’s paradigm, as computers assist designers to realize the full potential of process-driven digital design data.