English · 01:17:05
Feb 11, 2026 7:58 AM

The third golden age of software engineering – thanks to AI, with Grady Booch

SUMMARY

Gergely Orosz interviews Grady Booch, a pioneering software engineer, on the three golden ages of software engineering since the 1940s, emphasizing AI's role in elevating abstractions without eliminating human judgment in the current vibrant era.

STATEMENTS

  • Software engineering originated in the late 1940s with Margaret Hamilton coining the term during the Apollo program to distinguish software work from hardware engineering.
  • Early software was bespoke and machine-specific, but economic pressures led to decoupling software from hardware, enabling reusable investments and sparking the industry's growth.
  • The first golden age of software engineering, from the late 1940s to late 1970s, focused on algorithmic abstractions for mathematical and business applications amid rising complexity.
  • During the first golden age, innovations like flowcharts and entity-relationship models emerged, alongside a division of labor involving analysts, programmers, keypunch operators, and machine operators.
  • Economic factors dominated the first golden age, as machine costs far exceeded human labor, driving automation of business processes like accounting and payroll for precision and speed.
  • Fringe developments in defense, such as real-time distributed systems for SAGE, pushed boundaries beyond central business applications, influencing future computing like the internet.
  • The software crisis of the late 1970s arose from insatiable demand for quality software at scale, which the industry couldn't meet, leading to expensive, slow, and unreliable production.
  • The U.S. government addressed the crisis by funding the ADA language project to consolidate over 14,000 military programming languages into one standardized solution.
  • Research in abstract data types, information hiding, and literate programming in the late 1970s laid groundwork for higher abstractions beyond algorithmic ones.
  • The second golden age, starting in the early 1980s, shifted to object-oriented abstractions, combining data and processes to manage growing complexity in distributed systems.
  • Micro-miniaturization and personal computers fueled the second golden age, empowering hobbyists and tying into counterculture movements for decentralized innovation.
  • Object-oriented design enabled scalable systems, as seen in early software like MacWrite and MacPaint, with principles persisting in modern tools like Photoshop.
  • Open-source practices originated in the first golden age with user groups sharing algorithms, evolving in the second to include libraries for reusable components.
  • Platforms and service-oriented architectures emerged in the second golden age, precursors to modern APIs and SaaS models like AWS and Salesforce.
  • The dot-com crash and Y2K remediation in the late 1990s highlighted heroic, invisible engineering efforts that prevented widespread failures.
  • AI's history parallels software engineering with its own golden ages: symbolic methods in the 1940s-1950s and rule-based systems in the 1980s, both ending in winters due to scalability issues.
  • The third golden age of software engineering began around 2000, emphasizing abstractions at the system level with libraries and packages for complex, distributed ecosystems.
  • AI tools like Cursor and ChatGPT accelerate the third golden age by automating pattern-based coding, freeing engineers for higher-level systems thinking.
  • Current challenges in the third golden age include managing vast software volumes, ensuring supply chain security, and addressing ethical implications of pervasive systems.
  • Software engineering fundamentally balances technical, economic, ethical, and human forces, roles AI cannot fully automate.
  • Existential fears among developers mirror past crises during abstraction shifts, like from assembly to high-level languages, but fundamentals endure.
  • AI excels at generating code for familiar patterns but struggles with novel, enduring systems requiring human judgment.
  • New skills in the AI era demand systems theory, complexity management, and interdisciplinary knowledge from biology and neuroscience.
  • Successful engineers in past golden ages adapted by embracing higher abstractions, focusing on imagination unconstrained by tedium.
  • Periods of rapid change in software history produce both progress and hype, but human responsibility remains central.
  • Defense and commerce have historically driven computing innovations, from transistors to the internet, often via government funding.
  • Ethical questions, like whether to build pervasive tracking software, increasingly define modern software engineering.
  • Thriving in change involves retreating to foundational theories like those from Herbert Simon and the Santa Fe Institute.

IDEAS

  • Software engineering's youth, emerging just 70-80 years ago, positions it as a blink in cosmic time, yet it has already undergone profound transformations.
  • The term "software" itself was coined in the 1950s, highlighting how recent and fluid the distinction from hardware truly is.
  • Early programmers faced physics constraints like the speed of light, mirroring hardware engineers' challenges in a malleable medium.
  • Decoupling software from hardware in the 1960s via IBM's architecture preserved investments, unleashing exponential growth.
  • Flowcharts in the first golden age were not just diagrams but cognitive aids for tackling "hello world"-level complexities that felt monumental.
  • Defense projects like SAGE consumed 20-30% of U.S. developers, birthing real-time systems that quietly shaped civilian tech.
  • The software crisis wasn't about surveillance or crashes but sheer production bottlenecks, a "clear and present danger" of unmet demand.
  • Plato's ancient dichotomy of viewing the world through processes versus things prefigures algorithmic versus object-oriented paradigms.
  • Functional programming, pioneered post-FORTRAN, excels at hard tasks but falters on simple ones, explaining its niche status.
  • Hobbyists in the 1970s-1980s, fueled by disposable income and cheap transistors, democratized computing like never before at scale.
  • Counterculture influences, from hippie ethos to Stewart Brand's WELL, intertwined with personal computing's rise, blending social and technical innovation.
  • Object-oriented abstractions reduced complexity by encapsulating data and behavior, enabling feats impossible in procedural eras.
  • Y2K's "non-event" exemplifies invisible engineering: massive effort averted disaster, making success unnoticeable.
  • AI winters stemmed not from flawed ideas but from computational limits, much like early neural networks needing five vacuum tubes per neuron.
  • The third golden age's abstractions shift from code lines to ecosystem orchestration, with AI as an accelerator for library adoption.
  • Non-professionals, like accountants using ChatGPT for custom scripts, echo early PC hobbyists, expanding software's reach.
  • Dario Amodei's predictions overlook software engineering's holistic nature, focusing narrowly on code generation.
  • English as a "programming language" via AI bridges natural language to code, shortening the ideation-to-implementation gap.
  • Embodied cognition in systems, like Mars rovers, demands integration of physical and software worlds, beyond disembodied AI tools.
  • Biological inspirations, such as cockroach neural architectures, reveal decentralized intelligence models applicable to multi-agent systems.
  • Supply chain vulnerabilities, like SolarWinds, amplify human factors in security, turning ethics into operational imperatives.
  • Platforms like Salesforce create economic moats, where high complexity justifies SaaS fees over in-house builds.
  • Past abstraction leaps obsoleted rote skills but amplified demand for systems thinkers who balance multifaceted forces.
  • Imagination, once bottlenecked by coding tedium, now flourishes as AI handles patterns, enabling novel real-world applications.
  • Golden ages thrive on fringes: defense in the first, hobbyists in the second, AI-assisted creators in the third.

INSIGHTS

  • Historical patterns show abstraction shifts eliminate drudgery but elevate the need for holistic systems judgment, ensuring engineers' relevance.
  • Crises in software engineering arise from unchecked growth, but they catalyze golden ages by forcing adaptive abstractions.
  • Economic and warfare drivers have woven modern computing's fabric, underscoring that innovation often stems from necessity over idealism.
  • Ethical forces now dominate as software permeates civilization, transforming engineering from technical to moral balancing.
  • AI automates patterns but amplifies the scarcity of human skills in managing complexity, ethics, and novelty.
  • Democratizing tools like personal computers or LLMs expand the creator base, injecting fresh ideas from non-experts.
  • Invisible successes, like Y2K fixes, reveal engineering's true value in prevention, not spectacle.
  • Interdisciplinary foundations from systems theory and biology provide timeless frameworks for architecting resilient, intelligent systems.
  • Existential fears recur with each paradigm shift, but adaptation through fundamentals turns threats into opportunities for soaring innovation.
  • Platforms and open-source evolve reuse from algorithms to ecosystems, reducing redundancy while heightening integration challenges.
  • Disembodied AI excels in virtual patterns but falters in real-world embodiment, preserving demand for grounded engineering.
  • Periods of hype, like AI winters or dot-com booms, blend progress with overreach, requiring tempered expectations.
  • Imagination unconstrained by low-level constraints unleashes economic and social value previously unattainable.

QUOTES

  • "The entire history of software engineering is one of rising levels of abstraction."
  • "Software engineering is a field that tries to build reasonably optimal solutions that balance the static and dynamic forces around them."
  • "We are an astonishingly young industry. If you were to take Carl Sagan's cosmic calendar and put software in it, we would be in the last few nanoseconds."
  • "The software crisis was software was clearly useful. There were economic incentives to use it and yet the industry could not generate quality software of scale fast enough."
  • "Object-oriented programming and design differs in the sense that we approach the world at a different layer of abstraction."
  • "The best technology evaporates and disappears and becomes part of the air that we breathe."
  • "We are in the third golden age of software engineering but it actually started around the turn of the millennium."
  • "Software engineers are the engineers who balance these forces. So we use code as one of our mechanisms, but it's not the only thing that drives us."
  • "There are more things in computing, Dario, that are dreamt of in your philosophy."
  • "English is a good enough programming language much like COBOL was in that if I give it those phrases in a domain that is well enough structured, it allows me to have good enough solutions."
  • "The main thing that constrains us in software is our imagination. Well actually that's where we begin. We're actually not constrained by imagination."
  • "When there's an opportunity where you're on the cusp of something wonderful, you should look at the abyss and say, 'No, I'm going to leap and I'm going to soar.'"
  • "This is an exciting time to be in the industry. It's frightening at the same time, but that's as it should be."
  • "Functional programming makes it easy to do hard things, but it makes it astonishingly impossible to do easy things."
  • "Much of modern computing is really woven upon the loom of sorrow."

HABITS

  • Regularly retreat to foundational texts like Herbert Simon's "The Sciences of the Artificial" for grounding in complex problems.
  • Study interdisciplinary fields such as neuroscience and biology to inform software architectures, as in analyzing brain structures for embodied cognition.
  • Use AI tools like Cursor or Claude to accelerate learning unfamiliar libraries by generating simple prototypes for study.
  • Maintain historical awareness by reflecting on past abstraction shifts to contextualize current changes.
  • Focus daily on systems thinking, balancing technical, economic, and ethical forces rather than isolated coding.
  • Experiment with hobbyist projects using AI to prototype throwaway ideas, fostering creativity without perfectionism.
  • Engage with open-source communities to share and reuse patterns, building collaborative habits from early industry precedents.
  • Read works from the Santa Fe Institute on complexity to develop models for managing scale in software ecosystems.
  • Interview or discuss with pioneers, as Grady did with John Backus, to gain insights into paradigm evolutions.
  • Prioritize ethical deliberation in design, questioning "can we" versus "should we" for pervasive systems.
  • Build personal tools for non-professional needs, like accountants scripting with ChatGPT, to explore practical applications.
  • Evacuate or adapt to real-world constraints, metaphorically like handling missile launches, to stay resilient in dynamic environments.

FACTS

  • The SAGE system, built in the 1950s-1960s, consumed 20-30% of all U.S. software developers at the time, despite there being only tens of thousands total.
  • By the late 1970s, the U.S. military used over 14,000 different programming languages across its systems.
  • The term "digital" was coined in the late 1940s, and "software" in the 1950s, marking computing's linguistic infancy.
  • Early neural networks required five vacuum tubes to simulate one neuron, leading to their dismissal as impractical in the 1950s.
  • IBM did not charge for software until the late 1960s-1970s, when it decoupled from hardware sales.
  • The ARPANET email directory in 1987 was a 100-page booklet listing every email address worldwide.
  • Y2K remediation involved global efforts costing billions, averting potential economic chaos from date overflows.
  • The Whirlwind computer in the 1950s was an experimental real-time system that influenced SAGE and early user interfaces.
  • Fairchild Semiconductor's first major customer was the U.S. Air Force for Minuteman missiles, kickstarting Silicon Valley.
  • The WELL, launched in the 1980s, was the first online social network, predating modern platforms by decades.

REFERENCES

  • Unified Modeling Language (UML), co-created by Grady Booch.
  • Margaret Hamilton's work on the Apollo program.
  • NATO Conference on Software Engineering (1968).
  • Grace Hopper's contributions to compilers and early programming.
  • FORTRAN, developed for formula translation.
  • Whirlwind computer project.
  • SAGE (Semi-Automatic Ground Environment) system.
  • Vertalry algorithm, essential for cellular phones.
  • Fast Fourier Transform implementations.
  • Entity-relationship models by Larry Constantine.
  • Flowcharts as thinking aids.
  • ADA programming language project.
  • Abstract data types from Barbara Liskov (Galan).
  • Information hiding from David Parnas.
  • Literate programming from Donald Knuth.
  • Simula, the first object-oriented language.
  • C++ by Bjarne Stroustrup.
  • UNIX from Bell Labs.
  • JOVIAL programming language.
  • ALGOL, influenced by formal methods.
  • Object Pascal for MacWrite and MacPaint.
  • Photoshop, inheriting early object-oriented designs.
  • SHARE, early user group for software sharing.
  • SOAP and service-oriented architectures.
  • Netscape's HTML image extensions.
  • HTTP protocols for information exchange.
  • Bezos's API mandates at Amazon.
  • AWS as a modern platform example.
  • Salesforce as a SaaS platform.
  • Y2K remediation efforts.
  • SNARC, the first vacuum tube neuron network.
  • Lisp Machines and Thinking Machines hardware.
  • Hearsay AI system using blackboard architecture.
  • Society of Mind by Marvin Minsky.
  • Subsumption architectures by Rodney Brooks.
  • The Sciences of the Artificial by Herbert Simon.
  • Santa Fe Institute works on complexity.
  • Maintenance by Stewart Brand.
  • What the Dormouse Said book on PC history.
  • Computing the Human Experience (Grady Booch's website).
  • Victorian Engineering Connections site using D3 library.
  • D3 JavaScript library for visualizations.

HOW TO APPLY

  • Study historical abstraction shifts, like from assembly to high-level languages, to recognize patterns in current AI changes.
  • Focus on building enduring software by prioritizing systems balance over disposable prototypes.
  • Use AI agents to generate initial code for familiar patterns, then refine with human judgment for novelty.
  • Retreat to systems theory resources, such as Herbert Simon's works, when facing complexity overload.
  • Experiment with hobbyist AI tools to prototype ideas quickly, mirroring early PC creativity.
  • Assess ethical implications early: for any tracking feature, debate "should we" before "can we."
  • Integrate interdisciplinary insights, like biological architectures, into multi-agent system designs.
  • Manage software supply chains by verifying dependencies and scanning for injections like SolarWinds.
  • Shift career focus from low-level coding to ecosystem orchestration, using platforms like AWS.
  • Collaborate via open-source to reuse libraries, reducing reinvention and accelerating development.
  • Train on embodied cognition for real-world systems, ensuring AI integrates with physical constraints.
  • Monitor economic forces: evaluate if building in-house justifies costs versus SaaS adoption.
  • Leap into uncertainty by channeling imagination into previously unaffordable projects, embracing the third golden age's vibrancy.

ONE-SENTENCE TAKEAWAY

Embrace AI as the next abstraction level in software engineering's third golden age, amplifying human ingenuity over replacing it.

RECOMMENDATIONS

  • Read "The Sciences of the Artificial" by Herbert Simon to build foundational systems thinking for complex architectures.
  • Explore Santa Fe Institute publications on complexity to model scalable software ecosystems effectively.
  • Experiment with AI tools like Cursor for prototyping, but always verify outputs against core engineering principles.
  • Study historical AI winters to temper hype around current capabilities, focusing on proven scalability limits.
  • Prioritize ethical audits in design phases, especially for pervasive tracking or surveillance features.
  • Shift training toward multi-agent systems, drawing from Minsky's "Society of Mind" for collaborative intelligence.
  • Encourage non-professionals to use AI for custom scripts, broadening innovation like early PC hobbyists did.
  • Invest in embodied cognition research for hardware-software integration, preparing for robotics and IoT advances.
  • Reskill in supply chain security to counter vulnerabilities, using tools like dependency scanners routinely.
  • Leverage platforms like Salesforce or AWS to offload complexity, concentrating on unique value creation.
  • Reflect on past crises, such as the software crisis of the 1970s, to navigate AI-induced changes calmly.
  • Foster imagination by allocating time for unconstrained ideation, now feasible without coding bottlenecks.
  • Build interdisciplinary habits, incorporating neuroscience insights for more resilient, adaptive systems.
  • Advocate for open-source reuse at higher abstractions, evolving from algorithms to full ecosystem patterns.
  • Prepare for job evolution by mastering force-balancing skills, ensuring demand in human-centric roles.
  • Document and share AI-assisted prototypes publicly to contribute to community knowledge growth.

MEMO

In the shadowed glow of early computing machines, software engineering was born not as a profession but as a rebellion. Margaret Hamilton, amid the Apollo program's frenzy in the late 1960s, first uttered the term to carve out space for her code-weaving craft among hardware titans. As Grady Booch recounts in this podcast with Gergely Orosz, the field has since navigated three golden ages, each a response to the era's unyielding constraints—technical, economic, and human. Booch, the UML co-creator and IBM Chief Scientist, paints software as an elastic medium, forever rising through abstractions, from machine-tied assembly in the 1940s to today's AI-fueled symphonies.

The inaugural golden age, spanning the late 1940s to 1970s, decoupled software from hardware's iron grip, thanks to visionaries like Grace Hopper. IBM's 1960s architectures preserved code investments across evolving machines, igniting an industry focused on algorithmic wizardry for business automation and numerical crunching. Yet fringes buzzed with defense-driven daring: the SAGE network, a Cold War behemoth, slurped 20-30% of America's scant developers to forge real-time radar webs, birthing interfaces that echo in our screens today. Flowcharts and entity models tamed laughably simple complexities by modern lights, but they were Herculean then, optimizing rare mainframes amid economic imperatives.

Cracks appeared by the late 1970s, heralding the software crisis—a clamor for quality code outpacing production, with 14,000 military languages sowing Babel. The U.S. government's ADA push sought unity, while academic whispers of objects and data hiding simmered. Enter the second golden age in the 1980s: object-oriented paradigms fused data and process, conquering distributed sprawl as personal computers democratized creation. Hobbyists, buoyed by transistor floods from Minuteman missiles, tinkered in Silicon Valley garages, their counterculture zeal—think Stewart Brand's WELL—infusing tech with communal spirit. Platforms germinated, open-source roots deepened from SHARE's shared algorithms, and SOAP protocols previewed API empires like AWS.

The millennium's turn brought the third golden age, not sparked by AI but amplified by it, shifting abstractions to library-laden ecosystems. Dot-com crashes and Y2K's silent triumph—billions spent averting apocalypse—underscored invisible engineering's quiet heroism. AI's own saga, from 1950s symbolic dreams crushed by vacuum-tube neurons to 1980s rule winters, mirrors this rhythm. Now, tools like Cursor automate patterns, freeing minds for systems orchestration amid security snarls and ethical quandaries: Can we track every step? Should we?

Booch dismisses doomsayers like Anthropic's Dario Amodei, whose 12-month automation prophecy he brands "utter" nonsense. Coding, mere mechanism, yields to judgment balancing forces AI ignores—human, economic, moral. English becomes a viable programming tongue for well-trodden paths, shrinking idea-to-artifact gaps, yet novel frontiers demand embodied cognition, drawing from cockroach brains and Mars rovers. Developers' dread echoes assembly coders' laments at FORTRAN's dawn; history rhymes, rewarding adapters.

To thrive, Booch urges foundations: Simon's artificial sciences, Santa Fe complexity tomes, Minsky's agent societies. Skills obsolete in tedium—pipeline drudgery, simple apps—pave way for rarity: complexity wranglers. Non-experts scripting via ChatGPT herald a renaissance, echoing PC artists, injecting wild ideas. Imagination, once chained, now soars unconstrained.

This era's vibrancy lies in peril's cusp: leap or cower. Software, woven on war's loom yet blooming in civilian fabric, beckons the bold. As Booch's HAL replica looms—a Damoclean reminder—engineers must wield AI not as replacement, but as wing to unprecedented heights, ensuring humanity's code endures.

Like this? Create a free account to export to PDF and ePub, and send to Kindle.

Create a free account