English · 01:09:45 Sep 16, 2025 3:09 PM
What If AI Makes Us Less Productive? (The Study Sam Altman Doesn’t Want You to See)
SUMMARY
Cal Newport, in Deep Questions podcast episode 370, analyzes a METR study showing AI tools slowing experienced programmers' productivity during deep work, explores AI's environmental impact, and debunks claims that Wi-Fi absence dumbs down West Virginia students.
STATEMENTS
- A recent METR study from July 2025 unexpectedly found that AI tools made experienced open-source developers 20% slower on real coding tasks.
- The study recruited 16 developers from large repositories to work on actual issues, randomly assigning AI use or non-use.
- Developers using AI primarily chose Cursor Pro with Claude 3.5 or 3.7 Sonnet models.
- Economic and machine learning experts predicted a 40% productivity boost from AI in programming.
- Developers self-reported 20-30% productivity gains while using AI, but measurements showed the opposite.
- Programming tasks in the study required deep work: cognitively demanding focus without distraction.
- Deep work involves intense, undistracted attention to complex problems, central to valuable knowledge work.
- AI integration created "cybernetic collaboration," an interactive back-and-forth with the tool for code generation and review.
- Developers using AI spent less time coding and more reviewing outputs, prompting, waiting, and idling.
- Collaborative deep work succeeds by amplifying focus intensity and duration, like the "whiteboard effect" with human partners.
- Cybernetic collaboration reduces focus intensity, providing breaks and offloading effort, making work feel easier but slower.
- AI's small errors require extensive human review, often taking more time than original coding.
- Deep work productivity hinges on focus intensity times duration; anything diluting it harms output.
- AI may automate shallow tasks effectively but hinders deep work through fragmented collaboration.
- Programmers found AI-assisted work more pleasant due to breaks, but it lowered overall efficiency.
- The study counters assumptions that AI inherently boosts programming productivity.
- Active recall for learning should revisit material within two weeks to prevent forgetting unused information.
- Using learned material cements memory better than isolated recall.
- AI queries like ChatGPT consume far more energy than Google searches due to massive model sharding across GPUs.
- Environmental critiques of AI often stem from anti-tech biases rather than sustainable concerns.
- Future AI will likely use smaller, specialized models running on local devices, reducing environmental impact.
- Large frontier models are like F1 cars: showcases, not practical consumer tech.
- Leaving academia would not make Cal Newport antsy, as writing and podcasting already dominate his work.
- Cal's academic role integrates tech ethics, writing, and public outreach seamlessly.
- Lifestyle-centric planning requires optimizing all life aspects, not single radical changes.
- Evidence-based career planning involves assessing real job needs and building targeted skills.
- Reviving old career capital through ultralearning can lead to higher pay and flexibility.
- Wi-Fi absence in Green Bank, West Virginia, due to a radio telescope, limits school tech use.
- Pocahontas County schools show no clear evidence that Wi-Fi lack caused performance drops compared to similar counties.
- Math and reading scores in Pocahontas declined mid-2010s, but similar Wi-Fi-enabled counties fared worse.
IDEAS
- AI's productivity promise falters in deep work because interactive collaboration fragments human focus, turning intense effort into a slower, intermittent loop.
- Conventional wisdom overlooks how AI's "helpfulness" creates hidden costs: waiting periods and error-checking that erode the sustained cognition needed for breakthroughs.
- Deep work isn't solitary by necessity; human collaboration sharpens focus through social accountability, unlike AI's diluting breaks.
- The "whiteboard effect" reveals collaboration's true power: it enforces deeper immersion, not task division, explaining why AI offloading backfires.
- Programmers' self-perceived productivity gains from AI mask a seductive ease that trades quality and speed for comfort.
- Environmental AI debates distract from core issues, serving as a safe critique zone for tech skeptics avoiding technical nuances.
- Massive AI models are economically unsustainable, predicting a shift to efficient, task-specific systems that mimic human cognition without excess power.
- Radical life changes, like quitting for a nature job, often amplify one desire while neglecting holistic lifestyle balance, leading to unintended regrets.
- Career revival through deliberate skill-building demonstrates how past capital can fund intentional freedoms, like reduced hours for personal passions.
- School tech mandates assume connectivity equals progress, but data muddies this, suggesting overreliance on digital tools may not drive learning gains.
- Active recall's timing hinges on usage: frequent application embeds knowledge naturally, while rote review demands weekly touchpoints to combat decay.
- ChatGPT as a project notebook risks embedding cybernetic habits, blurring helpful tracking with focus-undermining interactions.
- AI's environmental footprint correlates with cost; unprofitable energy hogs won't dominate, easing long-term planetary concerns.
- Academia and public intellectualism overlap when ethics and tech intersect, making transitions seamless for hybrid thinkers.
- Small-sample studies like METR's signal broader truths: AI's current form may amplify fool's gold in knowledge work.
- Lifestyle planning thrives on evidence, not fantasy—real job markets guide skill acquisition for sustainable joy.
- West Virginia's Green Bank natural experiment challenges edtech evangelism, as Wi-Fi-free zones don't inevitably lag peers.
- Focus intensity remains the irreplaceable engine of value creation, rendering AI a potential saboteur in cognitive domains.
- Poker AI like Pluribus shows hybrid intelligence—small nets plus logic—outperforms bloated models, hinting at leaner futures.
- Teacher complaints about tech limits reflect adaptation biases, not proven causation in student outcomes.
INSIGHTS
- AI's interactive use in deep work creates a paradox where perceived ease masks reduced cognitive gear, slowing innovation by fragmenting the mind's peak performance.
- True collaboration amplifies human focus through mutual pressure, while AI's version offloads effort at the cost of depth, revealing machines as poor substitutes for intellectual synergy.
- Environmental fears around AI are often proxies for cultural unease, but market forces will prune inefficient giants, favoring compact intelligence that aligns with sustainable tech evolution.
- Radical pursuits like nature immersion succeed only when balanced across life's pillars; isolated changes invite imbalance, underscoring holistic design for enduring fulfillment.
- Data-driven career pivots, blending past strengths with targeted upskilling, unlock leverage for autonomy, proving evidence trumps impulse in professional reinvention.
- Edtech's promise of personalization falters without rigorous controls; intuitive leaps from correlation to causation, as in Wi-Fi debates, highlight the need for skeptical scrutiny.
- Memory retention flourishes through application over rote repetition, teaching that knowledge integration, not isolation, forges lasting neural pathways.
- Project tools like AI chats tempt escape from cognitive strain, but sustaining intensity—via self-narratives—preserves the raw power of unaided focus.
- Frontier AI models dazzle as prototypes, yet practical futures lie in modular, device-native systems, mirroring how poker bots evolved from supercomputers to laptops.
- Academic and creative pursuits converge in ethics-focused inquiry, allowing seamless identity shifts without the void of specialization silos.
- Productivity illusions from AI arise from pleasure's deception; deep work's discomfort is its virtue, demanding we guard against tools that soften essential rigor.
- County-level data exposes edtech myths: Wi-Fi equity doesn't guarantee equity in outcomes, urging reevaluation of digital dependency in education.
- Skill revival via ultralearning transforms stagnation into opportunity, illustrating how deliberate effort recaptures lost capital for a tailored life rhythm.
- Critiques of AI often retreat to familiar terrains like ecology, evading the discomfort of engaging its algorithmic core, a tactic that delays substantive discourse.
- Focus as a formula—intensity multiplied by duration—exposes cybernetic collaboration's flaw: breaks dilute output, affirming the brain's supremacy in value generation.
QUOTES
- "The observed result is on average they were about 20% slower than the people not using AI."
- "When allowed to use AI, developers spend a smaller proportion of their time actively coding and reading, searching for information. Instead, they spend time reviewing AI outputs, prompting AI systems, and waiting for AI generations."
- "The reason why collaboration helps with deep work is that you use the presence of other people to increase the intensity and duration of your focus."
- "Cybernetic collaboration means much less intensity of focus, much less duration of focus. It takes less energy. It feels nicer, but that's why they're slower."
- "Even the most advanced systems make small mistakes or slightly misunderstand directions, requiring a human to carefully review their work and make changes where needed."
- "Deep work rewards intensity of focus. And if you add anything into your workflow that's going to reduce this intensity, you'll probably get less productive."
- "If querying one of these models is bad for the environment, that means the amount of computation it's using can't possibly be profitable."
- "I think the future of AI is going to be it has to be systems that have much smaller models, machine native models if possible. Meaning like this thing is running on my iPad, not in the cloud somewhere on a bunch of GPUs."
- "Your daily subjective mood is not the result of a single decision or change but on all of the relevant aspects of your life."
- "There's a nice subtle leap from the schools without Wi-Fi are worse to the schools without Wi-Fi are worse because they don't have Wi-Fi."
- "The focus mind is what produces value."
- "Hard is sometimes what you need."
- "Be wary of things that get in the way of that [the brain focusing hard]."
- "I just don't think this industry can survive in a mode that is super bad for the environment. It just costs too much money."
HABITS
- End writing or project sessions with brief narrative notes outlining next steps to enable quick restarts.
- Use active recall weekly for unused memorized information to prevent forgetting within two weeks.
- Integrate learned concepts into real applications or teaching to naturally reinforce memory.
- Conduct evidence-based planning by reviewing actual job listings to identify and target specific skill gaps.
- Employ ultralearning techniques, like focused deep work sessions, to rebuild outdated career skills efficiently.
- Block time for collaborative deep work with human partners to leverage the whiteboard effect for intensified focus.
- Maintain digital minimalism by avoiding ongoing AI chats that foster fragmented, low-intensity interactions.
- Schedule intentional nature time, like Fridays off, after building career capital for reduced hours.
- Coach multiple youth sports teams to balance intellectual work with community involvement.
- Rotate through high-quality bedding sets for optimal rest, prioritizing comfort in daily routines.
- Check in daily with an online fitness coach for accountability in nutrition and exercise consistency.
- Submit voice memos for podcast questions to practice concise articulation of personal challenges.
FACTS
- METR, a nonprofit evaluating AI, produced the July 2025 study on developer productivity using real open-source tasks.
- Claude 3.5 and 3.7 Sonnet were the leading AI models during the study's timeframe.
- Pocahontas County, West Virginia, spans large geography but has only a handful of small schools, including Green Bank Elementary-Middle with about 200 students.
- The Green Bank Telescope, the world's largest steerable radio telescope, prohibits Wi-Fi and cell signals to avoid interference.
- Chromebooks and online curricula proliferated in U.S. schools during the 2010s, post-2010 iPad introduction.
- Pluribus, Noam Brown's AI, beat professional poker players using a laptop-based hybrid of small neural nets and symbolic logic, not massive supercomputers.
- West Virginia's math scores in Pocahontas County improved from 2009 to 2017 before declining, mirroring but not exceeding state trends.
- Similar socioeconomic counties in West Virginia saw steeper math score drops (2019-2022) than Pocahontas despite full Wi-Fi access.
- Cal Newport's Deep Work book, published in 2016, approaches its 10-year anniversary in January 2026.
- Georgetown University's computer science, ethics, and society major is the first in the U.S. to integrate the fields.
REFERENCES
- METR July 2025 report: "Measuring the Impact of Early 2025 AI on Experienced Open-Source Developer Productivity."
- Cal Newport's book "Deep Work" (2016).
- Cal Newport's book "Slow Productivity."
- Cal Newport's book "So Good They Can't Ignore You."
- Scott Young's book "Ultralearning."
- Atlantic article by Rohit Krishnan: "Just How Bad Would an AI Bubble Be?"
- Washington Post op-ed on Green Bank, West Virginia schools (cited via Marginal Revolution).
- Tyler Cowen's Marginal Revolution blog post on the Green Bank Wi-Fi issue.
- Education Recovery Scorecard data on West Virginia county test scores (2009-2024).
- Noam Brown's Pluribus AI for Texas Hold'em poker.
- Cal Newport's New Yorker writing on technology impacts.
- Deep Questions podcast episode 367: "What if AI Doesn't Get Much Better Than This?"
- Cozy Earth Bamboo Joggers and Everyday Pants.
- BetterHelp online therapy platform.
- Shopify point-of-sale system for small businesses.
- MyBodyTutor online coaching program.
HOW TO APPLY
- Recruit experienced professionals for real tasks and randomly assign AI use to measure true productivity impacts.
- Define deep work sessions with strict no-distraction rules to maximize focus intensity on cognitive demands.
- Pair with human collaborators at a whiteboard to harness social pressure for longer, deeper concentration.
- Prompt AI sparingly for shallow tasks like information lookup, avoiding interactive code generation loops.
- Log session endings with concise narratives: note progress, gaps, and next actions for seamless restarts.
- Schedule active recall within two weeks for unused knowledge, escalating to weekly if retention fades.
- Audit job markets for skill requirements, then dedicate focused blocks to acquire them via targeted practice.
- Prototype lifestyle visions by mapping all daily elements—work, family, leisure—and adjust for balance.
- Experiment with small AI models locally on devices to minimize environmental and cost overheads.
- Compare county-level education data over time to validate tech interventions against peers.
- Integrate ethics into tech curricula by blending computational and societal analysis in coursework.
- Build hybrid AI systems: combine neural networks with symbolic logic for efficient, task-specific intelligence.
- Revive dormant skills by immersing in modern tools and real projects, tracking progress weekly.
- Foster accountability in fitness by daily app check-ins with a coach for habit consistency.
- Critique tech claims by pulling time-series data and similar-case comparisons to test causation.
ONE-SENTENCE TAKEAWAY
AI's cybernetic collaboration often hinders deep work productivity by diluting focus intensity.
RECOMMENDATIONS
- Prioritize solo or human-collaborative deep work over AI interactions to sustain peak cognitive output.
- Use AI exclusively for automating routine, shallow tasks to free time for high-value focus.
- Design project notes as self-contained narratives in primary documents, avoiding AI chat dependencies.
- Revisit active recall materials biweekly, prioritizing application in real scenarios for retention.
- Skeptically assess AI's environmental claims, favoring efficient local models over cloud giants.
- Plan career shifts with evidence from job listings, rebuilding skills through deliberate ultralearning.
- Construct holistic lifestyles by evaluating all facets—work, family, recreation—before radical moves.
- Scrutinize edtech benefits with comparative data, questioning intuitive links like Wi-Fi to performance.
- Embrace hybrid AI architectures, blending small nets and logic for practical, low-impact intelligence.
- Integrate writing and public outreach into professional roles for seamless intellectual fulfillment.
- Guard against productivity illusions by measuring outputs, not feelings, in AI-assisted workflows.
- Cultivate focus through discomfort, resisting tools that offer breaks at the expense of intensity.
- Balance nature pursuits with stable careers by leveraging capital for flexible, intentional time off.
- Engage therapy via platforms like BetterHelp to strengthen mind relationships amid tech pressures.
- Adopt time-blocking planners to structure deep work amid daily distractions.
MEMO
In a surprising twist to the AI hype, a July 2025 METR study revealed that generative tools like Claude actually slowed experienced open-source developers by 20% on real coding tasks. Cal Newport, unpacking this in his Deep Questions podcast, highlights the experiment's rigor: 16 veteran programmers tackled bug fixes and features, randomly barred from AI half the time. Expectations ran high—economists and AI experts forecasted 40% gains, developers felt 20-30% boosts—but reality showed AI users lingering in loops of prompting, waiting, and reviewing flawed outputs. This "cybernetic collaboration," as Newport dubs it, fragments the intense focus deep work demands, turning solitary cognitive marathons into intermittent chats that feel easier yet yield slower results.
Deep work, Newport's cornerstone concept from his 2016 book, thrives on undistracted immersion in complex problems—the engine of knowledge economy value. Programming epitomizes this: crafting code from scratch requires sustained intensity, not the back-and-forth with a machine prone to subtle errors. Human collaboration, by contrast, sharpens focus via the "whiteboard effect"—social stakes keep minds locked in longer and deeper, as Newport experienced in his theoretical computer science days at MIT and Georgetown. AI's allure lies in offloading strain, offering breaks while models generate tokens, but this dilutes the very intensity that accelerates breakthroughs. The study notes programmers idled more and coded less, enjoying the process yet producing at a crawl, underscoring a paradox: tools meant to empower often sabotage by making hard thinking too palatable.
Newport warns this trap extends beyond code; knowledge workers everywhere risk similar productivity pitfalls. AI shines for shallow automation—quick searches or form-filling—but falters in core creation, where human brains remain unmatched. He predicts future wins from specialized, device-local models, not trillion-parameter behemoths sharded across GPUs. On environment, Newport dismisses alarmism as a comfy critique for tech skeptics, noting massive models' inefficiency dooms them economically; heat and electricity costs ensure sustainability. Examples like Noam Brown's Pluribus poker AI, which beat pros on a laptop via lean hybrids of nets and logic, signal a pragmatic path forward, free from F1-style extravagance.
Listener queries illuminate applications. One ponders ChatGPT as project notebooks—handy for tracking, Newport concedes, but beware embedding cybernetic habits that erode focus; stick to inline notes for swift restarts. Active recall timing? Revisit unused facts biweekly to combat decay, but apply knowledge actively for permanence. AI's carbon footprint dwarfs Google searches, yet market forces will cull the wasteful, favoring iPad-native intelligence. If quitting academia, Newport sees no void—his writing, podcasting, and ethics teaching already blend seamlessly, enriched by student energy.
A case study from listener Vin exemplifies lifestyle-centric planning. Ditching web development for a Norwegian farm and later nature guiding thrilled initially but crumbled under family strains and commutes. Drawing from Newport's books and Scott Young's Ultralearning, Vin evidence-based his pivot: scanning job listings, he upskilled in modern coding over a year, doubling pay and angling for four-day weeks to savor nature sans chaos. This beats impulsive leaps, optimizing work, home, and passions holistically for sustained joy.
Skepticism peaks in Newport's data dive on Green Bank, West Virginia—a Wi-Fi-free enclave shielding the world's largest radio telescope. A Washington Post op-ed blamed connectivity voids for lagging student scores, citing Chromebook limits and teacher frustrations. Yet county-level trends from the Education Recovery Scorecard tell murkier tales: Pocahontas County's math and reading dips mid-2010s aligned with edtech's rise but weren't steeper than Wi-Fi-rich peers. Similar demographic counties fared worse post-2019, suggesting the school's woes stem from deeper issues, not absent internet. Newport urges caution against intuitive tech panaceas, echoing debates on phones in schools where harms outweigh gains for young minds.
Ultimately, Newport champions the brain's raw power: focus, unadulterated, forges value AI can't yet touch. As grand AI promises dim, practical wisdom prevails—wield tools judiciously, plan lives wholly, and question data-driven narratives. In an era of seductive distractions, staying deep means embracing the grind that truly advances us.
Like this? Create a free account to export to PDF and ePub, and send to Kindle.
Create a free account