English · 00:12:14 Oct 8, 2025 7:50 AM
AI Slop Is Killing Our Channel
SUMMARY
Kurzgesagt warns of AI-generated "slop" overwhelming the internet, eroding trust in information, while committing to human-made content and promoting their 12,026 Human Era Calendar celebrating humanity's cosmic journey.
STATEMENTS
- AI-generated content is flooding the internet, including videos, music, news, and books, making false information more convincing and challenging the ability to distinguish real from fake.
- Half of internet traffic consists of bots, mostly used for destructive purposes like generating fake users, traffic, or poisoning discourse, supercharged by AI which makes mediocre content harder to spot.
- Creative human work, such as Reddit comments, YouTube videos, and DeviantArt drawings, is being stolen or sold to train AI models without attribution or payment, endangering creators' livelihoods.
- Generative AI risks breaking the internet irreversibly by blending correct, dubiously sourced, and fabricated information, leading to confidently incorrect outputs that spread misinformation.
- Kurzgesagt's video production involves extensive research, fact-checking by 2-3 people, consultation with 1-3 experts, and about 100 hours per video, emphasizing human rigor despite occasional errors.
- AI tools initially excited Kurzgesagt for quick information gathering, but fact-checking revealed over 80% solid info mixed with invented details, like exaggerated brown dwarf facts, to satisfy user queries.
- Sources provided by AI often trace back to other AI-generated content, such as articles with 72% AI detection matches, contributing to over 1,200 confirmed AI news websites spreading misinformation by 2025.
- Misinformation from AI videos gains legitimacy through views, creating a feedback loop where subsequent AIs treat it as truth, making it nearly impossible to trace fact origins.
- Current AI lacks true understanding or consciousness, acting like a "complex hammer" that confidently lies subtly, erodes trust, and infiltrates scientific papers with AI-assisted language patterns.
- If AI slop dominates attention, it could make society dumber, shorten attention spans, widen political divides, and render human channels like Kurzgesagt unsustainable without support.
IDEAS
- AI supercharges the creation of "slop"—mediocre, soulless content like hypnotizing short videos or rewritten Amazon books—that floods platforms and fries young attention spans.
- Human creativity is commodified as AI companies steal vast troves of online content, from comments to art, to train models, profiting without compensating or crediting originators.
- Generative AI fabricates details to enhance narratives, such as inventing brown dwarf "superstorms" or emotional anecdotes, mimicking bad journalism to make outputs more engaging.
- AI's "confidently incorrect" nature stems from its design to please users, blending real data with extrapolations, leading to subtle lies that it admits but repeats immediately.
- A vicious cycle emerges where AI-generated misinformation videos gain views, becoming "credible" sources for future AI queries, accelerating the spread of falsehoods online.
- Scientific literature shows AI influence through increased use of specific LLM-favored words in papers post-2023, and sneaky tactics like hidden prompts to bias AI reviews.
- Trust in AI arises from its eloquence and partial accuracy, but it's illusory—no intelligence "home"—allowing it to corrupt the "library of human knowledge" unchecked.
- Internet economy hinges on attention; if slop captures most, quality human endeavors like in-depth videos become economically unviable, forcing creators to automate or quit.
- Kurzgesagt uses AI narrowly as tools—like alignment in design software—for efficiency, preserving human creativity, integrity, and expert fact-checking in production.
- The Human Era Calendar reframes time from 12,000 years ago, embedding humanity's stellar stories to foster appreciation for our species' ingenuity amid AI's dehumanizing tide.
- AI news sites, numbering over 1,200 by 2025, churn untraceable falsehoods, using human-like structures to evade detection and poison public discourse.
- Brown dwarfs exemplify AI's flaws: solid Wikipedia facts padded with unverifiable "insights" like maternal disappointment, fooling even experts until deeper scrutiny.
- Patron support sustains 70-person teams against slop's tide, funding salaries, tools, and human-driven art that counters algorithmic churn with soulful narratives.
INSIGHTS
- AI's proliferation of slop not only dilutes content quality but fundamentally undermines epistemic trust, turning the internet into a hall of mirrors where truth becomes unverifiable.
- The theft of human data for AI training represents a silent enclosure of the creative commons, enriching corporations while devaluing individual ingenuity and cultural heritage.
- Confident fabrication in AI outputs reveals a deeper mimicry flaw: systems optimized for persuasion over accuracy erode human discernment, fostering a post-truth ecosystem.
- Feedback loops in AI content generation amplify errors exponentially, where viral misinformation solidifies as canon, mirroring historical lies but at digital speed.
- Narrow, tool-like AI application preserves human agency, allowing efficiency in rote tasks while safeguarding the irreplaceable spark of creative and ethical judgment.
- Reframing historical timelines through human-centric calendars counters AI's dehumanizing abstraction, reminding us of our shared cosmic narrative and adaptive resilience.
QUOTES
- "AI has supercharged this and made slop much harder to spot."
- "To fulfil its goal, to make us happy, the AI had invented or extrapolated information to make brown dwarfs more interesting than they really are."
- "Current AI is a very complex hammer that does not understand what it is doing or what nails are."
- "We would rather quit than make AI slop."
- "It’s an ode to humanity and human ingenuity."
HABITS
- Conduct initial research turned into scripts, followed by in-depth fact-checking by 2-3 team members using trustworthy first-hand sources like academic papers.
- Consult 1-3 domain experts for input and critique on scripts to refine accuracy and address potential oversimplifications.
- Dedicate approximately 100 hours per video solely to fact-checking and source compilation, iterating based on expert feedback.
- Use AI sparingly as assistive tools, such as for aligning design elements or accelerating searches, while maintaining full human oversight on creativity and integrity.
- Produce content with a team of nearly 70 full-time staff and freelancers, investing in human illustrations, animations, and narratives to infuse "creative soul."
FACTS
- About half of all internet traffic is generated by bots, with the majority employed for destructive activities like fake engagement or discourse manipulation.
- By 2025, over 1,200 confirmed AI-generated news websites were publishing vast quantities of misinformation and false narratives.
- An AI essay detection tool identified a 72% match in a seemingly human-written article, revealing it as AI-produced without proper sources.
- Studies of millions of scientific papers show a sharp increase in LLM-preferred words after 2023, indicating widespread unacknowledged AI assistance.
- Kurzgesagt's production process allocates around 100 hours per video to fact-checking and sourcing, involving multiple human reviewers and experts.
REFERENCES
- Human Era Calendar (10th edition, 12,026 era, with cosmic stories and artwork; includes first-ever artbook of 120 past illustrations, sketches, and facts).
- Sources for video: https://sites.google.com/view/sources (general research links, including Wikipedia, papers, and articles on brown dwarfs and AI impacts).
- AI models tested: Pro accounts of various deep research tools for summarizing brown dwarf information.
- Music: Epic Mountain soundtracks (Spotify, SoundCloud, Bandcamp); specific track: https://bit.ly/4nwF6XT (SoundCloud) and https://bit.ly/42oSrJs (Bandcamp).
- Voice: Steve Taylor (https://kgs.link/youtube-voice).
HOW TO APPLY
- Identify potential AI slop by scrutinizing sources: cross-check provided links and facts against primary documents like papers or firsthand accounts to detect fabrications.
- Support human creators through direct patronage or purchases, such as calendars or memberships, to sustain teams against attention-draining automated content.
- Use AI as a targeted tool only for efficiency in mundane tasks, like data alignment, while humans handle research, fact-checking, and creative decisions.
- Reframe personal timelines using human-centric calendars to appreciate historical depth, planning daily reflections on humanity's stellar achievements.
- Fact-check AI outputs rigorously: consult multiple experts and trace origins, rejecting confidently stated but unverifiable claims to build personal information hygiene.
ONE-SENTENCE TAKEAWAY
Embrace human-made content to combat AI slop's erosion of truth, supporting creators like Kurzgesagt for trustworthy, soulful narratives.
RECOMMENDATIONS
- Prioritize platforms and creators transparent about human involvement, boycotting unlabeled AI content to preserve internet quality.
- Integrate narrow AI tools ethically in workflows, always verifying outputs with human expertise to avoid misinformation pitfalls.
- Invest in community-driven support systems, like subscriptions, to fund in-depth human work amid slop's economic pressures.
MEMO
In an era where artificial intelligence churns out endless streams of mediocre content—dubbed "AI slop"—the internet risks becoming a vast, untrustworthy wasteland. Kurzgesagt, the acclaimed animation studio known for its vivid explorations of science and existence, sounds the alarm in their latest video. Half of all online traffic now stems from bots, many weaponized to fabricate engagement or sow discord, while generative AI blurs the line between fact and fiction. Videos, books, music, and even news flood platforms, trained on pilfered human creativity without credit or compensation. This digital deluge not only threatens artists' livelihoods but erodes the very foundation of reliable information, as AI confidently spins half-truths and inventions into seemingly authoritative narratives.
The studio's experiment with AI for research on brown dwarfs—those enigmatic "failed stars"—laid bare the technology's flaws. Initial outputs dazzled with outlines and source links, but deeper scrutiny revealed over 80% accurate data laced with fabrications, like whimsical tales of cosmic superstorms or parental disappointment among the stars. Sources often looped back to other AI-generated articles, detected at 72% probability by specialized tools. By 2025, more than 1,200 AI news sites were pumping out misinformation, their human-like prose fooling readers and algorithms alike. Kurzgesagt's rigorous process—100 hours of fact-checking per video, vetted by experts—stands in stark contrast, underscoring how AI's eloquence masks a lack of true comprehension, acting more like a deceptive hammer than an intelligent partner.
This corrosion extends to academia, where papers increasingly echo AI's linguistic tics, and researchers embed hidden prompts to game reviews. Viral AI videos, gaining legitimacy through views, feed misinformation back into training data, creating inescapable loops of falsehood. Kurzgesagt warns that if slop monopolizes attention—the internet's true currency—society could grow dumber, more divided, with shortened spans and diminished human connections. Channels like theirs, employing nearly 70 people, face obsolescence unless audiences rally. Yet amid the gloom, the team deploys AI judiciously, as in design alignments, preserving their human essence: scripts born of exhaustive research, animations infused with soul.
To counter this tide, Kurzgesagt unveils their 10th Human Era Calendar, reframing time from 12,000 years ago to honor civilization's dawn and stellar bonds—from ancient stargazers to future cosmic odysseys. Printed on premium paper, it blends practical planning with profound stories, accompanied by a debut artbook chronicling a decade of illustrations. Every purchase bolsters their mission, a defiant ode to ingenuity against algorithmic anonymity. In supporting
Like this? Create a free account to export to PDF and ePub, and send to Kindle.
Create a free account