• Can 'Big Math' Solve for the Future? (with Terence Tao and Dawn Nakagawa)
    Jan 20 2026
    As AI floods the world with answers that merely sound right, math tethers them to the need to be actually right. New machine learning tools and collaboration platforms are pushing theoretical mathematics toward something bigger: large, open projects where progress is shared early; rabbit holes are avoided; and more people can contribute. In this episode, Terence Tao, a Fields Medal-winning mathematician at UCLA, lays out his case for “big math.” He explains what AI can do well — and where it still fails. The question isn’t whether machines can produce answers. It’s whether we can build systems, human and technical, that keep those answers tethered to truth. Resources Mentioned in this Episode: The Primes Contain Arbitrarily Long Arithmetic Progressions — Ben Green & Terence Tao (Paper, 2004) A Mathematician’s Apology — G. H. Hardy (Book, 1940) Observation of a New Particle in the Search for the Standard Model Higgs Boson With the ATLAS Detector at the LHC — The ATLAS Collaboration (Paper, 2012) Observation of a New Boson at a Mass of 125 GeV With the CMS Experiment at the LHC — The CMS Collaboration (Paper, 2012) Where to find Terence Tao: Mastodon: mathstodon.xyz/@tao Blog: terrytao.wordpress.com Home Page: www.math.ucla.edu/~tao/ Bluesky: https://bsky.app/profile/teorth.bsky.socialShow ideas and feedback? Email: futurology@berggruen.org Learn more about the Berggruen Institute https://www.berggruen.org Follow Futurology! Instagram: /futurologypod Twitter/X: / futurologypod Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky: / futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, & Jason Hoch Producers: Grant Slater, Alex Gardels, & Nathalia Ramos Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala Audio Engineer: Aaron Bastinelli & Kyle Scott Wilson Futurology is a production of Studio B and Wavland for the Berggruen Institute in Los Angeles, California.
    Mehr anzeigen Weniger anzeigen
    1 Std. und 5 Min.
  • Why Consciousness Matters in the Age of AI (with David Chalmers and Nils Gilman)
    Jan 13 2026
    It’s extremely difficult to doubt that you’re conscious, but still nearly impossible to explain why. As AI starts to speak in a voice that feels familiar, this ancient philosophical puzzle is becoming practical. If a system can persuade us it has an inner life, what does that do to the way we decide who – or what – matters? In this episode, philosopher David Chalmers makes the case that consciousness needs to move beyond the realm of mystery. Over the past three decades, serious work on the subject has gone from fringe curiosity to an active research frontier, but the central enigma remains. As the virtual infiltrates ‘IRL,’ the line between human and machine blurs. Or maybe it never mattered at all. Find more episodes of Futurology: Subscribe to the Berggruen Institute on YouTube: https://www.youtube.com/@Berggrueninst All Futurology Episodes: Apple Podcasts https://podcasts.apple.com/us/podcast/futurology/id1821718921 Spotify https://open.spotify.com/show/2I38HvHP6KlXrD5ysfygxk?si=XB2qyyGjT2ONMTd5XUKJAg&nd=1&dlsi=ac8cda6751834298 Mentioned in this Episode: The Conscious Mind — David Chalmers (Book, 1996) Reality+: Virtual Worlds and the Problems of Philosophy — David Chalmers (Book, 2022) The Emperor’s New Mind — Roger Penrose (Book, 1989) Neurophilosophy — — Patricia Churchland (Book, 1986) Is the Hard Problem of Consciousness Universal?— David Chalmers (Article, 2020) The Singularity: A Philosophical Analysis — David Chalmers (Article, 2010) “What Is It Like to Be a Bat?” — Thomas Nagel (Article, 1974) The Puzzle of Conscious Experience – David Chalmers (Article, 1995) Could a Large Language Model Be Conscious? — David Chalmers (Paper, 2023) The Meta-Problem of Consciousness — David Chalmers (Paper, 2018) Does Thought Require Sensory Grounding? From Pure Thinkers to Large Language Models – David J. Chalmers (Talk, 2023)Find David Chalmers Here: Website: https://consc.net/ On X: https://x.com/davidchalmers42?lang=en Show ideas and feedback? Email: futurology@berggruen.org Learn more about the Berggruen Institute https://www.berggruen.org Follow Futurology! Instagram: /futurologypod Twitter/X: / futurologypod Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky: / futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, & Jason Hoch Producers: Grant Slater, Alex Gardels, & Nathalia Ramos Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala Audio Engineer: Aaron Bastinelli & Kyle Scott Wilson Futurology is a production of Studio B and Wavland for the Berggruen Institute in Los Angeles, California.
    Mehr anzeigen Weniger anzeigen
    1 Std. und 30 Min.
  • Breaking Out of a Black-and-White World (with Brook Ziporyn and Bing Song)
    Jan 6 2026
    We live in a culture that flattens the world into yes or no. Hot takes and hard binaries promise simplicity. But complexity is leaking through the cracks. Opposites depend on each other. If you try to tease out the uncertain from the certain, you destroy the reality of the thing itself. In this episode, Taoist scholar Brook Ziporyn makes the case that Taoism and Buddhism aren’t puzzles to solve but tools for living. Reckoning with Eastern paradoxes can help us navigate the desire to end desire. Modern science has unlocked humanity's potential to see the emptiness of both the far-away universe and the vast space within the building blocks of matter. Buddhism and Taoism give us the capacity to reckon with the fact that there is "no there there." Resources Daodejing (Tao Te Ching) — Laozi; translated/edited by Brook Ziporyn (Book, 2023) Zhuangzi: The Complete Writings — translated by Brook Ziporyn (Book, 2020) Emptiness and Omnipresence: An Essential Introduction to Tiantai Buddhism — Brook Ziporyn (Book, 2016) Gödel, Escher, Bach: An Eternal Golden Braid — Douglas R. Hofstadter (Book, 1979) Find Professor Brook Ziporyn here: https://divinity.uchicago.edu/directory/brook-ziporyn https://voices.uchicago.edu/ziporyn/ Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: /futurologypod Twitter/X: / futurologypod Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky: / futurologypod Youtube: / berggrueninst Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, and Jason Hoch. Producers: Grant Slater, Alex Gardels, and Nathalia Ramos. Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala Audio Engineer: Aaron Bastinelli & Kyle Scott Wilson Futurology is a production of Studio B and Wavland for the Berggruen Institute in Los Angeles, California.
    Mehr anzeigen Weniger anzeigen
    57 Min.
  • The Future of Sovereignty Is Closer Than You Think (with Graham Brewer and Grant Slater)
    Dec 23 2025
    The current world order seeks to make sovereignty simple. One map. One flag. One final authority. But in Indian Country, the borders break down. Tribal nations govern alongside the United States, and sovereignty overlaps in real, everyday ways. This isn’t a historical footnote. It’s the future, hiding in plain sight. In this episode, Graham Brewer – the AP’s National Correspondent covering native lands and peoples – traces what sovereignty looks like when power overlaps and treaty promises from the 19th century adapt to the 21st. That negotiation is now playing out in the cloud: as languages are revived and culture moves onto servers. By its nature, the training of AI frontier models plunders native wisdom, but fully opting out risks another century of invisibility. The Dawn of Everything: A New History of Humanity — David Graeber & David Wengrow (Book, 2021) The Cherokee Nation and the Trail of Tears — Theda Perdue & Michael D. Green (Book, 2007) Prison Writings: My Life Is My Sun Dance — Leonard Peltier (Book, 1999) Native American Graves Protection and Repatriation Act (NAGPRA) — U.S. Congress (U.S. law, 1990) United Nations Declaration on the Rights of Indigenous Peoples (UNDRIP) — United Nations General Assembly (UN declaration, 2007) Music Modernization Act — U.S. Congress (U.S. law, 2018) McGirt v. Oklahoma — Supreme Court of the United States (Supreme Court case, 2020) Oklahoma v. Castro-Huerta — Supreme Court of the United States (Supreme Court case, 2022) Treaty of New Echota — Cherokee Nation and United States Government (Treaty, 1835) https://apnews.com/author/graham-lee-brewer# https://x.com/grahambrewer Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: / berggrueninst Twitter/X: / berggrueninst Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky /futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, and Jason Hoch. Producers: Grant Slater, Alex Gardels, and Nathalia Ramos. Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala. Audio Engineer: Aaron Bastinelli Futurology is a production of Studio B and Wavland.
    Mehr anzeigen Weniger anzeigen
    1 Std. und 37 Min.
  • Conjuring Art from Machine Hallucinations (with Refik Anadol and Claire Webb)
    Dec 16 2025
    For Artist Refik Anadol, data is not just information. It is pigment. He feeds weather records, river flows, forests and archives into custom AI models and treats the outputs as brushstrokes. The point is to let AI learn from our memories and then push beyond them, catching the moments when the machine’s vision glitches out and creates something truly novel.In this episode, Anadol talks Claire Web, the head of the Berggruen Institute’s Future Humans program, about how this collaboration has changed his sense of nature, authorship, and the edges of reality. They explore how training a model on the textures of rainforests, rivers, and archives can produce a visual language that feels both familiar and strange, and why the future of art may depend less on controlling a system than on listening to where it leads. Resources Blade Runner – (Film, 1982) The Poetics of Augmented Space — Lev Manovich (Essay, 2006) TED Talk Your brain hallucinates your conscious reality — Anil Seth (Talk, April 2017) Large Nature Model — Refik Anadol Studio (AI Model, ongoing) Refik Anadol https://refikanadol.com/ https://www.linkedin.com/company/refik-anadol-studio/ https://www.instagram.com/refikanadol/?hl=en https://dataland.art/?utm_source https://refikanadolstudio.com/ Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: / berggrueninst Twitter/X: / berggrueninst Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky /futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, and Jason Hoch. Producers: Grant Slater, Alex Gardels, and Nathalia Ramos. Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala. Audio Engineer: Aaron Bastinelli Futurology is a production of Studio B and Wavland and distributed by Realm.
    Mehr anzeigen Weniger anzeigen
    56 Min.
  • The Dangers of Seeing Ourselves in Artificial Intelligence (with Anil Seth and Nils Gilman)
    Dec 9 2025
    Humans are built for pattern recognition. It is the engine behind perception, emotion, and the fragile sense of self that feels so solid from the inside. For Anil Seth, this pattern-making power explains why consciousness is not a light inside but a process the brain assembles from guesses about the world. And it matters that each of us perceives that world differently. In this episode of Futurology, Seth talks with Nils Gilman about what these differences reveal about the nature of consciousness and why they matter for the debate over artificial minds. LLMs are pattern-recognition machines of a different kind, uncanny enough to gain our sympathy but Seth argues there is no there there. Caring for conscious AI could quickly become more than a harmless curiosity. It may turn into a zero-sum game that diminishes how we treat one another long before the machines ‘wake up’ in any meaningful way, if that is even possible at all. Anil Seth Anil Seth’s Website: https://www.anilseth.com/ Instagram: @profanilseth https://www.instagram.com/profanilseth/ BlueSky @anilseth.bsky.social https://bsky.app/profile/anilseth.bsky.social Twitter: https://x.com/anilkseth An Essay Concerning Human Understanding — John Locke (Book, 1689) How to Change Your Mind About Psychedelics — Michael Pollan (Book, 2018) Ex Machina — Alex Garland (Film, 2014) The Perception Census — Anil Seth et al. (Online Study, Ongoing) The Dress — Viral Internet Illusion (Internet Phenomenon, 2015) Müller-Lyer Illusion — Franz Carl Müller-Lyer (Visual Illusion, 1889) Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: / berggrueninst Twitter/X: / berggrueninst Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky /futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, and Jason Hoch. Producers: Grant Slater, Alex Gardels, and Nathalia Ramos. Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala. Audio Engineer: Aaron Bastinelli Futurology is a production of Studio B and Wavland and distributed by Realm.
    Mehr anzeigen Weniger anzeigen
    1 Std. und 6 Min.
  • What Whales Can Teach Us About Talking to Aliens (With David Gruber and Claire Webb)
    Nov 25 2025
    We’ve spent decades beaming radio waves into space listening for an answer. But it might be enough to start here on Earth, or more accurately, under the seas. Sperm whales live in complex clans and communicate in rapid-fire clicks. Even if we could decode their messages, is it safe to assume they want to talk to us? What, exactly, would we have to say to them? The Cetacean Translation Initiative – CETI for whales not SETI for E.T. – is considering the implications of AI translation tools for the ocean’s depths. In this episode of Futurology, CETI Founder David Gruber joins Claire Webb – the director of the Berggruen Institute's Future Humans program – to explore what it means to approach another intelligence with humility rather than conquest. In the end, creating a direct linguistic connection with another species may be yet another white whale that humanity should abandon as folly. For Gruber, the point isn’t fluency. It’s learning to speak more softly on a planet filled with minds we’ve barely begun to meet. Resources Aglow in the Dark: The Revolutionary Science of Biofluorescence — David Gruber & Vincent Pieribone (Book, 2005) The Art of Translation — Vladimir Nabokov (Essay, 1941) Songs of the Humpback Whale — Roger Payne & Scott McVay (Scientific Article, 1970) Songs of the Humpback Whale — Roger Payne & Frank Watlington (Audio Recording, 1970) Follow David Gruber @davidfgruber https://www.davidgruber.com/ Follow Project CETI Instagram: @ProjectCETI LinkedIn: Project CETI Twitter/X: @ProjectCETI YouTube: Project CETI TikTok: @ProjectCETI Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: / berggrueninst Twitter/X: / berggrueninst Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky /futurologypod Credits Executive Producers: Nicolas Berggruen, Nathan Gardels, Nils Gilman, Dawn Nakagawa, and Jason Hoch. Producers: Grant Slater, Alex Gardels, and Nathalia Ramos. Associate Producer: Elissa Mardiney Theme Music: Marcus Bagala. Audio Engineer: Aaron Bastinelli Futurology is a production of Studio B and Wavland and distributed by Realm.
    Mehr anzeigen Weniger anzeigen
    1 Std. und 17 Min.
  • The Big Lie Behind AI (with Jaron Lanier and Grant Slater)
    Nov 18 2025
    Artificial intelligence isn’t alive. But our belief that it is may be the most dangerous illusion of all. Tech leaders talk about AI as if it thinks for itself. But that fantasy hides a more nuanced story about people, power, and profit. In this episode of Futurology, musician and technologist Jaron Lanier joins Futurology Producer Grant Slater to explain why treating AI as a creature, rather than a tool, lets corporations own the work of millions and silence the humans behind the code. Lanier argues that every algorithm is built from borrowed human creativity — the songs, stories, and patterns we’ve already made. The way forward, he says, is to restore data dignity: valuing people for the music and meaning they create, instead of worshipping the machines that remix it. Resources Who Owns the Future — Jaron Lanier (Book, 2013) The Dawn of the New Everything — Jaron Lanier (Book, 2017) Vers la flamme — Alexander Scriabin (Solo Piano Piece, 1914) A Blueprint for a Better Digital Society — Jaron Lanier and E. Glen Weyl (Article, 2018) Computing Machinery and Intelligence — Alan Turing (Article, 1950) Instruments of Change — Jaron Lanier (Album, 1994) Fantasia — Walt Disney Productions (Film, 1940) Snow Crash — Neal Stephenson (Novel, 1992) Want to share suggestions or feedback? Email futurology@berggruen.org Keep up to Date with the Berggruen Institute at: https://www.berggruen.org Instagram: / berggrueninst Twitter/X: / berggrueninst Facebook: / berggrueninst LinkedIn: / berggrueninst Bluesky /futurologypod Credits Producers: Grant Slater, Alex Gardels, Nathalia Ramos Associate Producer: Elissa Mardiney Mixing & Mastering: Aaron Bastinelli Theme Music: Marcus Bagala Special Thanks: Heather Mason, Olivia de Rienzo, Carly Migliori, Nick Goddard
    Mehr anzeigen Weniger anzeigen
    1 Std. und 40 Min.