Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

#152 – Joe Carlsmith on navigating serious philosophical confusion

#152 – Joe Carlsmith on navigating serious philosophical confusion

From80,000 Hours Podcast


#152 – Joe Carlsmith on navigating serious philosophical confusion

From80,000 Hours Podcast

ratings:
Length:
207 minutes
Released:
May 19, 2023
Format:
Podcast episode

Description

What is the nature of the universe? How do we make decisions correctly? What differentiates right actions from wrong ones?Such fundamental questions have been the subject of philosophical and theological debates for millennia. But, as we all know, and surveys of expert opinion make clear, we are very far from agreement. So... with these most basic questions unresolved, what’s a species to do?In today's episode, philosopher Joe Carlsmith — Senior Research Analyst at Open Philanthropy — makes the case that many current debates in philosophy ought to leave us confused and humbled. These are themes he discusses in his PhD thesis, A stranger priority? Topics at the outer reaches of effective altruism.Links to learn more, summary and full transcript.To help transmit the disorientation he thinks is appropriate, Joe presents three disconcerting theories — originating from him and his peers — that challenge humanity's self-assured understanding of the world.The first idea is that we might be living in a computer simulation, because, in the classic formulation, if most civilisations go on to run many computer simulations of their past history, then most beings who perceive themselves as living in such a history must themselves be in computer simulations. Joe prefers a somewhat different way of making the point, but, having looked into it, he hasn't identified any particular rebuttal to this 'simulation argument.'If true, it could revolutionise our comprehension of the universe and the way we ought to live...Other two ideas cut for length — click here to read the full post.These are just three particular instances of a much broader set of ideas that some have dubbed the "train to crazy town." Basically, if you commit to always take philosophy and arguments seriously, and try to act on them, it can lead to what seem like some pretty crazy and impractical places. So what should we do with this buffet of plausible-sounding but bewildering arguments?Joe and Rob discuss to what extent this should prompt us to pay less attention to philosophy, and how we as individuals can cope psychologically with feeling out of our depth just trying to make the most basic sense of the world.In today's challenging conversation, Joe and Rob discuss all of the above, as well as:
What Joe doesn't like about the drowning child thought experiment
An alternative thought experiment about helping a stranger that might better highlight our intrinsic desire to help others
What Joe doesn't like about the expression “the train to crazy town”
Whether Elon Musk should place a higher probability on living in a simulation than most other people
Whether the deterministic twin prisoner’s dilemma, if fully appreciated, gives us an extra reason to keep promises
To what extent learning to doubt our own judgement about difficult questions -- so-called “epistemic learned helplessness” -- is a good thing
How strong the case is that advanced AI will engage in generalised power-seeking behaviour
Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type ‘80,000 Hours’ into your podcasting app. Or read the transcript below.Producer: Keiran HarrisAudio mastering: Milo McGuire and Ben CordellTranscriptions: Katy Moore
Released:
May 19, 2023
Format:
Podcast episode

Titles in the series (100)

Unusually in-depth conversations about the world's most pressing problems and what you can do to solve them. Subscribe by searching for '80,000 Hours' wherever you get podcasts. Produced by Keiran Harris. Hosted by Rob Wiblin, Head of Research at 80,000 Hours.