S-risks: why they are the worst existential risks, and how to prevent them | Max Daniel

Effective altruists focussed on shaping the far future face a choice between different types of interventions. Of these, efforts to reduce the risk of human extinction have received the most attention so far. In this talk, Max Daniel will make the case that we may want to complement such work with interventions aimed at preventing very undesirable futures ("s-risks"), and that this provides a reason for, among the sources of existential risk identified so far, focussing on AI risk.
From Wikipedia: “Suffering risks, known as s-risks for short, are future events with the potential capacity to produce an astronomical amount of suffering.”
Discuss this talk on the Effective Altruism Forum: forum.effectivealtruism.org/p...

Пікірлер: 16

  • @jacyanthis
    @jacyanthis7 жыл бұрын

    Very important topic! Glad to see more effective altruists focusing on preventing these very bad outcomes.

  • @josiegreene6140
    @josiegreene6140Ай бұрын

    I cant believe that there arent more views on this talk, amazing speaker and very heartening to see the Centre for Effective Altruism in practice.

  • @BrianTomasik
    @BrianTomasik6 жыл бұрын

    Here's the talk in written form, with pictures of the slides: foundational-research.org/s-risks-talk-eag-boston-2017/

  • @MissInformati0n
    @MissInformati0n6 жыл бұрын

    Talk begins 1:12

  • @miroslavblagojevic2402
    @miroslavblagojevic2402 Жыл бұрын

    The challenging question is are we already subjected to factory farming?

  • @whalehorse
    @whalehorse10 ай бұрын

    Well articulated his point. Problem is the people in the vehicle have different priorities. How do we have a collective priority to override individual priority?

  • @diegooliveira7713
    @diegooliveira7713 Жыл бұрын

    Thanks for bringing awareness to this important topic!

  • @hagrone6305
    @hagrone6305 Жыл бұрын

    Very nice job! Thank you for spreading these ideas.

  • @alexandermoskowitz8000
    @alexandermoskowitz8000 Жыл бұрын

    From Wikipedia: “Suffering risks, known as s-risks for short, are future events with the potential capacity to produce an astronomical amount of suffering.” I was expecting an explanation in the video description

  • @lukao.3969
    @lukao.39694 жыл бұрын

    what's the reasoning behind categorizing s-risks as a subclass of x-risks? Arguably, reducing x-risks with terradorming or virtual world technology would be disastrous from the s-risk perspective

  • @daryoshi161

    @daryoshi161

    4 жыл бұрын

    9:47-9:56 X-risk is a confusing term for me, because x-risk, depending on usage, is supposed to mean EITHER 1) existential risk OR 2) extinction risk OR 3) existential risk/extinction risk, regardless of what people who coined these terms originally meant. I think you're right in that reducing extinction risk via e.g. Elon Musk 's approach (Mars colonisation) increases s-risks (via researching and implementing terraforming, creating ecosystems entailing suffering individuals).

  • @JezebelIsHongry

    @JezebelIsHongry

    Жыл бұрын

    I think it’s a good marketing play. We’ve dealt with x-risk every minute since we developed nuclear weapons. You don’t think much about the few times one human being’s decision stopped nuclear holocaust (most of those people were Russians) If the possibility is so horrible it’s probably best for the best minds to think very deeply on how to solve the s-risk problem. A nuclear weapon can’t keep you alive for thousands of subjective years under more pain then you can imagine. That type of risk might light a fire under people’s asses as opposed to x-risk sans s-risk.

  • @Nulono
    @Nulono5 жыл бұрын

    31:00 Cough areas?

  • @owensmith632

    @owensmith632

    Жыл бұрын

    Believe he means to say cause areas!

  • @JezebelIsHongry
    @JezebelIsHongry Жыл бұрын

    I just hope the future AI Daddy doesn’t read Surface Detail by Ian M Banks Fuuuuuuuuck

  • @JezebelIsHongry
    @JezebelIsHongry Жыл бұрын

    2023 enters the chat Oh shit.