CCCR 2022 Lightning Talk: Nora Ammann

Nora Ammann: Learning from Existing Complex Systems about Existential Risks and Alignment
As part of the CSER 2022 Conference, speakers were invited to give 7 minute 'Lightning Talks' to give a taster introduction to a particular dimension relevant to the study of global catastrophic risks.
CSER’s biennial conference is the leading regular gathering for scholars and policymakers working to understand and mitigate the greatest risks facing humanity. The 2022 Conference focused on three themes: future risks, and how we can study them; real catastrophes, and what we can learn from them; and effective global responses that manage the risks, and how we can achieve them.
The Centre for the Study of Existential Risk (CSER) is an interdisciplinary research centre within the University of Cambridge dedicated to the study and mitigation of risks that could lead to human extinction or civilisational collapse. For more information, please visit our website:
www.cser.ac.uk
/ csercambridge
/ csercambridge

Пікірлер: 1

  • @rontrost4789
    @rontrost47898 ай бұрын

    What about BEHAVIORISM? BF Skinner is turning over in his grave. To him, AI is basically an infinite source of schedules of reinforcement that will infinitely maintain behavior unless you stop using it. Imagine a gambling hall that maintains behavior as long as you continue pulling the lever. An addiction people (and most vertebrates) cannot or have great difficulty quitting voluntarily. Won’t an AI MODEL never stop finding reinforcers? And certainly it’s been well shown the human player knows his addiction is no advantage. If you keep playing to control of Ai, hasn’t it already won the contest.?