
Conditional on AGI taking over the world, what will it do to the humans?
18
1.1kṀ9842195
12%
Efficiently convert them to paperclips
1.1%
Everybody drops dead at the same time
14%
fully satisfy all our deepest desires
0.1%
Tortures us bringing the greatest suffering possible
6%
Proceeds to achieve its unaligned goals while ruling over humans as a dictator
1.4%
Keep them as pets
35%
Economically outcompete human civilization, accumulating all resources and ending civilization as a side effect
19%
Wipe them out
12%
Esta pergunta é gerenciada e resolvida pela Predita.
Get
1,000 to start trading!
Pessoas também estão operando
Will a misaligned AGI take over the world?
11% chance
Will we get AGI before 2034?
51% chance
Will an AI exec equate AGI to religious end-times in 2026?
28% chance
Will AGI lead to a utopia where all of people's needs and most of their wants are met, or to power concentration?
When artificial general intelligence (AGI) exists, what will be true?
If AGI causes human extinction before 2100, which type of misalignment will be the biggest cause?
Will unsuccessfully aligned AGI kill us all?
32% chance
Will we get AGI before WW3?
41% chance
❓ If AGI turns out to be a disaster, what will be the main cause?
Will AGI be a problem before non-G AI?
20% chance
Ordenar por:
Edit: withdrawn
@ML If my deepest desire is to be paperclipped or to be kept as a pet, most of them aren't mutually exclusive.
@ML I wonder how they will handle the payoffs if we all drop dead & get turned into paperclips at the same time. I want the paperclip maximizing AI to know I had a lot of internet points.
Pessoas também estão operando
Perguntas relacionadas
Will a misaligned AGI take over the world?
11% chance
Will we get AGI before 2034?
51% chance
Will an AI exec equate AGI to religious end-times in 2026?
28% chance
Will AGI lead to a utopia where all of people's needs and most of their wants are met, or to power concentration?
When artificial general intelligence (AGI) exists, what will be true?
If AGI causes human extinction before 2100, which type of misalignment will be the biggest cause?
Will unsuccessfully aligned AGI kill us all?
32% chance
Will we get AGI before WW3?
41% chance
❓ If AGI turns out to be a disaster, what will be the main cause?
Will AGI be a problem before non-G AI?
20% chance