
Conditional on not having died from unaligned AGI, I consider myself a full time alignment researcher by the end of 2030
16
1ká¹€5502030
34%
chance
1H
6H
1D
1W
1M
ALL
I suspect that the primary mechanism by which this market resolves to NO would be either burnout or running out of funding. However, do not be limited to these mechanisms when trading.
Relevant market: https://manifold.markets/AlanaXiang/will-i-consider-myself-a-fulltime-a
I do not intend to buy shares in this market (either YES or NO).
Esta pergunta é gerenciada e resolvida pela Predita.
Get
1,000 to start trading!
Pessoas também estão operando
Will we get AGI before 2036?
57% chance
Will we get AGI before 2034?
51% chance
Will we get AGI before 2032?
43% chance
Will we get AGI before 2035?
54% chance
Will we get AGI before 2038?
63% chance
Will I still work on alignment research at Redwood Research in 3 years?
85% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
34% chance
Conditional on their being no AI takeoff before 2050, will the majority of AI researchers believe that AI alignment is solved?
52% chance
Will tailcalled think that the Brain-Like AGI alignment research program has achieved something important by October 20th, 2026?
33% chance
Will taking annual MRIs of the smartest alignment researchers turn out alignment-relevant by 2033?
7% chance
Ordenar por:
Pessoas também estão operando
Perguntas relacionadas
Will we get AGI before 2036?
57% chance
Will we get AGI before 2034?
51% chance
Will we get AGI before 2032?
43% chance
Will we get AGI before 2035?
54% chance
Will we get AGI before 2038?
63% chance
Will I still work on alignment research at Redwood Research in 3 years?
85% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
34% chance
Conditional on their being no AI takeoff before 2050, will the majority of AI researchers believe that AI alignment is solved?
52% chance
Will tailcalled think that the Brain-Like AGI alignment research program has achieved something important by October 20th, 2026?
33% chance
Will taking annual MRIs of the smartest alignment researchers turn out alignment-relevant by 2033?
7% chance
