I'd like recommendations for articles dealing with slow and hard takeoff scenarios. I already found Yudkowsky's post 'hard takeoff', I know 'Superintelligence' has a section on it, and I think the Yudkowsky/Hanson debate mostly dealt with it.

Is there anything else?

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 3:33 AM
Select new highlight date
All comments loaded

Besides Superintelligence, the latest "major" publication on the subject is Yudkowsky's Intelligence explosion microeconomics. There are also a few articles related to the topic at AI Impacts.

I found Intelligence Explosion Microeconomics less helpful for thinking about this than some older MIRI papers: