"A Disneyland with no children" apocalypse where optimization competition eliminates any pleasure we get from life.
A hell apocalypse where a large numbers of sentient lifeforms are condemned to very long term suffering possibly in a computer simulation.
I don't understand why you exclude risks of astronomical suffering ("hell apocalypses").
Below you claim that those risks are "Pascalian" but this seems wrong.
We could add here "Qualia apocalypses" - human are alive but become p-zombies may be after wrong uploading
Intelligence apocalypses - human go extinct, but no other form of intelligence appear. Or human survive but their creative intelligence permanently damaged and IQ never rise above 80. May be because global contamination by arsenic.
Gene allele apocalypses - many interesting alleles in human genom disappear. The remains look like humans but many interesting traits are lost.
Primate apocalypses - all high apes extinct including humans, new intelligence on Earth could appear only after 10 million years from now or more.
Mammal apocalypses.
Vertebral apocalypses
Values apocalypses - human values eroded and replaced by another values, like Nazi. Probably it has happened several times in history.
Evolution apocalypses - evolution just ends, human exists almost forever, but nothing new happens, no superAI, no star travel. Just the end of complexity growth. AI may appear but will be as boring as Windows 7.
Individuality apocalypses - humans become all very similar to each other. It already happened with globalisation.
Children apocalypses - human just stop to reproduce above replacement rate.
Art apocalypses - human lost interest to arts or ability to create really interesting art. Some think that it has already happened.
Wire-geading-euporium-superdrug apocalypses. New ways of the brain stimulation completely distract humans from real life. ̶T̶h̶e̶y ̶s̶p̶e̶n̶t̶ ̶a̶l̶l̶ ̶t̶i̶m̶e̶ ̶i̶n̶ ̶F̶B̶
Wrong upgrade apocalypses - human basic drives are edited on birth so they are not aggressive, but they also lose interest to space exploration (S.Lem novel about it).
Did you see my map Typology of x-risks? http://lesswrong.com/lw/mdw/a_map_typology_of_human_extinction_risks/ I am interested in creating maps which will cover all topics about x-risks.
Thanks for this summation.
Maybe we can divide item 7. to "our universe apocalypse" and "everything that (physically) exists apocalypse." since the two might not be equal.
Of course, there might be things that exist necessarily and thus cannot be "apocalypsed out", and it also would be strange if the principle that brought our universe to existence can only operate once.
So while it might be possible to have a Multiverse apocalypse, I think that there will always be something (physical) existing (but I don't know if this thought really can comfort us if we get wiped out...)
By the way, how do you (up)vote here?
Cheers
The upvote for comments is in the lower left of the comment. The upvote for posts is harder to find: It's at the bottom left of the post, above the text box for commenting.
Also, there could be a rule where only accounts with positive karma (ie, not brand-new accounts) can upvote. I'm not sure.
(Slow response because i am also learning site features: Didn't see the 'letter' icon under my karma score.)
Where should one put the current civilization-destroying socialist catastrophy? Economic? Knowledge? Human? Hope it's recoverable.
Maybe the worst current risk - because of sheer politically fed irrationality (on the scale not seen since religion-dumbed dark ages), from political correctness, general statism, to keynesian-ispired megadebts, to political "scientific consensus" (of. e.g. global warming), etc. etc. Actually, worth saying that it's quite popular in modern pseudoreligious movements to use word "science" as true religious power word to justify anything (marxists, bolsheviks, nazis... coincidentally, all socialist branches).