How can I maximise my chances of having a decent life, given the very high likelihood that GAI will make all our intellectual labour useless in the next few years?

For example, I graduated from a good university a few years ago and am working as a software engineer in a multinational company, but my capabilities are middling at best. I am distressed that I will likely not be able to afford a house in the few years left before GAI renders me unable to afford a living. I am not a genius; it is very unlikely that I can join an AI research company and contribute meaningfully to AI research.

Assuming I have a small amount of money (100-200k) I can set aside, should I attempt to, for example, invest in companies that will likely be able to monetise GAI?

Or is there something else I should be doing to prepare for the time I have basically zero human capital? 

Should I attempt to move to (and get citizenship from) a country with a larger amount of natural resources, assuming that human capital will become worthless quickly?

Is it reasonable to find potential outs (e.g. physician-assisted death) in case we cannot earn a living (and if unfriendly AI is basically confirmed)?

New Answer
New Comment

6 Answers sorted by

FlorianH

82

Some hypotheses - with much simplification and limited applicability in terms of the possible future states of the world within which they would really be valuable:

  1. Invest in natural resources. Once labor as bottleneck is overcome, the real restriction will be the natural resources, earning an important scarcity rent. There are reasons why resource markets might not yet fully price that possibility in (somewhat speculative).
  2. Move to a place that represents a good combination of: (i) resource-rich and (ii) likelihood to be able to uphold basic social structures including economic redistribution, even in the face of tempting rewards for an extractive elite class.

Last but not least:

3. Mindfulness. Embrace how much of a cosmic joke our individual lives and self-centered aspirations represent - or something of that sort.

Fully on-board with #3.  Change your attitude about what constitutes "a decent life", such that pretty much all existence is positive-value, and moments of joy are worth much more than weeks of depression costs.  

#1 and #2 are less obvious.  One of the reasons it's called the singularity is that EVERYTHING becomes hard to predict.  A lot of people are assuming that the concepts of ownership and financial capital remain consistent enough that investments now retain their power after the big changes.  I think they're mostly wrong - i... (read more)

1FlorianH
Largely agree with you that "EVERYTHING becomes hard to predict.", it is partly what I meant to allude to with the introductory caveat in my comment. I imagine un-graspably transformative superintelligence well within our lifetime, and cannot give much more advice on that scenario, yet I still keep a non-zero probability on the world & socio-economic structures remaining - for whichever reasons - still more recognizable, for which case #1 and #2 seem still reasonably natural defaults. But yes, they may apply in a reasonably narrow band of imaginable AI transformed futures.

Lichdar

70

I would say to do everything possible to stop GAI. We might not win, but it was better to have tried. We might even succeed.

cousin_it

70

I mean, you're not alone in this. Lots of people will have the same problems. So one possible direction is to participate in movements for AI restrictions, job guarantees, basic income and so on.

The actions you suggest might represent a laudable contribution to a public good, but it doesn't directly answer the (self-concerned) question the OP raises. Given the largeness of the world and the public goods nature of the projects you mention, his own action will only marginally change the probability of a a better structure of society in general. That may still be worth it from a fully altruistic standpoint, but it has asymptotically 0 probability to improve his personal material welfare.

(If I may, an analogy: One could compare it to a situation where... (read more)

1M. Y. Zuo
Isn't that true for everything with global scale? The typical moderately-above-average person, by definition, has a very slim chance of moving the needle, in a positive direction, to any noticeable degree.
1FlorianH
Absolutely! That's why we have free-rider problems/insufficient contribution to public goods, all over the world. The thing you can do in society's best interest, is not typically in your own (material) best interest, unless you're a perfect altruist.
1M. Y. Zuo
Assuming actual bonafide geniuses are 1 in a thousand, that's 8 million of them, most of the rest of the population ~8 billion still get through life seemingly fine. They're public minded enough to not tear down their own hometowns and neighbourhoods at least. So it doesn't seem that serious of an issue?
1FlorianH
Meditations On Moloc
1M. Y. Zuo
Yes most people are not exactly overflowing with virtue, and in fact will more likely than not compete in a race to the bottom if given the motivation, but how does that relate?
1FlorianH
You had suggested the issues with free-riding/insufficient contributions to public goods, might not be so much of a problem. The linked post suggests otherwise, as it beautifully highlights some of the horrors that come from these issues. It's point is, even if humans are not all bad as of themselves, within the larger societies, there tend to arise strong incentives, for the individual, to act in the disinterest of society.
1M. Y. Zuo
Yes, but how does that equate to it being a serious issue for someone like the OP, who is not a super-genius and can't work a way out of it? It's like saying black holes are a serious issue to them because there's the  possibility of a rogue one swallowing up the Earth. Which in one sense, is true, but seems to be entirely futile because to worry about it is just pounding sand.

Tomás B.

61

I bought index funds. I would say it has the advantage of being robust to AGI not happening, but with birth rates as they are I am not so sure that's true! If we survive, Hanson's economic growth calculations predict the economy will start doubling every few months. Provided the stock market can capture some of this, I guess learning how to live on very little (you really want to avoid burning your capital in this future, so should live as modestly as possible both so you can acquire capital and so you can use as little as possible until the market prices in such insane growth) and putting everything in index funds should be fine with even modest amounts of invested capital.  However, I doubt property rights will be respected.

Matt Goldenberg

21

I think this post has decent financial advice if you believe in near term GAI.

 

https://www.lesswrong.com/posts/CTBta9i8sav7tjC2r/how-to-hopefully-ethically-make-money-off-of-agi

nippynige

10

Zero human capital? I’m sorry to read that you might think this but surely it’s simply not true. Personally, if I was in your situation I would invest those funds in myself. Perhaps a relatively future proof vocation, something physically creative but difficult to replace (in the short term). I’m certainly no expert but believe that skills such as dance teacher, hairdresser, building renovation skills, antique restorations, watch/clock repairs, blacksmith, etc might retain their utility way into the future.