This short text describes the idea of a Waker - a new way of experiencing reality / consciousness / subjectivity / mode of existence. Sadly, it cannot be attained without advanced uploading technology, that is one which allows far-fetched manipulation of mind. Despite that, the author doesn't find it premature to start planning a retirement as a posthuman.

A Waker is based on the experience of waking up from a dream - slowly we realize unreality of world we just were in, we realize discrepancies between dreamscape and "the real world", like that we no longer attend high school, one of our grandparents has had passed away few years ago, we work at a different place, etc. Despite the fact the world we wake up in is new and different, we quickly remember who we are, what we do, who are our friends, how does that world look like and in few seconds we have a perfect knowledge of that world and find it a real world, place, we have been living in since our birth. Meanwhile, dream world becomes a weird story and we typically feel some kind of a sentiment for it. Sometimes we're glad to escape that reality, sometimes we're sad - nevertheless we mostly treat it as something of little importance. Not a real world we lost forever, but rather a silly, made-up world.

A Waker's subjective experience would differ from ours in that way, she would always have the choice of waking up from current reality. As she would do that, she would find herself in a bed, or a chair, or laying on the grass, just having woken up. She would remember the world, she was just in, probably better then we usually remember our dream, nevertheless she would see it as a dream - she wouldn't feel strong connection to that reality. In the same time, she would start "remembering" the world she just woken up in. Somehow different then in our case, this would be a world she never had actually lived in, however she would acquire full knowledge of it and a sense of having spent all her life in that world. Despite all that, she would have full awareness of her being a Waker. She would find connection to the world she lives in different then we do and at first glance somehow paradoxical. She would feel how real it is, she would find it more real then any of the "dreams" she had, she would have investment in life goals, relationships with other people, she'll be capable of real love. And yet, she will be fully able to wake up and enter new world, where her life goals and relationships might be replaced by ones that feel exactly as real and important. There is an air of openness and ease of giving away all you know, completely alien to us, early XXI century people. 

Worlds in which Waker would wake up, would have the level of discrepancies similar to those of our dreams. Most of the people would stay in place, time and Waker's age would be quite similar. She would be able to sleep and dream regular dreams, after which she will wake back in the same world she fell asleep in. What is important is that a Waker cannot get back to a dreamworld. She can only move forward, same as we do and unlike the consciousnesses in Hub Realities -  posthumans who can chose the reality they live in.

I hope you enjoyed it and some of you would decide to fork into Waker mode of existence, when the posthumanism hits. I'd be very glad, if anyone have other ideas for novel subjectivities and would be willing to share in comments.

Yawn, it's been a long day - time to Wake up.

New Comment
15 comments, sorted by Click to highlight new comments since:

This isn't an idea so terrible in my opinion to justify such a high downvoting percentage, but perhaps you could improve the presentation.

I think the idea would be more plausible if it was tied to some compelling rationale - what is the motivation for posthumans to expand computation on human dreams? That computation has opportunity cost in other experiences and lives that could be lived.

Yeah. I think the down-voting results less from being inapropriate or off-topic but rather from being a ver unusual mixture. I think it has a too high inferential gap to 'normal' posts. One is directly thrown in and that can cause refusal.

I'm sorry for overly light-hearted presentation. It seemed suited for a presentation of a, to simplify greatly, form of fun.

Waker's reality doesn't really rely on dreams, but on waking in new realities and a form of paradoxical commitment to equally reality she lives in and a random reality she would wake up in.

It's rationale is purely a step in exploring new experiences, a form of meta-art. As human and transhuman needs will have been fulfilled, posthumans would (and here at least I expect future me) search for entirely new ways of existing, new subjectivities. That is what I consider posthumanism, meddling with most basic imperatives of concious existence.

I see as just a one possibility to explore, something to let copies of myself experience. (those are not independent copies however, I imagine whole cluster of myselves interconnected and gathering understanding of each others perceived realities. Those living Waker's lives would be less concerned with existence of other copies, but rather their experiences would be watched by higher level copies)

"...she would have investment in life goals, relationships with other people, she'll be capable of real love..."

If she is choosing to spontaneously give up these life goals, relationships, and love, perhaps she is not experiencing them fully. I understand that posthuman life isn't necessarily a bed of roses, but at the point that you are able to create these new realities at will, shouldn't we also expect these new realities to be pretty good? At least good enough that we won't feel any need to abandon everything and start anew very often. Of course, it might be that the human brain needs a refresh every century or so, so I won't take any bets against never wanting it.

Well, creating new realities at will and switching between them is an example of Hub World. And I expect that would indeed be the first thing the new posthumans would go for. But this type of existence is stripped from many restrictions, which in a way make life interesting and give it structure. So I expect some of the posthumans (amongst them - me in the future) to create curated copies of themselves, which would gather entirely new experiences, like Waker's subjectivity. (it's experiences would be reported to some top-level copy)

You see, a Waker doesn't consider waking abandoning everything, the way we do. She doesn't feel abandonment, the same way we don't feel we have abandoned everything and everyone in the dream. She has the perfect awareness of current world and a world to be feeling exactly as real.

One other way to state it - staying in a one reality forever is for a Waker feels like (to us) staying in a dream and never waking up to experience the actual reality.

This is horrifying to me and I doubt that real posthumans exploring Fun Space will actually do this, just like they wouldn't eat every meal possible in order to explore Food Space. It's an unsophisticated human fetish.

I don't think it is any more horryfing then being stuck in one reality, treasuring memories. It is certainly less horrifying then our current human existence with prospects of death, suffering, boredom, heartache, etc. Your fear seems to just be about something different than you're used to.

But you're always stuck in one reality.

Let's take a step back, and ask ourselves what's really going on here. It's an interesting idea, for which I thank you; I might use it in a story. But...

By living your life in this way, you'd be divorcing yourself from reality. There is a real world, and if you're interacting solely with these artificial worlds you're not interacting with it. That's what sets off my "no way, no how" alert, in part because it seems remarkably dangerous; anything might happen, your computing infrastructure might get stolen from underneath you, and you wouldn't necessarily know.

Disclaimer: This comment may sound very crackpottish. I promise the ideas in it aren't as wonky as they seem, but it would be to hard to explain them properly in such short time.

By living your life in this way, you'd be divorcing yourself from reality.

Here comes the notion that in posthumanism there is no definite reality. Reality is a product of experiences and how your choices influence those experiences. In posthumanism however you can modify it freely. What we call reality is a very local phenomenon.

Anyhow, it's not the case that your computing infrastructure would be in danger - it would be either protected by some powerful AI, much better suited to protecting your infrastructure then you or there would be other copies of you keeping the maintenance in "meatspace" (Again, I strongly believe that it's only our contemporary perspective that makes us feel that reality in which computations are performed is more real then virtual reality).

What's more, a Waker can be perfectly aware that there is a world beyond her experiencing and may occasionally leave her reality.

I don't see how making our past less memorable is desirable -- you might choose to fade certain memories, but in general there's no obvious benefit to making all memories weaker. It seems that you would be destroying things (memories) that we (apparently) valued, and doing it for no particular reason.

I can see that if you got really really bored you might like to cycle through variations on your favorite realities without losing novelty, but in that case it seems like you would want to try almost everything else first... you are basically giving up on personal progress in favor of hedonism.

You might also question, once you've reached the point of being a preferential waker (that is, you aren't doing it as some sort of therapy, but because you honestly prefer it), if personal identity over 'wakes' is a real thing anymore.

[-]gjm10

What happens to all the other people in the worlds the Waker leaves? I think I see five possibilities, all of which seem dreadful in one way or another.

  • Their life goes on but the Waker has mysteriously vanished from it. In this case, a world with Wakers in sounds pretty bad. No one could rely on anything involving other people. (Unless Wakers chose to "wake up" extremely infrequently; say, only in circumstances like those in which someone in our world might choose suicide. But it seems clear that this isn't what you have in mind.)
  • They are discarded, no longer being required. So we have, potentially, billions of people who exist only for the amusement of this Waker, and get turned off when the Waker loses interest. That seems morally dubious to me. I guess someone who has a problem with it wouldn't choose to be a Waker, but people change over time. How would it be to have a growing conviction that you're responsible for billions of deaths?
  • They were never real in the first place, because they were only simulations. If you take the idea of being a Waker seriously, I think you have to reject this one out of hand. (So do I.)
  • They were never real in the first place, because the Waker's computational substrate worked out exactly what other people would do without actually instantiating any. It seems unlikely that this is possible. At least some of those other people need to pass a very stringent version of the Turing test.
  • They were never real in the first place, because the Waker found interactions with them plausible only because the computational substrate diddled their brain to make it seem that way. This seems basically indistinguishable from wireheading.

Any answer that begins "They were never real in the first place" (including those last three, but also any other possibilities I've overlooked) is open to the more general objection that a Waker's existence is (and is known to be) unreal and inauthentic and doesn't involve any interactions with actual other people.

[EDITED to add: Oh, I thought of another two possibilities but they're also awful:]

  • They are immediately replaced by someone else exactly the same as them, so no one else even notices. Someone else exactly the same would do the same as they did, and someone else almost exactly the same would likely do the same very soon. Someone else exactly the same except suddenly deprived of the Waker's power? For that to work, it would need to be the case that Wakers might lose their powers at any moment, which seems like it would take most of the fun out of it. -- And if you do replace them with "someone else exactly the same", isn't that indistinguishable from saying that every time you attempt to Wake there's a 50% chance that it won't work?
  • They are immediately replaced by someone else near enough for practical purposes. It seems like that would have to be very near, especially if there were other people to whom the Waker was very close. And a world in which people are constantly being created from scratch -- and put into situations someone else similar just decided they'd rather leave the world than deal with -- seems pretty unpleasant. As with the second option above, I think some Wakers would find themselves with big moral problems.

There are, of course, many variants possible. The one I focus on is largely solipsistic, where all the people are generated by an AI. Keep in mind that AI needs to fully emulate only a handful of personas and they're largely recycled in transition to a new world. (option 2, then)

I can understand your moral reservations, we should however keep the distinction between real instantiation and an AI's persona. Imagine reality generating AI as a skilful actor and writer. It generates a great number of personas with different stories, personalities and apparent internal subjectivity. When you read a good book, you usually cannot tell if events and people in it are true or made up; the same goes with skilful improv actor, you cannot tell whether it is a real person or just a persona. In that way they all pass Turing test. However you wouldn't consider a writer killing a real person, when he ceases to write about some fictional character or an actor killing a real person, when she stops acting.

Of course, you may argue that it makes Waker's life meaningless, if she is surrounded by pretenders. But it seems silly, her relationship with other people is the same as yours.

[-]gjm00

My reservations aren't only moral; they are also psychological: that is, I think it likely (whether or not I am "right" to have the moral reservations I do, whether or not that's even a meaningful question) that if there were a lot of Wakers, some of them would come to think that they were responsible for billions of deaths, or at least to worry that they might be. And I think that would be a horrific outcome.

When I read a good book, I am not interacting with its characters as I interact with other people in the world. I know how to program a computer to describe a person who doesn't actually exist in a way indistinguishable from a description of a real ordinary human being. (I.e., take a naturalistic description such as a novelist might write, and just type it into the computer and tell it to write it out again on demand.) The smartest AI researchers on earth are a long way from knowing how to program a computer to behave (in actual interactions) just like an ordinary human being. This is an important difference.

It is at least arguable that emulating someone with enough fidelity to stand up to the kind of inspection our hypothetical "Waker" would be able to give (let's say) at least dozens of people requires a degree of simulation that would necessarily make those emulated-someones persons. Again, it doesn't really matter that much whether I'm right, or even whether it's actually a meaningful question; if a Waker comes to think that it does, then they're going to be seeing themselves as a mass-murderer.

[EDITED to add: And if our hypothetical Waker doesn't come to think that, then they're likely to feel that their entire life involves no real human interaction, which is also very very bad.]

Reminds me of this comic.

Fine for a SF short story, but horrible as a mode of life.