I have to heartily disagree with those that seem to think it impolite to disagree with the religious. Remember this same person is going to go out and make life and death decisions for himself and others. Notice also that it was the theist who started the debate.

You can have some fun with people whose anticipations get out of sync with what they believe they believe.

I was once at a dinner party, trying to explain to a man what I did for a living, when he said: "I don't believe Artificial Intelligence is possible because only God can make a soul."

At this point I must have been divinely inspired, because I instantly responded: "You mean if I can make an Artificial Intelligence, it proves your religion is false?"

He said, "What?"

I said, "Well, if your religion predicts that I can't possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for me to build an AI; or, if I build an AI, that disproves your religion."

There was a pause, as the one realized he had just made his hypothesis vulnerable to falsification, and then he said, "Well, I didn't mean that you couldn't make an intelligence, just that it couldn't be emotional in the same way we are."

I said, "So if I make an Artificial Intelligence that, without being deliberately preprogrammed with any sort of script, starts talking about an emotional life that sounds like ours, that means your religion is wrong."

He said, "Well, um, I guess we may have to agree to disagree on this."

I said: "No, we can't, actually. There's a theorem of rationality called Aumann's Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong."

We went back and forth on this briefly. Finally, he said, "Well, I guess I was really trying to say that I don't think you can make something eternal."

I said, "Well, I don't think so either! I'm glad we were able to reach agreement on this, as Aumann's Agreement Theorem requires."  I stretched out my hand, and he shook it, and then he wandered away.

A woman who had stood nearby, listening to the conversation, said to me gravely, "That was beautiful."

"Thank you very much," I said.

 

Part of the sequence Mysterious Answers to Mysterious Questions

Next post: "Professing and Cheering"

Previous post: "Belief in Belief"

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 1:55 AM
Select new highlight date
Rendering 50/101 comments  show more

Nice job, but the mention of Aumann's theorem looks a bit like a sleight of hand: did the poor fellow ever learn that the theorem requires the assumption of common priors?

I know that if I'm at a party (of most types), for example, my first goal ain't exactly to win philosophical arguments ...

Funny, I've always thought that debates are one of the most entertaining forms of social interaction available. Parties with a lot of strangers around are one of the best environments for them - not only don't you know in advance the opinions of the others, making the discussions more interesting, but you'll get to know them on a deeper level, and faster, than you could with idle small talk. You'll get to know how they think.

This story is related to the phenomena whereby the most intelligent and educated religious folks are very careful to define their beliefs so that there can be no conflict with observations, while ordinary people are more prone to allow their religion to have implications, which are then subject to challenges like Eliezer's. It is fun to pick holes in the less educated views, but to challenge religion overall it seems more honest to challenge the most educated views. But I usually have trouble figuring out just what it is that the most educated religious folks think exactly.

That was cruel. Fun, but cruel.

A woman who had stood nearby, listening to the conversation, said to me gravely, "That was beautiful."

And people wonder why men argue so often.

A few questions and comments:

1) What kind of dinner party was this? It's great to expose non-rigorous beliefs, but was that the right place to show off your superiority? It seems you came off as having some inferiority complex, though obviously I wasn't there. I know that if I'm at a party (of most types), for example, my first goal ain't exactly to win philosophical arguments ...

2) Why did you have to involve Aumann's theorem? You caught him in a contradiction. The question of whether people can agree to disagree, at least it seems to me, is an unnecessary distraction. And for all he knows, you could just be making that up to intimidate him. And Aumann's Theorem certainly doesn't imply that, at any given moment, rectifying that particularly inconsistency is an optimal use of someone's time.

3) It seems what he was really trying to say was someting along the lines of "while you could make an intelligence, its emotions would not be real the way humans' are". ("Submarines aren't really swimming.") I probably would have at least attempted to verify if that's what he meant rather latching onto the most ridiculous meaning I could find.

4) I've had the same experience with people who fervently hold beliefs but don't consider tests that could falsify them. In my case, it's usually with people who insist that the true rate of inflation in the US is ~12%, all the time. I always ask, "so what basket of commodity futures can I buy that consistently makes 12% nominal?"

Meanwhile, over at the next table, there was the following conversation:

"I believe science teaches us that human-caused global warming is an urgent crisis."

"You mean if it's either not a problem or can be fixed easily, it proves science is false?"

Technically, it proves his belief about science is false.

If he'd said "Science teaches us that human-caused global warming is an urgent crisis." then "You mean if it's either not a problem or can be fixed easily, it proves science is false?" applies. And yes, it in fact would.

And then Science would (metaphorically) say, "My bad, thanks for that new evidence, I reject my prior theory and form a new one that accounts for your data and explains this new phenomenon that causes symptoms as if global warming were an urgent problem."

"Technically, it proves his belief about science is false."

True, though in the same way, Eliezer's success in producing an AI, even according to the dodgy specifications of his dinner companion, would only prove his belief about God wrong, not his belief IN God wrong.

The AI data point would contradict Mr Dinner's model of God's nature only at a single point, His allegedly unique intelligence-producing quality.

I am quite impressed at your capability of signaling your prodigious intelligence. Less pompously, moments like that make for fond memories.

Hmmm... and I thought you were going to suggest that, if you succeded in making an AI that you must be god. I would've loved to be there to offer him that option instead. LOL!

Of course, even if what you said is what we really mean, I'm not sure which one is more effective at getting people to think, but your story shows that it's usually good (and at least entertaining) to try being more direct, every once in a while. I just find it easier to break through the social convention of politeness with humor.

"Funny, I've always thought that debates are one of the most entertaining forms of social interaction available."

We may not have rationality dojos, but in-person debating is as good an irrationality dojo as you're going to get. In debating, you're rewarded for 'winning', regardless of whether what you said was true; this encourages people to develop rhetorical techniques and arguments which are fully general across all possible situations, as this makes them easier to use. And while it may be hard to give public demonstrations of rationality, demonstrations of irrationality are easy: simply talk about impressive-sounding nonsense in a confident, commanding voice, and people will be impressed (look at how well Hitler did).

(FYI, the MRA who posted is not Ames.)

MRA, the difference between winning an argument with someone, versus pushing them into the dirt - well, there's a number of differences, really. The three most important are: First, I didn't force him to talk to me. Second, losing an argument makes you stronger. (Or rather, it gives you a precious chance to become stronger; whether he took advantage of it was up to him. Winning is a null-op, of course.)

Third and above all, in factual arguments there is such a thing as truth and falsity.

Robin sort-of generalized it so that it doesn't. http://www.overcomingbias.com/2006/12/why_common_prio.html

My big question though is whether this exchange led to a lasting change in the fellow's opinion as to the possibility of AI. In practice it seems to me that most of the time when people decisively loose an argument they still return to their original position within a few days just by ignoring that it ever happened.

This post's presence so early in the core sequences is the reason I nearly left LW after my first day or two. It gave me the impression that a major purpose of rationalism was to make fun of other people's irrationality rather than trying to change or improve either party. In short, to act like a jerk.

I'm glad I stuck around long enough to realize this post wasn't representative. Eliezer, at one point you said you wanted to know if there were characteristically male mistakes happening that would deter potential LWers. I can't speak for all women, but this post exemplifies a kind of male hubris that I find really off-putting. Obviously the woman in the penultimate paragraph appreciated it in someone else, but I don't know if it made her think, "This is a community I want to hang out with so I, too, can make fools of other people at parties."

Do you occasionally see other comments/posts that give you this same vibe?

Before I say anything I would like to mention that this is my first post on LW, and being only part way through the sequences I am hesitant to comment yet, but I am curious about your type of position.

What I find peculiar about your position is the fact that Yudkowsky did not, as he presented here, engage the argument. The other person did, asserting "only God can make a soul", implying that Yudkowsky's profession is impossible or nonsensical. Vocalizing any type of assertion, in my opinion, should be viewed as a two-way street, letting potential criticism come. In this particular example the assertion was of a subject that the man knew would be of large interest to Yudkowsky, certainly disproportionately more than say whether or not the punch being served had mango juice in it.

I'd like to know what you expect Yudkowsky should have done given the situation. Do you expect him not to give his own opinion, given the other person's challenge? Or was it instead something in particular about the way Yudkowsky did it? Isn't arguing inevitable and all we can do is try to build better dialogue quality? (That has been my conclusion for the last few years). Either way, I don't see the hubris you seem to. My usual complaints of discussions is that they are not well educated enough and people tend to say things that are too vague to be useful, or outright unsupported. However I rarely see a discussion and think "Well the root problem here is that they are too arrogant", so I'd like to know what your reasoning is.

It may be relevant that in real life I am known by some as being "aggressive" and "argumentative". Though you probably could have inferred that based on my position but I'd like to keep everything about my position as transparent as possible.

Thank you for your time.

If I were the host I would not like it if one of my guests tried to end a conversation with "We'll have to agree to disagree" and the other guest continued with "No, we can't, actually. There's a theorem of rationality called Aumann's Agreement Theorem which shows that no two rationalists can agree to disagree." In my book this is obnoxious behavior.

Having fun at someone else's expense is one thing, but holding it up in an early core sequences post as a good thing to do is another. Given that we direct new Less Wrong readers to the core sequence posts, I think they indicate what the spirit of the community is about. And I don't like seeing the community branded as being about how to show off or how to embarrass people who aren't as rational as you.

What gave me an icky feeling about this conversation is that Eliezer didn't seem to really be aiming to bring the man round to what he saw as a more accurate viewpoint. If you've read Eliezer being persuasive, you'll know that this was not it. He seemed more interested in proving that the man's statement was wrong. It's a good thing to learn to lose graciously when they're wrong, and learn from the experience. But that's not something you can force someone to learn from the outside. I don't think the other man walked away from this experience improved, and I don't think that was Eliezer's goal.

I, like you, love a good argument with someone who also enjoys it. But to continue arguing with someone who's not enjoying it feels sadistic to me.

If I were in this conversation, I would try to frame it as a mutual exploration rather than a mission to discover which of us was wrong. At the point where the other tried to shut down the conversation, I might say, "Wait, I think we were getting to something interesting, and I want to understand what you meant when you said..." Then proceed to poke holes, but in a curious rather than professorial way.

If I were the host I would not like it if one of my guests tried to end a conversation with "We'll have to agree to disagree" and the other guest continued with "No, we can't, actually. There's a theorem of rationality called Aumann's Agreement Theorem which shows that no two rationalists can agree to disagree." In my book this is obnoxious behavior.

I'd find it especially obnoxious because Aumann's agreement theorem looks to me like one of those theorems that just doesn't do what people want it to do, and so ends up as a rhetorical cudgel rather than a relevant argument with practical import.

Agreed. If this was Judo, it wasn't a clean point. EY's opponent simply didn't know that the move used on him was against the sport's rules, and failed to cry foul.

Storytelling-wise, EY getting away with that felt like a surprising ending, like a minor villain not getting his comeuppance.

I attended a lecture by noted theologian Alvin Plantinga, about whether miracles are incompatible with science. Most of it was "science doesn't say it's impossible, so there's still a chance, right?"-type arguments. However, later on, his main explanation for why it wasn't impossible that God could intervene from outside a closed system and still not violate our laws of physics was that maybe God works through wavefunction collapse. Maybe God creates miracles by causing the right wavefunction collapses, resulting in, say, Jesus walking on water, rising from the dead, unscrambling eggs, etc.

Recalling this article, I wrote down and asked this question when the time came:

"The Many-Worlds Interpretation is currently [I said "currently" because he was complaining earlier about other philosophers misrepresenting modern science] one of the leading interpretations of quantum mechanics. The universe splits off at quantum events, but is still deterministic, and only appears probabilistic from the perspective of any given branch. Every one of the other branches still exists, including ones where Jesus doesn't come back. If true, how does this affect your argument?"

I wanted to see if he would accept a falsifiable version of his belief. Unfortunately, he said something like "Oh, I don't like that theory, I don't know how it would work with a million versions of me out there" and ignored the "if" part of the question. (I would have liked to point this out, but the guy before me had abused his mic privileges so I had to give it back.)

(Also, is that a fair layman's representation of many-worlds? I'm normally very wary of using any sort of quantum physics-based reasoning as a non-quantum physicist, but, well, he started it.)

Would you put aside your convictions and adopt religion if a skilful debater put forward an argument more compelling than yours?

To the extent the answer is "No" my atheism would be meaningless. I hope the answer is "Yes", but I have not been so tested (and do not expect to be; strong arguments for false theses should not exist).

Where do people get the impression that we all have the right not to be challenged in our beliefs? Tolerance is not about letting every person's ideas go unchallenged; it's about refraining from other measures (enforced conformity, violence) when faced with intractable personal differences.

As for politeness, it is an overrated virtue. We cannot have free and open discussions, if we are chained to the notion that we should not challenge those that cannot countenance dissent, or that we should be free from the dissent of others. Some people should be challenged often and publicly. Of course, the civility of these exchanges matters, but, as presented by Eliezer, no serious conversational fouls or fallacies were committed in this case (contemptuous tone, ad hominems, tu quoque or other Latinate no-nos, etc.).

Mark D,

How do you know what the putative AI "believes" about what is advantageous or logical? How do you know that other humans are feeling compassion? In other words, how you feel about the Turing test, and how, other than their behavior, would you be able to know about what people or AIs believe and feel?

"I believe science teaches us that human-caused global warming is an urgent crisis." "You mean if it's either not a problem or can be fixed easily, it proves science is false?" Science has been proved false many times. Those things proven to be false are no longer science. OTOH most religious beliefs are dogmatic. They can't be discarded from that religion without divine intervention/prophecy.

Rereading the post, I don't understand why the fellow didn't just say "I defy your ability to build an AI" in response to your first question. Maybe he was intimidated at the moment.

Rereading the post, I don't understand why the fellow didn't just say "I defy your ability to build an AI"

Because he wanted the ability to retain his religious belief in the face of successful AI - and he can anticipate, in advance, exactly which experimental results he'll need to excuse

I think the idea of argument is to explore an issue, not "win" or "lose". If you enter an argument with the mentality that you must be right, you've rather missed the point. There wasn't an argument here, just a one-sided discussion. It was a bludgeoning by someone with training and practice in logical reasoning on someone without. It was both disgusting and pathetic, no different than a high-school yard bully pushing some kid's face in the dirt because he's got bigger biceps. Did the outcome of this "argument" stroke your ego?

All-in-all, I'm not sure this is a story you should want to share. To put this in uncomplicated terms, it makes you sound like a real a$$hole.

I've already seen plenty of comment here on just how awkward this post is to be so early in the Sequences, and how it would turn people off, so I won't comment on that.

However: Seeing this post, early in the sequences, led me to revise my general opinion of Eliezer down just enough that I managed to catch myself before I turned specific admiration into hero-worship (my early, personal term for the halo effect).

I seriously, seriously doubt that's the purpose of this article, mainly because if Eliezer wanted to deliberately prevent himself from being affective-death-spiraled this article would read more subtly.

That said, if it is agreed that it would be good for a post like this to exist early in the Sequences (that's a pretty big if), I would hope that it could be written to invite fewer pattern-matches to the stereotype of "socially-oblivious, obsessed-with-narrow-intellectual-interest geek/nerd/dork".

I've already seen plenty of comment here on just how awkward this post is to be so early in the Sequences, and how it would turn people off, so I won't comment on that.

So early in the sequences? It would seem to be worse later in what we now call the sequences. At the time this was written it was just a casual post on a blog Eliezer had only recently started posting on. Perhaps the main error is that somehow someone included it in an index when they were dividing the stream of blog posts into 'sequences' for reference.

I have to heartily disagree with those that seem to think it impolite to disagree with the religious. Remember this same person is going to go out and make life and death decisions for himself and others. Notice also that it was the theist who started the debate.

There's a theorem of rationality called Aumann's Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong.

This seems like one of those things that can be detrimental if taught in isolation.

It may be a good idea to emphasize that only one person in a disagreement doing something wrong is far less likely than both sides in a disagreement doing something wrong.

I can easily imagine someone casually encountering that statement, and taking it to instead mean this:

There's a thing called "Aumann's Agreement Theorem" that says rationalists can't agree to disagree. Therefore if I apply the label "rationalist" to myself, I can use the words "Aumann's Agreement Theorem" to prove that anyone who disagrees with me is wrong.

JL, I’ve programmed in several languages, but you have me correctly pegged as someone who is more familiar with databases. And since I’ve never designed anything on the scale we’re discussing I’m happy to defer to your experience. It sounds like an enormously fun exercise though.

My original point remains unanswered however. We’re demanding a level of intellectual rigour from our monotheistic party goer. Fair enough. But nothing I’ve seen here leads me to believe that we’re as open minded as we’re asking him to be. Would you put aside your convictions and adopt religion if a skilful debater put forward an argument more compelling than yours? If you were to still say “no” in the face of overwhelming logic, you wouldn’t justifiably be able to identify yourself as a critical thinker. And THAT’S what I was driving at. Perhaps I’m reading subtexts where none exist, but this whole anecdote has felt less like an exercise in deductive reasoning than having sport at someone else’s expense (which is plainly out of order).

I don’t really have any passion for debating so I’ll leave it there. I’m sure EY can pass along the email address I entered on this site if you’re determined to talk me out of my wayward Christianity.

Best of luck to you all