Sam Harris is here offering a substantial amount of money to anyone who can show a flaw in the philosophy of 'The Moral Landscape' in 1000 word or less, or at least the best attempt.
http://www.samharris.org/blog/item/the-moral-landscape-challenge1
Up to $20,000 is on offer, although that's only if you change his mind. Whilst we know that this is very difficult, note how few people offer large sums of money for the privelage of being disproven.
In case anyone does win, I will remind you that this site is created and maintained by people who work at MIRI and CFAR, which rely on outside donations, and with whom I am not affiliated.
Note: Is this misplaced in Discussion? I imagine that it could be easily overlooked in an open thread by the sorts of people who would be able to use this information well?
I think his beliefs are worked out and make sense, but aren't articulated well. What he's really doing is trying to replace morality-speak with a new, slightly different and more homogeneous way of speaking in order to facilitate scientific research (i.e., a very loose operationalization) and political cooperation (i.e., a common language).
But, I gather, he can't emphasize that point because then he'll start sounding like a moral anti-realist, and even appearing to endorse anything in the neighborhood of relativism will reliably explode most people's brains. (The realists will panic and worry we have to stop locking up rapists if we lose their favorite Moral System. The relativists will declare victory and take this metaphysical footnote as a vindication of their sloppy, reflectively inconsistent normative talk.)
This is not true. He recognizes this point repeatedly in the book and in follow-ups, and his response is simply that it doesn't matter. He's never claimed to have a self-justifying system, nor does he take it to be a particularly good argument against disciplines that can't achieve the inconsistent goal of non-circularly justifying themselves.
Check out his response to critics. That should clarify a lot.
What do you mean by 'utility' here? If 'utility' is just a measure of how much something satisfies our values, then the obviousness seems a lot less mysterious.
Yeah, I plan to do basically that. (Not just as a tactic, though. I do agree with him on most of his points, and I do disagree with him on a specific just-barely-core issue.)
I did read his response to critics in addition to skimming through his book. As far as I remember his position really does seem vague and inconsistent, and he never addresses things like the supposed is-ought problem properly. He just handwaves it by saying it does not matter, as you point out, but this is not what I would call addressing it properly.
Utility always means satisfying preferences, as far as I know. The reason his answer is not obvious is that it assumes that what is desirable for the aliens must necessarily be desirable for us. In other words, it assumes a universal morality rather than a merely "objective" one (he assumes a universally compelling moral argument, to put it in less wrong terms). My greatest frustration in discussing morality is that people always confuse the ability to handle a moral issue objectively with being able to create a moral imperative that applies to everyone, and Harris seems guilty of this as well here.