The Craft and the Community: Post 1 – 2

1. Raising the Sanity Waterline

Even if we eliminated harmful and insane practices and beliefs like e.g. drug-criminalization or the Blank Slate Dogma (Yudkowsky bashes religion, I however really love to bash those two, as you may have noticed by now), there still would be a lot of other problems left. Those beliefs are merely symptoms of a disease that goes much deeper. I’m speaking a bit hyperbolic but as a general rule it’s accurate:  Humans are crazy.

Not everyone, and not batshit-insane like the inner circle of a psychiatric ward, but nonetheless…

Anyway, so merely treating the symptoms wouldn’t be as effective as curing the sickness. So Yudkowsky thinks that general rationality skills like Occam’s Razor, Mysterious Answers, Bayesian notions of evidence, etc. could be panaceas and raise the sanity waterline, so that crazy beliefs go “underwater”.

Sounds like a nice strategy and would get rid of loads of bullshit at a single blow. Problem is of course that many forms of madness are incurable. In addition to that, most people have IQs below 100 (only European countries have IQs of around hundred and China has an average IQ of about 105, but e.g. even the USA has something like 94 since its population is only 80% white or so and African or Latin American countries score one whole SD below the mean) and probably only a fraction of those folks can understand more complex stuff about rationality.

2. A Sense That More Is Possible

Most aspiring rationalists just don’t seem that impressive. Sure, they have probably ~2 SD above average IQ, and aren’t obviously crazy but that’s about it.

The problem is that there aren’t systematized practices or courses that allow you to level up in rationality. And developing the stuff on your own or reading some stuff by Yudkowsky only works to some degree.

The conclusion:

Why are there schools of martial arts, but not rationality dojos? … Is it more important to hit people than to think?

No, but it’s easier to verify when you have hit someone.  That’s part of it, a highly central part.

But maybe even more importantly—there are people out there who want to hit, and who have the idea that there ought to be a systematic art of hitting that makes you into a visibly more formidable fighter, with a speed and grace and strength beyond the struggles of the unpracticed.  So they go to a school that promises to teach that.  And that school exists because, long ago, some people had the sense that more was possible.  And they got together and shared their techniques and practiced and formalized and practiced and developed the Systematic Art of Hitting.  They pushed themselves that far because they thought they should be awesome and they were willing to put some back into it.

Now—they got somewhere with that aspiration, unlike a thousand other aspirations of awesomeness that failed, because they could tell when they had hit someone; and the schools competed against each other regularly in realistic contests with clearly-defined winners.

But before even that—there was first the aspiration, the wish to become stronger, a sense that more was possible.  A vision of a speed and grace and strength that they did not already possess, but could possess, if they were willing to put in a lot of work, that drove them to systematize and train and test.

Why don’t we have an Art of Rationality?

Third, because current “rationalists” have trouble working in groups: of this I shall speak more.

Second, because it is hard to verify success in training, or which of two schools is the stronger.

But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.

And conversely they don’t look at the lack of visibly greater formidability, and say, “We must be doing something wrong.”

“Rationality” just seems like one more hobby or hobbyhorse, that people talk about at parties; an adopted mode of conversational attire with few or no real consequences; and it doesn’t seem like there’s anything wrong about that, either.

Good comment by Vladimir Golovin:

“Because they don’t win? Because they don’t reliably steer reality into narrow regions other people consider desirable?

I’ve met and worked with several irrationalists whose models of reality were, to put it mildly, not correlated to said reailty, with one explicit, outspoken anti-rationalist with a totally weird, alien epistemology among them. All these people had a couple of interesting things in common.

On one hand, they were often dismal at planning – they were unable to see obvious things, and they couldn’t be convinced otherwise by any arguments appealing to ‘facts’ and ‘reality’ (they universally hated these words).

On the other hand, they were surprisingly good at execution. All of them were very energetic people who didn’t fear any work or situation at all, and I almost never saw any of them procrastinating. Could this be because their minds, due to their poor predictive ability, were unable to see the real difficulty of their tasks and thus avoided auto-switching into procrastination mode?

(And a third observation – all these people excelled in political environments. They tended to interpret their surroundings primarily in terms of who is kin to whom, who is a friend of who, who is sexually attracted to whom, what others think of me, who is the most influential dude around here etc etc. What they lost due to their desynchronization with factual reality, they gained back thanks to their political aptness. Do rationalists excel in political environments?)”

Here is my theory that explains why rationality isn’t effective in the real world:

The first and biggest part is of course that seeing the truth actively destroys your passions and motivation. Lovecraft was right and I have to quote him:

The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.

Probably the greatest problem for our (maybe only my motivation, YMMV) is that life has not inherent meaning. There is no great conflict, no higher purpose, no clear Good vs. Evil; it’s more like Evil vs. Evil or hypocrites vs. hypocrites. Once you’ve eliminated the Story Bias (as Tyler Cowen calls it) you just can’t take idealism and romanticism – which were essential parts of my identity – seriously anymore. Which obviously leads to akrasia in many of us. I mean, what would Frodo do if it somehow turned out that Sauron is actually a nice guy and had good reasons for his actions?

Once you realize that your goals, thoughts and behavior are shaped by evolution and that most people are evil and hypocritical by design, you just come to the conclusion: “Fuck this shit” which – in my humble opinion – is an entirely rational and understandable response.

Secondly, even if you somehow manage to remain idealistic and passionate, you still have to acknowledge that the world is really complicated. Even if you e.g. are convinced that x-risk-reduction is the right thing, you don’t know how. Donating to SIAI? Oh, but Yudkowsky seems a bit overconfident, FAI is probably too hard anyway, etc. so it’s probably better to follow e.g. Wei Dai’s singularity strategies like cloning thousands of Van Neumanns or uploading them or whatever. (Wei Dai actually declined to author some academic papers on UDT because he thinks it might increase the risk of UFAI.) So what looks like a safe and reasonable strategy? Basically just talking, reading and encouraging discussions until you know more about what the right strategy seems to be. Which may take a long time.

Thirdly, rationality is probably positively correlated with procrastination. Some people are just energetic, happy and have lots of will-power and stuff. I don’t know, maybe there is something like “practical” rationality that can help you with these sort of things but genes seem to play a big role here. Of course, taking drugs can help, too.

Fourthly, you become ever more confused about the basic metaphysical stuff like reductionism/physicalism or ethics, especially ethics in a Big World. You aren’t even sure if your model of the universe makes sense on the most basic of all levels. You don’t know how to reconcile utilitarianism with infinite worlds, etc. etc. Just read the above quote by Lovecraft again, or better yet, memorize it and chant it everyday.

Fifthly, just thinking about things consumes precious time and prevents you from achieving real world success.

Sixthly, an accurate assessment of the chances of success often induces doubt and inaction which leads to greater variability of outcomes among non-rationalists. Nobody would run for president, play the lottery, start a religion or try to become the next superstar if they were not deluded and vastly overestimated their chances of success.

Which leads us to the last point: The average rationalist is way cooler than the average non-rational person. Admittedly, that doesn’t mean much, but we shouldn’t fall prey to base rate neglect. There are probably less than 10000 rational people in the world, so of course there are more amazing non-rational folks out there. Only because Sparta lost  doesn’t mean that the average Spartan warrior wasn’t formidable. And on that self-congratulatory note, I would like to end this rant.

 

Advertisements
This entry was posted in ethics, Fundamentals, Lesswrong Zusammenfassungen. Bookmark the permalink.

2 Responses to The Craft and the Community: Post 1 – 2

  1. muflax says:

    Minor pet peeve: Spartan soldiers sucked. Lots of muscles, no flexibility or organization. A bunch of overweight bureaucrats would kick Sparta’s ass any time. (And did, see Athens.)

    Can’t really think of “excellent warriors, still lost a lot due to bad luck or an extremely shitty situation” that aren’t horribly political, though.

    (Pretty much anyone involved in the Eastern Front of WW2, for example. The whole region is full with “did everything right, still got fucked hard”. And maybe is another instance of the kind of double-bind Vladimir described. You need a lot of passion and sheer fanaticism to have any chance at all to get something meaningful done, but this limits your rational decision-making in the process and might easily be your downfall in the end. (*cough*Hitler*cough*)

    I wonder if that is just a human problem, or more general. Solutions like that seem to crop up a lot in Game Theory (e.g. in Chicken), and the (seemingly) low advantage of “rational” adaptations may support that. (Though being dangerously sane is still a fairly new strategy and worth a shot.)

    (I agree otherwise. Also, I like your summaries, saves me the trouble of reviewing the Sequences myself. ;))

    • wallowinmaya says:

      Ah thanks, I don’t know shit about history; really had more the movie/legend in mind.

      Yeah, good point about Game Theory. I guess, our ability to (somehwat irrationally) ignore ideas also helps us in cases like Pascal’s Mugging and infinite ethics.

      Cool, I’m glad these posts are of some use!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s