Eliezer Yudkowsky quotes:

+1
Share
Pin
Like
Send
Share
  • Nothing you'll read as breaking news will ever hold a candle to the sheer beauty of settled science. Textbook science has carefully phrased explanations for new students, math derived step by step, plenty of experiments as illustration, and test problems.

  • The purpose of a moral philosophy is not to look delightfully strange and counterintuitive or to provide employment to bioethicists. The purpose is to guide our choices toward life, health, beauty, happiness, fun, laughter, challenge, and learning.

  • There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.

  • If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.

  • A scientist worthy of a lab coat should be able to make original discoveries while wearing a clown suit, or give a lecture in a high squeaky voice from inhaling helium. It is written nowhere in the math of probability theory that one may have no fun.

  • I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.

  • The systematic experimental study of reproducible errors of human reasoning, and what these errors reveal about underlying mental processes, is known as the heuristics and biases program in cognitive psychology. This program has made discoveries highly relevant to assessors of global catastrophic risks.

  • Textbook science is beautiful! Textbook science is comprehensible, unlike mere fascinating words that can never be truly beautiful. Elementary science textbooks describe simple theories, and simplicity is the core of scientific beauty. Fascinating words have no power, nor yet any meaning, without the math.

  • The media thinks that only the cutting edge of science, the very latest controversies, are worth reporting on. How often do you see headlines like 'General Relativity still governing planetary orbits' or 'Phlogiston theory remains false'? By the time anything is solid science, it is no longer a breaking headline.

  • Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.

  • We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.

  • Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.

  • I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon."

  • I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.

  • When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.

  • Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.

  • Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.

  • I want to carry in my heart forever the key word of the Olympics - 'passion.'

  • Transhumanists are not fond of death. We would stop it if we could. To this end, we support research that holds out hope of a future in which humanity has defeated death.

  • The human species was not born into a market economy. Bees won't sell you honey if you offer them an electronic funds transfer. The human species imagined money into existence, and it exists - for us, not mice or wasps - because we go on believing in it.

  • In our skulls, we carry around 3 pounds of slimy, wet, greyish tissue, corrugated like crumpled toilet paper. You wouldn't think, to look at the unappetizing lump, that it was some of the most powerful stuff in the known universe.

  • I wouldn't be surprised if tomorrow was the Final Dawn, the last sunrise before the Earth and Sun are reshaped into computing elements.

  • If our extinction proceeds slowly enough to allow a moment of horrified realization, the doers of the deed will likely be quite taken aback on realizing that they have actually destroyed the world. Therefore I suggest that if the Earth is destroyed, it will probably be by mistake.

  • Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.

  • I don't care where I live, so long as there's a roof to keep the rain off my books, and high-speed Internet access.

  • My parents were early adopters, and I've been online since a rather young age. You should regard anything from 2001 or earlier as having been written by a different person who also happens to be named 'Eliezer Yudkowsky.' I do not share his opinions.

  • You cannot 'rationalize' what is not rational to begin with - as if lying were called 'truthization.' There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.

  • [...] intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists. After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.

  • Litmus test: If you can't describe Ricardo 's Law of Comparative Advantage and explain why people find it counterintuitive, you don't know enough about economics to direct any criticism or praise at " capitalism " because you don't know what other people are referring to when they use that word .

  • Why does any kind of cynicism appeal to people? Because it seems like a mark of maturity, of sophistication, like you've seen everything and know better. Or because putting something down feels like pushing yourself up.

  • Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.

  • If you are equally good at explaining any outcome, you have zero knowledge.

  • To confess your fallibility and then do nothing about it is not humble; it is boasting of your modesty.

  • Trying and getting hurt can't possibly be worse for you than being... stuck.

  • The important graphs are the ones where some things are not connected to some other things. When the unenlightened ones try to be profound, they draw endless verbal comparisons between this topic, and that topic, which is like this, which is like that; until their graph is fully connected and also totally useless.

  • Like that's the only reason anyone would ever buy a first-aid kit? Don't take this the wrong way, Professor McGonagall, but what sort of crazy children are you used to dealing with?" "Gryffindors," spat Professor McGonagall, the word carrying a freight of bitterness and despair that fell like an eternal curse on all youthful heroism and high spirits.

  • I'm lazy! I hate work! Hate hard work in all its forms! Clever shortcuts, that's all I'm about!

  • - With respect, Professor McGonagall, I'm not quite sure you understand what I'm trying to do here. - With respect, Mr. Potter, I'm quite sure I don't. Unless - this is a guess, mind - you're trying to take over the world? - No! I mean yes - well, NO! - I

  • The human brain cannot release enough neurotransmitters to feel emotion a thousand times as strong as the grief of one funeral. A prospective risk going from 10,000,000 deaths to 100,000,000 deaths does not multiply by ten the strength of our determination to stop it. It adds one more zero on paper for our eyes to glaze over.

  • He'd met other prodigies in mathematical competitions. In fact he'd been thoroughly trounced by competitors who probably spent literally all day practising maths problems and who'd never read a science-fiction book and who would burn out completely before puberty and never amount to anything in their future lives because they'd just practised known techniques instead of learning to think creatively.

  • Hermione's eyes lit up with a terrible light of helpfulness and something in the back of Harry's brain screamed in desperate humiliation.

  • Without thinking about it at all, Harry stepped in front of Hermione.There was an intake of breath from behind him, and then a moment later Hermione brushed past and stepped in front of him. "Run, Harry!" she said. "Boys shouldn't have to be in danger.

  • But it is cute. It's such a boy thing to do.Drop dead.Aw, you say the most romantic things.

  • Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them.

  • The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else.

  • For an instant Harry imagined... Just for an instant, before his imagination blew a fuse and called an emergency shut down and told him never to imagine that again.

  • People form close friendships by knowing private things about each other, and the reason most people don't make close friends is because they're too embarrassed to share anything really important about themselves.

  • Physiologically adult humans are not meant to spend an additional 10 years in a school system; their brains map that onto "I have been assigned low tribal status". And so, of course, they plot rebellion accuse the existing tribal overlords of corruption plot perhaps to split off their own little tribe in the savanna, not realizing that this is impossible in the Modern World.

  • If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.

  • The people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

  • To be clever in argument is not rationality but rationalization.

  • The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.

  • World domination is such an ugly phrase. I prefer to call it world optimisation.

  • An anthropologist will not excitedly report of a newly discovered tribe: 'They eat food! They breathe air! They use tools! They tell each other stories!' We humans forget how alike we are, living in a world that only reminds us of our differences.

  • By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

  • A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.

  • When something is universal enough in our everyday lives, we take it for granted to the point of forgetting it exists.

  • My successes already accomplished have mostly been taking existing science and getting people to apply it in their everyday lives.

  • Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.

  • The obvious choice isn't always the best choice, but sometimes, by golly, it is. I don't stop looking as soon I find an obvious answer, but if I go on looking, and the obvious-seeming answer still seems obvious, I don't feel guilty about keeping it.

  • Moore's Law of Mad Science: Every eighteen months, the minimum IQ necessary to destroy the world drops by one point.

  • You are personally responsible for becoming more ethical than the society you grew up in.

  • There were mysterious questions, but a mysterious answer was a contradiction in terms.

  • Crocker's Rules didn't give you the right to say anything offensive, but other people could say potentially offensive things to you , and it was your responsibility not to be offended. This was surprisingly hard to explain to people; many people would read the careful explanation and hear, "Crocker's Rules mean you can say offensive things to other people.

  • If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.

  • By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

  • When you are older, you will learn that the first and foremost thing which any ordinary person does is nothing.

  • There is no justice in the laws of nature, no term for fairness in the equations of motion. The Universe is neither evil, nor good, it simply does not care. The stars don't care, or the Sun, or the sky. But they don't have to! WE care! There IS light in the world, and it is US!

  • Every mystery ever solved had been a puzzle from the dawn of the human species right up until someone solved it.

  • To worship a sacred mystery was just to worship your own ignorance.

  • Science has heroes, but no gods. The great Names are not our superiors, or even our rivals, they are passed milestones on our road; and the most important milestone is the hero yet to come.

  • There is light in the world, and it is us!

  • If cryonics were a scam it would have far better marketing and be far more popular.

  • If people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing.

  • You cannot rationalize what is not rational to begin with - as if lying were called truthization. There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.

  • Not every change is an improvement but every improvement is a change; you can't do anything BETTER unless you can manage to do it DIFFERENTLY, you've got to let yourself do better than other people!

  • Most Muggles lived in a world defined by the limits of what you could do with cars and telephones. Even though Muggle physics explicitly permitted possibilities like molecular nanotechnology or the Penrose process for extracting energy from black holes, most people filed that away in the same section of their brain that stored fairy tales and history books, well away from their personal realities: Long ago and far away, ever so long ago.

  • I see little hope for democracy as an effective form of government, but I admire the poetry of how it makes its victims complicit in their own destruction.

  • Lonely dissent doesn't feel like going to school dressed in black. It feels like going to school wearing a clown suit.

  • I ask the fundamental question of rationality: Why do you believe what you believe? What do you think you know and how do you think you know it?

  • Part of the rationalist ethos is binding yourself emotionally to an absolutely lawful reductionistic universe a universe containing no ontologically basic mental things such as souls or magic and pouring all your hope and all your care into that merely real universe and its possibilities, without disappointment.

  • You couldn't changed history. But you could get it right to start with. Do something differently the FIRST time around. This whole business with seeking Slytherin's secrets... seemed an awful lot like the sort of thing where, years later, you would look back and say, 'And THAT was where it all started to go wrong.' And he would wish desperately for the ability to fall back through time and make a different choice. Wish granted. Now what?

  • Remember, if you succeed in everything you try in life, you're living below your full potential and you should take up more difficult or daring things.

  • Our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted.

  • If you handed [character] a glass that was 90% full, he'd tell you that the 10% empty part proved that no one really cared about water.

  • Reality has been around since long before you showed up. Don't go calling it nasty names like 'bizarre' or 'incredible'. The universe was propagating complex amplitudes through configuration space for ten billion years before life ever emerged on Earth. Quantum physics is not 'weird'. You are weird.

  • It is triple ultra forbidden to respond to criticism with violence. There are a very few injunctions in the human art of rationality that have no ifs, ands, buts, or escape clauses. This is one of them. Bad argument gets counterargument. Does not get bullet. Never. Never ever never for ever.

  • Maybe you just can't protect people from certain specialized types of folly with any sane amount of regulation, and the correct response is to give up on the high social costs of inadequately protecting people from themselves under certain circumstances.

  • Rationality is the master lifehack which distinguishes which other lifehacks to use.

  • If you want to build a recursively self-improving AI, have it go through a billion sequential self-modifications, become vastly smarter than you, and not die, you've got to work to a pretty precise standard.

  • I don't want to rule the universe. I just think it could be more sensibly organised.

  • I only want power so I can get books.

  • Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us.

  • You will find ambiguity a great ally on your road to power. Give a sign of Slytherin on one day, and contradict it with a sign of Gryffindor the next; and the Slytherins will be enabled to believe what they wish, while the Gryffindors argue themselves into supporting you as well. So long as there is uncertainty, people can believe whatever seems to be to their own advantage. And so long as you appear strong, so long as you appear to be winning, their instincts will tell them that their advantage lies with you. Walk always in the shadow, and light and darkness both will follow.

  • My experience is that journalists report on the nearest-cliche algorithm, which is extremely uninformative because there aren't many cliches, the truth is often quite distant from any cliche, and the only thing you can infer about the actual event was that this was the closest cliche.... It is simply not possible to appreciate the sheer awfulness of mainstream media reporting until someone has actually reported on you. It is so much worse than you think.

  • By and large, the answer to the question "How do large institutions survive?" is "They don't!" The vast majority of large modern-day institutions some of them extremely vital to the functioning of our complex civilization simply fail to exist in the first place.

  • We underestimate the distance between ourselves and others. Not just inferential distance, but distances of temperament and ability, distances of situation and resource, distances of unspoken knowledge and unnoticed skills and luck, distances of interior landscape.

  • Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so? Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?

  • Boys," said Hermione Granger, "should not be allowed to love girls without asking them first! This is true in a number of ways and especially when it comes to gluing people to the ceiling!

  • Okay, so either (a) I just teleported somewhere else entirely (b) they can fold space like no one's business or (c) they are simply ignoring all the rules.

  • Every time someone cries out in prayer and I can't answer, I feel guilty about not being God. - That doesn't sound good. - I understand that I have a problem, and I know what I need to do to solve it, all right? I'm working on it. Of course, Harry hadn'

  • The strength of a theory is not what it allows, but what it prohibits; if you can invent an equally persuasive explanation for any outcome, you have zero knowledge.

  • This is one of theprimary mechanisms whereby, if a fool says the sun is shining, we do notcorrectly discard this as irrelevant nonevidence, but rather find ourselvesimpelled to say that it must be dark outside.

  • If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool.

  • If I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.

  • That which the truth nourishes should thrive.

  • I'm wondering if there's a spell to make lightning flash in the background whenever I make an ominous resolution.

  • After all, if you had the complete decision process, you could run it as an AI, and I'd be coding it up right now.

  • There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model.

  • Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.

  • Singularitarians are the munchkins of the real world. We just ignore all the usual dungeons and head straight for the cycle of infinite wish spells.

  • What people really believe doesn't feel like a BELIEF, it feels like the way the world IS.

  • ...there's something in science like the shine of the Patronus Charm, driving back all sorts of darkness and madness...

+1
Share
Pin
Like
Send
Share