Zitat des Tages von Eliezer Yudkowsky:
Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.
Though I have friends aplenty in academia, I don't operate within the academic system myself.
A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.
I wouldn't be surprised if tomorrow was the Final Dawn, the last sunrise before the Earth and Sun are reshaped into computing elements.
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.
Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.
My parents were early adopters, and I've been online since a rather young age. You should regard anything from 2001 or earlier as having been written by a different person who also happens to be named 'Eliezer Yudkowsky.' I do not share his opinions.
In our skulls, we carry around 3 pounds of slimy, wet, greyish tissue, corrugated like crumpled toilet paper. You wouldn't think, to look at the unappetizing lump, that it was some of the most powerful stuff in the known universe.
I want to carry in my heart forever the key word of the Olympics - 'passion.'
If I could create a world where people lived forever, or at the very least a few billion years, I would do so. I don't think humanity will always be stuck in the awkward stage we now occupy, when we are smart enough to create enormous problems for ourselves, but not quite smart enough to solve them.
Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.
Transhumanists are not fond of death. We would stop it if we could. To this end, we support research that holds out hope of a future in which humanity has defeated death.
The media thinks that only the cutting edge of science, the very latest controversies, are worth reporting on. How often do you see headlines like 'General Relativity still governing planetary orbits' or 'Phlogiston theory remains false'? By the time anything is solid science, it is no longer a breaking headline.
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
To be clever in argument is not rationality but rationalization.
There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
The obvious choice isn't always the best choice, but sometimes, by golly, it is. I don't stop looking as soon I find an obvious answer, but if I go on looking, and the obvious-seeming answer still seems obvious, I don't feel guilty about keeping it.