Eliezer Yudkowsky (via abundance-mine)
Said the guy who literally believes he’s smarter than the rest of humanity and wants to build an AI that will take over not just the world, but the *universe*, and fill it with whatever One True Way all of humanity *really* wants (which it will discover).
And thinks there’s a good case to be made the children under two are not “sentient,” but that this is too much of a shock to our morals and values and we are not rational enough to question those unless we’re a rare scintillating jewel of an intellect.
And thinks that in a morally more-enlightened world than our own, people might think of rape as basically just a pleasant surprise.
And is a big fan of evolutionary psychology as an explanation for why men and women are literal aliens to each other, and for racial IQ differences and “achievement gaps”, and has said outright he thinks that rich people really are smarter, more engaged and more alive than others (and that he really hopes to fix that because gosh isn’t it unfair).
And has routinely boosted his own “Machine Intelligence Research Institute” (formerly the “Singularity Institute” until someone there realized how that made them all sound) as the single most ethical source for all charitable donations.
And…you know what, I don’t even want to go *on* here. XD But this sort of quote that sounds really profound and insightful (and clashes horribly with the rest of what he says/does) is sort of his bag, I guess…
I was unfamiliar with the source, but that sounds…special. :/
Yeah it’s pretty horrible. I know someone who fell in with his cult of personality, and it’s changed them beyond recognition. They fell in with him after researching how to best invest their money… and of course coming to the Singularity Institute as the best possible way to do it. They seriously believe they’re saving the world. They probably will be insulted if they read this and find that I no longer believe them, not even remotely, after doing the research.
Another fun thing his group has done is attempt to convince people that real actual disasters facing the planet right now are not actually important, compared to the possibility of the Singularity. So everyone should forget about climate change and feeding the world and all this other important stuff, and instead focus on making sure that a super-intelligent computer won’t be able to do anything bad to us. Also? Those problems won’t be problems once the Singularity comes. Because the Singularity is essentially so advanced as to be magic, and will be able to manipulate people into doing whatever it wants, and even create elements out of nowhere so that we will no longer have shortages of anything, and everything will be fine.
Honestly I think part of the reason that people flock to groups like this is that the state of the environment is horrible, we’re facing catastrophic events in the next couple hundred years, even possible extinction… and it’s better to think that the real threat is a computer that will probably never exist — and to also believe that this computer will be their savior if they can get it to be on their side instead of destroying humanity or something.
And the reason this doesn’t make sense to anyone with common sense? Is apparently because we’re not logical enough. Human brains, you see, weren’t built for understanding a situation like the Singularity, so we dismiss it out of hand. Instead, we need to learn to think Logically And Rationally, and then the Singularity will make sense as this horrible threat, perhaps the worst threat facing the world today, and we will want to pour all our time, money, and energy into the Singularity Institute.
The last conversation I had with this person was harrowing and felt like having my mind shredded. So does reading their website. There’s, for lack of a better word (and they’d laugh at me), a really fucked-up feeling, almost an ‘energy’, around this, that I don’t trust and actually fear quite a bit. It’s intense, it’s deliberately constructed, and it’s as foul as foul can get. I badly miss the person in question, but I can’t talk to them as long as they still have traces of that feeling attached to them. It feels like my mind is being shredded by cheese wire and I have to stay away from things that make me feel like that.
Please avoid getting sucked in. It looks laughable, but it’s also quite sinister. Be aware, also, that they have a lot of PR aimed at getting upper-middle-class and wealthy geeks to give them a lot of money, time, and energy, and that they often succeed. Don’t be sucked in. Please. Even if you can’t see the kind of mind-patterns I can see, this is bad news, it’s far more than something mildly disturbing to laugh at.