Thursday, April 20, 2017

Our Best Minds

Why don't people like scientists? Because they never bring you any good news. The polar ice caps are melting. You can get cancer from just about everything. There is no such thing as Big Foot. Okay, that last one may turn out to be good news if you happen to to harbor fears of being attacked by Sasquatch, but mostly it really isn't what we want to hear. Please don't call us until you find the secret to molecular transport or a cure for the common cold. 
Did they listen? No. They're scientists. They're far too busy discovering things and blabbing about them as if we could all use this worrisome knowledge. Like this group out of Princeton who recently published a study in Science magazine, they found out that if people teach machines how to be human, then they end up acting like humans. Pretty cool so far, right? This Artificial Intelligence uses big batches of words and analyzes them along with their uses in all kinds of places, including Al Gore's Internet. The AI then sets about making connections to words and phrases and assigning meaning to them that helps generate "human" responses. A simple example? “Flower” is more likely to be associated with “pleasant” than “weapon” is. That, of course, makes perfect sense. What makes less sense is how the trained AI also had a habit of associating typically Caucasian-sounding names with other things that it considered to be “pleasant,” rather than African-American names. The AI also shied away from pairing female pronouns with mathematics, and instead often associated them with artistic terms. 
Whoops. The AI seems to be racist and sexist. This opens a whole case of worm cans, all of which threaten to wriggle out onto the floor and make the future as big a mess as our present. Or worse. The good news? This study may go a long way to explain the temperament of Arnold Schwarzenegger. Maybe we should send some scientists over to his house for a couple weeks to keep an eye on him. Maybe this will help them on some of those other worries they have us going on about. Or maybe they just discovered this to make us fret some more. Maybe Skynet is too self-aware. 

No comments: