This story appeared in the September/October 2020 of Discover magazine. We hope you’ll subscribe to Discover and help support science journalism at a time when it’s needed the most.
1. You were learning before you were born: By eavesdropping on their mothers while in the womb, babies pick up the sound patterns of their native tongue.
2. After birth, infants recognize these rhythms and are more attentive to them than the sounds of other languages, prepping for eventual fluency.
3. But infants’ first postnatal teachers are their mothers’ faces: Babies can distinguish their mothers from other women within hours of birth, and discern emotions within days.
4. That ability helps junior figure out whether a new toy is safe or a stranger is to be trusted.
5. This focus on mom has a cost, though. As a child becomes more familiar with their mother’s face, faces unlike hers become more vague, especially those belonging to people of other races — an effect best countered by interracial interactions at a young age.
6. Ever learned something by rote? Before the printing press was invented, books were so rare that scholars committed vast tracts to memory, keeping passages in order by associating them with rooms in a house that they could imagine visiting during recitation.
7. The connection between verbal communication and learning runs deep. To figure out how early hominins taught each other to make stone implements, a University of California, Berkeley, psychologist and a University of Liverpool archaeologist had students instruct their classmates in flint toolmaking.
8. They found that, in comparison with verbal explanation, silent imitation just didn’t cut it.
9. Which is not to say that proto-humans were using fancy turns of phrase 2.5 million years ago. But gene-culture co-evolution theory holds that language may have evolved because of the advantages of verbalization during toolmaking instruction.
10. Tools are a key part of wild chimpanzee learning as well: Mothers share specially fashioned twigs with their young to teach them how to scoop up delicious termites.
11. Many animals are good learners — some even better than people. In 2014, Katholieke Universiteit Leuven neurobiologists taught rats and humans to identify complex patterns. When the researchers then modified the designs’ spacing and orientation, the rats were better able to recognize the patterns.
12. Still, rats can’t compete with the Amazing Learned Pig, which became the talk of London in 1785 by performing arithmetic and spelling people’s names.
13. The Learned Pig interpreted its trainer’s body language in order to “answer” his questions. This phenomenon, known as unconscious cueing, continues to complicate legitimate research on animal cognition.
14. Microsoft dealt with a different type of learned pig in 2016, when the company set up a Twitter chatbot that learned to make conversation based on what people told it. The internet being the internet, the bot turned foul-mouthed overnight.
15. AI’s vulnerability to bias based on its training materials is especially troubling when one considers that computers now often decide whether you get a job or a loan.
16. But, originally, machine learning was all fun and games: Arthur Samuel launched the field at IBM in the ’50s by programming a computer to replay the games in Lee’s Guide to the Game of Checkers.
17. The machine taught itself enough strategy to beat human players. About half a century later, another IBM machine bested humans at their own game by becoming a Jeopardy! champ.
18. Named Watson after IBM’s founding chairman, the computer prepped by cramming source materials ranging from The New York Times to the World Book Encyclopedia.
19. Executives proudly announced that they’d next give Watson medical training by having it read all the latest journals.
20. However, although machines never forget a fact and are skilled at connecting the dots, good MDs must read between the lines, applying journal findings in ways the authors didn’t intend or explain. Dr. Watson is unlikely to see you anytime soon.
Jonathon Keats is a contributing editor to Discover. His most recent book is You Belong to the Universe: Buckminster Fuller and the Future.