After passing through the outer and middle ear, sound waves enter a fluid-filled, spiral-shaped chamber. This is the cochlea. Thirty thousand tiny hair cells line a membrane that runs through the center of the chamber—miniature machines that convert sound into electrical signals, which then travel along the auditory nerve and into the brain. For the vast majority of deaf and hard-of-hearing people, these hair cells are damaged or missing.

For children born profoundly deaf, or who lose hearing before they can talk, the prospects of effective speech are virtually nonexistent. Lip-reading, too, is daunting: If you have never heard, you have no mental sound to link with lip movements. Even if that link could be made, around a third of spoken English is indistinguishable on the mouth and lips. (Look in a mirror and try to tell the difference between words like “pat” and “bat,” or “to” and “do.”)

Sign languages lack a written form, so much of deaf history has been lost to time. When the history was recorded, the voices of the deaf were rarely included; mostly the accounts come from philosophers, poets, and artists who have interacted with deaf people and, as one Deaf studies expert put it, mused about “this alternative way of being and thinking in the world.” Aristotle believed that “hearing is the sense of sound, and sound the vehicle of thought; hence the blind are more intelligent than deaf-mutes.” References to the deaf in the Old Testament suggest a sympathy towards the non-hearing, but the same cannot be said for early Christians. Paul’s Epistle to the Romans reads: “Faith cometh by hearing, and hearing by the word of God.” A deaf person, under such a view, could not have faith.

Charles-Michel de l’Épée

It’s not until the 18th century, and the advent of schools for the deaf, that deaf voices become more widely heard. The most famous of these early schools was started in France by Charles-Michel de l’Épée, a hearing abbot who chanced upon deaf sisters and learned enough rudimentary sign language to communicate with them. The deaf have come to revere the abbott, and up until recently recounted his tale in epic terms. Carol Padden, a linguist at the University of California San Diego, recounts one version she heard: “The Abbé de l’Epée had been walking for a long time through a dark night. He wanted to stop and rest overnight, but he couldn’t not find a place to stay, until at a distance he saw a house with a light.” De l’Epée found a house with two young women, who did not speak. Only after their mother appeared and told de l’Epée that they were deaf did he understand. His life’s purpose—to help deaf people—became clear. “The story of the Abbé de l’Epée is almost always a preface to an official deaf-club event,” Padden writes.

The story might be apocryphal, but reverence for de l’Epée is well placed because the Parisian school he started catalysed deaf education. In 1817, Laurent Clerc, one of the school’s most prominent students, and Thomas Hopkins Gallaudet, a hearing American scholar with an interest in the deaf, founded what became the American School for the Deaf in Hartford, Connecticut. It would become the birthplace of modern-day American Sign Language. Deaf schools later proliferated, each with its own quirks, customs, and dialects. Much as hearing people say where they’re from when meeting a new person, Deaf individuals began including where they went to school.

Yet acceptance of sign was far from widespread. Educators at the Clarke Schools for the Deaf in Northampton, Massachusetts, banned sign and attempted to teach students to speak. They took their cue from Alexander Graham Bell, the man famous for inventing the telephone and infamous, at least in the Deaf community, for his abhorrence of sign. Bell believed that sign prevented deaf people from integrating into society, an opinion drawn from personal experience: his wife was deaf and his mother lost her hearing when he was a child. If deaf children found it hard to communicate with one another, went Bell’s thinking, they’d be less likely to pair up, and less likely to produce more deaf children.

The most important battle in the war between the two approaches—oralist and signing—was fought in 1880, when deaf educators from across the globe convened in Milan, Italy, for the Second International Congress on Education of the Deaf. There was only a single deaf delegate, and most fell squarely in the oralist camp. Attendees agreed upon “the incontestable superiority of speech over signs in restoring deaf-mutes to social life” and voted to use the oral method. Today, Deaf people see the conference as the bleakest moment in their history, an event that closed deaf schools around the world and drove sign languages underground.

The Dark Age of deafness didn’t wane until the 1970s, when William Stokoe, a teacher at Gallaudet, debunked the notion that ASL was nothing more than pidgin English. Others would later show that ASL was distinct from spoken English, and contained its own grammar and syntax. People started talking of Deaf culture, the D capitalized to distinguish the culture from the condition. And that culture began, slowly, to diffuse into the hearing world. The touring National Theatre of the Deaf took its shows on the road to help de-stigmatize sign. The deaf actress Linda Bove got her start there, before going to Sesame Street, where she introduced two generations of children to sign language and the idea that being deaf was something to celebrate.

For many adults, the introduction came from the 1986 movie Children of a Lesser God, in which deaf actress Marlee Matlin plays a former Deaf-school student working as a custodian at her alma mater. She falls for a hearing speech teacher, played by William Hurt, who tries unsuccessfully to get her to speak instead of sign. Eventually, Hurt is forced to accept Matlin for what she is: a beautiful and proud Deaf woman. The movie earned Matlin the Best Actress Oscar and made millions aware of the Deaf cause. “The subject matter is new and challenging, and I was interested in everything the movie had to tell me about deafness,” wrote critic Roger Ebert. (Sensitivities still ran high. Many Deaf people excoriated Matlin, who has some residual hearing, for speaking when she returned to the Oscars to present an award).

The irony of this period is that it contains not just the flowering of the Deaf civil rights movement, which was based on the idea that deaf people didn’t need a cure, but also the emergence of something approximating just that—a cure. In 1984, the U.S. Food and Drug Administration approved cochlear implants. The devices come in two parts. A hearing-aid shaped computer, worn on the ear, digitizes incoming sound. That sound is then sent, by radio, to electrodes implanted in the cochlea, which activate the auditory nerve. At first, the FDA approved implants for adults only. But some argued that if children received implants just before they would normally begin speaking, the hearing parts of their brains might develop along the same trajectory as other children. They might learn to hear and speak.

To be clear: Cochlear implants don’t fix the biology of the ear. They’re a prosthesis, a workaround. People with implants cannot hear like non-deaf people. Users typically remove the processor to sleep and shower, and the batteries need to be replaced. When the processors are out or the batteries have run down, users are once again deaf.