This week, we’re examining women’s pioneering role in computer programming. To learn more about tech history made in New York, check out our upcoming exhibit Silicon City, opening Friday, November 13. The show explores early inventions that paved the way for the computer—think electricity, the telegraph—to the Big Apple’s contemporary role as a burgeoning epicenter for cutting-edge tech firms.

Who was the world’s first computer scientist? It’s not who you think. Here’s a clue: This person was not a bespectacled man. In fact, she was a countess whose approach to mathematics was self-defined as “poetic.” A fitting title considering that she, Ada Lovelace, was the daughter of famed Romantic poet Lord Byron. Ada’s mother vowed her daughter would be a mathematician, not a poet. Despite her best efforts, Ada found the right brain in the left. As a teenager, Lovelace met Cambridge professor Charles Babbage at a London salon, where he showed her plans for a machine he believed was capable of advanced mathematical calculations, unfathomable to Victorian Brits. The machine was never built, nonetheless Lovelace wrote about it in a scholarly journal, and her theoretical prose transcended the machine. She envisioned a future with a device that could perform any and all calculations. Nearly a century before it was invented, Lovelace foretold the computer.

Before social media, before “brogrammers,” before P.C.s, computer programming was women’s work. Today, their seminal and defining contributions to the field have been largely forgotten. More than a half century after they pioneered the field, they can scarcely be found in it. At one of the world’s leading tech firms (you guessed it: Google), women make up a mere 17 percent of its technical staff. And believe it or not, these numbers are high. So how did this happen?

One of the world’s first fully electronic general-purpose computers, the Electronic Numerical Integrator And Computer (ENIAC), was built during World War II. Its mission: calculate artillery firing tables from the U.S. Army’s Ballistic Research Lab. Among the first ENIAC programs was to test the feasibility of the A-bomb. Manned by a team of six women who are considered the first modern computer programmers, the machine was made public after the peace in 1946. At the time, creating software was likened to clerical work, thus well-suited for women, whereas hardware, physically building computers and their computational mechanisms was seen as masculine.

But then came UNIVAC and Grace Hopper. The UNIVAC was the first commercial computer and Grace Hopper its programmer, who had linked up with the “ENIAC girls.” A mathematics professor and member of the Naval Reserve, Hopper revolutionized programming by designing COBOL, a language that used words, not numbers—suddenly computers could perform more and more complex commands. And nearly six decades later, this program is still in use today.

During the 1960s, it had become clear that software, not hardware would be the future of the field. While hardware was and remains important in keeping machines running, software was capable of affecting and controlling the devices at an executive level. In reaction to the rising dominance of computers, men began entering into the field, though women continued to play a central role in it. Young women were readily encouraged to enter computer science. Some estimates suggest 50 percent of programmers were women. “It’s just like planning a dinner,” Grace Hopper told Cosmopolitan magazine in 1967 in an article titled “The Computer Girls.” Since the work was considered “feminine,” it’s no surprise that it was likened to domestic tasks. But this Cosmopolitan article also slices through the condescending comparison, declaring that while the glass ceiling remained firmly intact in other fields, women could readily rise to the top in computer science, a rare opportunity for the “fairer sex.”

By the 1980s, however, more “masculine” descriptions of computer science had replaced Hopper’s dinner party parallel; the narrative had changed. It was during this decade that computers were first marketed to civilian consumers on a mass scale, and who was their target audience? Men and boys. Parents began buying computers for their sons, not their daughters. Boys’ early exposure put them at an advantage to succeed in the field; they arrived at college with an advanced knowledge of computers and how to program them.

Among the first personal computers was IBM’s 5150. These devices were essentially toys capable of a few simple games and rudimentary word processing. Through relentless campaigns, advertising firms gave the computer a makeover. Overtime, it transformed into a “masculine” machine, fit only for the minds of logically-thinking men. Hollywood followed suit, inventing the nerdy, anti-social, boy genius, whose image became synonymous with programming. Technology companies began employing exclusionary measure to keep women out. They established exclusive professional organizations and developed personality tests that set men as the “programming type,” fueling discriminatory hiring practices. With the ideal of the male programmer reinforced through multiple platforms, it became the norm.

In 2015, nearly 70 years after women pioneered the field, their history and their contribution to computer science is largely forgotten. By the 1980s, the absence of women had become an integral aspect of the industry’s culture. Today, this still stands. However, in recent years, there has been a growing awareness of this gender disparity and a new found effort to bridge it. The first step in conquering contemporary inequality is to understand its roots. To bring women into computer science, we must first understand how and why they were cast out. We must prevent history from repeating itself. Learn more about this history, and the women and men who made it in Silicon City.