It’s not always easy to know when we’re in the presence of “genius.” In part, that’s because we barely agree on what it means. In Roman times, genius was not something you achieved but rather an animating spirit that adhered itself to people and places. In the 18th century, Romantics gave genius its modern meaning: Someone with special, almost divine abilities. Today, we’re quick to anoint a “marketing genius” or a “political genius,” oblivious to the fact that true genius requires no such modification. In truth, real geniuses transcend the confines of their particular domains. They inspire and awe. Which is precisely why we should use the word sparingly, lest it lose some of its magic. That’s not the only misconception. Here are some others.

Myth No. 1

Genius is mostly about genetics.

In 1869, a British polymath named Francis Galton published a popular book called “Hereditary Genius.” As the title suggests, Galton argued that genius is determined by genetics, or what he called “inheritance.” That idea stuck. “Genes appear to have a big role in our intelligence and talents,” one website declares. Others breathlessly report that scientists have uncovered a gene that makes some people brilliant.

The truth, though, is that genius is not transmitted genetically like blue eyes or baldness. Genius parents don’t beget genius babies, and there’s no “genius gene.” Genetics is part of the mix, but only part. “Much of the literature concludes that hereditary factors play a minor role at best in the determination of creativity,” University of Minnesota psychologist Niels Waller wrote in Psychological Inquiry.

There are other factors, too. One is hard work — the “drudge theory” of genius. Others suggest that attitude matters as well. A study of young musicians found that it was not the number of practice hours students racked up that determined their success but rather their “long-term commitment .” In other words, genius requires a certain mind-set, an unflappable persistence.

Myth No. 2

Geniuses are smarter than the rest of us.

This myth is baked right into the Merriam-Webster entry, which defines a genius as “a very smart or talented person: a person who has a level of talent or intelligence that is very rare or remarkable.”

But many of history’s most eminent figures possessed only modest IQs. William Shockley, co-inventor of the transistor and a Nobel laureate, had an IQ of about 125 — respectable but hardly spectacular. The great physicist Richard Feynman also scored 125 — hardly what you’d expect from the subject of a biography titled “Genius.”

Genius, particularly creative genius, is less about raw intelligence and more about elevated vision. A creative genius, says artificial-intelligence expert Margaret Boden, is someone with “the ability to come up with ideas that are new, surprising and valuable.” Yes, some intelligence is required to do that, but beyond a certain point — an IQ of, say, 120 — greater intelligence yields fewer measurable gains in creativity, many psychologists believe.

Nor is genius necessarily about encyclopedic knowledge or impressive education. The share of Americans who completed at least four years of college jumped from about 6 percent in 1950 to 32 percent in 2014, yet we have not seen a commensurate increase in creative output. In fact, many geniuses either dropped out of college or, like the renowned British scientist Michael Faraday, never attended. Albert Einstein was a famously mediocre student. During his annus mirabilis, in 1905, when he published four papers that rocked the foundation of physics, his overall knowledge of physics was eclipsed by that of others working in the field. Einstein’s genius rested not with amassed knowledge but, rather, with his ability to make leaps of understanding that others couldn’t. Einstein wasn’t a know-it-all. He was a see-it-all.

Myth No. 3

Geniuses can pop up anywhere and at any time.

We tend to think of geniuses as the intellectual equivalent of shooting stars, beautiful to behold but essentially random. The Atlantic’s CityLab analyzed the birthplaces of MacArthur “genius” grantees and found that “winners were born all over the map.”

In fact, if you plot the appearance of genius over time and across the globe, you notice an interesting pattern. Geniuses do not appear randomly — one in Bolivia, another in Brooklyn — but, rather, in groupings. Genius clusters. Certain places, at certain times, produce a mother lode of brilliant minds and good ideas. Think of ancient Athens or Renaissance Florence or Paris in the 1920s — or, arguably, Silicon Valley today. These places were, in some ways, quite different, but they also shared certain characteristics. For starters, almost all were cities. The density and intimacy of an urban setting nurture creativity. All of these places, too, possessed an outsize degree of tolerance and “openness to experience,” the trait that psychologists have identified as the single most important for creativity. As Plato said, “What is honored in a country will be cultivated there.” Geniuses are less like shooting stars and more like flowers, a natural outcome of a creative ecology.

Myth No. 4

Geniuses are grumpy loners.

Pop culture is full of brilliant characters who fit this description. In “Searching for Bobby Fischer,” the main character is taught to play chess by genius misanthrope Bruce Pandolfini. The brainy William Forrester lives like a recluse for much of “Finding Forrester.” The Christian Science Monitor writes that “our culture loves the myth of the tortured, solitary genius — the man scribbling or painting or composing in a threadbare European garret.”

It is true that geniuses (especially writers and artists) are more likely to suffer from mental illness, particularly depression, compared with the population at large. But they are rarely loners. They seek out kindred spirits who can, at the very least, reassure them that they are not going crazy. Thus, the advent of the genius support group. Freud had his Wednesday Circle, Einstein the Olympia Academy. The French Impressionists held weekly meetings, outdoor painting sessions and other informal gatherings, all aimed at bolstering their spirits in the face of the regular rejection they received at the hands of the old guard. One study, by psychologist Dean Simonton of the University of California at Davis, examined the interpersonal relations of some 2,000 scientists. The more eminent the scientist, Simonton found, the more interactions he or she had with other eminent scientists.

Geniuses do cherish times of solitude, and they often toggle between those moments and more sociable ones. David Hume, the Scottish philosopher, would spend weeks holed up in his study, reading and pondering, but then he would emerge and head straight to the local pub, “absolutely and necessarily determin’d to live, and talk, and act like other people in the common affairs of life.” Conversely, Beethoven would regularly escape bustling Vienna for long, solitary walks in the verdant Wienerwald, where he found musical inspiration.

Myth No. 5

We’re smarter now than ever.

College attendance rates and IQ scores are higher than ever, leading many to conclude that we’re living in the golden age of genius. Despite grumbling from “geezers,” a Berkeley sociologist writes, “Americans have been getting more ‘intelligent’ over the generations.” And the Telegraph reports that “humans have been getting steadily more intelligent for at least 100 years.” This misconception is so popular that it even has its own name, the “Flynn effect.”

Don’t bet on it. Admittedly, comparing creative output across the centuries is tricky. We need the perspective of time. People of every era believe that theirs is golden. We are no exception. Sure, we’ve seen tremendous advances in digital technology and the emergence of possible geniuses such as Steve Jobs and Elon Musk. But the jury is out on our goldenness. In the sciences, momentous leaps, such as Darwin’s theory of evolution or Einstein’s general theory of relativity, have been replaced by impressive but incremental advances — important, yes, but nothing that alters our understanding of the universe and our place in it.

Over the past 70 years, the scientific community has published exponentially more research papers, “yet the rate at which truly creative work emerges has remained relatively constant,” historian J. Rogers Hollingsworth writes in the journal Nature. We are producing a greater number of competent scientists, talented ones even, but not necessarily more geniuses.

We are also producing an unprecedented amount of data, but that is not to be confused with creative genius. After all, if genius were simply a function of the amount of data at your fingertips, then every smartphone owner would be another Einstein. In fact, there’s some evidence that the flood of gigabytes that washes over us every day may be hindering genuine breakthroughs. At least one study found that it’s more difficult for us to detect patterns when bombarded with an excess of data. This is troubling, for if there is anything that distinguishes genius, it is the ability to look at what everyone else is looking at — and see something different.

Twitter: @Eric_Weiner

Five myths is a weekly feature challenging everything you think you know. You can check out previous myths, read more from Outlook or follow our updates on Facebook and Twitter.