Genetic engineering has improved over the decades, but for years the process was expensive, time consuming, and inexact. That all changed in 2012, when a paper in Science unveiled the revolutionary gene-editing system known as Crispr-Cas9. Described as a “game changer,” this new technology has offered precision genomic-editing capabilities that are cheap, easy, and fast. Laboratories around the world have rushed to embrace it. But like any new technology, this one has also raised ethical and security concerns. What does it mean for human society, or the planet in general, to introduce potentially permanent changes to Homo sapiens or another population? At a recent meeting in Washington, an international gathering of scientists went so far as to call for a halt in the use of Crispr-Cas9 on the human genome until more was known about potential risks. Jennifer Doudna, the UC Berkeley scientist who co-discovered the technique, has done the same.

This is not the first time scientists have worried about genetic engineering. In the 1970s, scientists and policy makers developed national and local entities to oversee the safety of experiments involving genetic engineering (namely, the Recombinant DNA Advisory Committee at the National Institutes of Health and the network of Institutional Biosafety Committees at individual institutions). These oversight entities focus on biosafety and biocontainment, but not on the ethics of the experiments—i.e., whether or not they should be done at all. So far, scientists have been the primary discussants on these issues, which is akin to having the hammers watch the nails.

Indeed, despite the important work bioethics has done over the years, there is a gaping hole in the study and oversight of life sciences research, especially when it comes to asking if ethical or security risks outweigh potential benefits. This is an area ripe for research and discussion that should become an interdisciplinary academic discipline. In addition to the sciences, the humanities have much to contribute and should be an integral part of the deliberations, helping policy makers understand what it means to be human in a rapidly changing world.

There have been some efforts to include humanists in the conversation. For example, the global “Beings 2015” summit, held in Atlanta in May, discussed the ethics of biotechnology. Margaret Atwood, a science fiction author who has written about dystopic futures, and Ruha Benjamin, a Princeton University professor who writes about the tension between scientific innovation and equity, were among the speakers. Humanists provide unique perspectives of the potential impacts of scientific innovations, both good and bad, to human societies and to the planet’s ecosystems. If the National Institutes of Health were to develop oversight panels to evaluate whether research proposals were ethical or not, they should include humanists with training in basic science bioethics. Of course, oversight panels should be established in all countries conducting such research.

In the meantime, technology continues its inevitable charge ahead. Scientists have already used Crispr-Cas9 to edit the genomes of a variety of organisms, including bacteria, yeast, crops, research mice, livestock, and human cells, with the goal of improving food, developing new therapeutics, and eliminating diseases. These alterations are expected to benefit humanity, of course, but there is no reason to rule out more maleficent uses, not to mention unintended consequences. Crispr-Cas9 has the potential to genetically alter entire groups of humans or animals. Changes can “speed through a population exponentially faster than normal,” as a recent Nature article put it. “Not since J. Robert Oppenheimer realized that the atomic bomb he built to protect the world might actually destroy it,” remarked a recent New Yorker article about Crispr, “have the scientists responsible for a discovery been so leery of using it.”

The reasons should be obvious.

Chinese scientists have already used the new technology to edit the genomes of human embryos, raising the specter of genetically altered humans and the genetically altered progeny they’d produce. Even the potential for such uses is enough to conjure up chilling sci-fi scenarios that now seem to have an eerie echo in science fact. In his 1932 novel Brave New World, Aldous Huxley warned of a future in which society-wide genetic planning has created a new caste system, with lesser groups engineered for low intelligence and satisfaction with dismal lives. In the 1997 film Gattaca, society is split between the genetically enhanced and the old-fashioned humans. The unenhanced, not surprisingly, face lives of discrimination and menial labor. It is easy to imagine how a society could slide down this path. Parents rich enough would certainly want their children to be born with enhanced intelligence, physical beauty and prowess, and reduced risk of disease. The poor would inevitably be discriminated against and forced to fill undesirable jobs. This sad situation already exists to some extent, but one could imagine it becoming exponentially worse once genetic enhancements become the norm. If, as some think, inequality creates violent clashes and outbursts, how will human civilization cope with yet another social division?

Advances in science are moving fast, and a future of genetically altered organisms, including humans, is becoming a very real possibility. Experts other than scientists must engage with these issues as a reality of the near future, if not the present. It’s no use hoping the technology will monitor itself or remain the stuff of science fiction. Even if some governments prohibit certain areas of research, that won’t stop wealthy individuals or corporations, not to mention less scrupulous governments or militaries, from funding experiments of their own. And with the world evermore interconnected, changes to one population could easily affect others. Crispr-Cas9 has the potential to change the world as we know it, which is why we need to oversee this technology with extreme caution.