It’s 2039, and the Jacksonville Jaguars are fresh off their win in Super Bowl LXXIII, their third straight championship victory. The team’s genetic consultants are meeting to go over the genotyping data of the management’s favorite draft picks. The team is looking to add an outside linebacker to their defense, but their favorite pick has a worrisome genetic variant that suggests he’ll be slow to recover from joint injuries. A promising running back, who could be part of a trade deal, has a rare combination of DNA mutations that appear to have a protective effect against concussions, but there is some evidence that players with these mutations are more likely to be extreme risk-takers. The consultants briefly glance at the data on fast-twitch muscle variants; this information isn’t so useful with the players’ college performance record already in hand, but the genetic team likes to check for any surprising results.
Could this really happen? Will genetic testing become a routine part of our society’s decision-making, influencing professional sports, the military, parole boards, political campaigns, and our own health, education, and career choices?
You could be forgiven for believing that we’re almost there. The National Institutes of Health recently funded BabySeq, a project to sequence the genomes of several hundred newborns and determine the utility of having complete genetic information from birth. Companies are now offering a bewildering array of genetic tests directly to consumers, tests that go beyond estimating your genetic risk for devastating diseases like Alzheimer’s or breast cancer. These companies promise to tell you how much coffee you can safely drink, if you’ll go bald, how hard it will be to quit smoking, whether you should go for endurance sports, what kind of earwax you have, and information about hundreds of other traits. Many genetic tests now cost less than a new iPhone, and Harvard geneticist George Church recently complained that too few of us are taking advantage of the information that genetics has to offer. The personal genetics industry seems poised to become a big part of our lives. But how trustworthy is this information?
Would voters have re-elected a 73-year-old Ronald Reagan in 1984 if a genetic report suggested that he was highly likely to develop Alzheimer’s disease within five years?
Before answering that question, let’s step back and examine what it means to say that your DNA predicts something about you.
THE GOAL OF THE research behind genetic tests is not to figure out the detailed function of every gene. Instead, the goal is to find those parts of our DNA that make us different from each other. If you pick two individuals and compare their genomes, you’ll find millions of places, out of 3.2 billion, where their DNA isn’t the same. These differences include small “misspellings,” where one letter of DNA’s chemical alphabet has been substituted with another, as well as places where some of the DNA text has been duplicated, deleted, or new material has been added. Researchers have cataloged close to 100 million different mutations, or, as scientists call them, genetic variants in humans around the world.
The recent explosive progress in DNA analysis technology has made the process of identifying genetic variants almost trivial, and much less expensive than it was even five years ago. The hard part is figuring out which genetic differences matter. Large studies involving thousands of human subjects are generally required to get the statistical power necessary to link a genetic variant to a particular physical trait. These studies work by asking if the presence of the variant correlates with the presence of the trait, and in most instances, scientists have no idea why a genetic variant has the effect it does. Over the past decade, biomedical researchers have committed major resources to finding meaningful genetic variants, and thousands of studies reporting links between variants and traits have been published. However, as David Dobbs reports, many of these results turn out to be useless, either because they are statistical flukes or because they are only relevant to a limited subgroup of people.
Even when the impact of a genetic variant on a physical trait is genuine, there are important reasons why that variant might not tell you very much. Genetic variants don’t act alone; the impact of any one variant often depends on what else is in your genome. And then there is the environment. What effect your genetic variants have is intimately tied up with the foods you eat, the air you breathe, the drugs you take, your physical activity, and even the people you surround yourself with. It can be fiendishly difficult to predict what happens when different parts of your DNA interact with each other and with the environment, as some of my lab colleagues demonstrated in a study published in 2010. Studying genetic variants in a relatively simple organism, baker’s yeast, they found that “there were no simple rules” that could predict what influence these genetic variants had in the presence of other variants and in different environments. If researchers can’t easily say how genes influence the behavior of yeast in the lab, you can imagine how difficult it is to predict the influence of genes on humans in the wild.
Slowly, researchers are sifting out actionable genetic information and, some research institutions, like my home institution at Washington University, are aggressively using genetic data to help physicians make better medical decisions. But what about decisions in society at large?
IT’S NOT DIFFICULT TO imagine the explosive effects that genetically based decision-making could have on our society. Would voters have re-elected a 73-year-old Ronald Reagan in 1984 if a genetic report suggested that he was highly likely to develop Alzheimer’s disease within five years? How would a parole board handle a sexual offender with a stellar behavior record in prison, but who had a genetic variant linked with a high risk for recidivism?
In 2008, President George W. Bush signed the Genetic Information Non-Discrimination Act, which bans employers from using genetic information in their hiring and promotion decisions. The act is a useful step forward, but genetic information will likely be too tempting to ignore. Candidates for president voluntarily release their medical history to the public; in the future, they could be pressured into releasing their genetic information as well. U.S. law enforcement now routinely collects DNA from prisoners; it wouldn’t take much for an analysis of that DNA to influence sentencing and parole hearings. And aspiring pro football players might feel pressured to “voluntarily” submit their genetic data to stay competitive in the NFL draft (although for their own safety, athletes probably should be routinely tested for some genetic risks).
Aside from the obvious ethical quandaries, there is a paradoxical challenge facing society as genetic tests become routine: Some genetic results might not be reliable enough, and at the same time, others could be too predictive. A candidate’s political campaign could be sunk by the suggestion of a moderate risk for Alzheimer’s, while a convicted sexual offender up for parole could be doomed by a genetic variant that almost always correctly predicts who will be a repeat offender. For the foreseeable future, how we interpret genetic tests will come down to hype, confusion, and, sometimes, scientifically sound knowledge. Legally, ethically, and educationally, our society has only begun to prepare for the role that genes could soon play in our decisions.