On this particular night, Wellford waited the 10 minutes he thought it would take his daughter to get home, but he didn't hear anything. At first, he figured it was no big deal. He rationalized that she must have gotten delayed. But then 20 minutes, went by and the phone was still silent. At 30 minutes, Wellford tried calling his daughter. Nobody answered. He waited a few minutes and dialed the number a second time. The phone rang. The voicemail picked up. This was the point where Charles Wellford really started to worry.

Charles Wellford is a professor of criminology and criminal justice at The University of Maryland. He's a scientist who studies how social systems work, an expert in the process of homicide investigations. He knows far more about crime than the average American, but that doesn't stop him from being scared in a very normal, average way.

"Our daughter lives about a mile from us, in a rural area. One night, while her son and husband were away, she comes over to visit. She's over 40 now, but still, when she leaves, I say, 'give me a call when you get home'."

"So I went upstairs and I got a revolver and got in my car and drove out there," he told me. "I pull up, and her car is there and all the lights are on everywhere. Now I'm convinced – somebody was in the house. Someone else was there when she got home. I get the gun and I start walking towards the house. And that's when my daughter comes out of the barn," he said.

"She'd just started doing chores and she'd forgotten to call."

This is more than just a story about a jumpy father, worried for his child's safety. It's a story that illustrates how complicated and flawed the science on gun use and gun violence is in the United States.

If you were studying gun use, and you wanted to know how often guns were used in self-defense, how would you categorize Charles Wellford's experience?

If you look at real-world research, Wellford said, the answer is far from consistent. Some research papers would classify his story an example of defensive gun use. Others wouldn't. And that difference in definitions is part of why we don't have solid answers to the big questions about gun violence, gun ownership, and the effects of gun laws.

Wellford doesn't study guns, himself. But in 2004, he served as the chairman on a National Academy of Sciences panel that reviewed a huge amount of gun violence research and presented a sort of "state-of-the-field" report summarizing what we know, what we don't know, and why.

The results were less than glowing. In the executive summary, the committee wrote that, despite lots of research, it was still impossible to answer some of the most pressing questions surrounding gun violence. The paper does its best to praise researchers for the good work they have produced – this isn't a situation where we know absolutely nothing about gun use, gun ownership, and the impact of gun laws. But the committee members I spoke with were also critical of the field, and say that the confidence politicians, lobbyists, and activists put in this research is seriously premature. Gun violence research suffers from a lack of consistently recorded data and, for that matter, a lack of data, in general. As John Pepper, associate economics professor at The University of Virginia and the study director on the 2004 report, put it, "The data are just terrible."

Worse, critics say the methods used to analyze that data are also deeply flawed in many cases. What you end up with, researchers told me, is a field where key pieces of the puzzle are missing entirely and where multiple scientists are reaching wildly different conclusions from the exact same data sets. For instance, because of those differences in the definition of "defensive gun use" some researchers will tell you that Americans use a gun to defend themselves something like 1.5 million times every year. Others say it happens maybe 200,000 times annually.

That kind of variability does not create an environment where it is easy to craft evidence-based policy, and the situation has not improved since 2004, Wellford said.

A couple of months ago, I wrote a short piece here at BoingBoing, briefly addressing these issues. That piece was written quickly, mostly by reading a few review analyses. Because gun violence – and how to deal with it – continues to be a major issue in our society, I wanted to come back to these questions and dig a little deeper. We know that gun violence research is deeply flawed. We know that it cannot currently answer the questions we need it to answer. But why? What, specifically, is missing? What about this field is broken? And how do we fix it?

According to scientists who do gun research, scientists who were involved in the National Academies review, and scientists who study the way other scientists do research, there are two key problems. First is the issue of missing and poorly matched data. Second, there are also serious problems with the mathematical models scientists use to analyze that data, and with the type of conclusions they attempt to draw from it. In this first of a two-part series, I'm going to focus on the data.

About 11,000 Americans died at the end of a gun in 2010. We know that because the basic, Clue-esque information on who is killed, where, and with what gets documented by local law enforcement agencies – all of which is, in turn, compiled by the FBI into the Uniform Crime Report. This system has been around since 1930.

The other primary source of this kind of information is the CDC's National Violent Death Reporting System. It's been around since 2002 and collects more-detailed information than the Uniform Crime Report. For one thing, it includes suicides. When I say that guns killed 11,000 people in 2010, I'm only talking about deaths that were classified as homicides. Another thing the CDC records do is link deaths to other pieces of information – like previous domestic violence calls — that can help researchers understand what lead up to the death. Unfortunately, only 18 states participate in that system.

In 1989, the FBI also started collecting more-detailed reports of crimes – including crimes that might involve a gun, but not be homicides – as part of the National Incident Based Reporting System. But that system is still used by only a small minority of law enforcement agencies.

Taken all together, these reporting systems give scientists a place to start. But it's just that. A place to start. It's a nice diagram of your street. It's not a road map showing you the way to your cousin's house in Cleveland.

One of the big problems is something that you've already seen here – definitions. How one person collecting data classifies a type of crime can be different from how somebody else does it, and neither of those might really capture the details of specific cases.

Mark Hoekstra is an associate professor of economics at Texas A&M University. He's been studying the effects of stand-your-ground laws – legislation that changes the way the law expects people to act when they feel threatened. Historically (and this is dating back to English common law), you were expected to remove yourself from a threatening situation, rather than attacking the person you felt threatened by … unless the situation happened within your own home. Stand-your-ground laws basically expand the places and situations where it's legally acceptable to go straight to "fight" without first attempting "flight".

So what happens when a state institutes a stand-your-ground law? A good way to study this, as you might guess, is to start by looking at the rates of justifiable homicides and the rates of criminal homicides and see how each change after the law takes effect. The good news is that the FBI has a standardized definition of what "justifiable homicide" means.

The problem: The FBI definition doesn't necessarily capture the full story of what's going on. The FBI calls justifiable homicide "the killing of a felon during commission of a felony", Hoekstra said. There are only about 200-300 of those reported annually in the entire country, he said. But nobody knows whether that is because justifiable homicide is actually rare, or whether it's more common, but not captured by the reporting system. Remember, what's happening here is that somebody puts another tick mark under one category or another. The details of how specific shootings happened and why don't usually make it into the record.

It's easy to imagine lots of situations that wouldn't fit neatly into the FBI definitions. "Like one guy breaks a beer bottle and hits the other guy with it, and the guy who got hit shoots and kills the first guy," Hoekstra said. "According to the FBI handbook, that's not legally justifiable. But you don't know the specific details of the case. In reality, you can imagine a situation where that scenario was deemed justifiable. You can also imagine a situation where it would be criminal and the guy would go to prison."

That makes it difficult for people like Hoekstra to study justifiable homicide, and it makes it difficult for lay people, like you and I, to understand what's going on when we hear about stuff like this in the news or see statistics repeated on a Facebook JPEG. There's a lot of room for people and organizations to take a concept – what happens when states institute stand-your-ground laws, say – and fiddle with different ways of counting until they end up being able to make the statement they want to make. What's more, those folks can all probably make a decent case for why they chose to tally up the numbers the way they did. It's not really as simple as someone lying to you and someone not. At least, not always. When data and definitions don't capture the full story, it leaves room for reasonable (and unreasonable) people to group the numbers in different ways.

Whether you think it's the guns or the people that kill people, you're bound to agree that homicide isn't the only kind of violence guns end up involved in. Guns are part of burglaries. They're used as a threat in of some kinds of rape. They're used to harass and intimidate victims of domestic violence. Sometimes, people who are shot with guns don't die. Sometimes, people shoot themselves, whether accidentally or intentionally.

All of those things are, presumably, affected in some way by the availability of guns and by the regulations that we place on guns. This isn't just about people killing one another. But research on gun violence tends to focus on homicide. And there's a very good reason for that.

"Start with deaths and go down from there to shooting yourself in the hand," Charles Wellford explained. "As you go down that continuum, the comprehensiveness and quality of the data decreases."

There's a lot we just don't know when it comes to how guns are used and misused in a whole range of violent events. The simple explanation is that a dead body is hard to hide. Murders get reported to police. The police generally follow up on those cases and report them to the FBI. Other crimes are much more of a patchwork, said John Donohue, professor of law at Stanford Law School. People may or may not call the cops to report domestic violence or an assault by someone they know. If the cops are called, the situation may or may not be taken seriously enough that it's logged in any meaningful way. And if the violent incident in question isn't technically a crime – shooting yourself in the foot, for instance, or drunkenly blowing a hole in your mother-in-law's garage on the 4th of July – there's no reason why that information would be reported to the FBI's Uniform Crime Report, to begin with.

All those things matter very much to the people who are trying to figure out how guns are used in our society and how gun use changes over time. But there's not really a solid, nation-wide, uniform way of tracking any of it. So what we say we know about gun violence is almost always just a synonym for what we know about gun murders.

And that's not the only information that is just flat-out missing.

Think about right-to-carry laws, which allow licensed individuals to pack heat in a holster or handbag, or even just slung over their shoulder at a JC Penny. Scientists like Donohue and Hoekstra study the effects of those laws by analyzing data on crime statistics – murders, and whatever else happens to be available in the states they're researching. That information can help them get an idea of what's going on. But to really understand how the specific conceal-carry laws affect those crime statistics you would need to know what people are actually doing with their newfound rights. How many people were carrying guns last year? How about this year? How often do they carry them? Where do they take them? That data simply doesn't exist, Wellford told me.

Another thing we don't have is reliable, long-term data on where the guns that are actually used in crimes come from. One of the ways we legislate gun use is through registration programs and systems that limit who can buy a gun legally. But if we don't know whether guns used in crimes are purchased legally, illegally, or purchased legally and then sold or given illegally to a third party, we have no idea how to craft those laws or even if they make any difference at all.

Finally, consider the question of whether more guns in the hands of law-abiding citizens serves as a deterrent to criminals. That's a pretty basic argument that many people make, and scientists try to answer that question using lots of different methods. (For the record, the National Academies report came to the conclusion that the research is currently inconclusive on this. Right now, we don't know whether having more guns means less crime, or more crime, or whether it has any effect at all. The research is all over the place and nobody has made a strong enough case to be conclusive.)

But here's one thing nobody has ever done: Find out what the criminals think. That same issue also came up when John Pepper was involved in a National Academies panel considering research on the death penalty. "If you think about whether it has a deterrent effect, we know almost nothing, because we know almost nothing about how offenders perceive the risk of execution," he told me. And the same is true of the risk of being shot by a potential victim.

Sixty years ago, nobody really knew how America had sex. Sure, scientists could guess sex was happening, based on the basic population numbers collected in the census. But who was doing it, when, with whom … that was all lost in the mists of incredibly awkward conversations that nobody wanted to have. Figuring out ways to collect and compile that data was a daunting task. And, in fact, a lot of people likely would have thought it was pretty invasive for scientists and government entities like the CDC to even want to know the answers to those questions.

But here we are, in 2013, and even if we don't know exactly what people get up to between 9:35 and 9:37 on a Wednesday night, we do know a lot more about American sex habits. More importantly, we know how those sex habits affect other parts of people's lives, and we know a lot more about how public policy affects both sex and quality of life. That matters. It's uncomfortable, potentially invasive research that actually makes us aware of rapes and sexual assaults that go unreported in crime statistics. It's that research that helps us track STD rates, and makes sure we notice when those patterns change for the better or worse. Research on sex means that we know more about teen sex, teen pregnancy rates, and how to reduce the latter.

"We made progress," Charles Wellford told me. "There are lots of examples of difficult measurement issues and we didn't just throw our hands up and walk away from them."

We can solve the problems with gun violence data, scientists say, but it's going to take funding and it's going to take political willpower. There are a few key solutions that the researchers I spoke with suggested.

First, we need to expand the crime reporting systems that track a broader range of incidents and collect more detailed accounts of what actually happened in those incidents. That means expanding the CDC's National Violent Death Reporting System from 18 states to 50. And it means getting more local law enforcement agencies using the FBI's National Incident Based Reporting System. The basic Uniform Crime Report has been useful, they say, but it's time to bring this kind of reporting into the 21st century.

The harder task is going to be finding ways to collect a new kind of data. Wellford calls it the "left side variables". If you think about the relationship between crime and guns as an equation, he said, all we really have right now is the information in the right-hand side of that equation. We have data on the occurrence of gun violence. What we're missing is all the stuff that connects people to those guns.

"None of the surveys used to study other crimes, where you could include information about guns and then link that up to other things we care about like crime, labor markets, schooling outcomes … we just don't have the data," John Pepper said. "Take a simple question about correlation between gun ownership and crime, or gun ownership and suicide. We can't even answer that."

There are two ways to study questions like those. If you had a survey or some reports that could tell you how many gun owners in the state of Virginia had committed suicide, then you could compare that to suicides among people in Virginia who didn't own guns. Alternately, you could take broadly aggregated data about how many suicides happen in the state of Virginia and broadly aggregated data about gun ownership rates in the state of Virginia, and you can compare those statistics to other states. You can easily tell that the former method is going to produce a much more accurate estimate of the relationship between gun ownership and suicide than the latter. But we have no way to do that.

Creating a system that allows scientists to gather that data might be objectionable to some people who own guns. But think of it this way. Right now, whatever your beliefs on guns happen to be, it's incredibly difficult to back them up with solid science. If you want to be able to make any kind of statement about gun ownership and the effects thereof – and have anybody who doesn't agree with you 100% actually take you seriously – then you should support better data. This should be the first step. Because right now, we don't know enough to know definitively what effects guns have, or what effects gun policies have.

Better data would help that. But, unfortunately, it's not the only thing that needs fixing. In my next post on gun violence research, I'll focus in on the way scientists analyze data. To avoid misleading conclusions, we need good mathematical models. But some experts say we don't have those. So what does that mean for the research scientists are publishing? And what does it tell us about the usefulness of evidence-based policy making, in general? Stay tuned.