What would happen if scientists stopped trusting each other? Before trying to answer this question, I'll explain why it has been on my mind. Science fraud, questionable research practices, and replication have got a lot of attention lately. One issue common to all of these discussions is trust. Scientists are asking: can we trust other scientists to be honest? Is peer review based on trust? Is the act of discussing these issues itself eroding trust? What can we do to restore trust?

But what is trust, in a scientific context, and where does it come from? Let's consider the most common kind of scientific communication, the experimental paper. In a paper, the authors assert that they did a certain experiment, that they found certain results, and that these results imply a certain conclusion. Now, scientists are generally not supposed to take the last step of this chain on trust. We're supposed to be skeptical of claims that results imply particular conclusions, and we're expected to evaluate conclusions critically on the strength of the results. Applying this kind of critical analysis is a big part of peer review. By contrast, scientists are expected to trust that the authors are telling the truth about the methods and the results. We can challenge the authors' interpretations, and we can even interpret the results as meaningless artifacts, but we can't suspect the authors of lying about the hard facts of what they did and what they found. But why not? Why should I believe the authors? They might well have an incentive to lie. 'Nice' results get published in hot journals, and that gets you promotions, grant money, and influence. So why believe someone when they claim to have got some nice results? I think there are three possible reasons to believe. One reason is what I'll call idealistic trust. In this case, we trust a given person because of who they are. We believe that they would not deceive us. 'I can't believe a fellow scientist would do such a thing'. Such trust is implicit. I think it's this, idealistic, kind of trust that people are worried about when they speak of trust in science being damaged by scandals and fraud cases. But there's another kind of trust. I can reasonably trust your data if I can be confident that, were it fraudulent, this fraud would be discovered, sooner or later, and that you would be punished for it. In other words, I can trust you if I believe that it is not in your interests to lie. We could call this pragmatic trust. Unlike idealistic trust, this doesn't require me to have a high opinion of you as a person. I might see you as a crook who would happily commit fraud if you thought that you could get away with it - but so long as I believe that you wouldn't get away with it, I can trust you. I think that both idealistic and pragmatic trust exist in science today. But it would seem that if one of these kinds of trust were to decline, the other would need to be bolstered to maintain trust in science overall. If we can't trust each other for idealistic reasons, we'd need tougher investigation and enforcement of misconduct, to make pragmatic trust work. Or is there an alternative? There's transparency. Transparency removes the need for trust, by allowing readers to see the evidence with their own eyes. For example, if the authors provide the raw data, instead of just the summary results, this might allay my doubts. It's easy to fabricate numbers in a spreadsheet. It's harder, perhaps impossible, to fabricate fMRI data, microscope images, handwritten questionnaires. As fraud-busting statistician Uri Simonsohn put it in the title of one of his papers, when it comes to raw data we should Just Post It. But - would even this be enough? It may be difficult to conjure data out of thin air. But that doesn't mean it's difficult to misrepresent data. I might conduct an experiment using certain methods, and then present the results as if I had used different methods, implying a false conclusion. For instance, I might secretly 'spike' some of my samples in such a way as to change the results of my tests. Such subterfuge really happens. When it's been suspected, it has sometimes led to Orwellian measures - people have resorted to placing cameras around the lab to record what scientists are up to. It seems to me that if scientists stopped trusting each other, such Big Brother measures would be the only way that scientists would be able to convince each other of their claims. I suspect that few researchers would be willing to work under such conditions.