Experiments are an exercise in confidence leeching. Stuff will go wrong, you will screw up, you will break stuff. The most advanced thing I’ve ever done with my own two hands is attempt to train a rat in a box to press a lever. It didn’t. I failed even at that basic task. The thought of trying to lower a piece of glass roughly the thickness of a human hair through the brain until it touches the surface of a single neuron’s body, without breaking the glass, the neuron, or your own mind in the process doesn’t compute. Getting good data can be a process of confidence-sapping, head-banging floundering.

Analysing the data is even worse. Statistics is an exercise in keyboard-snapping frustration, compounded by most scientist’s truly terrible training in statistics, which is no fault of their own. Which were you trained in: the cookbook; the t-test/ANOVA solves all; or absolutely bugger all? Few people deeply understand the principles behind the statistical analyses they do. Even fewer use them correctly; judging by the torrents of anger statisticians aim at each other, that includes most of them too. The already low confidence of most working scientists faced with doing statistics is now compounded by the many high-profile pronouncements that we’ve all been doing it wrong for decades. But with no clear guidance on what to do instead: p-values but no significance level; p-values but more stringent significance levels; no p-values but confidence intervals; no p-values but effect sizes; none of that Fisher or Neyman-Pearson rubbish, use Bayes (factors) — for a different arbitrary number scale to interpret instead. Oh for the heady days of Rutherford’s dictum “if you need statistics, you’ve done the wrong experiment”.

Reading scientific papers is worse. To know what is known, we have to read the literature. Every new research paper we read reveals to us something we didn’t know before. Broaching a new research topic — say the sub-unit composition of the GABAb receptor, the response of dopamine neurons to reward or lack thereof, the algorithms of hierarchical clustering — is like swallowing a firehose of our own ignorance.

The mere existence of the literature is worse. Science is a crushing flood of papers. More papers are published in your own research field than you can ever read. And more are published every year, every month, every day. You can never catch up. The scale of your ignorance writ large by your PubMed and Google search results.

And finally, there are your peers. Scientists spend much, perhaps all, of their time with other scientists. This is not healthy. Other scientists are smart. They know things you don’t know; can do things you can’t do; can understand things you can’t understand. All of them. Thousands of them. Go to the annual Society for Neuroscience meeting and stand in the half-mile long poster hall, and there are about 10,000 scientists in that single room that know things you don’t.