It may come as a shock to some, but scientific pronouncements aren’t nearly as trustworthy as they seem. Consider the tale of saturated fat. Until about the middle of the 20th century, the Western world was largely unconcerned with the effects of saturated fat on health. Fatty fish, butter, and eggs were seen as components of a healthy diet. However, studies funded by the vegetable oil lobby in the US started declaring saturated fat a killer, as well as the root of most cases of heart disease. This idea persisted for another 50 years and persists in some places to this day, despite the fact that the vast majority of medical researchers now believe the whole theory was worthless. In fact, saturated fat has been proven to be good for your health.

There were two causes for this failure of science. First of all, the initial research projects were funded by a lobby group with a vested interest in portraying saturated fat as unhealthy. But secondly, there were a lot of genuine mistakes and error-strewn conclusions drawn by honest researchers. This was in part because of the lack of efforts to replicate study findings to make them more robust.

Put simply, these scientific ideas weren’t tested rigorously or openly enough. And what is exciting right now is that crypto technology is playing a part in making sure this doesn’t happen again. It’s about time, since it is becoming more and more clear that this problem is decaying the very foundations of what we thought we knew to be true.

Back to scientific basics

It has long been known that observational anomalies can happen in science and how important it is to test and retest findings. This goes to the core of the philosophy of science, as explained by two of the great scientific philosophers, Karl Popper and Ronald Fisher:

Only when certain events recur in accordance with rules or regularities, as in the case of repeatable experiments, can our observations be tested—in principle—by anyone… Only by such repetition can we convince ourselves that we are not dealing with a mere isolated ‘coincidence,’ but with events which, on account of their regularity and reproducibility, are in principle inter-subjectively testable… Non-reproducible single occurrences are of no significance to science. [1] We may say that a phenomenon is experimentally demonstrable when we know how to conduct an experiment which will rarely fail to give us statistically significant results. [2]

Thus, that which is not reproducible is not useful for scientific advancement. And as we saw with the saturated fat example, scientists often have a reason to generalize from a narrow observation and create flimsy new laws of science.

The answer to this problem is obvious: the more times an experiment is replicated, the more trustworthy any findings become.

Enter cryptographically-backed platforms

Advances in crypto technology enable decentralized verification of information on a massive scale in an efficient way. This is well-suited to researchers declaring their premises, data samples, and analyses available for replication and double checking. Now researchers that are collaborating with others or investigating results of other researchers can easily parse their data. More importantly, the immutable nature of ledger technology ensures that researchers do not move the goalposts to suit their results.

The more interesting aspects of this challenge are being tackled by a specific crypto paradigm that uses Directed Acyclic Graphs (DAGs) which are like blockchains but more amenable to large and complex datasets. Companies like CyberVein are making it possible to efficiently record large datasets on a ledger easily. And with minimal fees.

To do so, CyberVein employs DAGs (which work like blockchains but don’t require all nodes to carry and confirm a full copy of the entire transaction history, as happens with the Bitcoin blockchain) as well as a different consensus model known as Proof-of-Contribution (PoC, which is more efficient than the more common Proof-of-Work or Proof-of-Stake mechanisms). As explained by its spokesman:

On CyberVein, nodes are only required to store data shards relevant to their own transaction history and the smart contracts they are parties of. With this approach, CyberVein is able to store entire databases as smart contracts which are permissioned to their owners and participants, without congesting the rest of the ledger.

In practice, this means that users of CyberVein will be able to record experimental data directly onto their DAG database. It can then be reused in connection with other research (making citation easier, or even permitting the reuse of the data in different analyses) and also for easier peer review in which reviewers have decentralized access to the relevant data.





With these leaps forward in collective computing, new solutions are being found every day. Science has a replicability crisis at the moment, but it looks like another branch of science could be coming to the rescue.

*

[1] Karl Popper (1959) “The logic of scientific discovery”

[2] Ronald Fisher (1935) “The Design of Experiments”

Image(s): Shutterstock.com



