Opinion: Science’s Acute Replicability Issue and the Crypto Antidote

July 8, 2018 15:52 UTC

It may come as a shock to some, but scientific pronouncements aren’t nearly as trustworthy as they seem.

Consider the tale of saturated fat. Until about the middle of the 20th century, the Western world was largely unconcerned about the effects of saturated fat on health. Fatty fish, butter, and eggs were seen as components of a healthy diet. However, studies funded by the vegetable oil lobby in the US started declaring saturated fat as a killer; the root of most heart disease. This idea persisted for another 50 years and persists in some places to this day, despite the fact that the vast majority of medical researchers now believe the whole theory was worthless and in fact saturated fat has been proven good for your health.

There were 2 causes for this failure of science. First of all, the initial research projects were funded by a lobby group with a vested interest in portraying saturated fat as unhealthy. But secondly, there was a lot of genuine mistakes and error-strewn conclusions drawn by honest researchers. This was in part because of the lack of efforts to replicate study findings to make them more robust.

Put simply, the scientific ideas weren’t tested rigorously or openly enough. And what is exciting right now is that crypto technology is playing a part in making sure this doesn’t happen again. And that’s about time, since it is becoming more and more clear that this problem is decaying the very foundations of what we thought we knew to be true.

Back to scientific basics

It has long been known how observational anomalies can happen in science and important it is to test and retest any findings. This goes to the core of the philosophy of science, as explained by two of the great scientific philosophers, Karl Popper and Ronald Fisher:

“Only when certain events recur in accordance with rules or regularities, as in the case of repeatable experiments, can our observations be tested—in principle—by anyone.… Only by such repetition can we convince ourselves that we are not dealing with a mere isolated ‘coincidence,’ but with events which, on account of their regularity and reproducibility, are in principle inter-subjectively testable.”

“Non-reproducible single occurrences are of no significance to science.”

Karl Popper (1959) “The logic of scientific discovery”

“We may say that a phenomenon is experimentally demonstrable when we know how to conduct an experiment which will rarely fail to give us statistically significant results.”

Ronald Fisher (1935) “The Design of Experiments”

Thus, that which is not reproducible is not useful in scientific advancement. And as we saw with the saturated fat example, scientists often have a reason to generalise from a narrow observation and create flimsy new laws of science.

The answer to this problem is obviously clear; the more replication of an experiment the more trustworthy any findings become.

Enter cryptographically-backed platforms

The advances of crypto technology enable decentralised verification of information on a massive scale in an efficient way. This is ideally suited to researchers declaring their premises, data samples, and analysis available for replication and double checking. Now researchers that are collaborating with others or investigating results of other researchers can easily parse the data that went through the whole process. More importantly, the immutable nature of ledger technology ensures that researchers do not move the goalposts to suit their results.

The more interesting aspects of this challenge are being tackled by a specific crypto paradigm that uses Directed Acyclic Graphs (DAGs) which are like blockchains but more amenable to large and complex datasets. Companies like CyberVein are making it possible to efficiently record large datasets on a ledger easily. And with minimal fees.

The way CyberVein have designed their platform to allow this is by using DAGs (which work like blockchains but don’t require all nodes to carry and confirm a full copy of the entire transaction history as happens with the Bitcoin blockchain) and also a different consensus model known as Proof-of-Contribution (PoC, which is more efficient than the more common Proof-of-Work or Proof-of-Stake mechanisms). As explained by their spokesman: “On CyberVein, nodes are only required to store data shards relevant to their own transaction history and the smart contracts they are parties of. With this approach CyberVein is able to store entire databases as smart contracts which are permissioned to their owners and participants, without congesting the rest of the ledger.”

In practice, this means that users of CyberVein will be able to record experimental data directly onto their DAG database. This can then be used again towards other research works (making citation easier, or even actual reuse of the data for different analysis) and also for easier peer review in which the reviewers have decentralised access to the relevant data.

With these leaps forward in collective computing, new solutions are being found every day. Science has a replicability crisis at the moment, but it looks like another branch of science could be coming to the rescue.

Featured image from Shutterstock.

More of: blockchainCrypto

Gerald Fenech is an established journalist with more than 15 years experience in the financial, economic and business fields. Since August 2017 he has been heavily involved in cryptocurrency and blockchain journalism writing for several news sites and avidly following the crypto space.