by Jesse Grushack , Blockchain Strategist at ConsenSys



What comes to mind when you think about the word data? Digital documents? Filing cabinets? Hard drives and memory sticks? Data is visualized in many shapes and sizes but, at the digital core, data is a bunch of 1’s and 0’s.



Where did this concept come from? And who are the nerds that understand that 10010101 actually equals 149? We need to go back to the very beginning, when , in order to win World War II, Alan Turing used the first computers, as show in the 2014 film The Imitation Game. Data was computed by punch cards. String together a number of these data cards and you need hundreds of people to decipher what’s being computed.



When digital computers were invented, we started to generate much more information, which required even more programmers. To save space, we began to store data on magnetic tapes. However, in order to read the data, someone (or, more likely, 50 people) had to translate it to paper.



The 1970’s made Standard Query Language (SQL) a reality. All of a sudden, ‘big data’ could be stored in tables and organized so that it was readable without necessitating hundreds of programmers. Companies and organizations started creating massive SQL databases. The problem was that these databases were not linked in any way to other databases from other organizations, these groups began stockpiling as much data as they could because as long as they didn’t leave the walls of the system, they didn’t need to double check the data.

