We used to believe in truth, and indeed the world we live in is founded on the accurate interpretation of facts. Then came post-truth, and now we are asked to believe rather than to know. We could all see the photographs of Donald Trump’s inauguration, and read the data about attendance from the police and park rangers. But the President of the United States told us he had the biggest inauguration crowd ever, and demanded that this was believed.
Data = truth
Since then the market value of truth has plunged even further, drowned out by fake news and fake fake news. We still collect and generate truth – also known as data – at such an incredible rate that no one can actually measure it, but who believes what we collect? Some analysts reckon we create 2.5 quintillion bytes of data every day (that’s 1030). Another way of looking at how much data – also known as truth – that we deal in is the number of searches conducted every day: currently standing at around 5 billion, although that figure has probably jumped upwards since I looked a few minutes ago. Google alone processes 40,000 searches every second of every hour of every day.
And everything is stored, forever.
But what do these avalanches of data mean, if truth is so devalued? In the financial world we have been led to think that the truth is always in the numbers, but that’s far from certain. It’s now 10 years since the shocking and apparently sudden collapse of Lehman Brothers in the USA – something the regulators said was completely unavoidable, because the truth was in the figures, and the figures said that Lehman was impossibly broken.
Now in this anniversary year the data for the collapse has been raked over again and it appears that Lehman’s bankruptcy was very far from unavoidable, because the company was essentially solvent. Stressed, but solvent. The Federal Reserve had already rescued the Freddie Mac and Fanny Mae Government-Sponsored Enterprises because they were ‘Too big to fail’, but allowed Lehman Brothers to go to the wall for political and ideological reasons, rather than strictly because of facts.
What happens when crypto becomes “too big to fail”?
And what has this got to do with the cryptosphere? Well it is clear that we are increasingly under the scrutiny of governments and regulators, but is it the data or the politics which they will assess if an enterprise goes under? Right now there’s no business in the cryptoworld which is ‘Too big to fail’, but there surely will be sooner or later. And then, in the trillions of gigabytes of data, how will anyone be able to pick through the story of an individual ICO or STO and decide that the business was conducted truthfully… or otherwise?
Right now individual enterprises are small and the crypto ecosystem forms only a fraction of financial transactions worldwide. But the time will come when an offering actually becomes TBTF, and then will regulators comb through the data using their heads or – as in the case of Lehman Brothers – make their decisions on the basis of their political beliefs?