Your faithful Sciency News correspondent began using computers when 16kByte was a good amount of memory, hard-disks were the size of refrigerators, and monitors were a luxury. By contrast, today’s disks are over a terabyte, storage of petabytes and exabytes is not unusual, and perhaps the zettabyte is not so far away. That’s more than 100 times bigger in only 3 decades – an extraordinary achievement.
Now IT has introduced “big data”. But this understated term is far too often overstated.
We need to look closely at something as big as the hype of big data. Does it measure up? Is the hype being overhyped? Will overinflated expectations be trashed in the trough of disillusionment like a fragile metaphor in the hands of a poor writer caught in the vortex of a slow tsunami? Does all this raise more questions than it answers?
Talking with a friend at a recent party it was clear that he was exaggerating many things and especially the size of his big data. Sure, he drove a red sports car and he was consulting the heck out of big data. But it was clear that it was smaller than claimed.
To get a grip on the issue I contacted a big data analytics expert who I will call “Andy” to protect his ignorance. Andy described big data as “any amount of data too big to fit in a single Excel page so long as the client couldn’t make sense of it.”
After this exhaustive investigate it is clear the whole industry has an big image problem. Going forward I’ll be looking back at this article in the context of the future.
∫∞