There are many examples of applications that will drive us into the Brontobytes era because of sensor data. Some examples of these include the self-driving car, which I discussed before, which will generate approximately 1 gigabyte of sensor data per second. Airplanes are already generating 2,5 billion Terabytes of data all together. The London Underground Video Surveillance system generates approximately 2 Petabyte of data every day. Or what about the Square Kilometre Array (SKA) Telescope currently being build in Australia and South-Africa? The computing power required for this revolutionary radio telescope should be three times more powerful as the most powerful supercomputer in 2013. It will generate a staggering 1 Exabyte of data every day.
What started with just a few kilobytes of data, the Appolo Guidance Computer had only 4kb of memory, has quickly expanded to Gigabytes and Terabytes. And now we are approaching the era of Zettabyte, Yottabyte, Brontobytes and even Geopbyte. A Geopbyte is 10ˆ30, which is an unfathomable amount of data. As a comparison; 1 Geopbyte is as much data as 15 million trillion 64GB iPhone’s 5S. Truly Big Data.
It may be clear, we have entered the data era and it is expected that the amount of data generated will only grow exponentially from here on. The amount of data being created on a daily basis will multiply many times in the coming years. In such a data-driven world, data becomes a commodity and we will need to get used to massive amounts of data.
All that data also needs to be stored of course and until recently that seemed to be a problem. Currently, the biggest data centre, the Utah Data Centre, is estimated to be capable of storing 12 Exabyte of data. Although that’s a lot of data, it is by far not enough to store the data flood coming our way. The SKA Telescope would fill it in only 12 days.
New technologies are therefore required and HP is currently working on a new technology that is capable of storing 100 terabyte of data on a smartphone. HP developed a new architecture, which they have dubbed ‘The Machine’, and claims it is one of the biggest technological break-throughs of the past decades. The Machine does not only have powerful processor cores, but has many clusters of smaller processors that can execute specific tasks. All is connected with photonics; transferring data via optics instead of using copper wires and using ions instead of electrons. According to HP, The Machine is capable of calculating 160 petabyte of data in only 250 nanoseconds, while using 80% less energy than normally.
Of course, HP is not the only working on new data storage technologies. Graphene is the wonder material that also has a lot of promises for data storage; Graphene sticky notes for example might be able of storing 32 GB on an ordinary sticky note. And also biological data storage offers massive possibilities: a bioengineer and geneticist at Harvard have managed to store 700 Terabytes of data in a single gram of DNA. Although these technologies are still far away, it shows that the challenge of data storage might prove not to be a challenge after all.
In a world so full of data, the term Big Data is not necessary any more. Currently Big Data is primarily a buzzword, seen as a hype or a trend, which helps people and organisations understand this new paradigm. Once we are used to the new normal and Yottabyte, Brontobytes and Geopbytes have become common language in boardrooms, it is no use any more to talk about Big Data; It will become ‘just’ Data again. Data that will drive the economy, help to save lives and improve societies.
I really appreciate that you are reading my post. I am a regular blogger on the topic of Big Data and how organizations should develop a Big Data Strategy. If you wish to read more on these topics, then please click ‘Follow’ or connect with me via Twitter or Facebook.
You might also be interested in my book: Think Bigger – Developing a Successful Big Data Strategy for Your Business.
This article originally appeared on Datafloq.