Where on Earth to put the internet? Stick it in our genes
There is a frighteningly large data crunch looming, but techies might just have come up with a back-up plan
In 1956, IBM announced what has since become known as the world’s first hard disk drive. The “350 Disk Storage Unit” stood almost 2m high and weighed slightly less than a ton. Its 50 magnetic disks gave it a capacity equivalent to a whopping 3.75 megabytes, roughly equivalent to one MP3 music file.
Development of the drive was almost cancelled by IBM’s board, which feared it would damage its punchcard business, but fortunately for the company, TJ Watson, its president, reversed the decision. Several businesses were happy to pay $3,200 a month to rent the system, equivalent to $30,000 (R433,750) today.
Things have improved a little since then. Phones that fit in our pockets are capable of storing up to one terabyte – three million times the capacity of IBM’s fridge-sized hardware. The ability to store ever-growing quantities of data cheaply has been one of the critical factors in the technological revolution, alongside rapid improvements in processor power. The increased capacity available on our personal devices is nothing, however, compared with the explosion of information stored at the cavernous central data centres operated by IT companies.
The hundreds of millions of photos posted to Facebook daily, or the hundreds of hours of video uploaded onto YouTube every minute, must be saved somewhere, or more typically backed up in multiple places, should one fail. This explosion in data comes at a heavy cost. Google is spending $13bn this year on building new data centres in the US alone – about a third of 2018’s profits. Facebook’s capital expenditures, a big part of which is such infrastructure, doubled in that year to just shy of $14bn, and the company expects these costs to reduce its profit margins by a third in 2019.
That is not to mention the cost of running the centres, some of the most power-hungry buildings on Earth. Apple now owns more than 7,000 acres of land, much of it for data centres, up from just 584 acres in 2011. The internet has to be put somewhere. And yet, in all likelihood we are only at the beginning of this explosion.
As much as data drives our world today, we are rapidly approaching a future in which hundreds of billions of sensors in factories and cities are constantly recording and feeding back information to a central location. Driverless cars, which will monitor the 3D environment hundreds of times a second, will generate thousands of gigabytes per day, according to some estimates, and have to hold on to it all to figure out what went wrong in the event of a crash.
Improving healthcare may require vast quantities of real-time data from medical sensors. There are examples closer to the present day. Google’s “cloud” video games service, announced last week, relies on advanced games running within data centres rather than on consoles. According to IDC, an IT researcher, the world generated about 33 zettabytes (a zettabyte being one thousand million million megabytes) of data in 2018. That is 50% higher than a year earlier, but nothing compared with 2025, when it will increase to 175 zettabytes.
It is getting cheaper and more efficient to store data, but not fast enough. At current rates it will become increasingly expensive to keep up with the pace at which new information is being created, and it will soon become a problem.
Last week, researchers at Microsoft and the University of Washington presented a potential way out of this jam: storing data in DNA. For the first time, they demonstrated a technique that was able to encode information – in this case the word “hello” – from computer code into its genetic equivalent, and retrieve it automatically.
Turning data into DNA is not a new concept (the idea has been around since the 1960s), but being able to easily turn it back into data is a vital breakthrough if it is to help solve the looming storage crunch.
We are some way away from that: the prototype cost $10,000 and took 21 hours to carry out the simple task. However, the researchers said automating the process is a vital step towards turning DNA storage into a reality.
The advantages of such a system over today’s data storage systems would be twofold. First, DNA lasts for thousands of years, compared with the 30 years before magnetic tape degrades and less for other digital storage systems. Second, it would be far more space efficient: everything in a warehouse-sized data centre would fit into something that could be held in your hand.
Other storage systems have been considered. In 2018, Belgian scientists discovered a way to store data within powder, which they said represented a more environmentally friendly alternative to existing storage hardware.
Either way, unless we solve the looming storage crunch, many of the possibilities that the forthcoming explosion in data will create could be lost, or at least be hugely expensive. And if we can avoid covering Earth in data centres, all the better.
– © Telegraph Media Group Limited