Actually, at 6:45AM by my calculations.
According to ZDNet’s Dan Farber, quoting an IBM whitepaper, by 2010, “the world’s information base will be doubling in size every 11 hours.”
Every 11 hours? That’s quite a statement. Let’s see what this means. The largest storage system in the universe is the universe. (Let that sink in for a moment). When I grew up, I was taught that there were approximately 10^79 electrons in the universe. Let’s use them all! 10^79 bits of storage, stored using the spin state of the electrons, in a giant quantum computer.
I have no idea how much data we will have on January 1st, 2010, so let’s assume, for sake of argument, that a virus wipes out all the data in the world on New Year’s Eve, and we start the year with only 1 bit of data, and it doubles every 11 hours. So after 22 hours, we have 2 bits of data, after 33 hours 4 bits, and then after almost two days we get our first byte (8 bits). This isn’t too bad, is it?
The equation is: 2^x=10^79. Solve for x, a simple exercise in logarithms, giving the answer 262.43. We can only double that many times before hitting the universal limit and we exhaust all of the storage in the entire universe on May 1st at 6:45AM. Of course, maybe we’ll just Zip it all up and last until dinner time?
I think I’ll call in sick that day.
But seriously, I wonder if this “every 11 days” figure is a typo? Doubling “every 11 months” would be easier to imagine and would give us to 2250.
Steven G. Johnson says
You wrote: Let’s use them all! 10^79 bits of storage, stored using the spin state of the electrons, in a giant quantum computer. Actually, since you’re optimistic enough to posit storing the data coherently in qubits, the information capacity is more like 2^(10^79) classical bits due to entanglement.
Rob says
I’d hate to debug one of those. The unpredictability of multi-threaded programming is bad enough. Imagine a bug in a program that depends on quantum entanglement.
But it seems that all data we would store would come from Man or the machines that Man builds. So we are may be more limited on the production side than the storage side. How many bits can we produce given the sources of energy availanble to us?
hAl says
I think you should inform your IBM collegues to change the info in their whitepaper from 11 hours to 11 years which is most likely what they ment.
I wouldn’t want IBM storage services to try build a storage in the the electrons of my body…
Rob says
hAl,
For the record, I have no evidence that IBM is planning on harvesting the electrons from customers (or competitors) to builds quantum storage devices.
This much is sure: we’re moving to high definition in television, DVD, cable and broadcast. There are an increasing number of digitalization projection around the world to encode large libraries of texts. Our ability (and willingness) to place security cameras in public places is increasing.
So, there is no doubt that we’re going to see a rapid increase in storage demand. But I don’t see what mechanism would lead to sustained exponential growth. Maybe a 100x increase. But then what? Certainly an engineering breakthrough like a storage device that uses quantum entanglement would enable a vastly greater level of storage. But I don’t see how that level of storage capacity would lead to a proportionate growth in information supply.
Think of it this way: CPU cycles are more plentiful and cheaper. We’re moving to dual-core on the client. Multi-core will increase in computational power. But the average client machine spends most of its time idle as we read pages or type. CPU cycles are so cheap that Microsoft is able to use them for all sorts of dynamic UI embellishments that would have been considered a frivolous waste of resources a few years ago.
So the question is this: If storage were infinite and free, what would you do with it? What applications does it enable that we do not have today? Maybe I’d actually back up my photos. Or bolder, maybe I’d backup and carry around the entire internet on a quantum USB drive and carry it with me everywhere. With sufficient storage, bandwidth is irrelevant. At least until you need to update the data.
Anonymous says
Cost and size limitations are no excuse for not backing up your photos. Laziness is!
Magnetic and optical media is dirt cheap nowadays–yet almost nobody backs up. Makes me think that the user, not the computer, is the limiting factor!
Incidentally, all the storage in the universe won’t do us any good as long as the bottleneck in computing is disk access. Not even a top-of-the line disk array can supply data fast enough for a modern processor.