Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How to fit 1,000 terabytes on a DVD (theconversation.com)
142 points by clicks on June 20, 2013 | hide | past | favorite | 40 comments


I certainly wouldn't mind returning to a world where inexpensive, easily-labelled optical disks could hold relatively large amounts of data.

When HDDs were generally no more than a few gigabytes, I could basically stick a compressed backup of my PC on 1-2 CDs, write a date on them, and shove them in a box. It was incredibly convenient and offered good peace of mind. In fact, I just recently recovered some important data from the late 90s off one such CD.


You must take good care of your CDs, or they made better quality ones back in the 90s as most of them degrade after 10 years or less.

http://en.wikipedia.org/wiki/CD-R#Lifespan


I've never seen a serious study on the subject (at least not in English), only anecdata, so I've never been sure how much veracity to assign to claims of the commonality of CD-R degradation. It wouldn't surprise me if 90s-era discs were higher quality, though -- for my own anecdata, I've had only one batch of CD-Rs ever show evidence of degradation, and it was one of the last I ever bought and seriously used in the early 2000s.

I've never seen a problem with DVD-Rs, but I didn't use nearly as many of those. I haven't written new optical disks for practically anything other than OS installation media since ~2003/2004, and in the last few years I've rarely even done that.

(And I never made the hop to BD-R. From a data storage perspective, I found it pretty much obsolete on arrival.)


My anecdote re. BD-R: I got the Verbatim "good" discs (made in Taiwan, good dye technology - sorry I forgot the specifics) a while ago and used them to backup data (that fortunately had backups to other places as well). Months later, tried to read them. No dice. I was shocked as I thought BD-R's were built more robustly than DVD+-R's. BTW, this is true of all 3 discs that I tried, albeit they came from the same 50-pack.


Indeed it would be wonderful, but exactly how viable this technique is remains to be seen.

Back when I started using CD burners a typical big hard drive was far smaller than a single CD. (A CD burner was usually hooked up to a dedicated Mac -- always a Mac -- with an unusually large and expensive external hard drive.) If this technology becomes available within, say, five years, it should still be far further ahead of our 4TB SSDs for a good 12 years (assuming Moore's Law in both cases, which is probably optimistic for SSDs). For comparison, CD-Rs probably stopped being useful for backups about fifteen years after I started using them in 1992 (by 2007 it would be a one large or a few small projects on one CD).


It would be interesting to see how difficult it would be to have micro-cd drives, where a disc had a housing, just like an old floppy; but about the sized of an SD card... Super cheap, super dense, and ideally protected better than current CDs...


Is the article using "DVD" to mean "a DVD-sized disk", or does it actually mean DVD? Either way, it's impressive.


DVD-sized disk.


    [...]using a two-light-beam method, with different 
    colours [...]

   The two beams were then overlapped. As the second   
   beam cancelled out the first in its donut ring, the 
   recording process was tightly confined to the centre 
   of the writing beam.
How to they get the two beams (of different frequency) to cancel out each other?


This isn't optical interference (in which beams with different wavelengths couldn't affect each other, as you say).

Instead, the recording media has a response that looks kinda like intensity(lambda1) - intensity(lambda2). Then they combine a spot (bigger than they want) of wavelength lambda1, with a "doughnut" of wavelength lambda2. The net response of the recording media is then "spot minus doughnut" and if they choose the parameters right that looks sufficiently like a smaller spot.

You can see the full article at http://www.nature.com/ncomms/2013/130619/ncomms3061/full/nco... for more details.


That makes sense, thanks.


I don't know. Purple doesn't even have a wavelength[1]. I'd love to see the technical paper instead of the simplified explanation.

EDIT: oh, the technical side is available for free[2]. Let's see.

[1] http://en.wikipedia.org/wiki/Purple

[2] http://www.nature.com/ncomms/2013/130619/ncomms3061/full/nco...


Well you probably already know this now since you found the technical paper, but for anyone else:

It looks like 'purple' was a simplification made for the posted article. The actual paper mentions the light as ultraviolet at 375 nm.


""" The key to 3D deep sub-diffraction OBL is the development of a unique material with two chemical activation channels. One is for photopolymerization and the other is for photoinhibition. For this aim, the material should be designed to satisfy the following requirements. First, it should include an initiator that is highly photosensitive to two-photon absorption generated by a writing beam, which allows for the near-threshold fabrication. Accordingly, it is possible to achieve a minimum degree of photopolymerization required for building solidified structures with a feature size smaller than the focal spot of the writing beam. Second, it should exhibit an effective inhibition of the 2PP process, which is """


Sure you can put 1,000TB on it but whats the read speed?

There is a reason both PS4 and XboxOne will require to install the game even if the Bluray can hold all that data and more. High density optical formats are slow to read.

At 16x Bluray only reads at 72MB/s.


Higher densities bring faster reads. I would assume this technology would be many times faster than the raw linear read performance of Blu-ray, as Blu-ray was much faster than DVDs, and DVDs much faster than CDs.

72MB/sec, BTW, is substantially faster than best-case USB 2.0, which many people still use for backups and file transfers.

The primary limiting factor for games isn't the 72MB/sec linear read speed, it's seek time/random read performance.


This might help save billions of tax payer dollars spent by the NSA.


Government budgets don't go down, they'll just find a way to spend more at their predicted curve.


Nah if things get tough, Google, Facebook, Yahoo, Microsoft etc store it, and NSA can access the data on demand.


So according to Brewster Kahle's estimates for storing all US phone calls (http://blog.archive.org/2013/06/15/cost-to-store-all-us-phon...), it would only take 272 of these theoretical DVDs per year!

Of course there's a big tradeoff in latency (on the order of 10s of seconds to switch DVDs) and throughput (unknown), but properly indexed I'm sure it would still be extremely useful, and extremely cheap.

Imagine fitting all of that data in this little box: http://gizmodo.com/5321357/sony-finally-popping-400+disc-blu...


The idea of cloud storage or streaming audio/video sounds even less enticing.


This would be for spooling archival data to, not to stream from. This will have great impact on the NSA's noble desire to keep a living record of all human communications!


Its too bad they don't address the media question. While its great to write 9 nanometer dots if your media fills them back in after a while, well its not as useful.

I've got media from the 80's (gold backed) that is still readable with no errors, and some that is aluminum (silver) backed and is readable with error recovery.


Otherwise known as a petabyte?


True; but this is a PopSci article. Most people who read such media are probably not familiar with the term. They'll probably think of a petting zoo when they read "petabyte".


Using this with archival grade DVD-Rs would make big science much cheaper, accessible, and reproducible. Imagine the LHC fitting all their data into a briefcase, and just sending it to whoever asked for it. Of course, the real problem would be writing bandwidth.

Using archival grade media and write redundancy + ECC, you could decrease the size of what you put on a disk to just 10TB and probably hit a sweet spot between massive amounts of storage, exceptional reliability, and increased bandwidth.


Wouldn't that have huge problems with durability? A single scratch could render a lot of data unreadable.


Would it really be so hard to engineer these "disks" with an outside layer of plastic protection?

I have a mock-up here: http://i.imgur.com/DbV5ByR.jpg


Yes, and it's the same with CDs / DVDs. That's why the information is stored redundantly, to be able to losslessly read all data in the presence of a limited amount of scratches.


That is an issue with any storage media where storage density increases relative to size.


Quick back of the envelope: 1 of these ~= 1333CDs or 212DVDs The increase from a CD to a DVD was by a factor of ~6.26. This is 33.8 times that factor of increase.. but no word on the speed of reading/writing in this article.. which I assume will be slow.


Your back of the envelope is off by a factor of 1000.


He only had access to a small envelope.


4.7gb * 212 ~= 1000GB. But the article says 1000TB!


That is fine for writing. How are they gonna read the thing. Also isn't exposure to direct sunlight going to obliterate everything inside?


They could still use this tech for HDD, right now a HDD uses similar tech to DVD just with more precision, this would make things still more precise.

(though I don't really know about it being obliterated when exposed to sunlight)


I wonder if researchers are also looking into meta materials as a way to increase storage density on optical discs?


Spendthrifts! They could have used a CD and increased the capacity from 600MB to one petabyte.


I wonder how long it will take to burn one cd on a typical desktop computer.


I can store 1024 terabytes on a DVD today, entirely in software, no innovative hardware based on new engineering principles necessary:

   dd if=/dev/zero bs=1024 count=1T | pv -c -W | gzip -c9 | pv -c -W > big.gz
I'm pretty sure the resulting big.gz will fit on a DVD with plenty of room to spare.

You may need to sudo apt-get install pv if you don't have that incredibly useful utility already. You may also need several hours of CPU time...

EDiT: Downvoted within two minutes? HN needs to get a sense of humor...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: