High Speed Data Compression - Cold Fusion Of The Internet?

from the hyperbole-of-the-internet? dept

Here's an article about some researchers who are claiming they've created a new way to compress data that they think can revolutionize data transfer and storage. They say it challenges theoretical assumptions about compression, and have even called in "cold fusion" for the internet - which could be more accurate than they realize. I'm waiting until I see/hear more about this. The language that people are using to describe this seems way too extreme at this point.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    mhh5, 9 Jan 2002 @ 6:06pm

    be skeptical... very skeptical..

    "The company has thus far demonstrated the technology only on very small bit strings, but if the formula can be scaled up to handle massive amounts of data, the consequences could be enormous. "

    WHAT?!? Who tests "revolutionary" compression algorthms on "very small bit strings"? How much processor time does this new technique take? There's gotta be some trade-off here like it takes 10 supercomputers an average human lifespan to compress one 650MB disk.... (if they're going to exaggerate the benefits, I might as well exaggerate some drawbacks.)

    link to this | view in thread ]

  2. identicon
    free, 21 Aug 2003 @ 3:57pm

    Re: be skeptical... very skeptical..

    you're just a pessimist - they said it was "high-speed". i mean, c'mon, we had the same development hurdles when the micro-processor was first invented. When they were first being developed they had a 1-bit or 2-bit bus and were very expensive to make (in the thousands/proc). 8-bit busses didn't happen (cheaply) until shortly before the Intel 8086.

    so even if this technology is expensive (in CPU cycles) currently, if scalable, it will no doubt revolutionize the entire industry.

    i have a feeling the scalability challenge is more related to the actual underlying principles involved (ie quantum mech.'s, physics, etc) rather than the CPU cylces used to achieve the result. what if, for example, the mathematics being used has limitations when it comes to larger chunks of data?

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.