Use Whatever Works, Even 'Alleged' Quantum Computing Algorithms -- Then Publicize It

from the sufficiently-advanced-technology-(or-magic) dept

When faced with really hard computational challenges, throwing more and more hardware at the problem can be a Sisyphean task. So often, the best answers to tough problems turn out to be solutions that are simply "good enough" to work. But there's always demand for better -- as well as a desire to make creative solutions sound cool and interesting. Some research groups promote their work with somewhat academic problems -- like pattern-matching software that can play games like Jeopardy! and chess. But there are also much more commercially-driven problems for pattern-recognition that haven't been solved, and when those solutions are not quite forthcoming, the research behind it can become more about grabbing attention than actually creating utilitarian algorithms. And adding a bit of mystery can make almost any research sound more appealing to the public.

A recent example of tackling a hard problem and making it sound mysterious comes from Google's research into quantum computing algorithms in which Google has partnered with the small Canadian startup, D-Wave, to master the computational realm of image recognition. The mystery, though, is that there may or may not be any quantum computing involved at all in this research. D-Wave claims to have a silicon-based chip that can simulate certain quantum mechanical scenarios, but the company hasn't yet published any peer-reviewed papers on their apparent breakthroughs in quantum computing. D-Wave admits it's not sure whether or not its technology is truly simulating quantum behavior, but presumably, the determination can be performed if the company really wanted to know with certainty. So while D-Wave says it's still evaluating its own technology, skeptics question the validity of their claims.

In the meantime, though, a reportedly faster image recognition algorithm seems to rely on D-Wave's chip. Google has presented its research for it with the conclusion that its "quantum" algorithm has surpassed Google's existing classical algorithms currently in use in its own data centers. However, these results don't necessarily mean the achievement is notable. A better algorithm could be classical as well. And given that the D-Wave chip hasn't been fully characterized, it's not clear how it compares to other chips. In the end, though, Google can brag about its cutting edge research, even if the progress can't be fully measured.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Marcus Carab (profile), 28 Dec 2009 @ 6:59pm

    I have been trying to get to the bottom of the D-Wave claim (in an admittedly somewhat cursory way) for awhile... I can't help but believe that if the scientific community there was any real chance that they had really created a working quantum chip with real-world applications, then it would be huge news. There should already be a hundred other initiatives to use their technology...

    And yet, at the same time, I don't see Google backing a bunch of shifty con-scientists: they must believe it is at least possible that they really have a quantum chip, otherwise they wouldn't put themselves in the awkward position of backing a startup that later turns out to have been twisting the facts.

    I suppose, to be fair, D-Wave does openly admit that there is no proof of quantum activity in their chip... but they also don't do anything to quell the layman's excitement over a supposed breakthrough that is very dubious to the experts...

    link to this | view in thread ]

  2. icon
    Michael Ho (profile), 28 Dec 2009 @ 9:15pm

    Google isn't really "backing" D-Wave...?

    From what I've seen, Google is only a research partner for D-Wave, not an investor? So if Google is getting free access to potentially cool new technology, there's no real drawback for them if D-Wave ends up NOT to have any quantum activity in their chips. The worst-case scenario seems to be that they have some specialized chips that may not be as good at solving complex calculations as quantum computers are supposed to be....

    link to this | view in thread ]

  3. icon
    Marcus Carab (profile), 29 Dec 2009 @ 12:19am

    Re: Google isn't really "backing" D-Wave...?

    You're right, there's no real loss for Google at this point. However, if Google were to make a bigger deal out of the fact that they are using quantum computers and then it became clear that these chips are not in fact "quantum", it might be damaging to Google's reputation for embracing and understanding emerging technologies.

    But really this is all just me, reliving over and over again the extreme-excitement-followed-by-profound-disappointment when I first heard of D-Wave's "breakthrough"

    link to this | view in thread ]

  4. icon
    Richard (profile), 8 Jan 2010 @ 3:16am

    What's going on here

    First we need a little clarity on what Quantum computing is and isn't.

    1 All quantum computing (QC) is Analogue computing (in the proper sense of computing by analogy rather than the incorrect usage to imply "continuous").

    2. QC created a big stir a few years back when Shor's factorisation algorithm was published. The key feature of this algorithm is parallelism in Hilbert space. Feynman had suggested some years earlier that there was a potentially enormous speedup if a way could be found to do such a parallel computation. The difficulty is how to "come back out" of Hilbert space without simply losing all your results. Shor showed how this could be done for a particular algorithm (factorisation). Because factorisation is critical to encryption schemes like RSA and because the speedup is so enormous the impact at the time was big - even though only one algorithm had been found. An unfortunate side effect of all this is that a lot of uninformed commentators assume that QC can be implemented for any computation you like and that the speedup will always be colossal. Neither of these is true. There have been a few more QC algorithms devised since - but none achieves the same speed bonus as Shor's algorithm.


    Now let's consider D wave's claims:

    1. D wave uses Adiabatic QC. This is a separate development which addresses minimisation problems. Since many problems can be expressed in this form it is more general than other forms of QC. In adiabatic QC we start with a simple system in its ground state. We then gradually change the system into the required one and hope that it remains in its ground state (that's what adiabatic means here). The new state should be the required minimum. There is a potentially large speedup here - but it can't be quantified in the same way as Shor's algorithm. Really it's not much different from the speedup you get from a classical analogue computer - such as a wind tunnel!

    2 However (as with all QC) there are serious practical difficulties. Can we be sure that (in changing the system) we haven't added any energy. D_wave have to take extraordinary precautions for this - and yet it still isn't clear that they are effective.

    3. The sting in the tale is that D_wave's system may also make sense as a classical analogue computer - and may get useful results even if the adiabatic conditions aren't actually being enforced. You may still get A minimum even if it isn't THE minimum - and there isn't a way of telling the difference.

    4. Because of the extreme environmental requirements of superconductivity D_wave's chips aren't coming to your desktop PC in the foreseeable future.

    5. D wave aren't making as much progress as they predicted 2-3 years ago. They are currently 2 years+ behind the schedukle predicted less than 3 years ago.

    6. Google has plenty of money to gamble on things "just in case they work". Many major corporations wasted money on cold fusion several years ago on exactly the same principle.

    link to this | view in thread ]

  5. icon
    Michael Ho (profile), 8 Jan 2010 @ 2:22pm

    Re: What's going on here

    Interesting.. Thanks for the info.

    I didn't realize that D-Wave was using superconducting processors...
    http://www.dwavesys.com/index.php?page=quantum-computing

    Why don't more High Performance Computers use superconducting materials? Cray used to cool their supercomputers with liquid nitrogen back in the day... but I haven't seen too much of that anymore.....? (maybe I'm just not looking, tho)

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.