There's something i don't get with this debate around "is this quantum or classical" : i thought quantum computing meant breaking NP complexity. So, in order to determine if it "is" quantum computing, one would suppose that any big dataset would easily show the difference in computing time...
Now, if i understood correctly, the problem is that the algorithms compared (aka annealing) are of statistical nature, so we're not actually comparing "fully" NP complete algorithms, and so the expected difference is not as a big as between an O(n) and an O(x^n)) algorithm.
As I understand, the problem is comparing a relatively advanced technology (classical computing) to a relatively immature technology (quantum computing). So, classical computing is fast enough to simulate quantum annealing at the speed that the quantum chip itself is able to process (because the technology is so new). This has lead people to doubt whether the chip is actually doing the quantum process or simply simulating it. Or at least that is my grasp of the situation.
Quantum computing means breaking Bounded Quantum Polynomial, not NP. Integer factorization is in NP, and also in BQP; but some problems can just as well be in NP and not in BQP. And NP-complete is not a subset of BQP, as far as anyone knows.
Now, if i understood correctly, the problem is that the algorithms compared (aka annealing) are of statistical nature, so we're not actually comparing "fully" NP complete algorithms, and so the expected difference is not as a big as between an O(n) and an O(x^n)) algorithm.
Could someone here confirm if this is correct ?