## Quotes for "Quantum Computing since Democritus"

But why would Schrödinger be interested in this dialogue? Well, Schrödinger was interested in a lot of things. He was not an intellectual monogamist (or really any kind of monogamist).

Well, on its face, that seems sort of boring. But as you get more accustomed to this sort of work, I think what you'll find is...it's still boring!

I think quantum computing is really the poster child here. It's fine to tell people to “Shut up and calculate,” but the question is, what should they calculate?

I probably should tell you explicitly that I’m compressing a whole math course into this chapter. On the one hand, that means I don't really expect you to understand everything. On the other hand, to the extent you do understand – hey! You got a whole math course in one chapter! You're welcome.

As Bertrand Russell put it: “To choose one sock from each of infinitely many pairs of socks requires the Axiom of Choice, but for shoes the Axiom is not needed.” (What's the difference?)

This result is known as Löb's Theorem (again with the umlauts), though personally I think that a better name would be the “You-Had-the-Mojo-All-Along Theorem.”

do we ever really talk about the continuum, or do we only ever talk about finite sequences of symbols that talk about the continuum?

“hypercomputation.” I’ve read some of this stuff, and it's mostly along the lines of, well, suppose you could do the first step of a computation in one second, the next step in a half second, the next step in a quarter second, the next step in an eighth second, and so on. Then in two seconds you'll have done an infinite amount of computation! Well, as stated it sounds a bit silly, so maybe sex it up by throwing in a black hole

You see: every time you set up a Yahoo! Mail account, you're directly confronting age-old mysteries about what it means to be human...

So, for example, there's a function f such that TIME(f(n))=TIME(2f(n)). Duuuuude.

To turn this idea into a proof, the main thing one needs to show is that simulating the iterative procedure is pretty much the only way to compute f: or more precisely, any algorithm to compute f needs at least t(n -i) -i) steps for some i. This then implies that f has no fastest algorithm.

If NP problems were feasible, then mathematical creativity could be automated. The ability to check a proof would entail the ability to find one. Every Apple II, every Commodore, would have the reasoning power of Archimedes or Gauss.

The key point is that, while this is a very large buttload of variables and relations, it's still only a polynomial

As I said before, the error probability of a BPP algorithm can easily be made smaller than the probability of an asteroid hitting the computer. And that's good enough for most applications: say, administering radiation doses in a hospital, or encrypting multibillion-dollar bank transactions, or controlling the launch of nuclear missiles. But what about proving theorems? For certain applications, you really can't take chances.

Sometimes you see BPP algorithms called “Monte Carlo algorithms,” and ZPP algorithms called “Las Vegas algorithms.” I’ve even seen RP algorithms called “Atlantic City algorithms.”

Once again, it would seem there's a crucial difference between quantum theory and classical probability theory, which allows certain ideas (like those of Sipser–Gács–Lautemann, Adleman, and Impagliazzo–Wigderson) to work for the latter but not for the former.

Of course, there's something that needs to be proved here, but the something that needs to be proved can be proved, and I’ll leave it at that.

One example is what's called the Blum–Blum–Shub 4 generator.

Blum et al. were able to show that, if we had a polynomial-time algorithm to distinguish f(x) from a random string, then (modulo some technicalities) we could use that algorithm to factor N in polynomial time.

working for GCHQ (the British NSA) in the early 1970s. Of course, they couldn't publish their work, so today they don't get much credit! Let that be a lesson to you.

As you probably know, there are two-qubit states that can't be written as the tensor product of one-qubit states. The most famous example is the EPR (Einstein–Podolsky–Rosen) pair:

And indeed, in 1998 Abrams and Lloyd 5 used exactly this observation to show that, if quantum mechanics were nonlinear, then one could build a computer to solve NP-complete problems in polynomial time. Of course, we don't know whether NP-complete problems are efficiently solvable in the physical world. But in a survey 6 I wrote years ago, I explained why the ability to solve NP-complete problems would give us “godlike” powers – arguably, even more so than the ability to transmit superluminal signals or reverse the Second Law of Thermodynamics. The basic point is that, when we talk about NP-complete problems, we're not just talking about scheduling airline flights (or for that matter, breaking the RSA cryptosystem). We're talking about automating insight: proving the Riemann Hypothesis, modeling the stock market, seeing whatever patterns or chains of logical deduction are there in the world to be seen.

There's one sense in which the No-Cloning Theorem has almost nothing to do with quantum mechanics. Namely, a precisely analogous theorem can be proved for classical probability distributions. If you have a coin that's heads-up with some unknown probability p, there's no way to convert it into two coins that are both heads-up with probability p independently of each other. Sure, you can look at your coin, but the information about p you get that way is too limited to enable copying. What's new in quantum mechanics is simply that the no-cloning principle applies, not only to mixed states, but also to pure states---states such that, if you knew the right basis in which to measure, then you could learn which state you had with absolute certainty. But if you don't know the right basis, then you can neither learn these states nor copy them.

The first explicit QKD proposal was by Bennett and Brassard 8 in 1984, and is creatively known as BB84.

Recently, though, Paul Christiano and I 10 proposed a new public-key quantum money scheme called the “hidden subspace scheme,”

To me it's sort of astounding that it took until the 1990s for anyone to really seriously ask this question, given that all the tools for asking it were in place by the 1960s or even earlier. It makes you wonder, what similarly obvious questions are there today that no one's asking?

From my perspective, Richard Feynman won the Nobel Prize in Physics essentially for showing BQP is contained in PP.

What the lower bound of Bennett et al. tells us is that, if quantum computing supports the existence of parallel universes, then it certainly doesn't do so in the way most people think! As we've seen, a quantum computer is not a device that could “try every possible solution in parallel” and then instantly pick the correct one. If we insist on seeing things in terms of parallel universes, then those universes all have to “collaborate” – more than that, have to meld into one another – to create an interference pattern that will lead to the correct answer being observed with high probability.

In science, you can always cook up a theory to “explain” the data you've seen so far: just list all the data you've got, and call that your “theory”! The obvious problem here is overfitting. Since your theory doesn't achieve any compression of the original data – i.e., since it takes as many bits to write down your theory as to write down the data itself – there's no reason to expect your theory to predict future data. In other words, your theory is worthless.

All ye who would claim the intractability of finite problems: that way lieth the P versus NP beast, from whose 2n jaws no mortal hath yet escaped.

Intuitively, H(D) measures the minimum number of random bits that you'd need to generate a single sample from D – on average, if you were generating lots of independent samples. It also measures the minimum number of bits that you'd need to send your friend, if you wanted to tell her which element from D was chosen – again on average, if you were telling her about lots of independent draws.

But in the last 20 years, there's been the question – why should we trust physics? We trust it in life-and-death situations every day, but should we really trust it with something as important as proving the Four-Color Theorem?

MA, for “Merlin-Arthur.” Babai was imagining a game where “Merlin,” an all-powerful but untrustworthy proving wizard, supplies a polynomial-size certificate, and then “Arthur,” a skeptical, polynomial-time king,

And indeed, it's known that if one-way functions exist,

The famous PCP Theorem 2 says that every NP problem admits PCPs – and furthermore, PCPs with polynomially long proofs! This means that every mathematical proof can be encoded in such a way that any error in the original proof translates into errors almost everywhere in the new proof.

computer science is what mediates between the physical world and the Platonic world. With that in mind, “computer science” is a bit of a misnomer; maybe it should be called “quantitative epistemology.” It's sort of the study of the capacity of finite beings such as us to learn mathematical truths.

There are these staggering, mind-bending mysteries about truth, proof, computers, physics, and the very limits of the knowable, and for ease of reference, we bottle the mysteries up using inscrutable sequences of three or four capital letters. Maybe we shouldn't do that. But we're going to do it anyway, starting with QMA (Quantum Merlin-Arthur), the quantum generalization of MA.

There is also a Quantum Cook–Levin Theorem – which is a great name, since both Cook and Levin are quantum computing skeptics

Very recently, Andy Lutomirski 4 proposed a candidate problem that he (and I) conjecture should give such a separation, but no one has been able to prove it yet. If you can, that'd be great.

This is just a general fact about quantum measurements. If you could predict the outcome with certainty, then the state wouldn't be disturbed at all by the measurement.

Here's the thing: there seem to be all these bits near the event horizon of the black hole. Why the event horizon? Because if you're standing outside a black hole, then you never actually see anything fall through the event horizon. Instead, because of time dilation, all the infalling objects will seem to get eerily frozen just outside the event horizon – approaching it, Zeno-like, but never reaching it.

The other funny thing about this is that, in classical general relativity, the event horizon doesn't play a particularly special role. You could pass through it and you wouldn't even notice. Eventually, you'll know you passed through it, because you'll be sucked into the singularity, but while you're passing through it, it doesn't feel special. On the other hand, this information point of view says that as you pass through, you'll pass a lot of bits near the event horizon. What is it that singles out the event horizon as being special in terms of information storage?

In the case that these bits are stored in a black hole, apparently if there are n bits on the surface, then it takes on the order of n3/2 time for the bits to evaporate via Hawking radiation. So, the time-order of retrieval is polynomial in the number of bits, but it still isn't particularly efficient. A black hole should not be one's first choice for a hard disk.

For in MA, Merlin can give Arthur the polynomial-size circuit for solving #P problems, and then Arthur just has to verify that it works. To do this, Arthur just runs the interactive protocol from before, but where he plays the part of both the prover and the verifier, and uses the circuit itself to simulate the prover. This is an example of what are called self-checking programs. You don't have to trust an alleged circuit for counting the number the solutions to a formula, since you can put it in the role of a prover in an interactive protocol.

(In slogan form, the fact of computational hardness is what makes proving computational hardness so hard!)

and hence QIP = IP = PSPACE. So, ultimately, quantum interactive proof systems turned out to have exactly the same power as classical ones. Amusingly, in the classical case, the big surprise was that these systems can simulate PSPACE, while in the quantum case, the big surprise was that PSPACE can simulate them!

So the modern form of the Doomsday Argument, which was formalized by Bostrom,

STUDENT: It seems as though there's a tension between wanting to be right yourself and wanting to design policies that try to maximize the number of people who are right.

Indeed, I'd say we can be much more confident of that inequality than of the more familiar conjecture BPP BQP, which is “merely” based on stuff like the presumed classical hardness of factoring, nothing as “robust” as the infinitude of the polynomial hierarchy.

STUDENT: What happens if you have a “you-oracle” and then decide to do whatever the simulation doesn't do? SCOTT: Right. What can we conclude from that? If you had a copy of the Predictor's computer, then the Predictor is screwed, right? But you don't have a copy of the Predictor's computer. STUDENT: So this is a theory of metaphysics which includes a monopoly on prediction?

SCOTT: Let me put it this way: the simulation has to bring into being a copy of you. I’m not saying that the simulation is identical to you. The simulation could bring into being many other things as well, so that the problem it's solving is “you-hard” rather than “you-complete.”

This suggests one of my favorite proposals for how to solve NP-complete problems in polynomial time: why not just start your computer working on an NP-complete problem, then board a spaceship traveling at close to the speed of light and return to Earth to pick up the solution?

One of the things you'd expect such a theory to answer is whether CTCs can exist. So quantum gravity seems “CTC-hard,” in the sense that it's at least as hard as determining if CTCs are possible!

We're assuming that Nature gives us this stationary distribution for free. Once we set up the CTC, its evolution has to be causally consistent to avoid grandfather paradoxes. But that means Nature has to solve a hard computational problem to make it consistent! That's the key idea that we're exploiting.

Around 2008, John Watrous and I were able to solve the problem. 4 Our result was that BQPCTC = PCTC = PSPACE. In other words, if CTCs existed, then quantum computers would be no more powerful than classical ones. STUDENT: Do we know anything about other classes with closed timelike curves?

If so, then when you run your CTC computer, you might always just get one of these spurious, easy-to-find, computationally uninteresting fixed points.

Because now we're doing something new and exotic: asking Nature to find a fixed point of a given physical evolution, but not specifying which fixed point. That being so, if there are “doofus” fixed points lying around – ones that don't correspond to any fixed point of the original circuit C being simulated, and don't require solving any hard computational problem – then why shouldn't Nature be lazy and choose one of those, rather than one of the “hard” fixed points? If so, then in the presence of CTCs, “mysterious” computer failures would be the norm rather than an exotic aberration.

the “Evolutionary Principle”: the principle that “knowledge can only come into existence via evolutionary processes” (or, translated into computer-science terms, that NP-complete and similar problems shouldn't be “solvable as if by magic”).

STUDENT: As far as the spread of mass is concerned, I think that people believe that is finite, because of the Big Bang. SCOTT: That's a misconception about the Big Bang. The Big Bang is not something that happens at one point in space; the Big Bang is the event that creates spacetime itself.

It's the vacuum energy. Again, this is physics. People don't define things, they observe them. They don't actually know what this vacuum energy is, they just know it's there.

You can't actually build a working computer whose radius is more than 20 billion light years or whatever. It's depressing, but true.

STUDENT: Why is it 10 billion light years? SCOTT: Because that's the distance such that something that far away from you will appear to be receding away from you at the speed of light, if there's no countervailing influence of gravity. STUDENT: So it's just a coincidence that that distance happens to be about the current size of the observable universe?

STUDENT: Where would you ever get an oracle? SCOTT: You just define it. Let A be an oracle...