Parallel computers don't "multiply their performance" in computation theoretic terms. If you took half as long by computing on two Turing machines simultaneously you didn't decrease the time complexity of the algorithm. You still churned through about the same number of total operations as if you had done it serially.
The distinction between the total operations and the total time is important, because your energy requirements scale with the time complexity of what you try to compute, not with the total time it takes.
An optical computer, for example, has a limit on how densely it can pack parallel computation into the same space, because at some point your lenses overheat and melt from the sheer intensity of the light passing through them. It's possible QCs are subject to similar limitations, and despite the math saying they're capable of computing certain functions polynomially, that doing so requires pushing exponential amounts of energy into the same space.
Good points. I just wanted to draw attention to the idea that Turing Machine is just one way of doing "computing". Whatever is proven about Turing Machines is proven about Turing Machines, not about all mechanical devices, like quantum computers, and "light-computers", in general.
I'm wondering is there a universal definition of "computing"? Saying that "Conmputing is what Turing Machines do" somehow seems circular. :-)
Well, Turing machines are supposed to be that definition. The Church-Turing thesis still hasn't been shown to be false. Proving something of Turing machines that doesn't hold for one of those classes of devices would mean refuting it. Basically we'd need to find some problem that, memory not being an issue, one class of computer can solve in finite time while another can't solve in any finite amount of time.
The distinction between the total operations and the total time is important, because your energy requirements scale with the time complexity of what you try to compute, not with the total time it takes.
An optical computer, for example, has a limit on how densely it can pack parallel computation into the same space, because at some point your lenses overheat and melt from the sheer intensity of the light passing through them. It's possible QCs are subject to similar limitations, and despite the math saying they're capable of computing certain functions polynomially, that doing so requires pushing exponential amounts of energy into the same space.