Very visual example of how Shor's algorithm works to solve factoring. Nothing more than basic arithmetic required.
The big takeaway for me was, it's not just "try every combination at once" as per pop lit on the subject. QC doesn't really do that. To get QC to work any better that traditional for any task, you need to get lucky and stumble across an algorithm that QC can excel at for that task. Just from reading Scott Aaronson's article, it seems likely that most tasks simply don't have a QC optimization, so perhaps QC won't change much at all. (Well, except cryptography, which may change everything...)
Yes, this was a bit of a surprise for me to - "In fact, in general solving NP-hard problems in polynomial time with quantum computers is not believed to be possible"
After quickly skimming it it seems well done. Leans towards mathematical rigor, which may not be ideal for some people. On the other hand, it'd be pretty hard to avoid math: quantum computing is linear algebra incarnate.
The only thing that caught my eye as off was totally minor. They say the many-controlled-Z gate used by Grover's algorithm can be done in O(n^2) constant-sized gates with an argument-by-reference, but with that type of argument you might as well give the tight bound of Θ(n).
It's possible to have a gentle ramp-up on the math content while learning quantum mechanics. "The Theoretical Minimum" by Susskind and Hrabovsky is structured like that: They teach you the math as part of, and motivated by, teaching the physics, and the books don't lag or resort to hand-waving.
I've recommended this here before, but David Mermin's "Quantum Computer Science: An Introduction" is also good (like this paper, it's targeted at computer scientists who have a little math knowledge and little-to-no physics background). There's a dead-tree version, but the lecture notes the book is based on are available online: http://www.lassp.cornell.edu/mermin/qcomp/CS483.html
Quantum Computation and Quantum Information by Nielsen and Chuang is the standard introduction to quantum computing. Besides that, you can find some old CS191 Vazirani lectures on the internet.
Very visual example of how Shor's algorithm works to solve factoring. Nothing more than basic arithmetic required.
The big takeaway for me was, it's not just "try every combination at once" as per pop lit on the subject. QC doesn't really do that. To get QC to work any better that traditional for any task, you need to get lucky and stumble across an algorithm that QC can excel at for that task. Just from reading Scott Aaronson's article, it seems likely that most tasks simply don't have a QC optimization, so perhaps QC won't change much at all. (Well, except cryptography, which may change everything...)