I just watched a video of a postdoc filling up a whiteboard with equations. He then erased the board, drew 2 Bloch spheres, and said that all that math represented the rotation of the left side around the Y axis to look like the right side.

Although that’s the first time I’ve seen anyone summarize the math visually like that, it occurred to me that one step is still missing: on a quantum computer, how do you actually rotate around the Y axis? I’ve actually never viewed a quantum computing course or read a paper that shows how to go from math to visual to circuit.

For the record, I know how to rotate around the Y axis.

Imagine a classical computing course that shows you how a classical computer adds 2 + 2. The professor talks about transistors and logic gates and yada yada yada, and you’re still left with the question: but, how do I actually use a computer to add 2 + 2? How do I use a command line or IDE to actually see the 4 on screen?

Quantum computing courses all seem to be structured this way. There’s a lot of math, but after the initial tutorials there’s no actual circuit building. You see math + circuit really early on — maybe with some visuals — but then the math takes over.

You could argue that the math is fundamental. You’re right. But, for “quantum computing” courses, where’s the quantum computer? Let’s take this math, visualize it, build a circuit, and run it. Here’s the math to rotate around the Y axis, here’s what it looks like on a Bloch sphere, and now here is how we actually build a circuit and do actual quantum computing.

Throw in the kitchen sink: matrices, bra-ket notation, Bloch spheres, circuit diagrams, histograms… all of it. Go through algorithms gate by gate, explaining what each is actually doing. Where I come from, that’s called teaching.

You wouldn’t teach classical computing courses without classical computers, so why would you teach quantum computing courses without quantum computers?