Opinion 155: Scott Aaronson's "Theories" and "Theorems" are Egregiously Wrong When Taken at Face Value, and When Interpreted Correctly are "Right" but Uninteresting

By Doron Zeilberger

Written: Oct. 13, 2016

If you think that the title of this Opinion (what Scott A. would call "Opinion") is a bit harsh, you should read Scott's "response" to my Opinion 125 written on Alan Turing's birthday. In his famous "blog", in response to a query by Peter Sheldrick (June 26, 2012)

Hey Scott, what's your take on Dr. Z's newest opinion for Turing's 100th anniversary http://sites.math.rutgers.edu/~zeilberg/Opinion125.html?

Peter: Like most of Zeilberger's "opinions", that one strikes me as so egregiously wrong that one wonders to what extent he himself believes it, and to what extent he just likes throwing bombs at the "mathematical establishment". Let M be a Turing machine that enumerates all ZF proofs, and halts iff it finds a proof of 0=1. Then "M never halts" is a perfectly-meaningful statement (it even makes a falsifiable prediction about an actual computer program - what more could you want?), which happens not to be provable in ZF assuming ZF's consistency. By contrast, "M tastes like chicken" is a meaningless statement. Putting both of these statements into the same category ("meaningless") strikes me as a bizarre twisting of language in the service of an ideology (something I don't like even for good ideologies!). Indeed, a moment's thought reveals that, if Zeilberger's suggestion were adopted, mathematicians would immediately start working around it with circumlocutions ("meaningless in the sense of not provable in ZF" vs. "meaningless in the sense of meaningless").

Scott, (I was almost tempted to write "Scott", but I resisted this temptation):

At the time I decided to ignore your dismissive "response", because I had better things to do than convincing people that their "God" ("infinitary" mathematics) is a priori meaningless, and it is a waste of time to argue about religion, but one thing made me change my mind.

In the October 2016 entry of Jean-Paul Delahaye's otherwise wonderful "Logique & Calcul" column of "Pour La Science", he described, in his usual inimitable style, the contents of your "paper" with Adam Yedidia, and it occurred to be that most people (including Delahaye, that I admire) still take "undecidability" in the platonic, Gödelian, naive sense, as saying that there exist "true but unprovable" statements, tacitly assuming that every "statement" in mathematics, including those that involve quantifies over "infinite" sets make perfect sense. This kind of meaninglessness is much worse than

"M tastes like a chicken" ,

because it is not obviously meaningless (to people indoctrinated by the current mathematical religion).

As I have said before, there is a quick dictionary to turn all this undecidability babble and the obsession with related problems, like the "busy beaver", into purely meaningful, albeit uninteresting, statements. Every statement that involves quantifies over "infinite" sets, even such a "trivial" statement like

n+1=1+n , for EVERY natural number n ,

(tacitly assuming that you have an "infinite" supply of them) is a priori meaningless, but many of them (including the above, and the statement that "for all" integers x,y,z > 0 and n > 2 , xn+ yn -zn < > 0) can be made a posteriori meaningful, by proving them for symbolic n (and x,y,z). So the right dictionary (for statements that involve quantifies over "infinite" sets)

Provable : a priori meaningless (taken literally), but a posteriori meaningful, when interpreted correctly (for symbolic n)

Undecidable: not even a posteriori meaningful, impossible to make sense of it symbolically

I am not saying that you are not brilliant, you sure are (and you are also a brilliant speaker, as I found out from your stimulating and engaging talk at AviFest last week), but you are wasting your talent on uninteresting research. Perhaps even worse than "undecidability" is your main research area on "quantum computing", that once again is a challenging intellectual mathematical game, but with empty content. The history of science and mathematics is full of people who had superstitious beliefs: Kepler believed in Astrology, Newton in Alchemy, but they did many other things besides. The great debunker, Gil Kalai, (who debunked the Bible Code, along with co-debunkers Dror Bar-Natan and Brendan McKay), has recently pointed out (unfortunately in his understated, gentle, way) the shortcomings of research in "quantum computing", and my impression is that he is right. It is indeed amazing how in our current "enlightened" age, that (allegedly) abhors superstition, such superstitious people as you (and many other, e.g. MIT cosmologist, Max Tegmark, another admittedly brilliant, but nevertheless superstitious, scientist) can be full professors at MIT.

But then again, it supplies some comic relief, and some of us still enjoy Mythology and Theology, but it is not nice to be dismissive of people who do not share your superstitions.