In 1879 Gottlob Frege published his Begriffsschrift: A formula language, modeled on that of arithmetic, of pure thought. This was when logic was still limited to Aristotelian syllogisms and few people saw any connection between it and mathematics. The only other significant attempt to bridge the two had been George Boole's An Investigation into the Laws of Thought. Instead of using Boole's rather opaque algebraic notation, Frege invented his own. It's utterly unlike any other notation system, often described as being "two-dimensional" in how it arranges operations something like a flowchart. Along with a set of informally derived axioms, Frege used this system to derive a long series of logical formulae.

Sadly this gloriously weird system never caught on... for us.

In a timeline not too far away, however, it wasn't forgotten. The logicism project which Frege helped launch spread its use. The Principia Mathematica of that universe is even longer but somewhat more readable. When the alternate Turing and Church were working on the Entscheidungsproblem, they described computation using Frege's terms. And as digital computers proliferated in their wake, the new programming languages naturally followed the established notation, only making a few tweaks here and there as needed to turn propositional statements into tools for telling a computer how to execute.