$\begingroup$

Technically, firmware is "hardware", so I would expect it to remain. I assume that "disappearing binaries" is more about the information being removed than a physical calamity which struck all information storage devices but narrowly missed anything which didn't look like running machine code. If indeed firmware remained, then I think we would have a good head start. We should be able to boot any system which can run off firmware, and at least have some primitive OS and text editing capability. The interesting problem is that if we have source code, but most of it is electronic, then it is mostly useless to us until we can restore the systems that are able to read it.

Since there is so much potentially usable software lying around, it would make the most sense to reproduce the compilers which could rebuild the software. This means society would most likely not try to reinvent programming languages from scratch until we had recovered the ones we just lost. Since we have lost the use of the internet and digital information storage, our best bet is to target the best-documented languages for which we have books. Without a doubt, a C compiler will be the first and most important high-level tool in the rebuilding. Once you have that, progress can be made very quickly. You can then rebuild entire OSes, many software tools, and compilers for a lot of languages. There is a reason that this 40 year old language still tops the TIOBE list. It is the "English" of the programming world: awkward, annoying, ubiquitous and powerful.

Since there are so many C/C++ experts in the world, once you have a system which can enter text and store bytes on disk, building a compiler should actually not be that hard. Most likely, a bunch of folks would be improving the "IDE" through raw assembly/machine code, and probably re-inventing it from scratch just to improve productivity. Many parts of a minimal OS would be brute-forced just to get this first C compiler up and running. But I'm pretty sure that getting the first self-hosting build of the compiler would be the moral equivalent of the starter on a giant engine finally firing up the flywheel so it can be self-sustaining.

In fact, this process would most likely happen in many places all over the world. It's entirely possible that Russia or Eastern Europe produces the first working C compiler "post-catastrophe" due to the number of hackers/virus writers who have to understand low-level code. Although China has a lot of hackers, they tend to take higher-level pathways into systems. I would be surprised if they created an early C compiler from scratch (although, a big group of enterprising university students may accomplish this through sheer force of will). The US and Western European hackers would have the advantage of the most C books and reference manuals available to them, and in a language they easily understand.

Now, if firmware is also zapped, things get much, much harder, along the lines of switch toggling as described by others. That is so depressing I can't even contemplate it. But I assume that the threads merge once you get to a basic console (keyboard, monitor, persistent store...whether disk, tape, flash, etc.).

Although many languages have self-hosting compilers, most of the compilers could be rebuilt from scratch in C, and most of the original language designers could aid in this effort. I think overall, the rebuild would proceed much faster than people might imagine (from basic console to self-hosting C compiler in 6 months or less). In almost all cases, folks would probably decide that it's better to simply replace what was lost and regain the functionality than to run off the rails and redesign things.

A redesign would occur if you also lost the source code. Perhaps information is retained in books, but if all electronic executables and source were lost, then I think we would see a significant redesign and shortcut to more advanced techniques. I think C would still be rebuilt from scratch, because of its status as a kind of lingua franca. And possibly Java and a few other major languages would be revived (though obviously from clean-room implementations). On the other hand, it would be much harder to restore Linux or Windows or OS X without any source code, from just books.

Interestingly, we could take this opportunity to eliminate a lot of nagging flaws from the languages, tools, and operating systems. Perhaps we wouldn't get C exactly, but a kind of enhanced C99 with a lot of legacy cruft removed. On the one hand, it would be to everyone's benefit to simply implement an exact C99 compiler, so that people from around the world could exchange C sources as the digital world was being rebuilt. This would discourage innovation. And for this reason, Linux would most likely become the de facto OS of the new era, simply because many portions of it could be restored from books and knowledge locked away in certain high-level wizards.

Probably proprietary software would simply fail to compete until the majority of functionality would be replaced. So the rebuild would most likely occur under a very open model, unless some countries noticed that they were progressing much faster than others, and could gain competitive advantage by closing off their progress from the rest of the world. At the end of the day, global commerce would force countries to re-establish international standards, so it is hard to say how long such walls could survive.

Although many failed languages would simply not be reproduced (unless by their loving creators), the most popular languages would surely be revived because of the stored value of programmers with proficiency in those languages. The same is true of tools. However, it would take a long time to rebuild something like Microsoft Office or Adobe Photoshop, let alone Windows Server 2012. These tools may never exist again, and perhaps there would be a new arms race to reinvent each software category from scratch.

Every technology with a published standard would out-compete proprietary alternatives with no public standard, simply because the standards would represent intellectual effort preserved in text that does not need to be redone. But the weakest standard technologies may be displaced by better alternatives simply because the weight of legacy has been lifted and is no longer such a great advantage to bad old solutions.