$\begingroup$

maybe all the major/preferred algorithms of interest to this audience have been mentioned at this point. however, a few more deserve mention for completeness. & some analysis of what is considered a significant algorithm is relevant here.

in CS & IT fields there seems to be a phenomenon noticed long ago in AI called "moving the goalposts". this is a psychological phenomenon where the field advances relatively quickly but people quickly mentally adjust to "the new normal" and take real or even breakthrough advances as mundane or unremarkable in retrospect, after accomplished, ie downplayed or minimized. this is highly captured in this question in the way that algorithms move from R&D into "deployment". quoting the author of the question in later comments:

In fact, a negligible fraction of all the code that gets written is implementing anything that is interesting from an algorithmic point of view.

but this is problematic and basically a TCS-centric redefinition of the word "algorithm". presumably the interesting algorithms are advanced. does that mean that if a problem is reduced to an advanced algorithm, its no longer "interesting"? and "advanced" is clearly a moving target. so there is a way to define "algorithms" narrowly, or broadly. it seems the TCS definition changes on context, but note even in TCS, there is a trend toward the broad definition eg in the so-called "algorithmic lens".

sometimes the most ubiquitous algorithms are also the most overlooked! the internet and WWW is a large environment/near-ecology for algorithms. still relatively young at only about 2 decades old (invented ~1991) it has grown massively and exponentially in a short amount of time. WWW site growth has probably even outpaced the famous exponential Moores law.

the internet/WWW are supported by many sophisticated algorithms. the internet has complex routing algorithms built into routers (again powering multi-billion-dollar corporations such as Cisco). some advanced theory is applicable there eg in routing algorithms. these algorithms were a subject of emerging, advanced/cutting edge research decades ago however a now so finetuned & well understood that its somewhat invisible.

we should not so soon forget that decades ago, leading researchers were not even sure if the internet world work or was possible (seen in early packet switching research, a radical new design pattern at the time departing from the prior circuit-switching), and even a few years ago there were fears that it would fail to scale at some point and begin to fail due to overwhelming spikes in volume.

it also uses sophisticated error detection/correction. the internet probably the largest, most fault-tolerant system ever built by humans, still growing.

next, there is a strong case to make that the algorithms powering the WWW are advanced. HTTP & web servers are highly tuned/optimized and also use advanced security/encryption protocols (HTTPS). the rendering logic of a web page has become extremely advanced in HTML5 & CSS3, along with the Javascript programming language.

the relatively new CSS has various principles similar to OOP programming such as reusability and inheritance. speaking of typesetting, TeX is an important, internally complex scientific typesetting system (not so different than a programming language) invented by Knuth that can now be rendered on web pages (and is used in possibly hundreds of thousands of scientific papers or more).

another relatively new area of algorithms building on the internet, still emerging, those based on collective intelligence. stackexchange software itself is an example of a sophisticated collective intelligence system. social networking also exhibits the key features of collective intelligence and features are continually being added to increase that intelligence (for example facebook "Likes" are only a few years old). the field of rating systems is based on collaborative filtering algorithms and is still evolving based on new research and applications.

so in short, all revolutionary successes transforming daily human experience actually quite far beyond merely "field goals". as the title of the question states, all core algorithms deployed. now so ubiquitous and invisible as to be something like the IT expression, "part of the plumbing".