On Connection Speeds and Appropriate Technology

It was Christmas Day of the year 2000.

In the previous year, my parents had secured a sub-prime mortage for a house in the City to join in the American Dream of home-ownership, secured by the wages of a Pizza Hut delivery driver and assistant manager of an Osco Drug. It was the glory times of the dot-com bubble, where the promises of trickle-down economics seemed to have arrived.

In the midst of this plenty, my parents decided to join the masses and bought their first internet-accessible computer as a family Christmas gift: the budget-line Compaq Presario 5184 boasting the following specs:

CPU: AMD K6-2 380Mhz

RAM: 64 MB

VRAM: 4MB

Video: Memory Allocation Technology (UMA)

HDD: 6 GB

Networking: 56kbps

Operating System: Win98x

Having three siblings, we were all limited to using this fine machine in 30-minute chunks on rotation, to be exercised during non-parental hours. As my father was a news and politics junkie (as well as a collector of email addresses), our allotments were always less than ideal, particularly during the school year.

At some point afterward, my older brothers had discovered and installed Napster.

O, Napster!

With a 56k modem, connected to a high-speed peer (green peers or bust!), we could download a 3MB MP3 in approximately fifteen minutes.

As children that grew up listening to Casey Kasem and American Top 40, this was nothing short of a miracle! Instead of hoping to catch our favorite songs, we could now listen to them whenever we wanted--and all we had to do was wait fifteen minutes!

This experience of dial-up has been my intuitive expectation of reasonable connection speeds ever since. Nowadays, I now possess a 30mbps connection and every download is as miraculous to me as that first Napster download.

However, we now live in the era where a 3MB payload is considered acceptable, if not the average, to deliver fundamentally text-based web-pages.

Three megabytes is a tad abstract. Let's look at some rough equivalencies for a minute.

Moby Dick; or, The Whale as an uncompressed plain-text file from Project Gutenberg, with header, racks up a whopping 1,219,547 bytes (around 1.2MB). This means that roughly, people have become habituated to loading the information equivalent of 2.5 Moby Dicks per webpage.

A picture is worth more than a thousand words.

In 2005, the average long-form article at major newspapers was 1,200 words long.

Assuming that, on average, a word in English has 5.5 characters, we can say that an average article has 6,600 characters or 6.6 kilobytes.

If a typical 3MB webpage had an average long-form article, and the text alone was what the user wanted, it would take 2.5kb per character for the desired content to load.

That is 450 words per character loaded.

Even Dickens would have trouble filling that gap.

It was the best of As,

It was the worst of As,

It was the A of wisdom,

It was the A of foolishness...

While lamentable without context, the trend is entirely legible when analyzed: Wirth's law, Parkinson's law, advertising, trackers, images, video, etc--generally the things that have accreted around the web-experience that allow it to necessarily exist in the current Molochian economic context.

It would be nice to undo this trend in commercial activity. Unfortunately, to do so would necessitate also generating a paradigm where frugality of bandwidth generates more value than the necessary evil of frivolous consumption. This seems extremely unlikely.

But what if outside that for-profit context, there were different paradigms as a result of different incentives? What if we treated non-commercial plain-text transfer at the speed of comprehension as a municipal right? Or the consideration that the transmission of information beyond the rate at which the recipient could comprehend it was wasteful? Or in other words, what if we put humans first and transmitted information at their rates rather than at Moore's Law?

Let's consider that long-form article again. If encoded purely in plain-text, without decoration, that same article as a payload would load in about one second on a V.90 56k connection--the same speed that I had as a child in 2000.

Let's assume that an average American adult reads at about 300 words per minute--give or take 100. One second of transfer at dial-up speed creates ~4 minutes of information backlog.

If we were to load pure text content at, or slightly above, the rate at which the average American adult consumes it, we could effectively limit a connection to ~30 bytes/second without hitting a hard wait. Of course, skimming or deep navigation would be problematic, but this could be mitigated or adopted as a feature; less skimming nominally means more comprehension and engagement with the author. In an era of hot takes, enforced patience might be a good thing to bake in.

Even if we assumed that everyone wanted to read at 1000 WPM using something like rapid serial visual presentation, this would only increase the payload rate to ~100 bytes/second.

Given no other constraints, this means that one second of transfer from a basic dial-up connection could support the human speed requirements of ~40 people at 1000WPM, or ~85 at 300WPM, both including ~40 byte TCP/IP packet headers. A single residential 30mbps line, like the one I currently have? More than 20,000 people at 1000WPM and 53,000 at 300WPM (the latter well above the max processes for my CPU).

Maybe it's time to reevaluate what a non-commercial World Wide Web could look like, given broader attention to human factors and ergonomics.

txti