I am noticing a disturbing trend. On every device I use (e.g. iPhone 6 Plus, desktop, tablet) websites seems to be getting slower and slower to interact with. This isn’t true for every website I visit. However, for all of the bigger and more heavily trafficked websites it is definitely true.

Specifically, I have noticed the following:



Websites seem to take forever to show initial content. The screen is white for a long time before content appears. I can see that tons of data is being downloaded and the CPU spikes hard. Sometimes, fans spin up. However, nothing is visible. Then, all at once, the website is shown. Often, it can take ~2 seconds before the page is initially displayed. Why does my phone have to download and parse a megabyte of JavaScript in order to then download an parse a bunch of JSON to then convert it to HTML to then be rendered. In what universe does this make sense? Why not just download and show the HTML directly? Browsers are crazy fast at rendering plain HTML. Web servers are crazy good at compressing and serving plain HTML. Can anyone explain this insanity? Why do I need another rendering engine built in JavaScript? After the initial page load, the page begins to jitter and convulse like someone with epilepsy as page re-flow occurs over and over again. This usually continues for about 2 more seconds. Can anyone actually read anything on a web page while page re-flows are occurring due to ajaxing content and various other useless gizmos in? After the page is fully loaded, several modal pop-ups, pop-overs, and sliders appear. Those usually take another 2 seconds to dismiss, if I can even dismiss them at all. Some of them can’t be dismissed, or are broken on certain devices, and actually cover main content. It reminds of all of the porn pop-ups from the nineties. Why would anyone do that to their users? Then, as soon as I scroll the page, the page gets jittery and starts to stutter all over again. Of course, my CPU spikes also. Why does scrolling cause the page to have convulsions and fits as re-flows happen again? What possible business case does this fix? If I leave one of these websites open long enough, sometimes my browser/computer will run out of memory and the browser crashes or needs to be restarted because it becomes unresponsive. I have gigabytes of RAM. What could a silly web page be doing that could use up all that RAM? Are these JavaScript objects that are never being garbage collected? Your website should never do anything that requires all of my RAM. Not in this universe or any other. Why would someone do this?

I don’t know why in the world anyone would make a website that has the above characteristics. Yet, I see it spreading like cancer. People may have reasons that sound good in theory for these shenanigans. But, the reality is that these methodologies are causing an experience for users that is horrible and widespread. Whatever propeller head engineering decisions that resulted in the current sad state of the internet surely cannot be good ones, can they?