When you think of the top 100 sites in the world, you think of high-traffic domains and pages coded to perfection. In fact, even the most popular sites in the world have errors hidden behind the scenes that are still visible in your browser’s developer tools. These can affect your experience as a user directly, create inaccurate tracking data and security vulnerabilities, and even lose the company revenue.

We found that most of the top 100 sites had several errors which could be easily monitored and prevented by their IT team. If errors happen on these popular sites, it can happen on your company’s website too. We’ll show you the most common errors and how to avoid them.

Overview of Errors found on the Top 100 Alexa Sites

How We Found The Errors

We used Alexa's ranking to identify the top 100 websites based on number of visitors. We visited each of these websites using the popular Google Chrome browser and disabled all extensions to capture the most native experience. We then recorded which errors displayed in the developer tools console.

You can see these errors yourself by opening your browser’s developer tools feature. In Chrome it’s in the menu under More tools -> Developer tools. Here is what it looks like at Huffington Post:

Chrome Developer Tools showing a list of HuffingtonPost errors

That’s a mess to read! We’ll make it easier to understand by grouping these errors into common themes. Then, we’ll explain what causes them and how to avoid them on your own site.

Failed Tracking and Ads

The most common error we saw by far was failed tracking pixels and ad timeouts. A single website might have dozens of tracking pixels or ads. A common pattern was failed DNS resolution or external ad network errors. Sites that were ad heavy not only loaded slowly in the browser, but also had numerous 404, and timeout issues behind the scenes when connecting to ad network URLs. Users probably don’t mind if tracking or ads are broken,but they can result in lost revenue opportunities for the business.

List of Walmart.com errors

Numerous sites were guilty of having these errors, but the worst of them included: Walmart.com, Buzzfeed.com, Twitch.tv, HuffingtonPost.com, BestBuy.com, NYTimes.com, WashingtonPost.com, CNN.com, and eBay.com.

Most sites had at least one failed ad resource. The common error was HTTP 404, which indicates a missing file. This could mean that the ad network was no longer available or location of their embedded pixel was changed. When marketing no longer works with an ad network, they often fail to alert developers who then remove the pixel from code. The result is that the pixel will 404.

Ad pixels also commonly timed out. This could be why sites loaded with multiple ad pixels tend to be slower than sites without them. Some sites such as Buzzfeed and HuffingtonPost seemed to continue loading in the browser for several seconds. The sites' content loaded, but behind the scenes the browser continued to load content, namely the unreachable embedded pixels.

The impact to a business is three-fold. Slow sites have been shown to affect user engagement and users may leave and go to some other site. The second effect is on revenue by missed ad impressions or clicks, as well as lost opportunity to retarget users with personalized content. Lastly, these sites lose the ability to accurately track which content drives engagement and thus promote or invest in the best content.

Deprecated SSL Certificate Warnings

While SSL deprecation isn't an error, the warnings indicate that these sites are using Symantec SSL certificates that will no longer be trusted by Google Chrome. Sites using cross-site resources on external domains with Symantec SSL encrypted resources will also generate an error in Chrome, so this issue is severe for site owners and any domains using external resources with the same warning, especially if they are unaware of Google's initiative to no longer recognize Symantec-issued certificates.

SSL deprecation error in Chrome

The initiative stems from a 2015 Google audit where it found Symantec had issued unauthorized test certificates under its trademarked name, which violates baseline requirements for Certificate Authorities.

Beginning October 2018, these sites will generate an error in Chrome since the browser will no longer recognize the certificates. Several top Alexa sites are under warning of SSL deprecation. HuffingtonPost.com, Amazon.com, Twitch.tv, PayPal.com, Target.com, USPS.com, CapitalOne.com, AOL.com, Salesforce.com, Etsy.com, Quora.com, Intuit.com, and WashingtonPost.com are just a few sites that either use a Symantec-issued SSL certificate or link to another domain that uses one.

CORS Errors and Web Resources

To speed up a website, it's not uncommon to pull web resources from cloud hosts such as a CDN. Web fonts and CSS files are commonly stored in the cloud because they load faster than locally stored files on the domain server. Browsers are coded to cautiously load external resources due to cross-site scripting attacks. If no cross-site checks are performed, an attacker can silently inject a CSS or JS file, and users are vulnerable to account theft. Here is an example CORS error we found:

Font from origin '[https://ABCDEFG.cloudfront.net](https://abcdefg.cloudfront.net/)' has been blocked from loading by Cross-Origin Resource Sharing policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin '[https://sub.domain.com](https://sub.domain.com/)' is therefore not allowed access.

We found that CORS errors were common with sites that use CDNs, like Amazon Cloudfront, without properly setup permissions. Office.com, Microsoft.com, Xfinity.com, CNN.com, Thesaurus.com, Target.com, Dailymail.co.uk and Forbes.com are guilty of having extensive cross-site permission errors. CNN pulls content from all of its partner sites, so it failed to load resources from nine different domains.

To fix it, site developers must tell the browser that pulling resources and executing from an externally trusted domain is permissible within CORS settings. For example, if you are using a web server like Apache you need to add this header to your .htaccess file:

Header add Access-Control-Allow-Origin "your-domain.com"

MIME Mismatch

A common theme among big retail sites such as Walmart was the following error:

Refused to execute script from 'https://tpc.googlesyndication.com/safeframe/1-0-23/html/%3Cscript%20type=%22text/javascript%22%20src=%22https://c.betrad.com/durly.js?;ad_w=300;ad_h=250;coid=329;nid=111025;%22%3E%3C/script%3E' because its MIME type ('text/html') is not executable, and strict MIME type checking is enabled.

Walmart wasn't the only site guilty of this error. Slickdeals.net, Blackboard.com, CBSsports.com, BankofAmerica.com, Salesforce.com, BBC.com, Netflix.com and Lifedaily.com had MIME errors on their sites and could load embedded resources.

The issue is with the indicated MIME type on external resources when requesting files using JavaScript. The JavaScript call to an external resource assigns the wrong data type. If the browser thinks the data type is a text document, it doesn’t make sense to execute it as a script. These little errors can interrupt backend production such as marketing, ads and customer tracking. You can override this behavior by setting the X-Content-Type-Options HTTP Response, although this can also be a security concern to execute arbitrary data. A better option is to set the proper Content-Type response to application/javascript instead of the default text/html .

Double ID Elements in the DOM

Most developers know that each DOM element in HTML needs a unique ID, but they can sometimes slip through QA since browsers will still allow the elements to render. The issues don't present themselves until either JavaScript calls the ID or a POST event happens on a form using HTML with non-unique IDs.

Facebook error with double element IDs

Facebook.com, NYTimes.com, and WellsFargo.com had were guilty of having HTML elements with non-unique IDs.

Having two elements with the same ID affects functionality on the site. It could cause errors when you use the element in JavaScript, or it could cause backend errors when your processing page retrieves the values from a user POST submission.

Miscellaneous But Notable Errors

Some sites had small, general errors that were still notable just for their simplicity to fix but only occurred with one or two domains. Perhaps the developers overlooked the issue, or they felt the errors weren't critical.

NIH.gov had a simple jQuery error that can be fixed by placing the jQuery JS library above the "ready" function call.

Uncaught ReferenceError: $ is not defined

Incidentally, visiting the site now shows no such error, so NIH developers have recently found and corrected it.

HuffingtonPost.com has errors connecting to a service on 127.0.0.1 on unassigned ports 4387-4389. From the underlying code, it seems that they are failing to connect to a localhost service but closed the ports. The ports could have been closed for security reasons, but the web server is unable to load the service indicating that they've turned it off but forgot to remove it from code. Another possibility is the site is checking for bots running on your machine.

Site Winners with No Home Page Errors

Not every site had numerous errors. It's interesting to note that sites running on older code bases with less reliance on JavaScript made this list. Of the top 100 sites on Alexa, sites reporting no errors in Chrome include:

Reddit.com

Wikia.com (although it does not 301 redirect to a secure version of the site)

Wikipedia.org

Discordapp.com (Discord)

Slack.com

StackOverflow.com

StackExchange.com

Patch.com

If you notice a pattern, sites that didn't use ads on their home page (or any at all like Wikipedia) had no errors. Stack Overflow and Stack Exchange are common sites to find solutions to these errors, so they should be commended on having none. These sites also load fast, and perform better than their peers.

Worst Offenders

While the best of the bunch are few, we did find several offenders with numerous errors on their site. Some of the worst offenders include:

HuffingtonPost.com - 34 errors on the home page

Buzzfeed.com - 13 errors after checking 3 pages

Xfinity.com - 12 errors on the login page

Twitch.tv - 10 errors on the home page

Forbes.com - 9 errors on the homepage

Monitor Errors on Your Own Site

You can find most kinds of errors using Chrome or Firefox’s development tools. It’s a time consuming process to check this manually though. You also need to check each time you make a change to your website, and test on a variety of browsers and devices.

If you have an important web app that many users depend on, then you should use an error monitoring solution like Rollbar. It tracks errors encountered by real users and can alert you when new errors arise or happen frequently. In the screenshot below, you can see the most common errors by the total number of times they occurred, how many unique IPs are affected, when it last occurred, and more.

Rollbar is easy to integrate with only a few lines of code, and it works with all of the major browsers. It offers many more features to help you troubleshoot and fix problems quickly, including telemetry of what lead up to the error, source map support for stack traces, and more. Catch these errors before your users do!

If you haven’t already, sign up for a 14-day free trial of Rollbar and stop flying blind in production.