So, what if you have great content?

So, what if you have great links?

If your website has technical problems, it is going to hurt your rankings and traffic.

Put it this way, how can you do good SEO if your pages aren’t indexed?

By admission, I did not start out on the technical side of SEO. Instead, I was a content writer who was learning my way around on-page optimization. The thought of servers, crawling, spidering, etc., was not on my mind.

I was a marketer — not a “techie.”

But search has evolved. So much so that marketers have to understand technical concepts and developers have to understand a little about marketing.

Does it mean that marketers now have to do developers’ work and vice versa?

Absolutely not.

It does mean, however, that if you are a digital marketer, you have to have some basic understanding of the technical issues that can harm a website.

The following technical checklist is created for the non-technical person.

If that is you, understand that you don’t have to fix these items per se, but you need to identify when something is wrong.

Once items have been identified, they can be shared with the developer or webmaster working on the website.

(Note: This checklist is not a full compilation of all technical items, but it does include key areas that every marketer should check out.)

1. Robots.txt File

The robots.txt file provides a directive to search engines and every website needs one in the root directory (i.e., example.com/robots.txt).

It must be formatted correctly, meaning it should only block files or directories you don’t want indexed and it should also be included in your XML sitemap.

To learn more about this file and how to set it up the right way, read Google’s recommendations.

Word of caution. If you’re redesigning your website, it’s common to have the dev site blocked in robots.txt using disallow:/.

Advertisement Continue Reading Below

Be sure this disallow is removed before the redesigned site is launched.

Otherwise, you risk taking your website out of the index. I have seen it happen way too many times.

How to check:

Google Search Console > Crawl > robots.txt Tester

2. Canonical Link Elements

Sometimes website owners accidentally create different URLs that generate identical or near identical content.

The canonical link element solves that problem.

Many websites use canonical link elements to ensure the right page – or, in other words, the preferred version of the page – is indexed.

There are some guidelines to follow with canonical link elements. They should reference a URL that does not redirect and is indexed and the URL needs to be the full path (i.e., http://www.example.com/).

The following is an example of a canonical tag:

How to check:

Advertisement Continue Reading Below

Screaming Frog

3. Redirects

Redirects communicate to search engines that a webpage has permanently moved to a new location, which is helpful when a page is deleted or the URL has changed.

Although there are different types of redirects, for SEO purposes, a 301 redirect is recommended, which tells the search engines that the page has permanently moved.

It’s important that the page redirects to the final destination.

Advertisement Continue Reading Below

For example, if Page A redirects to Page B then redirects to Page C, there is a redirect chain.

Instead, Page A should redirect to Page C and Page B should also redirect to Page C.

The idea is that you minimize the number of redirects.

How to check:

Redirect-checker.org

Screaming Frog

4. Duplicate Content

Google defines duplicate content as content that might be replicated within your website or on other web domains.

Advertisement Continue Reading Below

What happens with duplicate content? Google will filter it out of the search results, so only pages with distinct information shows.

Filtering isn’t a penalty, but if your pages aren’t showing up in search results, you won’t be getting organic traffic. It still hurts.

Duplicate content is often unintentional.

For example, a website migrates from a non-secured domain to a secured domain. If the proper redirects are not set up, the non-secured and secured URLs could end up with duplicate content in the eyes of the search engines.

Duplicate content also happens in the ecommerce environment, when product pages are found under multiple URLs.

Duplicate content can be addressed with canonical link elements, as mentioned above or redirects, depending on the cause.

How to check:

Siteliner.com

Site:domain search

5. Mobile-Friendliness

Mobile-friendliness is something every website must strive for, especially with Google’s mobile-first index.

Advertisement Continue Reading Below

Regardless, though, we want visitors to have a good experience no matter the device.

Google provides a way for you to check the mobile-friendliness of your pages.

You can also see an overall view of the mobile performance in Google Search Console.

How to check:

Google Mobile-Friendly Tool

Google Search Console

6. Page Speed

Google cares about page speed.

Your visitors care about page speed.

You need to care about it, too.

Pages that are slow to load turn off visitors and will cause them to click the back button.

User experience isn’t the only issue, though.

Large, slow-to-load pages run the risk of being crawled only partially or skipped completely. No one wants that to happen.

Google recommends above the fold content load no slower than one second and a general rule of thumb is to ensure the entire page load within four seconds.

Advertisement Continue Reading Below

There are many things that can be done to speed up slow loading pages, such as:

Compressing images.

Leveraging browser caching.

Minifying JavaScript and CSS.

The majority of these items will require the help of a developer.

How to check:

GT Metrix

Google PageSpeed Insights

7. Your Performance Versus Competitors

While you are reviewing your own site’s health, you can/should compare your competitors.

For example:

Are their pages loading fast?

Do they appear mobile friendly?

It’s always good to know where you stand compared to your competitors.

Advertisement Continue Reading Below

You can also use tools to glean information to help you outperform competitors and even identify weaknesses.

Nacho Analytics allows you to see your competitors’ analytics, which can help spot opportunities and SpyFu (which also owns Nacho Analytics) provides a nice organic snapshot of competitors.

How to check:

This checklist

Nacho Analytics

SpyFu

Closing Thoughts

Even if you aren’t technical, you should always know the health of your website and what areas can be improved. Simply run through this checklist and discuss the findings with your developer or web team.

More SEO Resources:

Advertisement Continue Reading Below

Image Credits

Featured Image: damedeeso/depositphotos.com

Screenshots taken by author, September 2018