#Google Webmaster Hangouts Notes – 3 May 2019 – Part 1













Welcome to MarketingSyrup! This post is part of my Google Webmaster Hangouts Notes. I cover them regularly to save you time.

Here are the notes from May 3rd. When I already had so much content while I didn’t get through the whole video, I decided to break this down into 2 parts. This is the first part, enjoy! The timestamps of the answers are in the brackets.

There’s no minimum amount of words you need in a blog post (1:56)

Google doesn’t count the words in a blog post at all. So there’s no optimal amount of words you need to use, it’s more important to cover the topic and provide useful information to your readers.

Google looks at different factors to determine the quality of the content, and the amount of words and lines is not one of them.

Kristina’s note: Sometimes it’s just easier to give a number to the copywriters to make sure you won’t end up with posts of 100 words. Though in some cases it’s possible to cover a topic with 100 words, it happens quite rarely.

The number of words in a blog post also depends on the type of it. If it’s a guide that is supposed to cover some topic in detail, a few paragraphs won’t be enough. I remember when one blog manager didn’t like my guide of 2000+ words… until it became the most popular post on that blog.

But if you just answer a question or provide some tips, people don’t want to read an excessive amount of the text. The bottomline here: use common sense.



Use the XML sitemap file with <lastmod> date to speed up re-indexing of your pages (4:56)

The best way to let Google know that your page has been changed and needs to be re-indexed is to use the last modification date for this page(s) in your XML sitemap. Additionally, ping the sitemap in Google Search Console.

Note that Google doesn’t use the change frequency and priority parameters in XML sitemaps anymore.

But don’t overuse this method. If you start adding <lastmod> to the pages which haven’t been updated, Google might eventually start ignoring this.

Google can show multiple pages from one website if they are strong and relevant (9:09)

If there are a few pages from a single website which Google sees as very relevant for that particular term, they all can be ranked high in Google.

But if these ranking pages are constantly replacing one another in search, it means that none of them is strong enough, and Google tries to understand which of them should be shown.

Kristina’s note: This is known as self-cannibalization: website pages compete against each other. Sometimes such things are hard to avoid (for example, I have a series of Google Webmaster Hangouts Notes which cannot be named differently). But in most cases, it’s easy to achieve.

Using parameters in some URL types (internal search, pagination) helps Googlebot better optimize crawl budget (10:52)

From the Google point of view, both variants – website.com/search/searchterm or website.com/search?q=searchterm – will work.

But John Mueller still recommends using a query parameter (like q= in the example above) as it makes easier for Google to understand that this part might vary.

Moreover, parameters help Google learn faster how to optimize crawl budget on the website. This is also applicable to pagination created with parameters (e.g. ?page=3) vs pagination being part of the URL (e.g. /page3).

Kristina’s note: This is really interesting and new to me. I don’t think you need to change your current pagination, especially if you don’t have crawl budget issues. But it’s still good to know if you’re developing a new website which is going to have lots of paginated URLs (e.g. an eCommerce store) since rel=prev/next won’t help anymore.

In terms of the site search, you’ll 100% need to use parameters as this way you will also be able to track what people are looking for within your website. This can be configured in Google Analytics. I’ll soon write a post about it, so subscribe and get it delivered to your inbox.