#Google Webmaster Hangouts Notes – 5 March 2019 – Part 1













Hey! Welcome to MarketingSyrup! This post is part of my Google Webmaster Hangouts Notes. I cover them regularly to save you time 🙂

This is Part 1 of my notes on Google Webmaster Hangouts from 5 March. When I found myself in the middle of the video with the notes of 1000+ words, I decided to break down them into 2 parts. There is definitely lots of worthwhile info in this video, so stay tuned for Part 2, I’ll post it soon! You can subscribe and get the notes delivered directly to your inbox.

As usual, you can find the timestamps for each answer in the brackets. And here is the full video:

Sharing the same IP address with other websites is not a concern in most cases (2:03)

If your website is on a shared hosting, and another website sharing the same IP with yours get hit by Google, this should not negatively influence your website. The same is applicable to using shared CDNs.

There are some extreme cases though when almost all websites on the same IP address are spammy, and a few of them are not. In such a scenario it might be tricky for Google to separate these few quality websites from the spammy ones. But again, this is an extreme case, and Google has become way better in understanding such things.

If a website used to perform well in Google, it doesn’t automatically mean that it’ll perform well in the future (4:00)

Google algorithms are changing, user expectations too. So even if a website was doing well, things might change over time. This means that it’s always important to improve your website and think about your users.

JSON-LD can be added via GTM but this is not an ideal solution (11:33)

While you can use GTM for structured data, there are some considerations.

In order to properly display your markup added via Google Tag Manager, Google needs to render JavaScript, process script files from GTM, index all the information. It’s much harder than extracting info from markup added in a more straight-forward way.

Moreover, Google can process structured data added with GTM for some pages but not for others if it takes too many resources.

A lot of the testing tools don’t process markup added via GTM too which makes it harder to debug the implementation.

There are many signals Google uses to determine which page is canonical, so in some cases user-stated canonical night be ignored (15:46)

Google uses multiple factors to determine which page is canonical:

Rel=canonical

Redirects

Internal and external links

Sitemap

Hreflang links

So even if you have a rel=canonical pointing to a particular URL but you never use this URL on your website, chances are Google might just ignore this canonical or pick another page which is strongly linked internally instead. This relates to the fact that Google can index URLs with UTM tags if they are linked internally.

You can use either canonical or hreflang directives for pages with main content in different languages and additional content remaining in the same language (17:55)

If you have a website in one language but part of this content can be translated in other languages (e.g. UI can be switched from English to German user-generated content will stay in English), you have 2 options to handle this:

Canonical – you can choose a particular language version as canonical and have it indexed by Google. Hreflang – this option is preferable if you want Google to show all your language versions in the search . Note that in this case indexing might take much longer if you have extensive conten and number of pages.

The choice is up to you and depends on how you want users to access your website.

It’s more user friendly to add actual authors’ names to blog posts rather than having a generic author (21:47)

In regards to EAT (Expertise, Authority, Trust), having a real author’s name displayed on blog posts might have an indirect influence on rankings since it’s more user-friendly than having a general author (like ‘Marketing’, ‘Admin”, etc.).

It’s fine to use different schema markup languages for a desktop version and AMPs (22:59)

It’s perfectly fine if a desktop page version uses microdata but an AMP version uses JSON-LD (or vice versa) to implement schema markup.

There are also no issues in having different markup languages within the same site version. For example, a desktop blog can have article markup added with microdata and desktop product pages on the same website have product markup added with JSON-LD.

Hiding content with display: none in styles in responsive design is not viewed spammy by Google (24:32)

If you have a responsive website and hide part of content only for some devices (e.g. smartphones) for better user experience, it’s OK from Google’s point of you.

However, it’s against Google’s guidelines to hide content on all devices or use keyword optimized text of the same colour as the background (e.g. white text on a white background).

Improve your traffic and revenue from SEO within 60 days!

A website can quickly recover from a manual action once it’s resolved, but with one exception (26:08)

Usually, if a manual action is resolved, then the website will quickly become visible in search again.

But there’s one exception: if a site is removed for pure spam reasons then it’ll definitely be removed from Google index completely. So once the reconsideration request is approved, Google will need to recrawl the website again which sometimes may take a few weeks.

Note that there also might be a change in visibility once you remove the reason for the manual action. For example, if you had unnatural links that helped you rank higher, then got a manual action for that and removed those links, after this manual action is resolved your website might rank lower as the unnatural links which used to help don’t exist anymore.

Low traffic pages are not automatically low quality pages, so think twice before removing them (30:20)

Before going and deleting low-quality pages from a website:

Make sure they are really of low quality. Some pages may have low traffic but it doesn’t automatically make them low-quality, they might just target not very popular searches. So use a combination o metrics to determine pages quality before deleting them.

Think about other ways to handle low-quality content. For example, update and expand it; combine multiple pages; 301 redirect them to related pages.

That’s it for the first part. I’ll be posting the second part of these notes soon.

Previous episodes

I cook digital marketing dishes. Take 3 tablespoons of on-page SEO, add 2 pinches of backlinks and sprinkle it all with paid advertising. Season to taste with actionable data from Analytics and bake until golden brown. Serve hot.