that is styled using the “100vh” attribute of CSS.

We conducted an experiment to see if it could be a problem: http://botbenchmarking.com/testgsc/visibility.html

The result? Fetch and render was able to see just the main <div>. Other parts of the website are omitted by F&R. This screencast illustrates the problem:

The problem can be even deeper. It can refer to all websites that always scale the content to a viewport.

To see what I mean, you can open http://botbenchmarking.com/testgsc/visibility.html in your browser. Then press “CTRL minus” a few times. I bet you still can see just the main div. Once you scroll down, you can see the rest of the content. But the Google Fetch and Render tool can’t “see” the content.

As far as we know, it doesn’t affect indexing.

But then the question is: does this issue affect ranking? Or affects only the way Fetch and Render sees your content? For the time being, I don’t know the exact answer. Did you encounter similar issues related to Fetch and Render? I would be happy to know your feedback.

How to check if Googlebot “sees” a drop-down menu – Check it out in Chrome 41!

John Mueller gave us some interesting hints on how to check if Googlebot can see drop-down menus in a JavaScript SEO Group.

The takeaway is clear: If your menu is inside the Document Object Model (DOM) before clicking any menu item, it should be picked up by Googlebot. Otherwise, it’s very likely it won’t be picked up at all.

To check this in Chrome 41, just click on the “Inspect element” button. And then search for any menu-specific fragment.

[Side note: technically, you can use any browser for the DOM inspection. But, do you have any arguments for not using the browser used by Google’s Web Rendering Service? Since we all know Google uses Chrome 41 for rendering, let’s take advantage of using this browser to make your websites crawlable and indexable :)]

To illustrate what I mean, let’s use an Ebay.com example:

At Ebay.com, there is a drop-down menu.

Let’s check if Google can see this and follow the menu links.

We can pick up Electronics -> Additional categories ->IPhone category.

Click “inspect element” (or Ctrl Shift I). Go to the “Elements” tab. Press Ctrl + F and search for “IPhone”. Indeed, it’s in the DOM, so Google can see this!

[Side note: if you are unsure about investigating the proper node, just edit the internal text “iPhone” and see if the menu was updated.]

Here is a screencast illustrating how you can accomplish this:

Our quick investigation showed that Google can interpret Ebay’s menu properly. However, this isn’t always the case. There are tons of websites with menus and content hidden under tabs Google can’t handle.

To avoid pitfalls, always check it using the Inspect tool in Chrome 41.

Using Chrome 41 to check the errors Googlebot gets

The following scenario is very common: Google Fetch and Render shows you that it can’t render a page correctly.

To investigate the source of the rendering issues, open Chrome 41, look at the console and see what errors occurred.

To demonstrate the scenario, I created a simple page.

The page contains a very simple script that takes advantage of a very common “let” command (a part of the ES6 JavaScript syntax).

I was checking if Googlebot supports it by opening it in Google Search Console and in Chrome 41. To compare the results, I checked if the newest version of Chrome can render it correctly.

The script is very simple. If a browser supports ES6, it replaces a paragraph with the following text: “Yeah, your browser supports ES6”.

Chrome 61 renders this page perfectly:

What about Fetch and Render?

OK, so open this page in Chrome 41 and see what errors occurred:

I don’t want to go too deep with the programming stuff. If you are not familiar with console errors, probably the best option for you is sending the information about the errors to your developers or Googling it. For sure they will know how to deal with errors like this.

So if I described you above, just go ahead and skip to the next section. However, if you know what I’m talking about, then please keep reading.

What does this error mean?

For those who really want to know what this error is really about, I’ve prepared a short explanation.

Chrome 41 throws a Syntax Error:

“Line 7; “Uncaught SyntaxError: Block-scoped declarations (let, const, function, class) not yet supported outside strict mode”

An error indicates it’s caused by the “let” declaration in the seventh line:

Ok, but what’s wrong with the “let” declaration?

Let’s check CanIUse.com:

Bingo! As you can see, it’s not fully supported by browsers older than Chrome 49. CanIUse.com indicates Chrome can use it only in strict JavaScript mode.

How to make it work?

You can use a strict mode in your JavaScript code.

You can use tools like Babel that allows transpiling to ES5. In a transpiling process, the “let” declaration is replaced by the “var” declaration (which is fully supported by Googlebot).

Crawling budget – browsers are not like Googlebot

Google can properly render a large number of modern websites. However, the problem is more trivial. Googlebot doesn’t act like a real browser. What does a standard browser do? It downloads all the resources: JS, CSS files, images, movies, etc. and shows the rendered view to users.

Googlebot acts differently. It aims for crawling the entire internet, grabbing only valuable resources and sending it to the indexer. The World Wide Web is huge. So Google should (and does) optimize its crawlers for performance. This is why Googlebot sometimes doesn’t visit all the pages webmasters want.

Google algorithms try to detect if a resource is necessary from a rendering point of view. If not, it probably won’t be fetched by Googlebot.

Also, another factor plays a role. Googlebot adjusts its crawling speed for website performance. If Google detects that the response time is very slow or the requests from Googlebot makes a website noticeably slower, it can lower the crawling speed.

If a website contains many time-consuming scripts, Chrome 41 will execute this and render a page. Whereas Google Fetch and Render and indexer will probably stop executing a script after ~5s — referred to by many SEOs as the 5-second rule.

The future

There is strong pressure on Google to upgrade Googlebot’s features and make it up-to-date.

Google promises that its rendering machine will always be based on the newest version of Chrome in the future.

When we spoke with Ilya Grigorik from Google, he said, “Last thing we want is every developer running an outdated version of Chrome. Fingers crossed, we’ll have better solutions in 2018+ :)”

And more big news is coming. John Mueller claims that Google is working to make Google Search Console Fetch and Render show not only a rendered image, but also a rendered DOM snapshot. “We’re working on including a copy of the rendered HTML/DOM in a tool that should hopefully be out soon.” Once they finish it, troubleshooting will be much easier. One will be able to see exactly which elements Google was able to pick up.

Summary

The news that Google’s Web Rendering Service uses Chrome 41 gave us an enormous amount of insight on how Google sees websites. Using Chrome 41, we can discover (with high probability) if Google has problems fetching content. If there are any problems with rendering in Chrome, just look into the Chrome Developers Console for any errors that might have occurred.

We have to be aware that there is a slight difference between Chrome 41 and Googlebot (or to be more precise, Google’s Web Rendering Service) regarding supported features. Also, we have to keep in mind that Googlebot doesn’t act like a standard browser. Here, other factors, like crawling budget and the speed of a website, play a significant role. These factors can really affect crawling and indexing! An SEO shouldn’t forget that.

If you’re interested in knowing more about JavaScript-based websites and SEO, then you absolutely must read my Ultimate Guide to JavaScript SEO.