At its I/O developer conference in May, Google announced that its web crawler, Googlebot, will be “evergreen,” meaning that it will always be up-to-date with the latest version of Chromium. This update enables Googlebot to crawl most modern websites and access features that modern browsers can access, such as those based on JavaScript.

Although this update was a long time coming, there is still some uncertainty about what the evergreen Googlebot is capable of. In the third episode of #AskGoogleWebmasters, Webmaster Trends Analyst John Mueller responded to whether Googlebot can detect client-side JavaScript redirects. Mueller explained, “We support JavaScript redirects of different types and follow them similar to how we’d follow server-side redirects.”

Why we should care

Prior to the evergreen Googlebot, using JavaScript might’ve forced brands to compromise functionality or user experience so that Googlebot’s dated version of Chromium could actually render the content. This also left loopholes for malicious actors to use tactics, such as sneaky redirects, to send viewers to pages that are hidden from Google.

Now that Googlebot is capable of rendering most modern JavaScript features, brands are free to make use of it without having to worry about whether it’ll negatively impact their SEO.

Learn more about the evergreen Googlebot and JavaScript

Here are some more resources to expand your knowledge of Googlebot and JavaScript.