Google’s Search Crawlers to Natively Render JavaScript-based Pages by @MattGSouthern

  Marketing, News, Rassegna Stampa, SEO
image_pdfimage_print

Starting in Q2 2018, Google’s search engine crawlers will begin to render JavaScript-based webpages without the assistance of the AJAX crawling scheme.

For site owners, that means it will no longer be necessary to provide rendered versions of these pages to Googlebot.

Googlebot currently relies on the AJAX crawling scheme to render JavaScript-based webpages when rendered versions are not provided.

ADVERTISEMENT

Following Google engineers’ advancements in the rendering of JavaScript, Googlebot is able to render these pages natively.

In the second quarter of 2018, Google will switch over from relying on the AJAX crawling scheme to relying fully on Googlebot.

Here’s exactly what is going to change

The AJAX crawling scheme currently works by accepting pages with either a “#!” in the URL or a “fragment meta tag”, and then crawling them with an “?_escaped_fragment_=” in the URL.

Currently, in order for Googlebot to crawl the page, the “escaped” URL has to be a fully rendered version of the “#!” URL.

When Google switches over in Q2 2018, Googlebot will begin to render the “#!” URL on its own. These URLs will continue to be supported, but site owners will no longer need provide rendered versions.

There shouldn’t be any significant changes to AJAX-crawling websites, Google says. If there are any issues then Google will notify individual site directly.