How do search engines render pages on the web?

While it's easy to focus only on indexing, it's only half the equation.

Indexing generally looks like this:

  • Crawler finds a page through a sitemap or general crawling, then visits that specific page

  • Crawler runs down the page, organizing all content from it

  • Crawler takes in this content and attempts to rank the page for queries related to its content

Traditionally, this has been the sole focus of SEOs as its the beginning of the ranking process.

But this isn't the end of the crawler's search phase.

While this is speculation, our intuition is that the weight assigned to indexing will decrease when compared to rendering.

What's the difference between indexing and rendering?



Notice that this is the same content, but the rendered version is a visual representation of the indexed version.

So it is appropriate to think of indexing and rendering as two sides of the same coin.

Why does this difference matter?

Rendering is important because it’s what the user actually sees.

While a search engine can grasp the general structure of a website from the HTML, the rendered version is final and may display something slightly different from what the engine expected.

Plus, it's hard to derive the user experience from HTML. When a page is rendered, you're experiencing everything a user will and can infer far more about the user experience.

Try answering these questions without having a rendered page to check against:

  • Is the page blocked by an ad?

  • Is the page slow? Or does it have slow-loading components?

  • Is the page content hidden behind click walls?

  • Is content overlapped or otherwise misplaced?

When Does Rendering Happen?

Back in the old days, rendering took a couple of weeks.

Now it happens within a few seconds.

How long does it take for Google to render a page?

The average is somewhere around 5 seconds. It's safe to assume that a page will be indexed within a few minutes of being placed in the render queue.

The short answer is this: rendering takes place after indexing and the timeline is dynamic but short.

So search engines will understand the context of a page before understanding exactly how to process its place in the queue.

What is Googlebot Evergreen?

In May 2019, Googlebot's Web Rendering Service was updated.

Before this update, the Web Rendering Service (WRS) was on Chrome version 41.

This version of Chrome was excellent for compatibility, but it was hell for websites that needed modern properties of JavaScript.

Evergreen means that Googlebot uses the most recent stable version of Chrome for rendering.

This means that Googlebot sees the page how most of your users are going to see it.

So what is a web rendering service?

The basic instruction set for rendering looks like this:

  • The Page is found via sitemap or crawler

  • The Page is appended to a list of pages to be crawled when the crawl budget is available

  • The page content is crawled then indexed

  • The Page is added to the rendering queue

  • The Page is rendered

When a page reaches the number one position in the render queue, the search engine will create a headless browser for it.

Unless you're a seasoned web-scraper, you're probably not very familiar with this term.

A headless browser is a browser without a graphical user interface (GUI).

While this may seem impossible for an engine to visually understand something without actually seeing it, the technology is there.

What is pre-rendering?

Pre-rendering is the act of caching HTML duplicates of JavaScript pages to later serve to Google. This can be done with libraries like Puppeteer.

Its generally appropriate to not need libraries that pre-render, as the lag time between indexing and rendering is so low now.

The easiest way to figure out if you still need to pre-render is to A/B test. Try running the system on a select few pages.

Compare the indexing and rendering of those pages against several that weren't pre-rendered.

If the results are the same, you're good to go.

Why render in the first place?

You're probably thinking if a site doesn't use JavaScript, why would Google need to render it?

Rendering gives search engines the capacity to rank content based on how a real user would interact with a page.

So when they're ranking the different factors that comprise a page's overall score, they're getting the same experience your real customers are.

We're moving to a rendered future

The decrease in time between indexing and rendering doesn't bode well for indexing as the main source of understanding for search engines.

We believe its probably going to become a non-factor as we move towards rendering web content as the primary mode of search engine discovery.