Click Depth: Why and How It Matters for a Website’s Pagination and SEO

 Much has been written about on-page SEO factors and how they affect search engine rankings.

Unfortunately, not every article on SEO factors on the page lists click depth or page depth as a ranking factor.

John Muller (Webmaster Trend Analyst at Google) has even described click depth as an important ranking factor for SEO, even more important than URL structure. According to Muller, Google considers the number of clicks from a homepage to your content as the most important SEO ranking factor.

In this article, we’ll understand the concept of click depth in SEO and how it affects your website’s internal linking, pagination, and crawl budget.

Let’s dive in!

What is click depth, and how does it help with SEO?

The term “Click Depth” describes the number of clicks it takes from your website’s home page to navigate to another page.

For example, the page clicked from your homepage has a click depth of one.

In the eyes of Google, a website’s homepage is considered cornerstone material, while every other page is considered essential material for SEO. Therefore, your goal should be to make it easy for users to navigate to the important content of your website from the homepage.

Let’s take an example of a 300-page website to help you understand how the click depth of a homepage affects search rankings.

The anatomy of a 300-page website and its SEO performance

Imagine you have a site with pagination starting at 1 through 300. For this example, let’s assume they are blog posts with unique content. Here, page number 1 is the homepage of the website, and page 300 is the last page of the website.

Now suppose the site consists of pagination, where these pages are connected with a simple “next page” link at the bottom of each blog post. (See image attached below for better understanding.)

From a human point of view, the pagination scheme is simple: you click the “next page” link and you go to blog post number 2. You click “next page” again and go to page 3. To visit each page ( forexample,e 300 times), you must keep clicking “next page” until you reach the page you want to visit. This looks likethea holy grail of website pagination. Is it not?

However, from the crawler’s point of view, the site looks like this:

The image above illustrates the visual representation of a discovery path that a search engine crawler must follow to crawl the entire 300-page website. The tail shown in the diagram is a “tunnel” that represents a long, contiguous series of pages that the crawler must search at a time. Note that for each page, the “next page” link acts as a navigation to the next page. It takes 299 clicks for a user to get to the last page of the website. Google sees this as a click depth of 299 clicks, which is very bad for SEO as it provides a poor user experience for your website.

According to Google, the optimal homepage click depth should not exceed three clicks. This way, the SE crawlers can easily find your website, and your website can have a good crawl budget.

Here’s what Google’s John Muller has to say about the importance of click depth in SEO:

“What matters to us… is how easy it is to actually find the content.” So especially if your homepage is generally the strongest page on your website and it takes several clicks from the homepage to actually get to one of these stores, then that makes it a lot harder for us to understand that these stores are actually quite important.

“On the other hand, if it’s one click from the homepage to one of these stores, then that tells us that these stores are probably quite relevant, and we should probably give them some weight in search results as well.” So it’s more about how many links you have to click through to actually get to that content than what the URL structure itself looks like.

Essentially, if a page takes more than 3 clicks to reach, its performance is considered poor by search engines. Therefore, the pages that are deeper in the silos of your website will have more problems being crawled compared to the pages that have a click depth of one or two.

If the statistics are to be believed, deep pages have a lower page rank because search engines are less likely to find them. Therefore, there will be crawling problems. These pages are less likely to perform and rank compared to the pages that are easy to find from the homepage.

In our example of the 300-page website, the last page has a click depth of 300. The main takeaway from this kind of website pagination is that the traditional form of “next” and “previous” type of pagination is just inefficient. Neither is it good for SEO or the user experience.

If I were to make a list of issues that have “Next” and “Previous” pagination, here are my key points:

When a site’s content is buried deep in the form of links, it sends a bad message to the search engines that the content is not important to users. Hence, bad SEO. Even worse, if one of the links between “next” and “previous” returns an error, crawlers will not crawl the pages that are deep in the pagination. Website visitors do not click until the 300th page or even less than that. That’s a bad user experience. So, how do we improve the click depth of a 300-page website?

Enter the paging scheme for midpoint links.

The midpoint link paging scheme is best for sites with a large number of pages. Here, the pagination for the home page looks like this:

In the image above, 201 is the centre point of the pagination, and this reduces the click depth from 300 to just a few clicks. In addition, this scheme allows a crawler to navigate from any page to any other page in just a few steps.

This is the crawl chart for the midpoint pagination strategy:

Notice how easily your user can go from page no. 2 to the last page of your website. 201. This leads us to an incredible improvement in crawlability compared to the previous “next” and “previous” paging schemes.

What is the relationship between click depth and page rank?

If a website (or homepage) has poor click depth, it can negatively impact search engine rankings.

The reason is simple: Google doesn’t crawl pages that are far from the homepage. As a result, these pages will not be indexed. This affects the rankings because there is little to no traffic on the pages.

The relationship between click depth and page rank is directly related, as both are very important to evaluating the importance of a page. The Page Rank algorithm does this by counting the quality and number of links to the page. On the other hand, Click Depth does this by automatically influencing the Page Rank depending on the number of clicks the Google bot made to find the deep-linked pages of the website’s homepage.

So, how do we improve the PageRank of the 300-page website?

The answer to this question lies in the internal link graph optimization scheme. Regardless of the number of pages, your website can be optimised for PageRank by internally linking the most important pages of your website.

Kevin Indig, VP SEO & Content at, wrote a great article on the optimization scheme for internal linking graphs of a website with more than 1000 pages. Kevin’s article illustrates a TIPR model that dives deep in a Robinhood way to improve the pagerank of bad pages on a website by internally linking them from stronger pages.

What are some strategies to improve your website’s click-through rate?

To improve a website’s click depth, all you need to do is make all your pages accessible within three to four clicks from the home page.

To begin with, this can be done by visualising your website as a tree diagram to understand the overall structure of the website. (See the image attached below for reference.)

Notice how much easier it is to understand the structure of the website when we visualise the tree chart and put it on paper. For large websites, you can improve click depth by internally linking the underperforming pages to the useful pages.

Internal links improve your website’s click depth by:

reducing the number of clicks it takes for a user to reach the page, easing the crawler’s job Redistributing page ranking across the entire website by linking low-performing pages to top-performing pages reducing the bounce rate because your users can easily navigate and access the linked pages In addition to internal links, there are other strategies to improve a website’s click depth:

Sidebars to link the top-performing pages and articles Breadcrumb links to navigate through the previous pages and the homepage Conclusion Click depth (sometimes referred to as “page depth”) should be an important consideration when designing pagination for a heavily structured website. With the right approach, SEO can make the most of Googlebot’s crawl budget, improving the visibility of the site’s content.

Is there another way to improve the click depth of a large website? Let me know in the comment section.