website Skip to content
Why filter pages are holding your SEO efforts back and how to avoid it

Why filter pages are holding your SEO efforts back and how to avoid it

Get All Growth Tips &Tactics To Make Your Sales Breakthrough

Quality eCommerce growth content. In your
inbox. Every month.

When it comes to eCommerce, product filtering is essential for a good user experience. Whether online shoppers are browsing through the product categories or looking for a specific item on an eCommerce store, they are likely to use the site's filtering system. If your e-store's filter pages are not set up properly, it can hurt your site's SEO efforts.

On the other hand, did you know that it can also help with your search engine ranking? By creating filters that are SEO-friendly, you can make it easier for customers to find the products they want on your site.

In this article, we'll show you how to set up your filters in a way that won't waste your SEO efforts.

What's wrong with Filter pages? The problem of duplicate content

The ultimate purpose of the filtering system on an eCommerce store is to help customers reduce the number of displayed products on a given list. To do so, users need to choose the filter criteria that match their buying intent.


Simply put, filter pages are the URLs you get after applying some filter options. In different industries, these URLs can look pretty similar. The differences lie in the names of the available filter values and categories.

filter page url on red dress store | boost product filter and search filter page url on protein package | boost product filter and search

In these stores, the URLs for filter pages both contain the domain and collection name, followed by identifiers starting with pf (product filter) then the applied values. For example, opt_size=7 (option size: 7), pt_type=Protein%20Bars (product type: Protein Bars). (Source: Red Dress, Protein Package)

The filtering system is meant to be a help for both usability and SEO, so how can it hold your search engine ranking? The real problem arises with inappropriate indexing.

As the filter pages follow the chosen filter values, the number of URLs knows no bound. Any combination of product filters will generate a different URL even when its content is the same.

different filter page urls despite applying same filter options | boost product filter search different filter page urls despite applying same filter options | boost product filter search

Just change the order of the applied filter, you'll have 2 URLs although they are for the same filter pages. (Source: Red Dress)

When not indexed properly, filter pages can cause duplicate content on the website that makes it difficult for search engines to determine the relevant pages for related searches. This impedes SEO efforts and diminishes the site's visibility in Search engine result pages (SERPs). Sadly, many online stores are not aware of this issue although the solution to it is not that complicated.

How to fix the issue of duplicate content caused by filter pages

In terms of properly indexing filter pages, “less is more" is a rule of thumb. To avoid duplicating materials, you can prevent some from being indexed. Our recommendation is that an indexable filter page should meet the following requirements:

  • It features legitimate URL addresses
  • OR it generates a source of additional substantial traffic to your website

Otherwise, follow the suggested solutions below to ensure that a large number of filter pages does not have a negative influence on your site.

Leverage holiday sales with Boost

Use canonical URLs to avoid duplicate content

Canonical URLs refer to the use of parameter rel="canonical" to signal to the search engine which content is original and should be indexed, and which one is replicated and shouldn’t be indexed. By using it this way, you can determine the best URL to show up on the SERPs instead of leaving that up to the search bots like Google. Here is an example from Conversion Giant:

For this page:

<a href=“https://www.shoes.com/boys-shoes?size=10boys” rel=”nofollow”>Size 10</a>

the canonical would look like this:

<link rel=“canonical” href=”https://www.shoes.com/boys-shoes” />

You have now essentially told Google that for all filter parameter pages, only count the main URL listed in the canonical tag.

Better fix: duplicate content with “nofollow" rather than “noindex" tag

A rel=“noindex" tag is often used to instruct Google bots not to crawl a certain page. In other words, you’re telling Google not to include certain pages in their library of search results. It's a great idea to save the crawling budget.

Read more about crawlability in: 4 SEO Considerations Besides Keywords For Dummies


Still, there are some serious problems with the “noindex” tag. When added to a filter page that has a good source of traffic, it'll exclude that page from the SERPs, which takes some of your visitors. This issue can also occur if you tag a Filter by Category/Brand page with “noindex", the entire category/brand page gets the tag too. As a result, Google won't index it.

Keep in mind that all the internal URLs on your store will instruct Google to assign them some specific anchor texts. For instance, you have Filter by Brands like M&M's Protein, Nocco, Redbull. You also want the search engines to bring up filter pages for these brands when users search for “M&M's Protein”, “Nocco”, “Redbull”, or any related keywords. With the “noindex” tag, you mistakenly take away the ranking of these pages.

That's why using a rel=”nofollow” tag is more effective and less risky.

Rather than messing around with “noindex” tags, consider using a rel=”nofollow” tag in your filter links. Let's look at another example from Conversion Giant:

<a href=“https://www.shoes.com/boys-shoes?size=10boys” rel=”nofollow”>Size 10</a>

Here, you have instructed Google bots not to follow this filter page (with the Size filter applied) to the full URL. The page /boys-shoes/ still gets indexed in the main menu or from the boys' apparel page.

Robots.txt disallow can also help with duplicate content issues

Search bots scan the robots.txt file first when they meet your site to identify which pages are subject to indexing, and which pages are not. All “not for indexing" pages are stored on the robots.txt file.

The file is available right after the domain name, for example, domainname.com/robots.txt, and consists of a sequence of commands. If you want to block specific pages with filtering results, you need to enter the filter URL parameter next to the “disallow” command. Exemplary parameters may look like this:

Disallow: *?color=red

Disallow: *?size=8

If dynamic filters are enabled on your website, they will allow combined or overlapped values. Hence, you may need to set custom parameters that cover the value combinations you want to block.

In conclusion

Product filtering is an important part of the eCommerce experience, but it can be difficult to make sure that it's also SEO-friendly. In this article, we've shown you how to make sure your product filtering is helping you rank higher in search results. We hope you found this information helpful!

boost ai search and discovery official launch

If you're looking for more ways to improve your eCommerce site, be sure to subscribe to our blog today to get useful tips and tricks for eCommerce. Have more discussion with the Boost team on our social channels ​​Facebook, Twitter, Linkedin, and Youtube.

Author

Ellie Ho
eCommerce Enthusiast
A digital nomad that is engrossed with online selling, best practices, and tips to grow sales through website optimization. Love learning, sharing, and connecting with similar minds.
A digital nomad that is engrossed with online selling, best practices, and tips to grow sales through website optimization. Love learning, sharing, and connecting with similar minds.