FAQ Why Algolia The impact of Algolia on SEO

Last updated 28 April 2017

If implemented well, Algolia doesn’t have any direct positive nor negative impact on SEO. But there are multiple small indirect positive impacts, due to the fact that search engines prefer websites with a good UX.

There are things to take into account when implementing anything through the front-end as we recommend on JavaScript and with any search provider.

Many duplicate pages

By indexing search results from any provider, not just Algolia, you can have duplicate content due to indexing of faceted pages that show the similar results.

This can lead to a situation where link equity is spread across pages, and no one page has enough to rank highly in the search result pages. This is solved with the canonical tag. It tells the search engine crawlers, “Yes, this page is a duplicate, so give all of the link equity to this other page.”

Google also doesn’t recommend indexing search results pages. They reserve the right to penalize these pages and many SEO experts recommend disallowing these pages via your robots.txt and noindex’ing them. By letting the search engines crawl lower value pages like search results, you spend up your crawl budget and it might in turn not be allocated to more important pages.

Concerns About JavaScript Delivered Content

At Algolia, we recommend all search results come through the front-end, served by JavaScript. We recommend this because it is much better for the user experience and happy users are returning users.

In the past, there has been a big concern that web crawlers are unable to see this content. But in 2015, Google announced that they indeed can see it. It is important to note, however, that Google still recommends having individual URLs for the “pages” that the JavaScript response creates.

To do this, you will use the browser APIs that allow you to manipulate the browser history. This is also a good UX principle independent of the SEO benefits (which is generally the case!). Note that we do this automatically with our InstantSearch.js library if you enable URL sync.

However, as the URL above mentions, if you want to be 100% safe then you can pre-render pages on the backend and serve it up to the search engine crawlers. This will increasingly be unnecessary as the web crawlers get better at executing JavaScript.

Did you find this page helpful?

We're always looking for advice to help improve our documentation! Please let us know what's working (or what's not!) - we're constantly iterating thanks to the feedback we receive.

Send us your suggestions!