If implemented well, Algolia doesn’t have any direct positive nor negative impact on SEO. But there are multiple small indirect positive impacts, due to the fact that search engines prefer websites with a good UX.
Many duplicate pages
By indexing search results from any provider, not just Algolia, you can have duplicate content due to indexing of faceted pages that show the similar results.
This can lead to a situation where link equity is spread across pages, and no one page has enough to rank highly in the search result pages. This is solved with the canonical tag. It tells the search engine crawlers, “Yes, this page is a duplicate, so give all of the link equity to this other page.”
Google also doesn’t recommend indexing search results pages. They reserve the right to penalize these pages and many SEO experts recommend disallowing these pages via your robots.txt and noindex’ing them. By letting the search engines crawl lower value pages like search results, you spend up your crawl budget and it might in turn not be allocated to more important pages.
To do this, you will use the browser APIs that allow you to manipulate the browser history. This is also a good UX principle independent of the SEO benefits (which is generally the case!). Note that we do this automatically with our InstantSearch.js library if you enable URL sync.
We're always looking for advice to help improve our documentation! Please let us know what's working (or what's not!) - we're constantly iterating thanks to the feedback we receive.