Tools / Crawler / Troubleshooting

Troubleshooting indexing issues

The crawler reports indexing issues when it can’t send the extracted data to Algolia. Issues can be indicated by:

The Crawler might run into indexing issues due to:

Records exceed the maximum for your Algolia plan

Your Algolia plan has limits on the number of records and the size of records you can import. If you exceed these limits, the Crawler generates one or more error messages:

  • Algolia error: Record too big
  • Algolia's record quota exceeded
  • Extractors returned too many records
  • Records extracted are too big

Solution

Reduce the number or size of records (with helpers.splitContentIntoRecords()) extracted by the crawler or upgrade your Algolia plan.

Data isn’t sent to Algolia

If you notice that some information isn’t showing up as it should, first check that your data extraction actions are set up correctly.

Algolia access issues

Some indexing issues may be due to your Algolia permissions.

Solution

  • Ensure that the appId in the crawler configuration matches the Algolia application ID with the Crawler add on
  • Ensure that the apiKey in the crawler configuration has the necessary ACL and no additional index restrictions that would prevent it from accessing the crawler indices.
Did you find this page helpful?