Data, Record Size, and Usage Limits
Record size limits
Records can’t go beyond a certain size limit. This limit might depend on your plan—see the Algolia pricing page for more details. If you try to index a record that exceeds the limit, Algolia returns the
Record is too big error.
There are techniques to help you break up your records into smaller ones if needed.
Index size limits
For regular plans and infrastructure, the total index size limit is 100 GB. This represents 80% of the RAM capacity (128 GB) of regular servers, which leaves 20% of the RAM capacity to handle your indexing tasks. If the index size exceeds the 100 GB capacity, performance degrades severely: data swaps back and forth between temporary and permanent memory, which is a costly operation.
You can monitor your total index size on the dedicated graph of the Usage page.
There is no limit on the number of records an index can have, only on the memory capacity of the hardware.
There are special infrastructure options available to go much further than this limit. If you would like to know more, please reach out to your Account Manager or the support team.
Indexing usage limits
Maximum indexing operations
Algolia counts the number of operations performed every month. When you hit your plan’s limit, you’re charged for the extra operations, based on your plan’s over-quota pricing.
Indexing rate limit
Algolia delays or rejects indexing operations whenever a server is overloaded. If Algolia determines that indexing operations can negatively impact search requests, it takes action to favor search over indexing. This rate limit exists to protect the server’s search capacity.
Algolia counts a search operation whenever you perform a search. In search-as-you-type implementations, this happens on every keystroke. If you’re querying several indices at each keystroke, then one keystroke triggers as many operations as queried indices, unless you use the
multipleQueries method to do this.