Search by Algolia
How personalization boosts customer engagement
e-commerce

How personalization boosts customer engagement

You land on your favorite retailer’s website, where everything seems to be attractively arranged just for you. Your favorite ...

Jon Silvers

Director, Digital Marketing

What is retail analytics and how can it inform your data-driven ecommerce merchandising strategy?
e-commerce

What is retail analytics and how can it inform your data-driven ecommerce merchandising strategy?

There is such tremendous activity both on and off of retailer websites today that it would be impossible to make ...

Catherine Dee

Search and Discovery writer

8 ways to use merchandising data to boost your online store ROI
e-commerce

8 ways to use merchandising data to boost your online store ROI

New year, new goals. Sounds positive, but looking at your sales data, your revenue and profit aren’t so hot ...

John Stewart

VP, Corporate Communications and Brand

Algolia DocSearch + Astro Starlight
engineering

Algolia DocSearch + Astro Starlight

What is Astro Starlight? If you're building a documentation site, your content needs to be easy to write and ...

Jaden Baptista

Technical Writer

What role does AI play in recommendation systems and engines?
ai

What role does AI play in recommendation systems and engines?

You put that in your cart. How about this cool thing to go with it? You liked that? Here are ...

Catherine Dee

Search and Discovery writer

How AI can help improve your user experience
ux

How AI can help improve your user experience

They say you get one chance to make a great first impression. With visual design on ecommerce web pages, this ...

Jon Silvers

Director, Digital Marketing

Keeping your Algolia search index up to date
product

Keeping your Algolia search index up to date

When creating your initial Algolia index, you may seed the index with an initial set of data. This is convenient ...

Jaden Baptista

Technical Writer

Merchandising in the AI era
e-commerce

Merchandising in the AI era

For merchandisers, every website visit is an opportunity to promote products to potential buyers. In the era of AI, incorporating ...

Tariq Khan

Director of Content Marketing

Debunking the most common AI myths
ai

Debunking the most common AI myths

ARTIFICIAL INTELLIGENCE CAN’T BE TRUSTED, shouts the headline on your social media newsfeed. Is that really true, or is ...

Vincent Caruana

Senior Digital Marketing Manager, SEO

How AI can benefit the retail industry
ai

How AI can benefit the retail industry

Artificial intelligence is on a roll. It’s strengthening healthcare diagnostics, taking on office grunt work, helping banks combat fraud ...

Catherine Dee

Search and Discovery writer

How ecommerce AI is reshaping business
e-commerce

How ecommerce AI is reshaping business

Like other modern phenomena such as social media, artificial intelligence has landed on the ecommerce industry scene with a giant ...

Vincent Caruana

Senior Digital Marketing Manager, SEO

AI-driven smart merchandising: what it is and why your ecommerce store needs it
ai

AI-driven smart merchandising: what it is and why your ecommerce store needs it

Do you dream of having your own personal online shopper? Someone familiar and fun who pops up every time you ...

Catherine Dee

Search and Discovery writer

NRF 2024: A cocktail of inspiration and innovation
e-commerce

NRF 2024: A cocktail of inspiration and innovation

Retail’s big show, NRF 2024, once again brought together a wide spectrum of practitioners focused on innovation and transformation ...

Reshma Iyer

Director of Product Marketing, Ecommerce

How AI-powered personalization is transforming the user and customer experience
ai

How AI-powered personalization is transforming the user and customer experience

In a world of so many overwhelming choices for consumers, how can you best engage with the shoppers who visit ...

Vincent Caruana

Senior Digital Marketing Manager, SEO

Unveiling the future: Algolia’s AI revolution at NRF Retail Big Show
algolia

Unveiling the future: Algolia’s AI revolution at NRF Retail Big Show

Get ready for an exhilarating journey into the future of retail as Algolia takes center stage at the NRF Retail ...

John Stewart

VP Corporate Marketing

How to master personalization with AI
ai

How to master personalization with AI

Picture ecommerce in its early days: businesses were just beginning to discover the power of personalized marketing. They’d divide ...

Ciprian Borodescu

AI Product Manager | On a mission to help people succeed through the use of AI

5 best practices for nailing the ecommerce virtual assistant user experience
ai

5 best practices for nailing the ecommerce virtual assistant user experience

“Hello there, how can I help you today?”, asks the virtual shopping assistant in the lower right-hand corner ...

Vincent Caruana

Senior Digital Marketing Manager, SEO

Add InstantSearch and Autocomplete to your search experience in just 5 minutes
product

Add InstantSearch and Autocomplete to your search experience in just 5 minutes

A good starting point for building a comprehensive search experience is a straightforward app template. When crafting your application’s ...

Imogen Lovera

Senior Product Manager

Looking for something?

facebookfacebooklinkedinlinkedintwittertwittermailmail

Algolia Search has always been exceptionally fast. Very few APIs let you call them many times per second and respond faster than your eye can notice, almost as if you weren’t querying a remote service in a data center hundreds or thousands of miles away. When you work on the front end, this is quite freeing. No need to debounce requests or care about loading states; you can rely on the service and focus on crafting experiences that feel instant.

With the introduction of NeuralSearch, we Algolia engineers have been presented with a new set of challenges to maintain unparalleled performance while adding one of the most ambitious features we’ve designed since the first release of Algolia Search.

And this isn’t just about optimizing back-end services and cloud infrastructures. When it comes to speed, every link in the chain counts.

The cost of added features

No matter how much you optimize, adding to an existing system is never free. Some changes are subtle or cheap enough to be imperceptible to the end user. And then, there are significant changes that challenge existing designs. These are usually the most interesting, exciting, and innovative features, but they’re also the ones that stack up and compel you to reconsider what you always took for granted.

When talking about NeuralSearch, it’s important to mention we are adding a complete new engine with neural hashes to the existing keyword-based Algolia search engine for more relevant, semantic results.

The introduction of NeuralSearch is challenging because we bring complex LLM on the critical path of the search, thanks to our neural hashes, the performance impact is contained compared to classical vector based solution but it invites us to reconsider early choices, optimize differently, and progressively resort to new designs to solve new problems.

Speeding up the user interface

Algolia provides InstantSearch, a family of UI libraries that fully integrate with the Algolia APIs to help customers quickly build their search & discovery interfaces without handling the unappealing work they’re not experts in. These libraries are designed to receive new content from Algolia whenever the end user interacts with the UI, so naturally, they’re sensitive to the performance of the underlying API. If it takes longer than usual to get search results, no amount of front-end optimizations will make them come faster.

Still, we believe that speed is an end-to-end concern. Building a platform of independent yet composable services means every component in the system requires careful consideration of its individual performance while staying mindful of how it eventually connects to other components. Although the front-end libraries can’t make the Algolia API respond faster, they can account for new constraints and seek to balance them out.

Actively seeking optimizations

With this in mind, we started looking into ways to make up for when Algolia takes longer to respond. How does this affect the UI? Sure, we can’t force search results to come faster, but is it the whole story?

Interactivity is a crucial component of how you design a search UI library. Unlike a blog or a marketing site, a search UI involves a strong, two-way relationship with the user: they type, click, tap, reset, refine, and expect the interface to reflect those changes.

Eventually, we figured that slower network responses didn’t only affect how fast new search results would show up but also how fast user-initiated actions reflected on screen. For example, say you have a search experience for electronics with a filtering UI component that lets you select specific categories you wish to see. When the end user clicked on a category refinement and Algolia was slower than usual, we realized it took a while for the refinement to appear selected.

When clicking a category refinement on a throttled network (Good 2G), the UI remains idle, unresponsive to the user’s interaction until the network request finally settles.

From a user experience perspective, this isn’t very clear. The UI feels stuck; you’re not sure if you clicked, you might click again thinking you didn’t do it right, which can eventually cancel out your initial action… all of which creates friction and depletes user trust from then on. It doesn’t matter that the state is “technically consistent.” What feels natural is to see immediate, incremental updates of parts of the UI based on interactions and eventually reach a consistent state with later updated search results.

This problem surfaced a design limitation: InstantSearch derives and updates its entire state from API responses. In most cases, this works fine because Algolia Search is optimized end-to-end to respond faster than your eye can notice. But when Algolia takes a bit longer to respond because of network latency or internal slowdowns, the clicked refinement doesn’t show up as selected until there’s a response.

With these new constraints in mind, we decided to challenge our initial choices and introduce optimistic UI to the design of InstantSearch.

Optimistic what?

Optimistic UI is a pattern that increases the speed perception of a user interface. Humans are exceptionally good at noticing delays, especially following an interaction. Research shows that 100ms is the limit for users to feel that a system is reacting instantaneously—any longer, and it breaks the connection between action and reaction.

When browsing Instagram and hitting the “Like” button on a post, you expect instant reactivity. The interface should give you immediate feedback and confirm with a visual or haptic response.

instagram like

When hitting the “Like” button on Instagram, you get an immediate visual response even though the like action may not have been fully dispatched or processed on the back end.

Technically, the whole thing may take time: the network request needs to go through, the back end needs to process the operation (or even queue it if the service is busy), and, granted that everything goes well, respond to the front end. But if the front end waits for this response to confirm the interaction visually, the user may sense a delay.

Such a situation is where optimistic UI chimes in. Instead of cautiously making the user wait in the rare event that the operation may fail, you adopt a confident attitude that it will eventually succeed and immediately reflect it on the UI. If something goes wrong, you can always revert that state and let the user know.

When doing so, you refocus the experience around the user and decide to trust your system: if you’re building a performant, scalable, resilient service and your logs confirm it, it makes sense to design around positive outcomes.

Making InstantSearch optimistic

InstantSearch uses a unidirectional data flow: it performs requests to the Algolia Search API, then derives its UI state from the response. The piece in charge of this state logic is the search helper, an internal dependency that performs search requests and keeps track of the state.

unidirectional data flow

When the end user selects a refinement, InstantSearch updates the search parameters with the new facet refinement and schedules a search. When the helper receives a response from Algolia, it emits an event to which InstantSearch reacts by rendering with the helper’s new state—this is when the UI finally updates.

This model conflicts with optimistic UI. It “plays safe” by using API responses as the single source of truth, but it does it at the user’s expense and how they process interaction flows.

To make InstantSearch optimistic, we needed to perform two changes:

  • When scheduling a new search, immediately render with the state of search parameters to send instead of waiting and using the previous ones computed from the current search results. This change accounts for intent and lets reality eventually catch up.
  • Revert the state on error in case reality doesn’t align with our hopes.

Immediately rendering with the current state

The search helper keeps track of not only one but two pieces of state:

  • The current search parameters with which to perform the next search request. They update whenever a change, such as the end user clicking a refinement, happens in InstantSearch.
  • The latest received search results which contain a copy of the search parameters from which they originated. They update whenever the Algolia API responds.

The current search parameters are often “ahead” of the ones in the latest received search results because they represent the next search to perform. Using only the search parameters from the latest search results provides us with the safety of a single source of truth, but induces the network-dependent delay we wanted to eliminate. Although the mismatch eventually solves itself out, it relies on constant new searches and immediate responses that we can’t guarantee.

search parameters

We decided to keep relying on our single state, but patch its search parameters with the current ones as soon as we schedule a new search from a user interaction. This creates a temporary state where the search parameters and the search results aren’t strictly coherent but better reflect the end user’s mental model.

When clicking a category refinement on a throttled network (Good 2G), the UI immediately reflects it. When Algolia responds, the search results and subcategories update as well.

Reverting the state on error

Although optimistic UI relies on an ideal near future, we must account for when it doesn’t work out. It can happen when the Internet connection fails or in the rare event that the Algolia API is unreachable.

The canonical way to optimistically handle errors is to revert to the original state and possibly let the user know something went wrong. A typical example is the iOS Messages app: when you send a text, the app optimistically adds it to the conversation flow (and shows a progress bar to reflect the send status), but whenever the message doesn’t go through, it reverts resiliently by displaying a warning icon near it that you can click to try again.

search error handling

In the case of InstantSearch, when the end user selects a refinement that we immediately reflect, but the associated search results never come, we just want to unselect the refinement to reflect the actual state.

reflect state

To do so, we decided to save the latest “correct” search parameters received from a successful API response as a backup for the next ones. Whenever an error occurs, we can change the helper state back to align with our results.

Speed is an end-to-end concern

When considering performance within platforms like Algolia, the big picture is what matters. Speed is an end-to-end concern, where every component in the system needs careful consideration as a unit but also as a part of a whole.

Optimistic UI is a small change with a tangible impact in our quest to design and grow the most reliable, UX-focused search & discovery platform.

About the author
Sarah Dayan

Principal Software Engineer

githublinkedinmediumtwitter

Recommended Articles

Powered byAlgolia Algolia Recommend

New features and capabilities in Algolia InstantSearch
engineering

Haroen Viaene

JavaScript Library Developer

Building a composable front-end search with autocomplete and instant search results
product

John Stewart

VP Corporate Marketing

Harnessing API’s with React: a different approach
engineering

Alexandre Stanislawski