Search by Algolia

Sorry, there is no results for this query

How AI search unlocks long tail results
facebookfacebooklinkedinlinkedintwittertwittermailmail

You’ve optimized your efforts to drive traffic to your site, but now the hard part begins: optimizing on-site search and discovery to boost conversions. Most on-site search engines still require visitors to use simple one-word queries like “wine” or “Champagne” — and you can easily optimize for those — but fail miserably for longer queries or misspellings. Our own data suggests half or more queries are in the long tail. What if your search engine acted more like a person — actually understood a search query like, “I’d like a good bubbly for Easter Sunday” — and delivered great results?

If you’re not optimizing search for 100% of your catalog, you’re wasting all that effort on generating website traffic and leaving money on the table.This is where AI search comes in.

Let me explain some terminology first. Fat head, chunky middle, and long tail could be the name of a comedy trio. Instead, they refer to different categories of search terms. There’s the “fat head” of search — the most common queries, the “chunky middle” in which millions of on-site search queries can drive meaningful volume on the site, and the “long tail” where billions of search queries generate a relatively smaller search volume.

long tail volume

Most retailers only have resources to optimize for fat head queries for on-site search. It makes perfect sense to spend time to optimize this part of your catalog; that’s where a large number of popular search queries for your catalog are coming from. However, if you only optimize for the fat head, you’re missing a huge opportunity. 

AI search offers a solution to optimizing all three segments simultaneously. It can improve revenue everywhere — without adding any additional work or overhead. In this blog, we’ll look at how AI search works to unlock the long tail. 

Note: this is not an article about SEO, generating organic traffic, or SEO strategy. This article is about how to improve on-site long tail search optimization.  

Query types and keyword search 

Sometimes shoppers know exactly what they want, and sometimes they want to go on a journey of discovery. There are different kinds of queries, and they’re getting longer and more complex, driven in part by voice search. Some search query types include:

  • Broad searches like “pens”
  • Exact searches like “Apple iPhone 14 Pro”
  • Feature-related searches like “men’s brown loafers”
  • Compatibility searches like “appetizers for a gluten-free dinner”
  • Concept searches like “something to be visible while running at night” 
  • Symptom-related searches like “alternative medicine to manage ringing in my ear”

Full-text keyword search engines can do a good job matching the first three above but, without help, the latter three will be a struggle. 

Keyword search engines look for exact or closely-matching phrases. Built-in typo tolerance, synonym libraries, query categorization, and NLP algorithms can help process these kinds of search queries. For example, a search for something like “men’s size 14 basktball shoes” — even with a misspelled word like basketball — can be parsed and filtered to deliver great results. 

Keyword search is fantastic for fat head and many chunky middle queries. But what about more complex queries such as a symptom search? 

Here’s an example: let’s say you sell aspirin and related products on your website. How many ways can your customers search for those products? Here’s ten off the top of my head:

  • Baby aspirin 
  • Headache meds
  • Back pain
  • Joint pain
  • Soreness in my neck
  • Best medicine for back pain
  • Aspirin alternative
  • Something to reduce inflammation
  • NSAID for shoulder and back 
  • Doctor recommended OTC meds for preventing heart attacks

There are easily another hundred long tail queries for these products alone. It’s impossible to pre-determine all of the long tail keywords someone might write. This is where AI vector search can help.

Vector search for long tail keywords

Vector embeddings are one of the main technologies behind AI search. Vector search is a type of search engine that understands concepts and similarities between objects in a search index. A vector search engine would understand that headache, aspirin, muscle soreness, Nurofen, NSAID, and other similar terms and ideas are related. For example, a pharmacy using vector search can propose “aspirin” when someone types in “headache” or “muscle soreness”, because the search engine knows that all these terms are near in meaning.

We published a longer explanation of how vector search engines work, but briefly, vectors are mathematically generated numbers that represent words. Each of these vectors is plotted in a “vector space”, and the search engines use machine learning algorithms to cluster millions of data points across thousands of dimensions to build an understanding of concepts based on how near the words are from each other within the vector space. This can work across different languages, too.

vector space diagram
Image via Medium showing vector space dimensions. Similarity is often measured using Euclidean distance or cosine similarity.

One of the fascinating things about vector search is that it doesn’t rely on keywords or specific search phrases at all. It can frequently deliver great results even if the search term isn’t anywhere to be found on your website. An example might be to search for the term “monarch” and get results for “king” or “queen”. In other words, vector search unlocks the long tail; customers can type in just about anything to get good results. 

In practice, there can be billions of points and thousands of dimensions. Vectors can also be added, subtracted, or multiplied to find meaning and build relationships. One example is espresso – caffeine + steamed milk = decaf cappuccino. Machines might use this kind of calculation to determine an answer or relationship. Search engines could use this capability to determine the largest mountain ranges in an area, determine gender or gender relationship, or identify diet cola alternatives. Those are just a few examples, but there are thousands more! Even if the long tail search phrase has never before been used on your site, a vector search engine will determine relevance. 

Pure vector search has been available for several years, but hasn’t been widely adopted because, as it turns out, it’s expensive to scale and slow. Speed matters to online buyers. Both Amazon and Google have published studies in which they demonstrated that slower page load speeds negatively impact e-commerce conversion rates and reduce customer engagement. 

Amazon showed that every 100 millisecond delay loses 1% of revenue. Similarly, Google showed a 500 msec delay reduced engagement by 20%. (source)

Keyword search is faster than vector search and often still preferred for single-word queries or exact phrase matches. However, it’s impossible to write rules and synonyms for every possible query. 

What if you were able to get the best of both keyword and vector search without the tradeoff of speed or accuracy?

AI combined with keywords

AI is very powerful, but more-so with traditional keyword search technologies. Together this hybrid combination delivers better results for any type of query. Keyword search is precise, vector search is smart. The combination of the two offers the best of both worlds.

The trick is in making it fast and performant across different datasets while determining relevance from both keyword and vector search results. At Algolia, we do that with a technology we call neural hashing. We can convert vectors into binary hashes — a smaller, more portable data type that can be run on commodity hardware without any cost overhead. Hashes retain 96% or more of the original vector accuracy. Combined with keywords, we’re able to deliver some mind-blowing results from the fat head to the long tail. See the example below in which we do a long tail search for “something to keep my beer cold”. Good luck with that on a keyword search engine! Results are fast, too, even on the largest catalogs with millions of SKUs. 

sample AI longtail search
In this sample data set of 22,000 products, hybrid search could find products that match the unusual query in single-digit milliseconds.

There are many different complementary technologies at play including NLP, typo tolerance, stemming, and query categorization. This is a big differentiator between Algolia and some other possible solutions out there. Only Algolia offers deep expertise with all of these search technologies to deliver better results at query time, faster and at scale.

For your customers, an API-First solution such as Algolia AI search means faster, more accurate search, discovery, and recommendations. Better search results mean a better user experience and a higher likelihood of on-site conversion and brand loyalty.

No effort, better results, more revenue

Long tail phrases present a challenge for on-site ecommerce optimization. It’s virtually impossible for your team to determine search intent and write rules, synonyms, and keywords for every possible query combination. Hybrid search offers retailers a smart solution even if you don’t have the exact right keywords on your site. 

Additionally, as your catalog changes, new products and new content is added, or as terms take on new meaning, the AI hybrid search engine will adjust. It doesn’t require any additional headcount or operations. The hybrid engine will automatically match keywords or concepts — sometimes a mix of both — depending on the query or search phrase. 

Better search results for head terms and long tail keywords will drive higher conversion rates and happier customer satisfaction.

See how AI search improves long tail search results. Sign up to be notified when hybrid search is available:

NEURALSEARCH ANNOUNCEMENT LIST

 

About the author
Michelle Adams

Chief Revenue Officer at Algolia

linkedin

Recommended Articles

Powered byAlgolia Algolia Recommend

What is vector search?
ai

Dustin Coates

Product and GTM Manager

The past, present, and future of semantic search
ai

Julien Lemoine

Co-founder & CTO at Algolia

How ecommerce search engines handle different types of queries
e-commerce

Jon Silvers

Director, Thought Leadership Marketing