Product

What is Natural Language Processing, and how is it leveraged by search tools/software?
facebooklinkedintwittermail

Language is one of our most basic ways of communicating, but it is also a rich source of information and one that we use all the time, including online. What if we could use that language, both written and spoken, in an automated way? That’s what natural language processing sets out to do.

Natural language processing, or NLP, takes language and processes it into bits of information that software can use. With this information, the software can then do myriad other tasks, which we’ll also examine.

Why is natural language processing necessary?

But first, why is natural language processing even necessary? First off is the fact that massive amounts of information is created and shared every day through natural language. Billions of social media posts come through daily. Trillions of searches happen on search engines great and small. Call transcripts. Emails. Classifieds. News articles. Some of these, like search queries, benefit directly from NLP. Others, like news articles, can be processed via NLP to create value.

Let’s look at a couple of examples in more detail. We’ll start off by looking at news articles.

Jones to Assume Presidency of Acme Corp.

Marcus L. Jones today announced that he was to become the 4th President in Acme Corp. history. He will lead the widget maker into its next chapter as it examines expansion into new markets, such as Europe, Mexico, and Canada.

Now think about all of the things we may want to do with this text. For example, we could want to know which companies, subjects, countries, and other key entities are mentioned so that we can tag and categorize similar articles. One way we can do that is to first decide that only nouns and adjectives are eligible to be considered for tags. For this we would use a parts of speech tagger that will specify what part of speech each word in a text is.

NLP normalization and tokenization

But even once we identify those words, things are tricky, because is “widget” different from “widgets”? Of course not! So we need to use some normalization, which will collapse words to their core so that different variations can be considered equivalent. And normalization can be complex, in cases such as “Europe” and “EU,” or “Marcus L. Jones” and “Marcus Jones.”

This approach even ignores that the items relevant for tagging and categorization may not be single words, but could even be a phrase, such as “Acme Corp.” Identifying these items is the job of tokenization. Tokenization breaks a larger text into smaller pieces. It may break a document into paragraphs, paragraphs into sentences, and sentences into “tokens.” (We won’t say words here, because Acme Corp. can be a token but isn’t a word, and “isn’t” is a word, but would often be broken down into two tokens: is and n’t.) Tokenization can be very difficult. For example, even something as “simple” as identifying sentences in a paragraph is tricky, because what happens when you have a sentence like the first one in the article? Is “Marcus L.” a sentence because it ends with a paragraph and is followed by a word with a capital letter?)

Using NLP for named entity recognition

Altogether, identifying key concepts is what is known as named entity recognition. Named entity recognition is not just about identifying nouns or adjectives, but about identifying important items within a text. In this news article lede, we can be sure that Marcus L. Jones, Acme Corp., Europe, Mexico, and Canada are all named entities.

Finally, we may want to understand the connections between words. This will help our programs understand the semantics behind who the “he” is in the second sentence, or that “widget maker” is describing Acme Corp.

Natural language processing for search

Natural language processing for search queries is just as important, but with its own challenges and needs. A good way to illustrate this is to also discuss an important factor of natural language processing: the fact that there are thousands of natural languages spoken in the world. Different languages will have different needs, and while English is the language that much NLP software starts with, it is not indicative of all languages.

As an example, English rarely compounds words together without some separator, be it a space or punctuation. In fact, it is so rare that we have the word portmanteau to describe it. Other languages do not follow this convention, and words will butt up against each other to form a new word entirely. In German, the word “Hundehütte” means dog house. It’s not two words, but one, but it refers to these two concepts in a combined way.

A naive search engine will match Hundehütte to Hundehütte well enough, but it won’t match that query word to the phrase “Hütte für große Hunde,” which means house for big dog. Natural language processing comes in to decompound the query word into its individual pieces so that the searcher can see the right products. This illustrates another area where  the deep learning element of NLP is useful, and how NLP often needs to be language-specific.

Natural language processing structures data for programs

Going through all of these steps for natural language processing (altogether known as the natural language processing pipeline) returns information that is structured in a way that software can understand. By now you’ve seen that there is a lot of information hidden inside language. A 40 word paragraph can refer to a company, a person, three regions, and much information about those items. Humans are very good at identifying the important parts of language, and understanding how it goes together, but bad at taking hundreds, thousands, or millions of texts and finding trends or grouping them together. Most software programs are the reverse: they can find trends or categorize texts, but they are bad at the text itself. That’s why we use tailored software, natural language processing, to structure the text into a way that those programs can use. (By the way, why not combine all the steps into a single program? Having small, focused programs can make each step better, and allow us to combine different tools for different purposes.)

Conclusion

Above, we looked at the examples of a news article and a search query, and how we could use natural language processing to better transform the text. Think now about the other examples of textual content we discussed, like call transcripts, classifieds, or emails. What kind of processing might these texts need?For more information on how Algolia’s search and discovery APIs leverage NLP, or to learn more about how we can help you implement this powerful technology within your site or app for a more engaging user experience, please contact our team of experts.

About the authorDustin Coates

Dustin Coates

Product and GTM Manager

Recommended Articles

Powered by Algolia AI Recommendations

Advanced keyword search is built upon natural language processing (NLP)
AI

Advanced keyword search is built upon natural language processing (NLP)

Julien Lemoine

Julien Lemoine

Co-founder & former CTO at Algolia
NLP & NLU as part of semantic search
UX

NLP & NLU as part of semantic search

Dustin Coates

Dustin Coates

Product and GTM Manager
What is natural language search?
Product

What is natural language search?

Dustin Coates

Dustin Coates

Product and GTM Manager