Introducing new developer-friendly pricing
Hey there, developers! At Algolia, we believe everyone should have the opportunity to bring a best-in-class search experience ...
VP of Product Growth
Hey there, developers! At Algolia, we believe everyone should have the opportunity to bring a best-in-class search experience ...
VP of Product Growth
Eye-catching mannequins. Bright, colorful signage. Soothing interior design. Exquisite product displays. In short, amazing store merchandising. For shoppers in ...
Search and Discovery writer
Ingesting data should be easy, but all too often, it can be anything but. Data can come in many different ...
Staff Product Manager, Data Connectivity
Everyday there are new messages in the market about what technology to buy, how to position your company against the ...
Chief Strategic Business Development Officer
Done any shopping on an ecommerce website lately? If so, you know a smooth online shopper experience is not optional ...
Sr. SEO Web Digital Marketing Manager
It’s hard to imagine having to think about Black Friday less than 4 months out from the previous one ...
Chief Strategic Business Development Officer
What happens if an online shopper arrives on your ecommerce site and: Your navigation provides no obvious or helpful direction ...
Search and Discovery writer
In part 1 of this blog-post series, we looked at app interface design obstacles in the mobile search experience ...
Sr. SEO Web Digital Marketing Manager
In part 1 of this series on mobile UX design, we talked about how designing a successful search user experience ...
Sr. SEO Web Digital Marketing Manager
Welcome to our three-part series on creating winning search UX design for your mobile app! This post identifies developer ...
Sr. SEO Web Digital Marketing Manager
National No Code Day falls on March 11th in the United States to encourage more people to build things online ...
Consulting powerhouse McKinsey is bullish on AI. Their forecasting estimates that AI could add around 16 percent to global GDP ...
Chief Revenue Officer at Algolia
How do you sell a product when your customers can’t assess it in person: pick it up, feel what ...
Search and Discovery writer
It is clear that for online businesses and especially for Marketplaces, content discovery can be especially challenging due to the ...
Chief Product Officer
This 2-part feature dives into the transformational journey made by digital merchandising to drive positive ecommerce experiences. Part 1 ...
Director of Product Marketing, Ecommerce
A social media user is shown snapshots of people he may know based on face-recognition technology and asked if ...
Search and Discovery writer
How’s your company’s organizational knowledge holding up? In other words, if an employee were to leave, would they ...
Search and Discovery writer
Recommendations can make or break an online shopping experience. In a world full of endless choices and infinite scrolling, recommendations ...
Oct 10th 2022 engineering
We invited our friends at Starschema to write about an example of using Algolia in combination with PostgreSQL. Enjoy this two-part series by Software Developer Mátyás Budavári!
If you would like to skip ahead to part two, where the implementation is covered, click here.
Hello, I’m Mátyás, a software developer at Starschema. This post is about the audit of one of our products and how we can leverage the speed and distributed nature of Algolia to let our moderators search through the audit data without having access to the underlying PostgreSQL database.
Our tool lets the users of our enterprise partners access all their reports from different Business Intelligence (BI) software and other resources in a centralized place. The most popular BI tools that we work with are Tableau and PowerBI. BI tools help communicate, present and analyze data with powerful interactive visualizations that can work even with real-time data. Managing multiple data sources and dashboards is often a frustrating and time-consuming experience filled with noise that harms the ability to derive insights and leads to fallible decisions.
We developed a service to improve the data analysis and presentation process by enabling individuals or teams to curate and collaborate with customized shared workspaces. Users can focus on analyzing data and communicating insights without having to shuffle between applications and authenticate multiple times. The contents of these reports are still served from the original tools, and we provide a unified experience on top of them. In addition, our users can modify the visible name of these reports and add additional tags and other metadata that make it easier for their teams to organize these reports.
We collect usage data periodically, which enables us to fine-tune the application based on how our users actually use it. In addition to collecting these metrics, we must comply with security requirements. We need to be able to tell who could access certain reports at a given time and what changes were made in the application by whom in a specific time range. We collect so-called audit logs to be able to extract this information. This data must be handled with care, and only people with higher privileges shall have access to the raw data.
In a recent project, we needed to develop an optional extension to extract anonymized audit log information into an external place out of our PostgreSQL database to let a team of content managers access that data quickly. These audit logs can contain JSON metadata, and it is hard to set up advanced search features with PostgreSQL. It’s better to index the data so that it will speed up the queries.
We need data about the usage distribution to present to the creators of the BI reports. Both Tableau and PowerBI provide usage metrics on their reports, but we’d like to show the usage information in a unified manner. We can show them detailed information on how their content is accessed through our application.
Since the report owners often want to know the peak periods of their insights, we need audit information to see what reports were accessed by whom through our application.
Note that the external BI sources manage permissions in their specific way. For this reason, the creators need to set the audience access up in the tool they use to publish the reports, we synchronize this information, and only those BI reports will show up for the given users what they can access.
We mainly use our application in internal and protected networks. Most of the time, due to strict policies, it’s not possible to connect dedicated analytics services so we save our analytics data into our database. However, most of the time, our analytics team does not have access to the database. We don’t send out actual user data and report metadata. Sometimes, we can send out anonymized data and tell them what specific identifiers represent in a different medium if it’s relevant and we’re allowed to share.
All in all, we need a way to query the usage logs fast without access to the original database.
We decided to use Algolia as the search index. We don’t want to host, manage and maintain our search index service per installation, and it’s much better to have a managed centralized place where we can send our data. Our focus groups can use it to compose the reports they need based on the data sent over there.
Out of security and respect for our clients, it’s an opt-in feature.
Our backend service of the application is written in Golang. We store the connection information and report metadata for the BI Tools and the usage information in a PostgreSQL database.
We aim for a solution as close to the database as possible. We want to leverage the language features that PostgreSQL provides without any extensions. And we don’t want to modify our code too much to support this data gathering, though we’re open to running small microservices that communicate with Algolia.
To send our data to Algolia, we need to write some kind of code outside PostgreSQL.
We want to create the data load component as a separate service with the single purpose of sending data to the index. Due to the privacy of our users and reports, we can not expose our database to the above-mentioned external auditors. We can only send them a selected set of information.
The audit log table can grow large over time in proportion to usage. We don’t want to query the whole log table every time we need to send out data. Having a small queue, that takes care of the data load is better. Each log line is necessary, but we can’t rely on these small services to be available at any point in time. We have to save what rows are needed to be processed.
We plan to create a new table in PostgreSQL that stores the filtered data that is yet to be sent to Algolia. This table will act as a LIFO (last-in-first-out) queue to prioritize the new data over the old ones.
We can send our already collected data into Algolia upon creating this new queue table.
We aim to point a trigger to the source table, and on every insert statement, a PostgreSQL procedure copies the necessary fields into this new table.
We plan to write a new service that periodically reads the new table, gets the last few items, and sends them to Algolia. It would be even better if multiple such mini-services could run simultaneously without interrupting each other and work on the queue independently.
In the next part of this article series, we’ll put the plan in motion by setting up the required triggers, seeding some random test data, and finally setting up the Producer and Consumer code to handle syncing data from PostgreSQL to our Algolia Index.
Software Developer Team Lead
Powered by Algolia Recommend