In this blog post I’d like to share our team’s experience and learnings from implementing Algolia in a B2B webshop running on SAP Commerce Cloud 2011.
What is Algolia?
Algolia is a cloud-based search & discovery platform, or search engine, that offers AI recommendation and ‘searchandising’ capabilities in eCommerce environments. In a SAP Commerce context, it can replace the Apache SOLR server to provide (faceted) search to users of the webshop storefront.
Why replace SOLR?
As Apache SOLR is shipped with SAP Commerce Cloud, it comes at no additional costs and the platform has built-in integrations to interact with the SOLR server. So why replace it? Searching with Apache SOLR is fast, reliable and highly configurable if you know what you’re doing. And that’s just the thing. It’s quite difficult for business users to leverage the power of SOLR in a SAP Commerce environment. Dealing with schema.xml files, configuring search behaviour in the SAP Commerce backoffice, managing indices and analysing query breakdowns in the SOLR server console can already prove quite challenging to backend developers with a technical background, let alone marketeers.
In contrast, Algolia offers a user-friendly and intuitive environment where developers and business users can configure, test and demo the search behaviour. This, as well as tech support and extensive documentation allows the business to build the search experience they desire with less dependence on SAP Commerce developers, who in turn can focus more on developing functionality on other terrains. On top of that, Algolia has smart (AI) capabilities and a rule engine that can further drive commercial goals.
The first design decision we faced was whether to have the frontend interact with Algolia via the backend, or directly. Algolia recommends the latter option, and offers ready-to-go React widgets that handle a lot of the logic around searching, facet filtering and pagination. As we have a frontend that’s based on the Accelerator but augmented with React, we went for this option.
As for pushing data and configuration to an index, Algolia offers a Java API Client SDK (Software Development Kit) that can easily be included in the project via Maven. For a moment we were afraid we’d have to essentially recreate all of the logic around indexing SOLR search data, but luckily SAP Commerce Cloud offers a generic searchservices extension where most of this logic is abstracted. The whole structure of search configurations, indexing cronjobs, field value providers can be used to keep Algolia indices in sync pretty much the same as with SOLR.
To connect the generic searchservices extension with the specific Algolia API Client we implemented some searchservices interfaces and added some additional business logic. We implemented both one-phase and two-phase indexing. One-phase by creating a temporary index next to the existing one, and switching when successful. Two-phase by adding records to an existing index and removing the records with an ‘old’ operationId when successful.
We housed all of this logic in a generic extension that could be re-used in the future, and created a separate extension for project-specific logic and configuration.
With some of the ready-made Algolia React widgets injected in our frontend, and the generic searchservices extension connected to the API Client, we had a basis of indexing and searching data.
Challenge 1: Personalized pricing
During the implementation process there were two main challenges. The first one had to do with personalized pricing. In our B2B shop, users see different prices for products due to specific discounts offered to buyer companies. Those prices, subjected to daily change, are fetched on-the-fly by the webshop from an external system.
Although Algolia suggests it’s possible to index multiple pricing tiers or groups, in our case the complexity and volatility of the pricing system prohibited this. This meant that for rendering a search result lister page with twenty products, no matter how fast Algolia returned with the result, we’d have to ‘wait’ for the price call to finish. In practice, this was adding multiple seconds to a page load of milliseconds, a real performance killer.
The solution we opted for was to load the search result page first, and decorating the product tiles with price data fetched in an asynchronous call to the backend. In fairness, this solution would have given the same performance boost in the previous SOLR setup.
Challenge 2: Dynamic faceting
The second challenge we faced was around facets. With a diverse B2B product catalog and a detailed classification system, we had about 1000 facets configured in our SOLR setup. The idea was that with ‘dynamic faceting’, users get served facet filters that apply to the items in the search result, and new possible facets pop up as they drill down and refine with filters.
The Algolia React widgets setup, however, required all possible facets to be rendered up front, only to decide which ones to render based on the search result afterwards. This caused major performance issues.
Right when we indexed our full product catalog for the first time, we could start testing and configuring search behaviour. One of the main advantages of Algolia is the short testing feedback loop. You can enter a search query in the index interface and see exactly how the query was broken down, what the results are and how they are scored and ranked, and make adjustments. The scoring system however is not always that clear to understand and perhaps a little less fine-grained compared to SOLR in some respects.
Despite the above difficulties, good gains were made, although some search queries just didn’t yield the desired results due to limitations of the underlying data. As goes for all software systems: garbage in, garbage out. For these queries, it wouldn’t matter which search engine was used. Even if Algolia would do better when AI was applied, it would do even better when the data was improved. We’ve since started initiatives to improve our product data, to make it easier to distinguish between main products and accessories, for example.
If you really want to expand the search and discovery features of your website, and allow business users to be empowered to build these experiences, it makes sense to replace the default SOLR search for a next-gen search engine. A paid service with intuitive UI, analytics, AI capabilities and tech support. That doesn’t mean SOLR is bad, it may just be harder and cost more development capabilities to unlock its potential and it has its limits. The built-in abstractions in SAP Commerce Cloud combined with the Algolia SDK, together with the Algolia frontend widgets make it relatively painless to make the initial switch, after which project-specific issues can be tackled.
– Intuitive and feature-rich UI, rule testing, very handy demo feature for business users
– Able to manually modify records from UI (for testing purposes)
– Insightful API logs
– Supports one-phase and two-phase indexing
– Search behaviour can be tested directly on the index, or in demo environments
– Easily move around data, search configurations, rules etc. between indices and environments
– Able to re-create records or modify specific values in records when indexing
– Dynamic faceting not supported out-of-the-box
– Having a lot of classification-based facets is slow with React widgets
– Facet (rendering) metadata not included, can be custom but is not manageable
– Could have a more detailed breakdown of query and hits (like SOLR analyser tool)