Menu

Menu

Enhancing search experience with SearchCentral

A UX case study on improving search discoverability and evaluating search engines through an A/B test—balancing interface clarity with smarter, faster answers for CPF members.

Category

MVP

Date

Feb 2025

Category

Category

MVP

Date

Date

Feb 2025

Feb 2025

Context

We aimed to enhance its website search by comparing two engines—SearchCentral (in-house) and SearchSG (third-party)—through a structured A/B test. To ensure a fair comparison, both versions were matched in layout, colour, responsiveness and included a “Beta” tag for transparency. Alongside engine evaluation, we explored UX enhancements inspired by best practices from government and global sites.

Problem

To determine which search engine better served members' needs on the CPF website, CPF needed a data-backed comparison to identify the engine that delivers more accurate, relevant, and efficient results.

Goals

To improve the CPF search experience by:

Gathering data to inform future decisions on search strategy and design

Improving usability and visibility of the search feature

Evaluating engine performance through an unbiased A/B test

Research & benchmark

Research & benchmark

Business challenges

  • Low discoverability of the search function reduced user engagement and risked invalidating test outcomes

  • Limited customisability with SearchSG to present an apples-to-apples test

SearchSG at a Glance

https://www.search.gov.sg/

Feature

Semantic search

Smart suggestions

Auto indexing

What does the feature offer

Uses AI to understand user intent and rank results based on relevance.

Provides predictive search, typo correction, and synonym recognition.

Automatically crawls and updates indexed pages from CPF’s website.

Why it fits our usage

Members often use natural phrasing and varied keywords.

Reduces failed searches from typos or uncommon terms.

CPF manages a large volume of dynamic content.

Benefits

Delivers faster, more accurate answers to user queries.

Improves search success rate with minimal user effort.

Keeps results fresh and relevant without manual updates.

Benchmark comparison

Define & align

Define & align

User needs

  • Result accuracy: Members want to get answers to their CPF-related questions quickly and correctly.

  • Minimal effort: Members prefer a direct experience without needing to apply filters or navigate complex interfaces.

  • Reliability: Reassurance that the feature is trustworthy and continuously improving.

Task flow

Layout comparison and alignment

To ensure a fair evaluation of both search experiences, I conducted a careful alignment of the UI between SearchCentral and SearchSG. This was essential to isolate the effectiveness of the search engines themselves without the influence of design disparities.

Some are the list of alignment efforts (non-exhaustive).

  • Search bar placement: Positioned consistently within the global navigation across both variants

  • Colour theme and typography: Aligned to CPF’s digital design guidelines for visual consistency

  • Consistent layout structure: Page templates and spacing standardised across the two versions

Ideate & design

Ideate & design

How might we...

... increase discoverability of the search bar across all devices?

... encourage members to try the improved search experience?

... make it clear that this search feature is being tested and improved?

... make it easier for members to recognise the search feature on mobile?

Launch & measure

Launch & measure

A/B test execution

To evaluate the effectiveness of the enhanced search experience, the analytics team conducted a controlled A/B test across the CPF website over a 6-month period.

Members were randomly assigned to experience one of two search engines: SearchCentral or SearchSG

  • Traffic was evenly split between the two search variants.

  • Designs were matched as close as possible in theme and layout to ensure a fair “apples-to-apples” comparison.

  • A “Beta” label was included to inform users of the experimental feature and invite feedback.

  • Search widget placements and icon enhancements were consistent across variants.

Metrics tracked

While results remain internal, the analytics team tracked key performance indicators using:

  • Adobe Analytics for SearchCentral

  • Google Analytics for SearchSG

Key metrics monitored included:

  • Click-through rates (CTR) from search results

  • Query-to-resolution pathways

  • Engagement on suggested results

Outcome

Outcome

While figures remain confidential, insights from the A/B test informed CPF’s future search direction, shaping decisions around engine selection, UI clarity, and discoverability strategies. We were able to determine through weighted CTRs, mean reciprocal ranks, and benefits that one-third of our engine selection costs were lowered.

Takeaway

Takeaway

This project reinforced how findability and relevance go hand-in-hand in delivering a meaningful search experience. While the core aim was to evaluate the search engine’s accuracy, it became clear that discoverability and interface clarity are just as critical.

Search must evolve to remain context-aware, fast, and intuitive.

  • Accuracy means little without discoverability — users must easily notice and trust the search interface

  • Small design tweaks, like icons and widget placement, can drive significant engagement

  • A/B testing enables confident, data-driven improvements without sacrificing user trust

Create a free website with Framer, the website builder loved by startups, designers and agencies.