Back

Service Review: How To Review A Service

4 min read
Ben Logan
Director
Service Design
Design Research
Industry
Articles
Resources

How well does your service work for your customers? And how does it compare to competitors in the market? For one of our clients we investigated exactly that. Adopting a customer-focused service design perspective ourselves, we explored, used and reviewed all relevant channels and touchpoints.

What is a Service Review?

A service review is a review of an organisation’s and their competitors’ overall service delivery. It allows an organisation to highlight UX shortcomings and gaps in the market from a customer experience perspective.

How to Review a Service?

In User Experience we are used to conducting Expert Reviews [1] of digital systems. Marrying this rather formal approach with the idea of a Service Safari [2] we defined a process that would allow to focus on the right problem and keep everything comparable.

A service can be reviewed by following the following process:

  • Shaping approach and categories with the client
  • Establishing a framework for the review
  • Conducting the review
  • Presenting the results

The study comprised an evaluation – more in the sense of a classical competitor review – as well as an exploration and documentation of best practices and opportunities throughout all services.

1. Shaping approach and categories

In order to get everyone on-board with the upcoming study we made sure that stakeholders representing all customer-facing sections of our client’s business were present during a kick-off workshop.

All contributed to brainstorming and reflection about the objectives of the study, which can be described in the following review categories:

Awareness

  • Communication of key aspects of the service to the user (USP, mechanics of the service)
  • Communication of pricing and payment model
  • Access through different routes (e.g. search, main URL, shortcuts)

DIY through digital interface

  • Registration, purchase and payment of packages
  • Posting and editing content through the service
  • Navigation through IVR (interactive voice response)

Communication

  • Frequency, timeliness of communication
  • Quality of communication through different channels (e.g. e-mail headers, content, CTAs)
  • Upsell strategy
  • Waiting time, need for referral within support (e.g. online chat, phone calls)
  • Quality of advice, hints and advice beyond questions

Luckily our approach was also informed by interviews and focus groups with end-users prior to that project.

2. Establishing a framework

In order to keep aspects comparable across services and phases, we defined a customer journey that would guide our review – even if the actual journey deviated on each service.

Service Review Journey

Each step and each contact with customer support was scored on ease of use, satisfaction and clarity with an extensive set of a total of 100 aspects. These statements or heuristics were based on kick-off workshop and learnings from previous projects. Breaking it down to the three core categories allowed a comprehensible communication of shortcomings per service and phase.

3. Conducting the review

Conducting the review we went through the whole journey. This included online research, calls to customer support, registration and using the online interface as well as communication through chat, phone and email.

Of course, we were aware of the subjective nature of a score – which can lead to a dangerous evaluator effect. A first run through all phases on all services was double checked again. The scores were treated like a severity rating, which was accompanied by detailed comments and rational. All scores were compared throughout the competitors in order to make the final comparison more robust.

In addition to this evaluative approach, good and bad issues and opportunities were explored and documented throughout the review.

4. Presentation of results

We used visual artefacts to communicate our findings on different scales.

Service Review Map

The detailed scoring framework allowed for a visualisation of gaps and shortcomings throughout the journey. During two following stakeholder workshops they served as a starting point to discuss the position of the brand on a more strategical level.

We also documented more detailed user journeys and flows, in order to highlight obstacles and diversions throughout the process of using the service.

Service Review Flow

Due to the fact that we logged all interactions with web interface and customer support, we could also draw a detailed picture of timeliness of contact and attempts to upsell.

Where appropriate full detail was given by calling out actual usability issues on the interface level.

The discussion of the results with stakeholders helped to uncover operational issues and articulate (conflicting) business goals. It very often is the case that digital products evolve over years and that individual areas are kept in silos within companies.

Service Review Comments

Adding a level of business logic to the customer journey maps helped to guide a discussion about the future strategy of the overall service – rather than fixing individual issues.

MORE from us
View All
headshot of Ben, Director of Spotless
Got a project in mind?

Ben is on hand to answer your questions.