How To Improve Call Center KPIs with Research

By Shivam Vora | Apr 22, 2015

Efficiency and effectiveness are fundamental to designing products and applications that provide exceptional user experiences. In a design or UX context, efficiency can be described as the speed in which users can complete key tasks when interacting with a product. Intuitive navigation paradigms, tooltips, and hot keys are a few ways to increase efficiency. Effectiveness has to do with completing key tasks and achieving intended goals. The effectiveness of a product or application is affected by how information is presented to the user. Are users drawn to the most important elements on the page? Can users quickly understand what each function is?

Research, Research, Research

To design applications that are efficient and effective, the first step is to conduct user research to gather insights into the existing processes a user has to navigate. As Steve Jobs said, “it is important to focus on the user first and work backwards.”

One of ChaiOne’s recent projects involved redesigning a call center application for a large energy company. Customers would call when they had questions or concerns about their bill, wanted to make payments, and/or wanted to view and understand rate plans. For each call, agents were also required to sell products to customers.

Metrics and Key Performance Indicators

Before ChaiOne began redesigning the application, key performance indicators were provided. The subjective metrics were to design an application that was more intuitive, efficient, easy to use, visually appealing and satisfying. The performance metrics were related to designing an application that reduced average call handle time for the common call types such as billing information, taking payments for customers, etc.

To meet the subjective metrics and better understand how agents perceived the existing application, ChaiOne distributed surveys in the beginning of the process to gather a benchmark. When the project was completed and designs were final, surveys were also distributed to measure improvements from the existing application to the redesigned app.

To meet the performance metrics, ChaiOne utilized a strategy grounded in data. To gather a baseline, our research team randomly sampled common call types from a database and transcripts were documented. For example, since a common call type was taking payments, calls were sampled to gather a baseline for how long it took agents to take payments for customers. Word for word agent and customer transcripts were documented, with the goal of understanding overarching themes regarding inefficiencies, as well as gathering a benchmark. Basically, how long was it taking agents to complete key tasks, such as taking payments for customers and providing customers with relevant information on price/rate plans?

Discovery Field Research

In the beginning of the process, before ChaiOne started redesigning the application, field research was conducted to 1) better understand how agents interact with the current application and 2) understand what the problematic areas of the application are. The ChaiOne User Research team conducted interviews, contextual inquiries, and dissected the existing workflows, trying to understand for example how many clicks it was taking agents to complete key tasks.

Data-Driven Design – Design based on research, not assumptions

Once we had a better understanding of the inefficiencies with the current application, it was time to begin designing. We found some takeaways from research such as: agents were interacting with various applications at the same time while trying to have a conversation with customers, it was difficult to navigate the help program so it took a while to answer questions, and agents had a hard time explaining what the products and rate plans were because information was difficult to find in the app. Using the key takeaways from the discovery research, ChaiOne started to redesign the application. Before redesigning the application, our team brainstormed ideas for meeting the subjective and performance metrics.

Changes Executed to Meet Metrics

In order to meet metrics, we made several changes including:

  1. Condense customer information into a single program
  2. Reconfigure customer information to minimize clicking and scrolling
  3. Use typography that maximizes reading speed and scanning
  4. Use a color palette and contrast suitable to the environment to reduce eye fatigue
  5. Add hot keys for power users to navigate faster
  6. Design the interface so that users perceive it to load as fast as possible
  7. Eliminate unnecessary pop-up warnings
  8. Give appropriate system feedback in a reasonable amount of time
  9. Create an easier way to view billing history
  10. Show easy-to-read graphs and gives the ability to email customers the graphs

Test, Test, Test

Once the designs were close to being completed, it was time for user testing to ensure that the designs were meeting business objectives and user needs. The goal of the testing was also to benchmark the performance and subjective metrics.

We conducted several sets of usability tests. To measure the performance metrics, certain aspects of the application were prototyped. The results from the usability testing were very positive. Regarding the performance metrics, all cases were able to reduce call handle time and agents were able to complete tasks more efficiently with the redesigned application.

Energy and Utilities

This then leads to designing for energy and utilities. When designing applications for energy and utilities, the focus should be on designing applications with intuitive navigation paradigms, readable text, and streamlined functionality. Important information should be visible and easy to find. Simply by providing necessary information in areas that consumers will find, there will be an increase in efficiency and effectiveness that sometimes can be upwards of 22%.
ux_strategy

Get in touch

Marketo Form