Yoigo Assistant

Why voice?

As telco clients, Yoigo users are used to talking to their company on the phone. Many still require telephonic assistance to complete their purchases. Most times, they simply call to verify the information available on the web or looking for more personal advice. Unfortunately, call centres are usually busy, causing long waits to the user and high costs for the company. We wanted to explore voice user interfaces to assist our web users and personalise their experience.

Let’s start…  by the beginning

We defined the use cases that best fitted this channel based on the following criteria:

  • The interaction already exists as a human-to-human conversation and it is relatively brief, with short ramifications.
  • Users might have to navigate a lot to gather complete this interaction on the web.
  • Users might want to multitask while completing this interaction.

Looking into real conversations

Before writing new scripts and dialog flows, we decided to listen and transcribe the real life conversations that our users where already having with human-agents on the telephone.  We learned a lot about how agents recommended products based on the clients input and, also, about  the questions most frequently asked by our clients. We also “faked” some of this conversations ourselves, in order to empathise with the user and identify the parts of the conversation that a IA might have more difficulties to understand. Once we were able to look at all these transcriptions together, some patterns started to emerge.

Finding the right voice for Yoigo

In order to start talking, we also needed a voice.  We had some brand guidelines that defined Yoigo as a witty and easy-going brand. But we needed some more detail in order to hear how it talks.

We created a persona based on this characteristics and added some language guidelines and examples about what and what not to say.

Time to design the flows

Based on our analysis, we started to write our dialogs and training phrases. We tried to keep this interactions short and simple, identifying possible crossroads and questions that might lead from one flow to another. We also prepared different dialogs scripts for visual or spoken interfaces, Google Assistant or Alexa.

First user testing

On our first test we discovered that users enjoyed the assistant… too much. Although we had limited our first MVP to only three kind of interactions, users asked many more questions that the assistant wasn’t prepared for. We decided to add a bunch of short question-answer interactions and a general fallback that consulted the “help” tool within Yoigo web in case the assistant didn’t have a custom made answer.

Share:

  • Categories:
  • Skills:
    • UX design
    • UX research
  • Share:

Other works

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.