I gave a 5 minute talk on ‘Ux for conversational interfaces’. Here’s the individual slides + running commentary:
In the next 4 and half minutes I’ll cover
- The things I looked at before I started the project.
- What was different when we got into the flow.
- And I’ll finish with a few specifics on interaction design.
Day 0, coming completely fresh to the field of AI. First, what does it mean?
AI was coined at a conference in 1956 by american scientist John Mccarthy. In short AI refers to a machine that mimics cognitive human functions such as learning and problem solving.
Natural language processing is part of AI and refers to the ability of a computer program to understand human speech as it is spoken. NLP is also one of the biggest challenges for conversational interfaces because of accents, grammar, slang, different languages, you name it.
Conversational interfaces have been defined as any UI that mimics chatting with a real human. The idea is that instead of communicating with a computer on its own inhuman terms—by clicking on things —you interact with it on yours, by just telling it what to do.
Two types of conversational interfaces:
- Voice assistants
On the right I have placed a few of the players in the field, like Watson, OK google, slack etc. There’s a lot more.
So, What was different when we got into the flow of the project?
I found myself adjusting and tweaking conventional UX methods to better suit working with conversations and this was crucial for a successful collaborative design process.
More than the specifics of interaction design (which is the final point I will cover), I take this as the most valuable learning and something I am looking forward to expand in the future.
AS a ux designer when you are creating a chatbot, you are essentially being asked to convert all the information you collect into conversations. And this is because language is the main tool you will be using to create value for your users.
This is the key difference in terms of process, outputs and tools. The process is led by copy and logic and this changes things for UX.
Finally, 4 interaction challenges:
Bots main form of interaction is natural language.
The way to interact is to ask questions and people are really uncomfortable with this because this is an unfamiliar way to interact with a machine. We’re used to pressing buttons to command things not to have to explain to them what we want to do.
The first interaction challenge is getting the conversation started. The way we got around this was by Being clear and assertive. Introduce the bot, explain what it is, and what’s here to do.
Make it clear this is a machine you are talking to. There’s no point in trying to pretend there’s a human behind this because people will figure out they are talking to a machine and will be pissed off at it for trying to pretend it wasn’t. That said, don’t speak like a machine either.
Second challenge – is when things go wrong. Machines don’t understand humans because of all the nuances in language.
So there will be lots of errors. Design them in.
For these situations we came up with a 3 level approach:
- First message, “I’m sorry, I don’t know anything about <last thing they typed>”
- Second and third levels will vary to acknowledge that this has happened before and to offer the user different options to manage this.
Although fairly simple, this pattern generated really positive responses because it acknowledging that users were struggling in succession and offered them recognition and control.
Third, sometime users will need to speak to a real person e.g. they hit a dead end, something has gone wrong, something didn’t happen as expected. There’s all sort of situations where users will want help.
We addressed this by adding permanent help i.e. an icon and the bot should also respond to things like ”I need to speak to someone”
4th and final challenge, on keeping the conversation going.
People responded well to a conversation that was fast, dynamic, simple and purposeful.
To achieve this we used
- assertive and clear copy
- buttons pre-empting options
- and pre-suggested text
On the dynamic side of things we used animation to guide users attention and to give the conversation a real flow. A couple of examples:
- We used a chat indicator, to signify the typing action from the chatbot. This brought much delight to users.
- We played around with timing and ways to introduce information at specific points in the journey.
We found that animation was fundamental to keep the conversation engaging.
As a summary
People need consistent interaction patterns that make it easy for them to converse with a bot and achieve what they want to do in an appropriate and timely manner. Start with these and build layers of complexity once you have nailed the basics.
Finally, I would love to hear any comments or feedback. Get in touch.
Actions shots from twitter 🙂