Voice UI design for a cooking app on Amazon Alexa
In this side project, I explored the idea of using Alexa to cook by developing Chefy, a skill that helps you keep your recipes clean while cooking.
Problem
Cooking can make your recipes (either on paper or on your phone) dirty. Could using a voice-enabled device help keep things clean?
Research
I looked at some Alexa skills competitive to Chefy. I paid attention to invocation names (how the interaction gets started), intents (the ‘instructions’ accepted by the skill on which it was designed to act upon), prompts (what type of questions it provides to the user to continue the conversation), and how are the responses provided by the users handled by those skills.
I also conducted interviews and on- and off-site usability testing using a Wizard of Oz method.
Wizard-of-Oz tool (using an Amazon parcel)
Remote usability test
Solution
This voice-user flow to visualise how users interact with Chefy while cooking.
It maps out how all the intents in a skill are related to one another and outlines the system's responses to various inputs, providing a clear overview of the user's actions and how they relate to one another.
Learn what the user and system would actually say. View the script here.
Voice user flow
Designing conversational UI
“When something speaks back to you in fluent natural language, you expect at least a child’s level of intelligence… So setting that expectation right keeps it more understandable.”
Designing conversational UI is not easy. If we ask another person a simple question, we expect to be understood and answered. If we ask a bot the same question, we will be understood and answered only in a manipulated way, and that’s primarily because a bot/voice assistant speaks using a human-like language. We must accommodate for these limitations and design in such a way that is at least conversational. Michael Beebe, a former CEO of Mayfield Robotics said: