Voice and Text Conversations

Applying Sound User-Centered Design Principles to Emerging Technology

As is typical with a new technology, companies are starting to explore the benefits of Chatbots and Assistants for their business and customers. Many innovators start by launching simple examples that run on new devices or in new channels, leaving the script to developers…or anyone brave enough to try. As companies are ready to offer more complex experiences, conversation designers (also called “voice UX” or “VUI” designers for voice applications) should be brought onto the teams to take a user-centered approach to the intents, specific prompts, and dialog management.

See More About our Voice and Text Conversation Services

screen

Intents and Scripts

Understanding the reasons that users want to use your Assistant/Chatbot and what the business wants to do is a key to success (think of the limits as guardrails). Both comprehensive and lightweight research methods are available to understand what intents must or should be supported. Analysis of the ways that the users (customers and employees) talk, how they group requests or information, and unusual situations that should be handled smoothly leads to better experiences. From there scenarios, flow charts, error handling strategy, slot filling strategies, and scripts/prompt lists can be created to create a useful, usable, and engaging experience. Without that, users won’t come back.

Tuning and Testing

With voice conversations, prompts need to be tuned or recorded to sound as natural as possible. Using voice actors who can be coached to understand the context and get the intonation right, or using SSML (Speech Synthesis Markup Language) to tune the prosody variables (like rate or pitch) to have the artificial voice approximate the right intonation. The goal with artificial speech should be to be appropriate and understandable, not to fool the user to believe that they’re speaking to a human.

Testing can involve getting user feedback on the scripts before launch, evaluating the experiences and suggestions of actual users, asking users to evaluate the sound of prompts, and gathering metrics and analytics. Using this feedback, scripts can be adjusted, intents added or expanded, guardrails moved, slot-filling and error strategies tweaked, and prompts re-tuned or re-recorded to continue to optimize the user’s experience and success.

chat code waveform

Our Experience

Lyn Bain worked in Telecomm for 12 years designing voice-based products like Voice Mail, Information Services, Directory Assistance Automation, and Call Management Features. She is an inventor on 3 U.S. patents for voice-based products. When the products had onscreen interactions for the customer or front-line employee as well as the conversational interactions, Lyn designed for both. As the possible interactions get more complex, the kind of experience Lyn brings to understanding user intents and behaviors, linguistics, business goals and processes, and writing understandable interactive scripts will be essential to successful interactions.

Nate Hall has produced 3 independent films, winning awards at multiple film festivals and has worked with companies such as Qualcomm and Verizon on industrial and tradeshow videos. His experience writing scripts and working with voice actors to achieve natural dialog on set and in recording studios helps when tuning voice prompts to sound natural using SSML or working with voice talent to record custom prompts. In addition, Nate’s work on mobile and web applications means that applications that are enhanced with onscreen information or interactions can be handled by the same team.

Go to Top   |   © Chili Interactive, LLC