SAPBanner3.jpg
SAPBanner3.jpg

SAP - Kaleidoscope


Voice integration exploration

SCROLL DOWN

SAP - Kaleidoscope


Voice integration exploration

While at SAP’s Innovation Center, I was tasked with exploring the possibility of integrating Voice UI (specifically, Amazon Alexa) into an existing SAP application, to enhance product experience.

My role:

  • User Research

  • Secondary Research

  • Ideation & Brainstorming

  • Script Writing

  • User Testing

  • Supporting engineering as they work on implementation

Process:

  • Research

  • Brainstorming and ideation

  • Create the first draft of the script for Alexa, to enable conversation flow

  • User testing with a low fidelity prototype to understand user expectations and validate decisions

  • Do regular weekly testing of the product as it is developed, for validation

The application: A touch UI application that allows visitors to the Innovation Center (ICN) to browse through and learn about various projects that the ICN has been working on.

Starting point: Observe and interview users with current application

IMG_20170208_153107.jpg

Conducted market research

Insights gathered:

  • Some devices use Natural Language Processing (NLP) and Machine Learning, and are therefore more capable of carrying out a “conversation” with a user.

  • Other devices can only recognize specific commands, programmed in. Alexa is one of these.

We did some exploratory research, where we played the role of Alexa ourselves.

This helped us create a wishlist:

  • Have the system greet possible users in the vicinity, drawing them in.

  • Make the nature of the product more obvious, even before the user interacts with it.

  • Have Alexa act as a help chat, for support.

  • Give users the ability to use voice commands for the entire experience.

User testing

We created a script for Alexa and tested it out with a low fidelity prototype.

UserTesting.jpg

User testing feedback

  • The application name “Kaleidoscope” or “Kali” was not working well in voice UI.

  • Users were looking at Alexa as more of a guide.

  • Users were not inclined to use Alexa for actions they could perform with a simple touch.

  • If Alexa did not respond or understand them the first time, users were unlikely to try using the VUI again.

  • The need to re-activate the skill multiple times within a session was a big negative.

  • Better on-boarding was required.

Insights and learnings

  • It’s hard to use a device with such specific commands in a space where first time users “happen upon” it.

  • The need for such a feature itself was in question. What value is this adding to the experience?

Limitations with Alexa:

  • Conversation initiation

  • Need to re-activate skill

  • No Natural Language Processing

In the end, we decided to de-prioritize this project in favour of other more value-adding endeavors.