Conversational Shopping is an experimental project I worked on at Nordstrom from September 2017 to May 2019, exploring features for Google Assistant and Google Home, as well as applications with Dialogflow. I helmed the design of this project (and also the product management for a six-month phase in late 2018).
In the fall of 2017, Google approached Nordstrom with a bilateral marketing opportunity in exchange for developing an app for its voice assistant platform. I was the lead UX Designer and worked with—and often led—a cross-functional team (Product Management, Engineering, Program Management, UX Writing, UX Research) for the duration of this project.
Performing Preliminary Research
When this project began, the proliferation of Amazon Alexa and Google Home (now in more than 50 million homes) fueled speculation that conversational UI would become the platform for everything. Nevertheless, there was considerable doubt on the team that soft goods (the bulk of Nordstrom’s inventory) could sell in a chat-oriented interaction, as conversational shopping had not been successful beyond replenishment hard goods (such as toothpaste and toilet paper, and even then with extremely low retention rates). Moreover, browsing for fashion generally necessitates visual review, something a voice interface like the Google Home or Amazon Alexa did not provide at the conception of this project.
Through my initial market research, I learned that chatbots were once heralded by the media as the "killer of apps" and the new interface du jour. After a number of high-profile (and a flurry of low-profile) chatbots fizzled upon launch, Wired then proclaimed, "Chatbots are dead." Industry journalist Digit's Ethan Bloch then retorted, "I'm not even sure if we can say 'chatbots are dead,' because I don't even know if they were ever alive.” Despite these high-profile declarations, voice assistants on the mobile phone had legitimately ballooned in popularity, with about half of all smartphone users saying they engage with them.
With such mixed results, I felt it was necessary to first conduct exhaustive secondary research. I recorded 1.) every use case in which either a voice assistant was popular, and 2.) every chatbot that was reported to resonate with users.
I then analyzed those use cases to identify the natural strengths of chat/voice over traditional visual interfaces:
Inherent progressive disclosure
The ability to bring disparate pieces of site technology into one linear, guided flow (i.e. technological hand-holding)
The alacrity and immediacy of voice interaction and feedback
The flexibility of natural language processing
The hands-free interaction model, allowing for multitasking
Hyperpersonalization
The efficient, to-the-point nature of conversation when humans aren't involved
Lack of social pressure (to purchase, to perform, etc.)
I was also able to identify the natural disadvantages of chat:
The low expectation from users that the feature will either work as expected or more efficiently than an alternative approach
The blurry line between what is possible and what is not, when presented with what's essentially a command line interface
The admittedly high rate of misunderstanding a user's intent, or responding with a robotically irrelevant response
The screen real estate taken up mostly by a chat interface (instead of showcasing merchandise, for example)
We also conducted a brainstorm with the wider UX team, based on these natural strengths and disadvantages.
Framing the User Problem
Because executives came to us for a design application of an emerging technology, the user problem was insofar nonexistent—we had to go hunting for it. They also mandated that a user must be able to convert/transact within the feature. Since Google was interested in promoting the app on the world stage, we felt that the value proposition needed to be sufficiently buzzworthy and magical. It also needed to be enticing enough to attract shoppers to a brand new channel and way of shopping. In order to find a use case that was compelling, I led the UX team in multiple IDEO-inspired daylong exercises in 1.) recording logs of all the times in the past year we purchased apparel, 2.) analyzing any and all pain points in those purchases, and 3.) brainstorming possible solutions where voice could be a particularly useful medium.
After reviewing all the secondary research insights and brainstorm ideas, we took time to consider Nordstrom’s competitive advantages and strategic vision, as well as existing technologies (to expedite product shipment) and then homed in on the most promising customer problem: Customers love the personalized styling advice from and “fashion authority” of Nordstrom's seasoned and hip employees, but they don't like the pressure to buy from a commissioned salesperson. What if we could make a chat assistant a robot stylist, so users can get similar advice without the pressure? We recognized, however, that most machine learning models require collecting a preponderance of human data before such complex interactions can be automated. Could we create an app that churns out sufficiently compelling personalized recommendations in the short timespan allotted our team for development? We decided to try.
Four Sprints in Four Weeks
Inspired by Google Venture's book Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days, we developed four rapid iterations on this basic idea for a Nordstrom Google Assistant app in four weeks, and tested them in the research lab every Friday of that month.
Storyboarding the experience of shopping for essentials with a Nordstrom app on Google Assistant
Version 1
Essentials - Shop wardrobe basics bestsellers, curated by Nordstrom’s fashion authority (this required no personalization at all; machine learning was limited to natural language processing). [See above for storyboard of this experience.]
Version 2
Skincare Regimen - Tell us your skin concerns and and we’ll recommend a skincare regimen perfect for you.
Version 3
Denim Fit - Looking for jeans that were designed for your body in mind? Shop jeans by fit through answering a series of questions on body shape.
Version 4
Complete Your Look - Have an item but don’t know what to wear with it? Tell us what it is, and we’ll complete your look. [See below for screenshots of the prototype.]
Screenshots of the prototype for our 4th experiment "Complete Your Look"
Because the feature was chat-based, we were able to test it in usability through mimicking an AI chat through human texting on iOS iMessage. In UX Research terminology, this is considered “Wizard of Oz” methodology. As the user went through the experience, the UX Writer would respond by copying and pasting directly from her script into the text messages, and I would create and export assets on the fly to add the Google Assistant “UI” (tiles, menus, suggestion bubbles, etc.) to the chat.
Major findings from the Wizard of Oz usability studies:
It is not efficient for users to browse lots of product within a chat environment since there is so little information displayed at a time (the screen real estate is mostly taken over by chat bubbles). The product sets need to be narrowed down and relevant for individual customers.
It can be very tiresome to go through the back and forth of “Do you have it available in size 29?” “No” What about size 30?” and so forth through all the possible questions a user might have.
Users are not trusting of subjective recommendations from a bot, so they did not care for the bot’s tips on flattering denim styles for different body shapes.
Though many of these technologies already exist on our website in various forms, many users don’t know they exist. The Complete Your Look iteration expanded the use case of an existing technology, and guiding the user on exactly how to leverage it for their pain point.
Ultimately, of the four versions, the two that yielded the most positive reviews were the Skincare Regimen and the Complete Your Look iterations.
Behind the scenes of our Wizard of Oz usability studies (user & researcher pictured in lower left corner of the television)
As luck would have it, engineering estimated that it would take an unacceptable amount of time (12-18 months) to ship either of these iterations, as a result of integrating our checkout flow into their platform. Executives reconvened and asked us to pivot. We went back to the drawing board, and the engineers found a natural language processing software (DialogFlow) that could be plugged into our Nordstrom app (thus bypassing the need to integrate our checkout into Google Assistant), which would drastically reduce engineering scope and timelines. Executives felt this was a better, less risky investment, but this, of course, also meant the loss of a high-profile partnership with Google. We commenced on an alternative design within the Nordstrom Android app (Nordstrom's Android user base is smaller, making it the ideal petri dish to test experimental ideas).
To build something that could ship in two months, Product Management asked that we design a purely proof-of-concept MVP, to test whether users would engage with voice if given a visual affordance. We added a microphone icon to the top navigation bar of the Nordstrom app, and tapping it would allow users to essentially dictate their product searches into the app (or even answer a limited set of questions and navigate the app). This, I felt, was not an MVP, but fell drastically short of “minimally viable” (and asserted such), but as a team player, I disagreed and committed, designing the requested UI/UX.
Our four guides to innovate at the edge of technology in this intrapreneurial venture
Basic Alpha user flow
Since I had a hunch that Alpha would be a failure, I started designing Beta before we even started engineering work on Alpha. As a result, when the dev version of Alpha was ready, we were able to concurrently usability test the dev version of Alpha and the Principle prototype of Beta (see below for video).
Beta user flow
For Beta, Users understood how to use the overall interface, but many were concerned that the bot wouldn’t actually understand many of the queries, and would instead respond unintelligently, which would end up wasting users’ time as they experiment to understand the bot’s limitations.
Interestingly, people said their experience with Alpha (a fully working, developed prototype) was great in our in-person usability tests, but ultimately they did not use it out in the wild.
The A/B test results seemed to indicate that Nordstrom’s in-app voice functionality was destined to follow the crowd of other conversational features into the chatbot app graveyard. I was hesitant to want to invest more deeply in the experience, by adding a multi-turn conversational experience through the Beta design.
After some reflection, I realized that though the trend of chatbots quickly rose and fell in the past few years, one aspect of the trend stuck with users: platform-centric voice assistants, ready for commands at the tap of a finger or a vocalized trigger phrase ("Hey Siri" or "Ok Google" or "Alexa"). Reviewing more publicly available market research revealed that the greatest motivation for using these products was the hands-free benefit, allowing its users to multi-task (e.g. setting a timer while cooking, or asking for the weather while getting dressed).
I then decided to pivot to an experience that optimized for the value proposition of hands-free, instead of forcing a user to unlock a screen and open an app to then use voice. I also thought about the value propositions of must-have applications (a list I gathered in my startup years): saves you time, saves you effort, saves you money, and makes you money.
What if we developed a feature that aimed for several of these? We felt the first two aligned most with Nordstrom’s core philosophy of customer service. An online personal shopper can (1) give you a running start on your shopping journey by narrowing results for you and (2) do the heavy lifting by riffling through all the merchandise to find what you're looking for. Furthermore, they can (3) help you fulfill your vision of what you need to buy, while not telling you what to wear or how to wear it, and (4) give you low-touch help, minimizing sales pressure.
After multiple rounds of usability studies, we landed on this voice-initiated personal shopping experience.
Investment into R&D for conversational shopping sunsetted just before we launched this iteration to the public in spring 2019, but many of these insights paved the way for other kinds of product development at Nordstrom.