Project stats
Role: Lead product designer
Team: Me, 1 PM, 3 front-end, 3 back-end
Duration: ~7 months
The problem
How can we add a brand new way of talking to customers (voice) to a platform built around text and messaging?
How it was solved
Phase 1 — Proof-of-concept
A very limited feature set to introduce the concept of voice calls to gather stakeholder and agent feedback.
Phase 2 — The real MVP
A more robust voice offering with all the minimum out-of-the-box features contact centers require.
Phase 3 — AI and automation
Adapting LivePerson’s existing conversational AI features to voice scenarios.
Chat agents
Uses just about any text-based communication channel (web chat, SMS, Whatsapp, etc…)
Can handle a few to a dozen conversations at the same time
Response times are generally not immediate, giving the agent a moment or two to formulate an answer
Voice agents
Daily call volume varies greatly if the agent is based in sales (25-70+ calls) or support (up to 20)
An agent can only ever handle one call at a time
Unsurprisingly, response times are instant once the call is answered
Blended agents
Some contact centers have "blended" agents that handle both channels simultaneously
To preserve quality of service, and the agent's overall sanity, blended agents are typically restricted to 1 voice call and 0-2 chats
Due to the sheer size of this initiative and the effects it would have on the entire platform, an ultra-lite version of the call controls needed to be built and demoed to stakeholders, select customers, the entire C-suite, and even our board of directors.
Maximized call controls
The enlarged call controls can be moved anywhere the agent wants on their screen.
Minimized call controls
Anchored in upper-right hand corner of the Agent Workspace, the minimized (and consolidated) controls are visible and interactable from anywhere in the platform.
Made with the future in mind
My first concepts for our new dialer were made with the “FINALfinal_final.jpeg” version in mind to preempt any future technical "surprises."
How did Phase 1 go?
This initial phase was less driven by any real user metrics or KPIs than it was if we can introduce the concept well enough for buy-in from all parties involved and show off our ability to effectively add voice as a communication channel to the Agent Workspace.
Here are some direct quotes from the various stakeholders we either interviewed or presented to:
— Liveperson chat agent with prior voice experience
— Customer that had requested voice integration
— Member of our sales team
— Member of the LivePerson C-suite
Our proof-of-concept suitably proven, it was time to flesh out and build what would be a more “traditional” and telephony-system-agnostic voice tool for contact centers.
Maximized call controls
Call history
Internal directory
Minimized call controls
Specific flows
Placing and Ending a call
Answer, record, mute, hold
Blind/Cold transfer
Warm transfer
It was our collective good fortune that LivePerson already had a whole suite of chat AI tools, including real-time alerts, bot delegation prompts, and knowledge base article recommendations, however our demos and user observations showed us that while the alerts and recommendations were helpful and accurate, they were not designed around the "urgency" of live, instant voice conversation.
Voice conversations are fluid, natural, and lightning-fast, and we needed to apply the time-crunch of live calls to (nearly) all of LivePerson’s existing AI chat features.
Rather than just sticking to a single script for the whole conversation, the Guided Workflow tool can update its steps if the topic of conversation abruptly changes.
Orgs can create their own automated workflows that can be used by their agents from directly within the widget. Ex. flight booking, customer credit card background approval.
Due to the rapid-fire nature of voice conversations, in-line alerts would disappear out of view almost as soon as they came in.
Alerts were moved to their own section of the Copilot widget in a timeline view, and new alerts would always appear within the widget no matter what page was open.
Instead of making the agent freeze with indecision, LivePerson just tells you what to say.
Fills up as the voice conversation goes on.
Primarily used as an area for the agent to receive topic summaries, bot updates, and messages from their manager(s).
Turn the Unified Workspace into a true BYOPS (bring-your-own-phone-system) platform
The team and I partnered with Avaya from the beginning of this project, but it was always meant to be able to accommodate any phone system you throw at it
Fortunately, voice calls are handled in nearly the same way across all major voice providers.
Managing your phone system within LivePerson
To combat whiplash from needing to manage both the phone system and LivePerson at the same time, some of the larger day-to-day features will be able to be managed directly within LivePerson (directories, dispositions, etc)
Improved workflow automations based on real-time transcription
Rather than just hearing a trigger word and presenting “Step 1” of a workflow to an agent, entire processes can be queued up and initiated by an agent.