top of page

Argos Assistant

The goal was to make a quick and easy  voice-led user experience that really made the Argos tone of voice really come out in the voice experience. We wanted the Argos Assistant to be fluid and uncomplicated, whether you were using it at home or on the go. The skill is able to learn via NLP, as well as perform Error handling, Validation and Response.




UX Manager


Alex Ayres | Louise Webber | Chris Crispin |  Kat West | Zubar Miah

GH - 320 - sstates_x2.png

Talking is far easier than whipping out your phone and navigating through a bunch of screens and typing letters with your thumbs. We’d already seen a big demand for voice to add value in the shopping journey and to facilitate a smart home, as over 10% of consumers in the UK own a smart home device.

We wanted to build a valuable service and product for our customers, whilst making our MVP minimal and focused on the most important thing(s), like easily finding the right product and making the reservations, so users can easily understand how to use it.

Argos Assistant
The problem & process

Our user problems were two pronged:

  1. I am a customer who is time poor, and I want to be able to reserve products quickly, easily and hands-free.

  2. I am a customer who is elderly or sight disabled. I find online shopping tedious and taxing and the process of browsing a website error prone, cumbersome and time consuming.

Going from there, we knew that creating the new Argos Assistant required close collaboration between different teams, specifically Google AI, our in-house Engineering team, and the UX team. It was important to bring everyone on the journey, showing the value of applying design thinking to create a solution that solved the user problems, while also being technically feasible. 

Creating scripts, table reads & VUI

By reading aloud scripts, we improved flow making the conversation feel more natural. We worked with our UX copywriter to help craft several versions of the final voice interaction. By running through unscripted flows (on the user's part) we teased out user responses that were not anticipated in the original design to reduce errors.

We realised that NLP is not as advanced as we’d like it to be and the service struggled with brand names or more complex language, which resulted in errors and abandonment of test users.​​

We designed “Earcons” which are branded sounds that will identify the Argos Assistant when in use. We iterated on the Assistant’s welcome screen (when used via mobile), to ensure that the look at feel assimilated our brand and complimented the VUI.

Argos_Assistant _case_study.012.jpeg

The Invisible Interface

It's not easy to design something you can't see, a lot like the backend that powers the front end. Mapping out the flows, you realise how much many calls our systems would have to make.  And through tireless testing and iterations we were able to to design an MVP solution which addressed the user pain points aforementioned.

Testing and establishing success metrics

We tested different iterations of the solution ahead of launch. This helped us refine the final solution and prove our hypothesis on our established customer problem statements. We tested the Argos Google Action with real consumers in a contained, ring fenced environment.

Over several weeks we launched an internal Argos audit. Employees from all departments reported back issues and feature gaps which were then prioritised for resolution.


Ahead of launch we established a series of success metrics, which included number of users (new user adoption) and initial engagement, in order to track incremental growth, reduction of calls to our call centres and change on AOV (average order value).


New users in it's first week


Call centre call reduction. Adds up to £3M of incremental savings


We saw our Average Order Value (AOV) go up from £44

Next steps: Improving the MVP

We'll be focusing on three main areas: Content, Tech and Reducing Contact. Our next steps are:

  • Continue to answer customer queries and meet with customer intent by optimising our content to match.

  • Optimise our help services to accommodate Voice Search, which is 2x as keyword search. Make sure we are catering to those Voice customers who have a higher intent and are searching in question form.

  • Explore scalable solutions that enable us to respond to intent across a variety of channels and touch points. Looking at serving on customers from a platform agnostic POV, not just on Google Home devices. 

  • Intelligent query handling to free up our customer contact agents to help customers when it really matters.

  • Integrating Bots within our Live Chat and social messaging AP

Google comp_2.png


Obstacles, early adoption & growing pains

We were the first partners with Google - they informed a lot of design requirements, making it hard for the Argos brand to shine through.

Google's platform had lots of bugs and we needed incorporate a lot of work around to get a working version. The system was changing underneath us as we designed and coded - testing a few days later, and suddenly we realised that the behaviours had changed. Google was optimising in parallel to us, which made it really tricky.

Argos did not have the underlying services to support a lot of the new features we want to launch within Argos Assistant, making it harder to quickly deliver personalised experiences.

Constrained by a very restrictive CMS that does not allow us to build pages aimed for editorial use, so getting the page to look modern was a challenge using very limited developed components and lots of restrictions.

I learned many valuable lessons during this process. We'll avoid starting coding without structure or direction, and take the time to untangle data and properly kickoff.  For Day 2, I will encourage my team to think more about the essence of conversational design and how we can optimise and explore the channel further, while working with our partners at Google to understand the underlying tech and it’s limitations better.

bottom of page