The public sector digital transformation trap and how to avoid it

#1 Chatbot Trouble

It’s common to hear stories where organisations have jumped into implementing a solution (often digital), perhaps dazzled by the patter of a shrewd salesperson or the desire to look good, without really thinking about the people they’re designing the solution for or the problem they’re trying to solve.



These stories usually end in some expensive solution that doesn’t quite do what it’s meant to do - which is a waste of time and money. Perhaps organisations like to sweep these potentially embarrassing stories under the carpet - never to see the light of day for fear of looking bad.



I’ve noticed a pattern in these stories.



So, working on the basis that we learn more by hearing where things went wrong than we do by hearing the perfectly polished case study, I thought I’d start a blog post series to uncover the reality of what goes on inside many organisations.



I’ve interviewed people anonymously to share their stories.



This first interview is from an anonymous source in a public service organisation that implemented a chatbot service on their website.



It wasn’t an overwhelming success - but why...?


What would you like to tell us about?

A chatbot function on our website. I guess the purpose of which is about reducing or changing the type of demand on our stretched services - so that simple transactional needs are dealt with automatically by the chatbot, freeing up the humans to deal with more complex issues.


Illustration of a hand holding a mobile phone representing a chatbot in a conversation with a person

Chatbot

Can you start by setting the scene?

So, we know there is an absolute need to review the website content, its structure and the way in which people access our services. 

We know this because we’ve looked at data analytics, we’ve surveyed people and carried out in-depth interviews and journey mapped how people flow through life events and how they did (or didn’t!) get the support we offer.

In one particular area, for example, we have over 100 pages of content, only four of which are commonly looked at by people accessing the service. The structure of the website doesn’t help people to find the information they need or enable them to self-serve a full end-to-end digital journey. Of the 30 or 40 customer needs we tested to see if it’s possible to make a complete digital journey, we failed on every single one. 

Analytics tell us that currently, people often use the website to find out how to contact the organisation to get advice or access services - by phone. This invites phone contact for even the most mundane of things.

All this is happening in the context of raised citizen expectations of digital public services. When we built the website, it was all about providing information to people - and we organised it in a way that made sense to staff. Now, customers expect to also be able to access help and support digitally - and, if possible, complete the entire journey online.  Covid-19 has accelerated this shift.

So, there is an evidence base and ongoing work to review our website and the way in which people interact with it and our services more widely.




OK, so what happened with this chatbot? How did it come about?

This project I think would be best described as a whim. A senior manager saw other similar organisations with chatbots - and wondered why we didn’t offer that here.

A senior manager saw other similar organisations with chatbots - and wondered why we didn’t offer that here.


So, without much further ado, the idea of a chatbot turned into a project. And we went from zero to “well, not hero” - delivering an output in a short timescale. No links were made with the in-depth work that was ongoing. It was literally a case of, ‘let's implement this gadget.’

A project board was set up to deliver this solution, led by people who are very good at getting things done, but none of them knew or thought about user testing, setting it up as a proper project, or connecting it to other ongoing work. 

So you have this project group set up with one person, essentially being tasked to do the actual implementation, alongside their day job, and a project board to lead it. The capability of these leaders around user-centred ways of working and digital wasn’t as mature as it could have been.

The ability of the person in the implementation role to influence and engage with the website content owners to change or update information on the website was limited because no one responded to a request to update their stuff. 

And anyway, it wouldn’t be easy to change their workflow to enable a truly user-centred end to end digital redesign in such a short period of time.

This member of staff is now spending time pointing the bot to the right places on the website based on what people are typing into it to help it to learn.

How well does the chatbot work?

I have made a series of simple support requests using this chatbot. It hasn't been able to correctly respond to a single one of my questions. 



I also had a go earlier today and it just took me to a web page that didn't give me the information that I was after.  Or if it did, it was really hard to understand.

That’s part of the problem.  The website is out of date and doesn’t give customers what they need.

Even if the individuals responsible for their topic areas had updated their information, we know from the in-depth work to date, that the structure of the information on the website isn’t helpful. So asking for information to be updated and still presenting it in the same unhelpful structure is unlikely to be all that beneficial.

We know, for example, that people want to use the website to find out if they are eligible for our support. If they are, how to access it. And if they’re not eligible, what else might be available to help them.

None of the work has been done to restructure the content to meet this need. So, adding a chatbot - it's just like putting lipstick on a pig if it’s not designed properly.

Adult pig in a sty lying on wood shavings with red lipstick added

Lipstick on a pig

 

adding a chatbot - it's just like putting lipstick on a pig.


How well does the chatbot achieve the intended outcome?

It’s hard to answer that because the intended outcome was never stated at the start. 

Red Ferrari driving on a road

Ferrari

 
 

If the intended outcome was - ‘let’s have a chatbot that makes us look as if we are keeping up with our contemporaries’, then I guess it's been successful.


But it’s a bit like having a Ferrari parked in the driveway, yet when you lift the bonnet (or hood for our friends in the USA!), there’s nothing mechanical underneath so you can’t actually get anywhere!

If the intended outcome was - ‘to better help people access support and find the information they need when they need it’ We can't say for certain, because nobody is testing it.

We've done the traditional public service thing. We've delivered a thing so, that's it. Job done. 

Personally, I’d say we've achieved little of any value. All we’ve done is install a second search option - effectively adding another layer of complication by having two search options. 

When testing it, it just left me feeling frustrated.

To answer this question, we need to do user research to find out how people are using it and whether it’s helping them achieve their goals.

You and I have proved in our small sample of two that it potentially makes things worse. It's adding confusion and compounding customers’ frustration when they try using it and they reach a dead-end, it takes you to an unhelpful web page or sends you into a chatbot loop. It's just a solution without understanding the problem.

It's just a solution without understanding the problem.

Were there positive things that came out of this?

Reputation. The reputation of the organisation to have a chatbot. But a bit like the Ferrari on the drive. It looks good from afar, but when you try using it, it goes nowhere.

We can say we've done it. And, you know, if you go high enough up in an organisation - sometimes symbols matter. Even if those symbols aren't as effective as they could be. 

There is some interesting and useful information that we can now use as a result of this, in that we can understand what questions people are asking the bot.

As an organisation, if you were to do this again, how would you approach this work differently?

I would apply a project management approach. And I’d start with an understanding of the problem we’re trying to solve and the outcome we want to achieve so that we can measure success. 

I would build a team of people who understand user-centred design, (that's not just because it's you - Jo!).  And I would build on the work that's already been done, including more user research and making sense of our data analytics to give us an understanding of the problem we’re trying to solve here and to understand what people need from our services.

There would absolutely be service engagement - I would involve a wider team beyond the core project delivery team - so I’d encourage input and engagement from multiple services. These other services need to be engaged in this work if we’re to provide a comprehensive chatbot that answers the questions and is able to respond to the needs of people. They are also going to be responsible for keeping content up to date in a way that enables customers to meet their information needs and access the support they need.

We need to give more prominence to the design principles we have in place in the organisation and to act on them.

I think we need to zoom out a bit and start by thinking about our customers, what they need from our services including the digital and non-digital channels and how that all feeds into the service design. And then if we need a chatbot, then that can then be included in the way it’s designed. A chatbot is a ‘nice to have’ - it’s not essential when we haven’t ‘fixed the plumbing’ yet.

This would obviously take a lot longer to do - but would probably result in a whole end to end service that would be a lot more effective than it is currently.

We also need to think about what happens afterwards. After customers use the chatbot or website to do their thing. So, we don’t create bottlenecks elsewhere in the customer's journey - or customers end up phoning us because we’ve failed to keep them informed of progress. 

What would you say the organisation's greatest learning point is from this whole experience?

The organisation hasn't learned a thing because it hasn't looked. When we say ‘the organisation’ we’re basically talking about the leaders that make the decisions. They haven't looked.  And if they didn't consider the nature of the problem they were trying to solve, how can they reflect and learn? So, the danger is that this approach will be repeated. You could argue this has been the approach of many public services for a long time. 

There is a widespread lack of understanding of how to manage an improvement project - how to take a user-centred view of services and the benefits of working that way. By ignoring the user-centred view, they’re building all sorts of unnecessary complexity which is inefficient and ineffective - causing people to feel frustrated.

Red triangle outline. Inside the triangle is an image of a hammer and the word solution. Outside the triangle on each corner is user, problem and solution provider, each with a representative image

ServiceWorks Analysis

How can you avoid this happening in your organisation - or if it’s happened already, how can you think differently next time?

When we think about solutions to design challenges, we think about fit. How well does the intended solution fit

  • the problem

  • the solution provider (the organisation)

  • the people who use the service?

Solution - problem fit: is the solution the right fit for the problem?

This is why in a design process, we’d spend time at the start in ‘discovery’, trying to understand the problem well enough.

The risk if you don’t do this is if you design a solution without a good understanding of the problem, how will you know if you’re successful.

You could just be wasting your time.

Solution - user fit: is the solution the right fit for the user?

The solution needs to meet the needs, capabilities and goals of the user. This is why in a design process, we’d spend time doing user research to gain a deep understanding of these things.

The risk if you don’t do this is if you design a solution without a good fit for the user, it won’t meet user needs, it won’t get used, it excludes people and creates service failure.

You could just be wasting money and resources.

Solution - solution provider fit: is the solution the right fit for the organisation providing it?

Without a good fit here, the solution is unlikely to be sustainable or even viable in the first place. This is the ‘fit’ that many organisations focus on while making assumptions about what the user needs or what the problem really is.

It’s clear that in this case of the frustrating chatbot, that the organisation did just this - focused on their own needs to look good while making assumptions about the problem and user needs.

The aim isn’t to get a perfect fit across all three, but to have a good enough fit and be mindful of this concept when designing services for people. Understanding these things takes time and resources, but result in solutions that should work better for people.

Public sector digital transformation isn’t about putting a digital layer over a badly designed service. It’s about putting people before tech and designing services for the needs of the people who will use them and including a digital channel in the end to end service if required.

Adopting user-centred ways of working takes a different set of skills and capabilities, which we’re helping organisations to develop through our brilliant courses and expert coaching.

Was this valuable?

Stay in the loop, join our community and never miss another post!


If you’re interested in developing your skills to design and improve your services in a way that puts the user at the centre of the design process, check out all our courses or sign up for our next learning and development programme, Service Design in Practice, starting every March and September.

Open for enrolment now!


If you have a story like this you’d like to share, we’d love to hear from you!

We want to put a spotlight on your digital endeavours. Especially if you’re from not for profit organisations like charities, councils, housing associations and public services.

Perhaps you have a similar story to this one - where your organisation has jumped straight to a digital solution without much thought about the people who will use it.

Alternatively, you might have a story with a positive outcome - we know it’s not all doom and gloom. We’d love to hear from you - either way.


How it works

We’ll organise a 45-minute interview with you.

Then we’ll turn it into a blog post, share it back with you for final edits and publish it.

You don’t have to reveal your identity or the organisation in the published post.

Easy!

Previous
Previous

Why a desk is a dangerous place from which to view Welsh sport

Next
Next

Building service design confidence