Do not want dog

How do you get your users to actually use your union digital project?

Digital change projects are only partly about making sure a new bit of technology works. Even the best specified new tech tools won’t be any use if people don’t agree to change their current behaviour and engage properly with a new system or process.

When you’re trying to get someone to do something they wouldn’t otherwise do, you’ve got two options. You can motivate them to get over the barriers to taking the actions needed. Or you can make the barriers much easier to get over.  

You don’t have to take my word for this. Stanford Uni’s BJ Fogg has a model to describe it. He proposes that for someone to do something new, they need 3 things. A prompt to do it, the motivation to do it, and the ability to do it (ease of use). 

BJ Fogg behaviour model

Even if something is easy to do, if a user has no motivation to do it, they probably won’t do it. And conversely, for even a keen activist, there’s a limit to how labour-intensive or counter-intuitive you can make a task before they drop out. 

Start by thinking about who your users are. Who needs to change their behaviour for this to go well? Who is actually doing the day to day work in this system? You might have internal users for your project, external ones, or in many cases both.

When you’ve found out who your users are, have a look at them through this prism:

  • What is the action you need from them? 
  • What’s the context they’ll be in when you are prompting them to act? 
  • Can they clearly see why they’re being asked to do it? 
  • And have you made it painless enough for them to do? 

Look at the problems that you are trying to tackle with your intervention. Who is each of them a problem for? Look at the opportunities that the change could create. Who benefits from each one? How might these factors help you establish more motivation for your users?

Testing prototypes is also useful here. You can start with best guesses on common practice for doing something, but always check your assumptions with real users. Showing it to a few people, even in a paper format, could remove barriers you didn’t realise would be a problem. It’s easy to think it is obvious how to do something, when sometimes it’s only really obvious to you, given your insight in this area already.

And if you try something and it doesn’t seem to work in testing, don’t just cross the idea off. Always try to ask probing questions to find out why your users don’t like it, or weren’t able to understand a particular part of the interface or step in the process.

It might be you’re onto a hiding to nothing and you need to come at the problem again with something totally different. But it might equally be that you can increase the conversion rate to something useful by tweaking one of those axes – motivation or usability.

Putting it into practice: WorkSmart employer search tool

As an example, on the WorkSmart app project, we needed to find a way to identify users’ industries, jobs and workplaces. Our plan is that we can offer better tailored content and ultimately start putting worker communities together, if we can find out more of what they have in common in a work context.

This is a trickier thing to do than it sounds at first. The way people describe their job isn’t the same as they way your union might describe them. There are almost as many job titles out there as jobs. They also won’t describe their location, employer and industry the same way as you, or as each other. 

Standard classifications don’t always help here – SIC codes, ISIC, NACE, SOC codes, ISCO. The one thing the standards have is that they’re confusing for most people. A worker might say they are a barista for Starbucks, but their employer thinks they are a temporary team member, employed by their agency subcontractor, and a union thinks they’re a hospitality sector worker in their southern region.

So we asked real workers how they described where they worked. And we found it was more likely to be in terms of the particular workplace. For example, “I work at Starbucks in Bermondsey”.

We then looked at how we could make the act of classification easier for the worker, and use conventions that are already out there and familiar to them. We hit upon the Google Maps API. People are used to searching for a business and picking it off a list, whenever they use a website storefinder for example. Google lists most employers in its database, and employers themselves will curate their own keywords, categories and locations.

We built in an API query to Google, to return a list of workplaces that looked relevant to a search. The user can quickly confirm which is right. This is great because we can then also automatically pull in other information that Google links to that workplace – the business category and the location. We only have to ask for one piece of info to get useful data that could help us create different types of community.

We tested this mechanism with real users, first on paper and then with a bare bones working tool, and we found they COULD do it pretty easily. We’d made a start on one side of our axis.

The next problem was that they didn’t WANT to do it. Making it easy to do wasn’t enough.

There are good reasons for this. People are understandably worried about sharing data, and especially data that could be sensitive to them like their employer. When we tested putting this field to the initial account creation form, people simply stopped signing up for accounts. 

So we looked at motivation too. We tested ways to ask for the information at the right time. People were more likely to give us the data if they could see a clear reason for doing it (that it’s not just a data grab). 

We incorporated the field into our salary checker tool. Where people wanted to compare their salaries with others in a similar situation, they could see more logic to providing that information. They also hopefully have a little more trust in us by that point, as it’s not the first thing they’re likely to try out with us. Context and trust were both ways to tackle their level of motivation in doing the task.

There’s still a big drop out in people providing this information, but we’ve moved from an initial situation where everyone dropped out, so that’s a big measure of progress. Now we’re back to working on ways to increase ease of use and motivation, and increase the take up even further.