How To Drive and Measure User Experience - Part 2
September 19, 2019

Ron van Haasteren
TOPdesk

Share this

Periodic measurement is examining your services regularly, through a survey, for example. Because periodic measurements can be pretty general, how you phrase your survey questions to users matters. "How do you rate our services?" will not suffice. You must dive into various aspects or themes of the service so that you can gauge authentic user experience.

Start with How To Drive and Measure User Experience - Part 1

There are usually five main themes that the customer thinks of when experiencing a service, according to the well-known research model SERVQUAL. These are:

The service desk's level of reliability— to what extent do they stick to agreements? Are they dependable and accurate?

The level of assurance— how is the expertise and courtesy of the service desk experienced? Moreover, do they convey trust and confidence?

The level of responsiveness— how quickly are services supplied?

Level of empathy— to what extent do I get the feeling that the service desk employees care and are they personalizing their approach towards me?

Tangibles— where can I find the physical objects I use? Are the services clear?

Here are some examples of actual questions I have used with customers to gauge these five themes:

Reliability:

Agreements made are always honored by employees.

The service quality is the same no matter which employee is helping me out.

Assurance:

The employees are skilled enough and have sufficient knowledge to answer my questions.

The employees are always courteous in helping me.

Responsiveness:

My matters are always processed promptly.

I am always kept updated about the status of my calls.

Empathy:

Employees can put themselves in my shoes; they understand my pain. I always feel approached individually; employees are not following a script.

Tangibles:

It is clear to me which products and services the department can supply. Verbal and written communication is proper and professional.

Now how do you approach this? Let me share some tips that helped us achieve results of a 40 percent response rate:

Keep the research simple. Do not create a massive survey no one wants to fill out. This increases response rates.

Communicate your goal. Make it clear what you want to achieve and what's in it for the user as a result. You want to provide the best service possible, but need input to do that. The moment of measurement, the timing of the day, and the duration of the survey are critical. Send the emails when users have time to respond, Incentives help.

Continuous measurement provides more regular detail about a specific service. In these cases, the user has just received a particular service — the move of a new work station, for example. Immediately afterward, ask the user how the experience was. This type of measurement can be service-specific or even operator specific. The interaction made at the point of a particular service experienced to gather near real-time results throughout the year to receive immediate feedback; accomplished by sending a short digital survey to the user immediately after the contact.

These surveys are quick and simple to encourage the user to complete them. In this situation, the customer has just experienced the service, so they can easily remember what they did or didn't like. Because of this, you can be specific with your survey, leading to some concrete feedback.

An excellent measurement for continuous measurement is the customer effort score, which measures the effort required by the customer to find a solution to their incident or issue. The customer effort score asks the question: How much effort did you experience accessing this service? You want the experience to be as effortless and pain-free as possible.

Continuous measurement provides the most up-to-date overview of your performance and helps identify whether or not you are on track to meet your department's goals. This can turn into a satisfaction key performance indicator (KPI). Satisfaction KPIs are often used by an organization that is in control of its services where services are defined in a service catalogue, and they can measure the customer effort score in relation to each service, and implement the necessary improvements.

The Votes Are In, What Now?

Once the results are in, you must communicate these within your department. Give your team members credit when they are doing well, and together — without judgment — attempt to identify areas needing improvement.

From here, as a team, come up with a plan and work together on it. Doing so encourages team building as your team feels involved and motivated to make the improvements. Try to see this exercise to gain valuable input and engagement from the user. If a user provides feedback, see this as a gift rather than a complaint.

The next step is to communicate with the user. This lets you make a great impression of your department, too, because doing so tells the user: We are listening to you. We value what you have to say.

There is no need to share all of the ins and outs required to address the concern. Provide users with enough information that they need in an easy-to-digest format. For bulk actions that affect multiple people or most of the organization, consider placing a poster by the coffee machine, for example, or sending an agency-wide communication.

The types of results shared may include updates on system outages or network updates, changes to interacting with the service desk, or even instructional information for managing a particular process, solution, or technology platform. Anyone requiring individual attention should be communicated with directly, in a non-public format, either through the service portal, electronic communication, or face-to-face.

Taking these steps lets the user feel valued and that their feedback is valued. Action must be taken immediately once a user reports an issue. Quick responses to user issues show that you can take immediate action when required.

Takeaways

The importance of gaining insight into the user experience, by actively asking for it, can be tremendously helpful to your efforts. Your metrics can influence behavior within your service department. Striving for a better user experience goes hand-in-hand with taking emotional metrics into account, as well and actively discussing them with your team.

When defining metrics, first focus on the metrics that matter most for your organization. This requires investigations and surveys to identify them. Implement these metrics in a phased approach. You cannot do everything at once, and you want to do things the right way.

Finally, be transparent about what you are trying to achieve and let your users know. When users feel like their feedback matters, they will be much more willing to keep giving their feedback, which helps you improve the organization for everyone.

Ron van Haasteren is the Global Culture Strategist at TOPdesk
Share this

The Latest

October 17, 2019

As the data generated by organizations grows, APM tools are now required to do a lot more than basic monitoring of metrics. Modern data is often raw and unstructured and requires more advanced methods of analysis. The tools must help dig deep into this data for both forensic analysis and predictive analysis. To extract more accurate and cheaper insights, modern APM tools use Big Data techniques to store, access, and analyze the multi-dimensional data ...

October 16, 2019

Modern enterprises are generating data at an unprecedented rate but aren't taking advantage of all the data available to them in order to drive real-time, actionable insights. According to a recent study commissioned by Actian, more than half of enterprises today are unable to efficiently manage nor effectively use data to drive decision-making ...

October 15, 2019

According to a study by Forrester Research, an enhanced UX design can increase the conversion rate by 400%. If UX has become the ultimate arbiter in determining the success or failure of a product or service, let us first understand what UX is all about ...

October 10, 2019

The requirements of an APM tool are now much more complex than they've ever been. Not only do they need to trace a user transaction across numerous microservices on the same system, but they also need to happen pretty fast ...

October 09, 2019

Performance monitoring is an old problem. As technology has advanced, we've had to evolve how we monitor applications. Initially, performance monitoring largely involved sending ICMP messages to start troubleshooting a down or slow application. Applications have gotten much more complex, so this is no longer enough. Now we need to know not just whether an application is broken, but why it broke. So APM has had to evolve over the years for us to get there. But how did this evolution take place, and what happens next? Let's find out ...