How To Drive and Measure User Experience - Part 2
September 19, 2019

Ron van Haasteren
TOPdesk

Share this

Periodic measurement is examining your services regularly, through a survey, for example. Because periodic measurements can be pretty general, how you phrase your survey questions to users matters. "How do you rate our services?" will not suffice. You must dive into various aspects or themes of the service so that you can gauge authentic user experience.

Start with How To Drive and Measure User Experience - Part 1

There are usually five main themes that the customer thinks of when experiencing a service, according to the well-known research model SERVQUAL. These are:

The service desk's level of reliability— to what extent do they stick to agreements? Are they dependable and accurate?

The level of assurance— how is the expertise and courtesy of the service desk experienced? Moreover, do they convey trust and confidence?

The level of responsiveness— how quickly are services supplied?

Level of empathy— to what extent do I get the feeling that the service desk employees care and are they personalizing their approach towards me?

Tangibles— where can I find the physical objects I use? Are the services clear?

Here are some examples of actual questions I have used with customers to gauge these five themes:

Reliability:

Agreements made are always honored by employees.

The service quality is the same no matter which employee is helping me out.

Assurance:

The employees are skilled enough and have sufficient knowledge to answer my questions.

The employees are always courteous in helping me.

Responsiveness:

My matters are always processed promptly.

I am always kept updated about the status of my calls.

Empathy:

Employees can put themselves in my shoes; they understand my pain. I always feel approached individually; employees are not following a script.

Tangibles:

It is clear to me which products and services the department can supply. Verbal and written communication is proper and professional.

Now how do you approach this? Let me share some tips that helped us achieve results of a 40 percent response rate:

Keep the research simple. Do not create a massive survey no one wants to fill out. This increases response rates.

Communicate your goal. Make it clear what you want to achieve and what's in it for the user as a result. You want to provide the best service possible, but need input to do that. The moment of measurement, the timing of the day, and the duration of the survey are critical. Send the emails when users have time to respond, Incentives help.

Continuous measurement provides more regular detail about a specific service. In these cases, the user has just received a particular service — the move of a new work station, for example. Immediately afterward, ask the user how the experience was. This type of measurement can be service-specific or even operator specific. The interaction made at the point of a particular service experienced to gather near real-time results throughout the year to receive immediate feedback; accomplished by sending a short digital survey to the user immediately after the contact.

These surveys are quick and simple to encourage the user to complete them. In this situation, the customer has just experienced the service, so they can easily remember what they did or didn't like. Because of this, you can be specific with your survey, leading to some concrete feedback.

An excellent measurement for continuous measurement is the customer effort score, which measures the effort required by the customer to find a solution to their incident or issue. The customer effort score asks the question: How much effort did you experience accessing this service? You want the experience to be as effortless and pain-free as possible.

Continuous measurement provides the most up-to-date overview of your performance and helps identify whether or not you are on track to meet your department's goals. This can turn into a satisfaction key performance indicator (KPI). Satisfaction KPIs are often used by an organization that is in control of its services where services are defined in a service catalogue, and they can measure the customer effort score in relation to each service, and implement the necessary improvements.

The Votes Are In, What Now?

Once the results are in, you must communicate these within your department. Give your team members credit when they are doing well, and together — without judgment — attempt to identify areas needing improvement.

From here, as a team, come up with a plan and work together on it. Doing so encourages team building as your team feels involved and motivated to make the improvements. Try to see this exercise to gain valuable input and engagement from the user. If a user provides feedback, see this as a gift rather than a complaint.

The next step is to communicate with the user. This lets you make a great impression of your department, too, because doing so tells the user: We are listening to you. We value what you have to say.

There is no need to share all of the ins and outs required to address the concern. Provide users with enough information that they need in an easy-to-digest format. For bulk actions that affect multiple people or most of the organization, consider placing a poster by the coffee machine, for example, or sending an agency-wide communication.

The types of results shared may include updates on system outages or network updates, changes to interacting with the service desk, or even instructional information for managing a particular process, solution, or technology platform. Anyone requiring individual attention should be communicated with directly, in a non-public format, either through the service portal, electronic communication, or face-to-face.

Taking these steps lets the user feel valued and that their feedback is valued. Action must be taken immediately once a user reports an issue. Quick responses to user issues show that you can take immediate action when required.

Takeaways

The importance of gaining insight into the user experience, by actively asking for it, can be tremendously helpful to your efforts. Your metrics can influence behavior within your service department. Striving for a better user experience goes hand-in-hand with taking emotional metrics into account, as well and actively discussing them with your team.

When defining metrics, first focus on the metrics that matter most for your organization. This requires investigations and surveys to identify them. Implement these metrics in a phased approach. You cannot do everything at once, and you want to do things the right way.

Finally, be transparent about what you are trying to achieve and let your users know. When users feel like their feedback matters, they will be much more willing to keep giving their feedback, which helps you improve the organization for everyone.

Ron van Haasteren is the Global Culture Strategist at TOPdesk
Share this

The Latest

May 26, 2020

Nearly 3,700 people told GitLab about their DevOps journeys. Respondents shared that their roles are changing dramatically, no matter where they sit in the organization. The lines surrounding the traditional definitions of dev, sec, ops and test have blurred, and as we enter the second half of 2020, it is perhaps more important than ever for companies to understand how these roles are evolving ...

May 21, 2020

As cloud computing continues to grow, tech pros say they are increasingly prioritizing areas like hybrid infrastructure management, application performance management (APM), and security management to optimize delivery for the organizations they serve, according to SolarWinds IT Trends Report 2020: The Universal Language of IT ...

May 20, 2020

Businesses see digital experience as a growing priority and a key to their success, with execution requiring a more integrated approach across development, IT and business users, according to Digital Experiences: Where the Industry Stands ...

May 19, 2020

Fully 90% of those who use observability tooling say those tools are important to their team's software development success, including 39% who say observability tools are very important ...

May 18, 2020

As our production application systems continuously increase in complexity, the challenges of understanding, debugging, and improving them keep growing by orders of magnitude. The practice of Observability addresses both the social and the technological challenges of wrangling complexity and working toward achieving production excellence. New research shows how observable systems and practices are changing the APM landscape ...