Skip to content

Making the Service Desk More Efficient: A Case Study: Part III

In Parts I and II of this case study, we had seen that a multinational company (ABC), developer of a telecommunication billing application, started with a three-member Service Desk in a country in South Asia. They had adopted ITSM guidelines two months ago. The Service Desk provided support to users of a very complex application which allowed subscribers to be assigned service packages, bills to be generated and revenue collection to be made.

Service Desk Performance Metrics

The following were the three metrics which received attention during the first management review: -Calls being closed directly by the Service Desk at the first call- actual figure 9 %, target 20% minimum. -Escalation of issues by the customer- actual figure 5%. The target was less than 1 %. -Customer Satisfaction Survey: Actual average score 3.9 out of 5. The target was 5 out of 5.

In Part I, we looked at the improvements to the first metric.

In Part II, we looked at escalation of issues by the customer.

In Part III we shall look at the Customer Satisfaction Survey (CSS) scores, and how to improve them.

Customer Satisfaction Survey

The target here was set at 5 out of 5 because the customer should be delighted, not just satisfied and there should have been no reason to complain! So what had gone wrong? Why was the average score only 3.9 and not 5? Remember, we are talking about only 6 customers here. They were major customers, with 100 to 150 users of the application and large numbers of subscribers to their telecommunications network, in the range of 100,000 to 500,000 subscribers including pre-paid and post-paid subscribers.

The IT Service Manager decided to analyze the CSS scores. He wanted to identify the reasons for receiving lower scores and to formulate an improvement plan. The CSS form contained the usual questions about the service received, including quality of service, promptness, and attitude.

The IT Service Manager came up with many points for improvement. We shall discuss here the issues related to the CSS process and analysis of customers’ comments.

1. Frequency of CSS

The frequency of the CSS was quarterly. The IT Service Manager found that the CSS was greatly influenced by the last incident or problem handled by the Service Desk and apparently did not reflect the service provided throughout the quarter.

2. Method of Conducting CSS

The CSS form was being mailed to the customer with a follow-up by telephone for its completion. It was found that some of the customers were not putting too much effort into CSS completion, with junior staff completing it.

3   Analysis of customers’ comments

General comments from the customer were analyzed. It was found that 3 out of 6 customers mentioned that competitor’s products offered more flexibility and features and that they wished to have these functionalities.

Two customers mentioned that comments in earlier CSS completed by them had not been acted upon by ABC.

Course of Action

The following course of action was taken over a period of 6 months:

  1. CSS was conducted after each incident or problem using a simpler form. Another template targeting more general comments about the overall service was introduced for the quarterly survey.
  2. The CSS forms were completed in direct meetings with the customer representatives. The quarterly CSS meeting was held with a senior manager of the customer’s organization.
  3. In these meetings, the comments received from the customer were addressed, and solutions and their implementation discussed.
  4. It was found that ABC’s next version of the application planned for release within 2 months actually contained the features desired by customers and offered by competitors. Documentation of this new version was provided to the customers with a discount provided if the upgrade was taken within a certain timeframe.

After 6 months, it was seen that the average CSS score was 4.8, an improvement indeed!