A multinational company (we shall call it ABC), developer of a telecommunication billing application, had a three-member Service Desk in a country in South Asia. They had adopted ITSM guidelines only two months back and had six major or high profile customers in the country. This billing system was used for managing the entire life cycle of subscribers to the mobile phone services provided by the customer. As you can imagine, it was a very complex application which allowed subscribers to be assigned service packages, bills to be generated and revenue collection to be made.
ABC had recently moved from a Help Desk situation to a Service Desk and there were teething problems experienced by the team in their new role. Earlier this team had to receive, prioritize and deal with calls received from customers, and mostly route them to other departments for resolution. They then had to keep track of the resolution time and closure of the call ticket. Routine analysis was being done of call closure data, adherence to SLA’s and customer feedback.
In their new role as a Service Desk, The team had to be the single point of contact for the customer for all types of queries and issues. Internally, they now had to record and prioritize all these calls, deal with the appropriate department, and track to closure any open issue whether a query, a service request, an incident, or a problem. There were also requests for bespoke customization from customers which had to be routed to the development team for feasibility studies. Once approved, these had to be tracked till they were implemented in the customer’s system.
Clearly the Service Desk now had a much wider scope of work and a critical one at that because it had a customer facing role. The first month’s analysis of call handling data led the IT Service Manager to critically look at several metrics which needed improvement as they were not meeting targets. This case study points out the three metrics which were highlighted. In the first part, we will look at action taken to improve one of them.
Service Desk Performance Metrics
The following were the three metrics which received the spotlight during the first management review:
-Calls being closed directly by the Service Desk at the first call- actual figure 9 %, target 20% minimum.
-Escalation of issues by the customer- actual figure 5%. The target was less than 1 %.
-Customer Satisfaction Survey: Actual average score 3.9 out of 5. The target was 5 out of 5.
Calls closed at the first call
The target here was set at 20% because it was seen that 50% of calls received by the Service Desk were queries about the application package and the bespoke software, mostly related to functionality, settings and configuration. Management had reasoned that there should be enough knowledge about the application within the Service Desk function to handle and close at least 20% of all calls during the initial call.
The IT Service Manager spoke to the team members of the Service Desk and analyzed call details to arrive at the causes for not meeting this target. The two main reasons for not meeting targets were:
1. There were frequent change requests for adding new functionality to the application to cater to the changing needs of customers in the very competitive atmosphere that prevailed. However, staff at ABC’s Service Desk was not always up-to-date about details of these changes. and hence they frequently needed to refer to the Software Development team for information.
2. The Service Desk staff lacked confidence and routinely did not attempt to answer the queries and close the calls during the initial interaction with the customer. They preferred to play safe by involving technical teams.
Course of Action
After much brainstorming the following course of action was implemented over a period of two months:
1. The release procedure for bespoke development linked to the application, and specific to a customer, included a briefing session with the Service Desk staff to make them aware of the new functionality being provided, and other relevant details.
2. An indexed database of frequent queries (and their answers) categorized by module (of the application) was designed and populated by the technical teams in a month-long project. Search words could be entered to see if a similar query had been answered before.
3. The Service Desk had an additional team member to ease the burden and allow for staff to attend training regularly. This new team member was actually from the Software Development team and had enough knowledge about the application to strengthen the team’s knowledge bank. In fact, it became the practice to assign a software development engineer to the Service Desk on rotation basis (six-month stints) in order to increase the technical knowledge of the team.
After three months of implementing the above action points, it was found that 20.5 % of all calls were closed satisfactorily at the first call itself. Target achieved, but further improvement needed!
In Part 2 of this post, we will look at the other metrics and see what action was taken to improve the performance against their targets.