How to Measure or Grade the Performance of IT Staff
The measurement or grading of MSP technician productivity is undoubtedly a hotly contested topic. While many support using several measurable values, others argue that measuring any one dimension does not paint an accurate picture of performance. The IT industry lacks a standard method of grading techs.
Some professionals argue that measuring an IT expert’s lines of code is pointless. On the other hand, determining productivity by counting the number of closed tickets is prone to manipulation. Techs may become creative by optimizing simpler tickets and opening new tickets. Likewise, assessing productivity levels based on daily commits may translate to small and frequent commits without necessarily increasing the volume.
According to expert discussions held on various platforms, including Reddit, most metrics are easy to manipulate.
In information technology, exceptional performance has a direct bearing on the timely completion of services and projects. Users depend on the work of IT specialists for the availability of quality services. Conversely, poor performance leads to increased system malfunctions, security breaches, and unreliable information produced by the systems.
Measuring or Grading Techs: Here are the Divergent Views
Discussions about the topic emanate from MSPs’ need to determine the productivity levels of individual techs. One employer who participated in a discussion on Reddit sought to find a way to assess the value of various tickets handled by an in-house IT expert. The employer revealed that the difference in the work done on one ticket to another complicates any attempts at measuring or grading performance.
Another professional expressed dislike of the time entry metric since the nature of IT tasks varies. The commenter noted that techs could easily manipulate this metric. For this reason, the Reddit user revealed a preference for a combination of surveys and the service level agreement (SLA) as a performance benchmark.
The gist of the argument is that a competent IT specialist provides a favorable customer or user experience by delivering results. According to the commenter, personality plays an important role in improving end-user experience irrespective of technical skills. Asking end-users, including customers, for feedback can provide insights into the tech expert’s overall performance.
Meanwhile, some commenters urge managed service providers (MSPs) to set daily targets. This approach aims to improve the accuracy of grading performance using tickets and time entries. The expert believes that IT companies can begin with manageable daily targets of around 80 to 85 percent, which equates to roughly six hours every day.
Also, employers need to check the tickets for clarity, troubleshooting results, and other technical factors. Participants in the Reddit discussion also recommend conducting shadow or mentoring sessions. It becomes easier to gain insights into how the techs handle tasks throughout the day.
Key Performance Indicators (KPIs)
Most participants in the discussion agreed that gauging performance based on clients’ feedback is an effective option. One commenter suggested considering tech experts’ overall billable time to determine whether they keep notes appropriately. Likewise, MSPs need to check the work done and the billed time to ensure that the two are equitable.
Another participant stated that one IT firm encouraged techs to define their own KPIs, which the MSP used to grade performance. However, manipulation still undermined the effectiveness of the approach.
Employing CSAT is another suggestion raised in the forum. Commenters recommended taking advantage of feedback systems to track the nature of customer or user responses. On the other hand, most professionals agree that traction metrics and goals should be ‘smart’ style and achievable.
The Reddit discussion also highlighted the importance of communicating with IT staff to assess their abilities and motivation levels. Such an assessment may also reveal that techs feel overworked and suffer from burnout. Techs given an excessive number of tickets are more likely to experience burnout, negatively impacting their overall performance.
Hence, over-relying on the ticketing system to gauge productivity could compel IT professionals to prioritize volume rather than improve customer experience or quality results.
When it comes to tickets, some professionals urged MSPs to categorize the tickets by type or subtype. As such, each ticket requires a corresponding marking: tier 1 (T1), tier 2 (T2), or tier 3 (T3). Once the classification is complete, managed service providers should determine the minimum and maximum working time for each tier. For instance, 35 minutes for T1 tickets, one hour and a half for T2, and eight hours for T3 tickets.
Monitoring time entries is a surefire way to ensure compliance by the techs. These actions enable MSPs to gain an in-depth understanding of ticket volume, capacity, and classification. In turn, it becomes easier to plan more effectively for the future. MSPs can hire new IT staff knowing how they fit into the tech team and the company’s day-to-day operations.
When assessing performance based on tickets, it is vital to check whether techs prioritize urgent ones. If not, MSPs must establish the reasons for not prioritizing urgent tickets.
One of the discussion participants once worked at a midsize MSP, which measured performance based on ticket re-open rates. This approach provides a more accurate picture than ticket volume and time entries. Re-open rates focus on the number of times clients call back for the same issue.
Additionally, management at the IT firm considered the complexity of the issues handled by techs. Some issues entail a simple fix, while others involve a cumbersome troubleshooting process. The same MSP also evaluated performance using customer surveys.
Many professionals viewed surveys as the best option, making it the ultimate measurement for information technology (IT) staff performance. Most clients provide honest feedback about a particular IT specialist’s performance.