Keep Notes On My Performance - Alpha Assessment

The report from the alpha assessment of the GDS's Keep Notes On My Performance service on 21 April 2016.

Assessment stage Alpha
Assessment result Met
Service provider Government Digital Service (GDS)

About the service

Service Manager: Simon Everest

Digital Leader: Chris Ferguson

The performance review (PR) process is the annual evaluation of civil service staff against objectives and competencies in order to rank them against a guided distribution, with the range being from ‘exceeded’ through to ‘requires improvement’. Currently, managers review their staff based on evidence and feedback written up in a Word document to derive an indicative mark. This process is generally not very popular with a large number of users (both managers and staff) and one particular difficulty reported was the collection and management of evidence used to form the basis of competency and objective setting discussions.

The Performance Notes alpha was developed to address this problem.

Detail of the assessment

Lead Assessor: D Williams

The team demonstrated evidence consistent with the alpha stage of the Digital by Default Service Standard. The team should progress to the beta stage of development so the service can be progressed and tested with more users.

Researching and understanding user needs [points 1, 2, 10]

The team demonstrated a deep understanding of civil servants’ current experience of performance reviews, and of the needs from potential new services. And they have good evidence of the potential value of a simple tool for better note keeping between reviews.

The team has used a good variety of research approaches and sources, including comments on government blog posts, their analysis has been thorough and they’ve produced good visualisations of the things they’ve learned.

The team have made good efforts to research with all types of civil servants, and not focus on users who are particularly interested in, or upset about, performance reviews. Although the service does not have specific requirements for assisted digital support, the team have considered the needs of civil servants with lower digital skills and access.

However, given the importance of disability and accessibility in performance reviews, with disputes centred around fair treatment, we would have hoped for more specific research with disabled people.

Running the service team [point 3]

The panel was impressed by the team which seems to be in high spirits, are working well together and are fully committed to following common agile practices which have been iterated as the team have progressed through alpha.

There have been some issues regarding obtaining design resource but this is becoming less of a problem since the recent addition of a content designer. The panel was advised that a data analyst is to be added shortly

The team have been very careful to engage with stakeholders for support and to keep ‘joined up’ with other activity at GDS.

Designing and testing the service [points 4, 5, 11, 12, 13, 18]

It’s good to see that the team prototyped and tested a number of possible elements of the service and have settled on an initial one to focus on for beta, which is exactly what an Alpha phase is all about. Each of the prototypes that we were shown were well-realised and showed promise. The chosen element of Notes is already in a good shape and seems like it will be very useful for a large majority of the audience (in effect most of the Civil Service!).

Since these are specialist internal tools intended to be in use over a number of months they cannot be assessed against users ‘being able to succeed first time’. Indeed, it will be interesting to see how the tools are tested during the beta phase since the Performance Review process is so protracted.

Our only concern is that ‘Notes’ only caters for one element of the multiple user needs uncovered by the research so we look forward to seeing what other tools the team works on during the beta phase - certainly ‘Objectives’ seems like a sensible choice for one of the other strands.

It’s clear that all efforts were made to design and develop the tools using as many of the existing design patterns as possible but the specialist nature of the tools means that not a lot was out there already. Likewise, the GOV.UK style guide has been adhered to as much as possible and we have no concerns in this area.

Finally, although the team has been sharing the role of front-end development between a couple of members of the team with necessary skills, it’ll be important that they decide on one person who will be working on the production ready code for the public beta so it’s as robust as possible.

Technology, security and resilience [points 6, 7, 8, 9]

The performance notes are likely to contain sensitive personal data. Because this “could be used in a discriminatory way, and is likely to be of a private nature, it needs to be treated with greater care than other personal data.”

https://ico.org.uk/for-organisations/guide-to-data-protection/key-definitions/

The team must be clear about how they will collect consent from beta participants, and how they will process the notes during beta.

Overall, the team have demonstrated a well considered approach to how they have designed and built the prototype technically.

In addition, they were able to give a thorough and reasoned outline of how they will build the beta and we do not have any concerns there.

Improving take-up and reporting performance [points 14, 15, 16, 17]

The team have good plans for take-up of the new tool through beta. But we expected to see a clearer plan for how participants would be invited to take part in the initial beta.

The team will start beta with Fast Streamers, and we expect the team to get valuable feedback from that group. However, we are concerned that the insights gained from Fast Streamer may not apply to other user groups. We recommend that the team picks a very different group of users and brings them into the beta as soon possible.

Defining and measuring ‘success’ will be difficult for performance notes. Before their beta assessment, the team will need to clearly define what a successful outcome is for a user and for the service, and how they will measure that.

Recommendations

The panel advise that the service team should:

  • Engage with another (less technically competent) user group to explore their needs and to confirm that the service will meet these.
  • Investigate other tools that could be provided as part of this service. Evidence of having addressed the above points should be presented to the panel at the beta assessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 22 December 2016