Biometric Residence Permits (BRPs) - Beta Assessment

The report from the beta assessment of the Home Office's Biometric Residence Permits on 17 August 2015.

Department / Agency:
Home Office

Date of Original Assessment:
14/01/2015

Date of Reassessment:
17/08/2015

Assessment Stage:
Beta

Result of Original Assessment:
Not Pass

Result of Reassessment:
Pass

Lead Assessor:
A. Lister

Service Manager:
P. Smith

Digital Leader:
M. Parsons


About the Service

The service offers a structured contact route for individuals who are encountering problems either with enrolling their biometric features as part of an application for permission to remain in the UK, or with the BRP secure token of leave issued to those whose applications are successful. The problems encountered range across delayed/missed delivery or collection of the card, lost or stolen cards to errors on the card itself.

Reassessment Report

17th August 2015

The Biometric Residence Permits service has been reviewed against the 14 points of the Service Standard which were not passed at the original assessment.

After consideration the assessment panel have concluded the Biometric Residence Permits service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch as a beta service on a service.gov.uk domain.

Reasons

It is clear that the service team has made substantial progress since the initial beta assessment. The panel was particularly impressed by the service’s move away from proprietary form building software to in-house development. The benefits of doing this were clear in the quality of the service demonstrated and the ability to make changes to the service quickly and easily.

The team has responded to recommendations in the previous assessment by increasing user research, including with people who have the most challenging needs and the case workers who support them. They have identified initial user needs for support and proposed support is by telephone through an in-house contact centre and face-to-face through existing Home Office contracts with third parties. The team plans to test both methods of support and explore other potential assisted digital user journeys in public beta. The team are already making changes to the on-screen part of the service in response to feedback from third parties supporting users. Members of the team, including the service manager, have viewed user research and can talk confidently about their users’ needs.

The service is employing standard design patterns and is well engaged with the broader government design community. The service is making good use of the Performance Platform to instrument and monitor the end-to-end service.

Recommendations

Although the panel appreciate that engaging with a broad range of users and their representatives is difficult, the panel would urge the team to challenge policy on this. The relatively easy wins presented by tech-savvy Tier 2 and Tier 4 applicants should not be considered as indicators of a successful service in isolation. Disadvantaged users, including refugees, those without digital skills and applicants who can legitimately apply without the need for proficiency in English, need to be part of future research plans and appropriate, sustainable support must be provided if there is a need.

The agile delivery process needs to be more precise with clear definition of tasks in the development backlog, mapping these to user stories, and assessing necessary effort. It should be clear which tasks will be delivered in the current sprint, how user testing will be fed back into the backlog, and what is an estimated delivery time for a user story. If the team practices these methodologies consistently, it will get a necessary level of experience and confidence.

The team should make sure that a designer, content designer and user researcher are all part of the development team throughout the life of the service. This will support continuous improvement and rapid iteration.

The designer and front-end developer should participate in the current cross-Government discussions about departmental style guides and toolkits, and determine the best use of toolkits and style guides for the Home Office.

The team also needs to consider failover scenarios for handling emails when parts of the system not directly under the team’s responsibility are not available. There should be operational procedures in place to notify GOV.UK of planned maintenance and for emergency situations.

Failover is currently monitored using infrastructure capabilities. It would be useful to consider in-application monitoring.

Summary

This is an important service with international exposure. The team has embraced the difficult recommendations set out in the original beta assessment and demonstrated that it’s possible to successfully challenge the status quo to create a high quality digital service that meets user needs. The panel look forward to seeing the service for live assessment.


Summary of Original Report

14th January 2015

After consideration the assessment panel has concluded that the Biometric Residence Permits service should not be given approval to launch on the service.gov.uk domain as a beta service.

Reasons

The service team demonstrated a thorough understanding of the business processes and its potential challenges for service users. However, it was apparent that actual, evidenced user-needs were not currently at the heart of the service’s design. The project’s history was outlined but it remained unclear to the assessment panel when things had been built and how frequent the iterations had been.

The points against which the service did not pass are explained below:

User needs (Points 1 and 20)

User needs are the foundation of the service standard. Although the Home Office team demonstrated that some research had been completed, the number of users and the frequency with which they were engaged indicates that a great deal of work is required to put users at the heart of the service’s design.

User needs were expressed as assumptions about potential difficulties users are likely to encounter, based on experience of non-digital provision of the service.

There was significant disconnect between research, generating actionable insight, integrating and measuring the impact of change. Specifically:

  • Performing a discovery phase to understand the needs of people wishing to stay in the country and establish the criteria for success of the service.

  • The number of users engaged and the timing of user research sessions.

  • Beyond face to face sessions and surveys, the research methods used to identify user needs and actionable insights were not clear.

  • User-stories, written in language real users would understand, are the most important tool in an iterative delivery and these were not evidenced at assessment.

  • There was a lack of research into ‘out-of-country’ applicants, who will start to receive BRPs later this year.

  • Learnings from user research obtained before the current researcher joined the team were not apparent.

It was evident that funding was available to take the service forward but not clear how funds and resource would be allocated to support and sustain continuous iteration.

The Team (Points 2 and 6)

The team has struggled to maintain continuity over the period of development. Meaningful handovers have not happened and the current team relies on documentation stored in Redmine.

Security, Privacy, Tools and Standards (Points 15, 17 and 25)

The service employs proprietary software already in use at the Home Office for reasons of economies of scale across multiple services, despite the fact that it is always difficult, and frequently impossible, to develop a useful and usable service when pre-existing technology is the start-point for a delivery.

Proprietary software has been employed because it offers ‘drag-and-drop form design without the need for specialist skills such as coding’. However, developers are also being used to customise the service. This means that the off the shelf solution does not meet the needs of the service and, consequently, the users. The assessment panel felt that the developers would be better employed building the service rather than customising an existing supplier’s product

Live-like end-to-end testing of the service, including any physical hand-in and hand-off points, is vital. The service demonstrated at the assessment had been observed in use by potential users but the research was constrained to the user behaviour in the digital space. For example, the assessment panel would expect to see a user working from information on physical correspondence they had received during the course of the application and permit issue process. This would include the letter with prompts to the error notification service which was mentioned but not shown.

The service had been tested using Browserstack but little testing had been done on physical devices other than laptops/desktops. Tablet and mobile devices now dominate as the preferred method for internet access and should be prioritised in research and testing.

Accessibility appears to present challenges because the changes to the proprietary software are prioritised by the vendor rather than the Service Manager. The assessment panel would expect to see any outstanding accessibility issues rectified and supported by appropriate evidence before reassessment.

In the event of the service being compromised or failing, a number of parties share responsibility. The Service Optimisation Team manage the general running of the service but the service itself is hosted by the software vendor, with the Service Manager ultimately responsible for service availability. This is not necessarily a problem but the assessment panel would like the team to demonstrate a consistent, shared knowledge of who does what and when in the event of service failure.

Improving the service (Points 14 and 19)

The Service Standard requires frequent releases - at least every two weeks - of the service so that improvements identified in user research can move into the public-facing service as quickly as possible. On achieving Beta status, this service moves to the Service Optimisation Team which is also responsible for the Home Office’s other digital services. At the at assessment it was not clear that the working relationship between the Service Manager and the Service Optimisation Team was well defined, because of this it was not possible for the team to commit to the actual frequency of service releases. However, the Home Office has subsequently explained more clearly its service management model and will also make this clear during any follow-up assessment.

Similarly, some improvements may only be undertaken by the software vendor. Such work requires chargeable evaluation and provision of requirements by the Home Office further extending lead times for releases.

Design (Points 9 and 13)

The assessment panel felt there was insufficient evidence to demonstrate that the service was intuitive and simple to use.

A thorough understanding of the non-digital steps in the process was shared but these steps had not been part of end-to-end service testing.

The functionality of the service is not inline with either GOV.UK or general web standards, for example, within the service, the browser back button does not work. Recommendations from the GOV.UK design team have been provided.

Similarly, a number of content design issues of varying severity need to be addressed. Again, observations and recommendations from the GOV.UK content design team have been provided.

Assisted digital and channel shift (Point 10)

As with the core of the digital service, the team proposed assisted digital support based on assumptions, rather than proven user need. The Service Manager showed a good understanding of assisted digital principles but was unable to provide any evidence of research with assisted digital users. Proposed channels and number of transactions are based on a related service and current use of paper, rather than potential assisted digital users of this new digital service.

For the beta assessment the assessment panel expects to see evidence that the team has undertaken user research to identify user needs and likely demand for each channel for this specific service. The team should also have a clear plan for ongoing user research and testing assisted digital support during public beta.

Analysis and benchmarking (Points 7, 8, 18, 21, 22, 23 and 24)

The Home Office team had undertaken user research and provided videos of the service being tested. However, there was no clear evidence of how actionable insight had been derived from the research, and used to formulate production stories, prioritised, built and released.

Similarly, low research numbers and constraints around implementing analytics within the proprietary software meant that no meaningful evaluation based on in-service analytical data could be undertaken.

It was encouraging to see the cost per transaction being clearly understood in terms of the overall operational overhead as well as the cost of operating the digital service. The cost for out of country transactions will need to be evaluated to ensure the data published on the performance platform is accurate. To get this aspect of the service right, early engagement with the GDS performance platform team is advised.

The take up and completion rates shared by the Home Office team were ambitious and did not appear to have a logical basis given the absence of substantial user research and service testing. The fact that many current users communicate by email does not directly equate to the desire or ability to use a wholly online service. However, once the service achieves Beta status, the real data generated will enable realistic estimates and trajectories to be established.

Recommendations

For reassessment, the team needs to:

  • Put the case for using proprietary software which is clearly obstructing development, iteration and control of the service rather than using the currently employed developers to build this relatively simple service in a way wholly aligned with the service standard.

  • Complete end-to-end user research (including non digital elements) with a broader, more representative range of potential users including out-of-country users. This must also include potential in-country users of an Assisted Digital service.

  • Demonstrate how user research is used to generate actionable insight.

  • Show how insight from user research, analytics data, technical and security requirements, etc, is used to create production stories which are developed, tested and integrated into the public facing service.

  • Undertake user research across a range of devices and describe the actionable insight generated.

  • Describe the plan for ongoing user-research and frequent, rapid iteration of the service.

  • Show how in-service analytics will be developed and used to support improvement.

  • Engage with the GDS performance platform team to agree how key metrics will be shared.

  • Explain how the current and future resource models – including the Service Optimisation Team – are managed and led in a way that ensures the service can be iterated frequently and rapidly under the direct control of the Service Manager.

  • Evidence that the known accessibility issues have been resolved – including those in the proprietary software; and complete the outstanding corrections and recommendations from the GOV.UK design and content design teams.

Summary

The team shows obvious and genuine commitment to improving users’ experience of their service and, in turn users’ experience of the Home Office as a whole. Many of the elements necessary to achieve this are in place. As it is a relatively simple service, there is scope, with the right commitment, to address the points described above and put the service forward for reassessment in the next eight to ten weeks. To do this, the recommendations need to be fully addressed. The panel looks forward to seeing the service for reassessment.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes

Updates to this page

Published 22 December 2016