Apply for a postal vote & Apply for a proxy vote Alpha Assessment

The report for the Apply for a postal vote & Apply for a proxy vote alpha assessment on the 06 April 2022

Service Standard assessment report

Apply for a postal vote & Apply for a proxy vote

From: Central Digital & Data Office (CDDO)
Assessment date: 06/04/2022
Stage: Alpha
Result: Not Met
Service provider: DLUHC

Service description

England, Scotland and Wales.

Currently there is only a paper-based user journey for electors who want to vote by post or proxy. New online application forms will be introduced for electors who want to vote online.

The Elections Bill amends existing legislation to enable the identity of absent vote applicants (electors using postal and proxy ways of voting) in Great Britain to be verified. The identity verification process will apply to paper applications as well as to applications made online.

Electors using a postal vote on a long term basis in Great Britain will need to reapply every three years; currently electors can apply to vote by post indefinitely.

A person may be appointed to act as a proxy for a maximum of four electors (no current restriction), and within that four, no more than two may be electors who are not overseas electors or service voters.

Postal and Proxy Voters in Scotland and Wales

Whilst the Elections Bill makes identity verification a legal requirement for voting in UK-wide General Elections, the requirement does not apply to devolved polls/referendums in Scotland and Wales. Separate applications are in place for such elections/referendums. This will mean that a voter in Scotland or Wales who wishes to vote by post or proxy for all elections will need to complete separate applications, one through the online absent vote application service to cover their vote in GB general elections/UK-wide referendums, and another application through the relevant Scottish or Welsh government services, to cover devolved polls/referendums.

Please note: Northern Ireland is not in scope for this service, and is the subject of a separate Discovery.

Service users

Postal voting

Groups Locations
Vote by post for convenience England, Wales, Scotland
Overseas UK voters Europe, Commonwealth
Over 70s England, Wales, Scotland
People with severe disabilities England, Wales, Scotland
Low digital skills England, Wales, Scotland
Users with access needs Europe, Commonwealth
Users without a permanent address England, Wales, Scotland
Users whose first language is not English England, Wales, Scotland
Electoral Registration Officers England, Wales, Scotland

Proxy voting service users

Groups Locations
Away on polling day (single election only)  
For being away on polling day, medical issue or disability, or not being able to vote in person because of work or military service England, Wales, Scotland
Education (full time course away from home) England, Wales, Scotland
Armed Forces England, Wales, Scotland
Crown/British Council England, Wales, Scotland
Spouses (marriage and civil partnership) for proxy applicants of armed forces, educational courses, employment). England, Wales, Scotland
Low digital skills England, Wales, Scotland
Overseas UK voters Europe, Commonwealth
User with access need Europe, Commonwealth
Users whose first language is not English England, Wales, Scotland
Users without a permanent address England, Wales, Scotland
Elections Registration Officers England, Wales, Scotland
Scottish and Welsh electors Scotland, Wales

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used a range of methods to understand how well their designs performed in usability testing
  • the team has a well-segmented understanding of the users of their service. They worked hard to include users from across these segments in their usability testing
  • the team gave a lot of thought to how the service will affect back-office processes, and demonstrated an in-depth understanding of the staff who will be affected by their service
  • the team engaged with the universal barriers framework as a way to understand how different users may experience their service

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct significant testing with users with access needs. There are concerns that the online implementation of the service creates a number of overlapping accessibility issues which will significantly disadvantage users with disabilities
  • consider proxy voters as a primary user group and research with them accordingly. The panel is concerned that proxy users were not involved in testing or research, though the service uses their data, relies on their compliance, and imposes legal duties on them. The team needs to understand how to design for proxy users so that their needs are met

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has reviewed the existing 7 proxy application forms to identify data captured across the entire set
  • the team has been working with EROs to explore how the service will interact with their existing processes, such as exporting data to local CRMs or enabling them to turn off the portal’s templated messages
  • the team has been speaking with other teams, including GDS, the GOV.UK sign in team and the DfT Blue Badge service

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how the proxy elements of the proxy voting service work with existing ERO processes. For example, what if the proxy details provided by the elector are incorrect? When will the mistake come to light? How can the details be changed? Are you confident you’re not introducing security/fraud risks?
  • make sure the service is making electors aware of any legal obligations they may be subject to, such as ensuring the contact details of the proxy are correct and checking that the proxy isn’t already acting on behalf of too many electors

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered a wide range of routes into the services and ways to promote the services, including social media and community groups
  • the team has considered non-digital routes for applying and for support
  • the team has worked with EROs to understand what their needs are from the service

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the offline support mechanisms available to users, such as the ‘get help applying’ phone line support and associated help scripts that may be used
  • the team needs to actively test how users find and experience the routes that are offered as alternatives to a digital application

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have looked at the time it takes electors to read and comprehend information on a page
  • the team has provided support (‘get help applying’ and ‘continue on paper’ information throughout the journey)

What the team needs to explore

Before their next assessment, the team needs to:

  • explore reducing the amount of information on the start page and making it clearer to electors what their alternative application routes are, especially for those who may not be able to provide signatures
  • make sure that the right information is given at the right time, based on user research insights. For example, are the ‘Get help applying’ and ‘Continue your application on paper’ links needed on every page in the service?
  • explore whether there are simpler approaches to some pages, such as the ‘Upload a photo or a scan of your […]’ page, which currently has 3 details components and a link, and the ‘If we have questions about your registration …’ page, which has lengthy hint text and may be simplified by turning it into “How can we contact you if we have questions about your registration?”

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has accessibility testing plans in place for Beta
  • the team has tested potentially challenging points, such as the signature upload, with a range of users
  • the team has provided additional support for the signature upload, including images
  • the team understand the fall back scenarios if a user is unable to meet any of the document-based identity confirmation requirements

What the team needs to explore

Before their next assessment, the team needs to:

  • design and test with users with access needs to ensure that the online service and wider service design accounts for their needs as well as they can

  • explore how to ensure that the digital signature capture element of the service, and alternative pathways, are as painless as possible for all users
  • explore, design and test the offline support mechanisms available to users, such as the ‘get help applying’ phone line support and associated help scripts that may be used

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team appears to have a good mix of skills and capabilities
  • DLUHC have developed good ways of working with their supplier over the development of this service and the other two which have recently been assessed

What the team needs to explore

Before their next assessment, the team needs to:

  • have a plan for their eventual transition to live running
  • work on knowledge management and transfer to ensure that they are able to continue their good work on the service when the incumbent suppliers are rolled off

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has established clear ways of working, and can describe and reflect on these
  • the team are empowered to solve problems and deliver the service inline, but also have clear escalation routes and governance
  • the team has a clear view on what problems they are trying to solve with this service, and have aligned around a common goal

What the team needs to explore

Before their next assessment, the team needs to:

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is committed to continuous research and incremental design improvements
  • the team is able to learn from the past and switch up their approach to solve new problems

What the team needs to explore

Before their next assessment, the team needs to:

  • understand how they may need to change the design of their service in future to match all of the ‘proxy voting’ use cases
  • continue to review and make changes to their content based on user feedback, and with dedicated content design input

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service retains as little data as possible before handing it off to ERO services
  • the team is thinking about how proactive activity monitoring can be integrated into the service to protect users from fraud
  • the team has given deep thought about the trade-offs in the verification approach

What the team needs to explore

Before their next assessment, the team needs to:

  • evaluate the cost-benefit of exposing counter-fraud data to the EROs

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified KPIs and have benchmarked these as far as possible
  • the team have an idea of which of these KPIs are a priority for them to measure service performance
  • the team plans to run analytics across the three election services to get a more holistic view of how their services are performing

What the team needs to explore

Before their next assessment, the team needs to:

  • continue exploring ways to benchmark their performance in the existing service

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team takes into account the necessary “spikiness” of the service’s traffic profile in choosing their technologies for quick scaling
  • the service is decoupled from downstream consumers to prevent cascading failure and allow consumers to process messages at their own rate
  • the authentication parts of the service use off-the-shelf components to provide features like a second factor

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has published code that is safe to do so
  • the team’s private code is kept private for good reasons

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether it would be possible in future to publish the remaining closed-source code, and how to make that happen

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used off-the-shelf design system components
  • the service utilises platform-provided features for login and issuing client certificates rather than deploy their own
  • the team architected their service around queues for decoupling, meaning the user will experience a faster frontend journey rather than waiting for synchronous calls

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure it’s following the GOV.UK Design System throughout both services, except where they have user research to support deviation (for example, the findings for the ‘before you start’ information on the start pages). This includes content patterns, such as using ‘First name’, not ‘First (given) name’

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses AWS Lambda as a way of coping with irregular traffic profiles
  • the support plan running up to the registration deadline is tailored to expectations from the public
  • the service leans on other teams to provide out-of-hours security monitoring in their platform accounts
  • the service stores state in such a way that in the unlikely event of an inability to publish downstream, a user should be able to refresh and try again

What the team needs to explore

Before their next assessment, the team needs to:

  • the team runs a light incident drill to make sure incident management processes are in place and well understood

Updates to this page

Published 29 June 2022