Check or Amend your Property Data - Alpha Assessment

The report from the alpha assessment for VOA's Check or Amend your Property Data service on 18 April 2016.

Stage Alpha
Result Met
Service provider Valuation Office Agency

The service met the Standard because:

  • The alpha has been effectively used to determine the overall approach for beta development.
  • The team has a well defined set of user groups, and has tested the service with assisted digital (AD) users.
  • The team has thoroughly considered additional service performance metrics in addition to the four mandatory key performance indicators (KPIs).

About the service

Service Manager: Margaret Whitby Digital Leader: Mark Dearnley

The service will allow users to search for and find their business property and view a summary breakdown of how the rateable value for that property is determined by the VOA. Where the rateable value is believed to be inaccurate, users can challenge the rate and provide evidence to support this challenge.

Detail of the assessment

Lead Assessor: Tom Moore

Researching and understanding user needs [points 1, 2, 10]

The service team has used the existing research of the ‘check’ service team, and supplemented this with new user research for the ‘amend’ component demonstrated at the assessment. This research includes lab sessions, business visits, focus groups, and remote sessions.

The team have a good understanding of a clearly defined set of user groups. These ranged from large businesses acting through an agent or estates manager, to smaller businesses and individuals that may need to pay business rates on much smaller properties.

For Assisted Digital (AD) support, the team had identified some users with lower digital skills and access - mostly in micro and small businesses. The team has good evidence of the potential size of AD support from the related ‘check’ service, and from the numbers of users downloading the current form compared to those asking for a printed copy.

AD user research has been limited, and the proposed support options are either aimed at people with some digital skills and access (eg FAQs, webchat), or are based on evidence of needs from the ‘check’ services (eg phone support). The service team must do more to research with users with the lowest levels of digital skill and access, and fully test the proposed support options to make sure they work well for users.

At the time of the alpha assessment, the team has done no research or accessibility testing with disabled users. While the panel recognises that full accessibility testing is not possible for simple prototypes, the service does include elements that could be a barrier for some users (eg a letter with a PIN). The team must start researching with disabled users as soon as possible, and not wait until they commission an accessibility audit.

Running the service team [point 3]

Responsibility for the overall service is split between teams within VOA. These teams are co-located and are working in a way that allows findings and experience to be shared. The setup also allows coverage of roles within the team, as required.

The lack of a dedicated content designer in the early stages of the the project is apparent in the service. The interface of a service like this is mostly words. Content design is one of the most important roles and must be continuously involved.

Technical leadership and expertise is a concern. Most of the technical expertise is external, and the delivery manager and architect roles are shared. With insufficient expertise on the team, the service manager will be reliant on advice from suppliers and advisors outside the team, which may not always be available, could favour the supplier, or fail to take into account the priorities and domain knowledge of the service team.

Designing and testing the service [points 4, 5, 11, 12, 13, 18]

The overall service design follows a three stage process: check, challenge and amend.

The team had tested both wireframe and HTML prototypes with users and could describe iterations made to specific features within the service. There was however limited evidence of any substantial iteration to the overall user flow within the service. For instance, the service design is greatly complicated by the need for users to set up a separate account with the VOA. The VOA account offers no discernable benefit to the user over logging in using Gateway and presenting user records linked to the user’s Gateway account. The team hadn’t evidenced any alternatives to this user journey; experimenting with alternative journeys will allow the team to find the best possible route.

The service broadly follows GOV.UK design patterns, however there were instances where new patterns were being tested. This is fine however the team must make sure that any new features are thoroughly tested with users and are accessible. In the beta these will need to be developed so they work on mobile and with JavaScript disabled.

The team should be careful as there was a slightly confused hierarchy of information throughout the service. Calls to action such as ‘Save for later’ and ‘back’ buttons are often needlessly duplicated on the same page, or placed in un-typical positions. Focus on the various confirmation pages throughout the service often doesn’t highlight the most important piece of information, for instance, once a change has been submitted the information about needing your PIN to log back in is buried in the third bullet point at the bottom of the page.

The team will also need to develop the content of the service. Early prototype testing should have focussed more on the clarity of the content. Content within the service was at times verbose and would leave users unclear on what to do next, and what information they present would be made publicly available (eg applications for hardship relief). It’s imperative that that content is reviewed as part of each sprint, and wording is tested as part of user research.

Technology, security and resilience [points 6, 7, 8, 9]

The team has evaluated the risk of fraud and has developed a strategy for offsetting this risk involving authentication steps and transaction monitoring.

A risk remains that any user, including those without authority, can easily link themselves to a property and view details that are not published on the public register. The service manager and SIRO have accepted the impact of this risk on the basis that the penalty scheme for making such a false declaration provides an adequate deterrent, and the data that can be accessed is not very sensitive.

This risk, mitigations and impact should be re-assessed continually and we will want to look again at the status of this at the beta assessment. The team will need to consider carefully which data can be accessed via the online service, given the ability for individuals to self-assert their entitlement to view the information. Additionally, the team will need to consider, and if appropriate mitigate, the risk of a small number of gateway accounts being used to download all the non-published data for all properties.

The tax platform provides a good technology base for the service and the team are committed to using the available features to automate builds and follow a process for promoting code from dev, via staging to production. An environment for testing performance is also available.

The team is committed to coding in the open and intends to publish its code continuously during the next phase of development. No exclusions to this policy have been identified so at the beta assessment the panel hope to see most, if not all, of the service code in public repos.

The team explained that they intend to use GOV.UK Notify to keep users informed on the progress of their applications, and GOV.UK Verify to establish the identity of applicants in the future. Additionally, we discussed the potential to publish the public version of the data as a public register, and to consume registers from other departments.

Improving take-up and reporting performance [points 14, 15, 16, 17]

The service in development is a new one, and is offered online only, with telephone line support.

The team are looking to promote the service via local authorities, and are also looking to promote the channel through a number of a number of business groups (eg Federation of Small Businesses, British Retail Consortium, licensing bodies).

In addition to the four mandatory KPIs, the team have given considerable thought to other meaningful metrics.

Recommendations

The service team must:

  • Remove the need to create an account with VOA. Login to Government Gateway is sufficient to declare connections to properties, see properties and submit updates.
  • Begin user research and accessibility testing with disabled users. Although some accessibility testing must wait until the service is fully developed, research with disabled users to understand access needs and potential barriers must be continuous, and some accessibility testing is possible with prototypes.
  • Where possible, ensure continued involvement and influence over all aspects of the service (eg. design of AD support, API connections). These elements shouldn’t be developed independently of the service that will rely upon them.
  • Engage with the GOV.UK Verify team to clarify the identity needs for the service and set the service up for Verify login if appropriate: govukverify_engagement@digital.cabinet-office.gov.uk
  • Decide on which data to show to logged in and non logged in users. Given that the alpha has found no way to check an individual’s entitlement to view the “private” data in realtime, any data shown to logged in users must be considered as “public”, i.e. available to anyone who is motivated to view it. Alternatively, if a method to determine entitlement progressively before showing sensitive details can be found, this would also meet the Standard.
  • Determine the impact of the entire set of information available via the service being downloaded and published on the internet. Implement countermeasures as appropriate.

### The service team should also:

  • Engage with the Registers team and GOV.UK Notify.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 3 January 2017