Renew a patent, trademark or design alpha assessment report

The report for IPO's Renew a patent, trademark or design alpha assessment on the 19th July 2018.

From: Central Digital and Data Office
Assessment date: 19th July 2018
Stage: Alpha
Result: Not met
Service provider: IPO

To meet the Standard the service should:

  • set a clear outcome for the wider service
  • use this outcome to target further user research, to ensure that the key user needs are understood across all key user groups
  • research and develop appropriate support routes for users with low digital skills, and increase research with users with access needs

About the service

Description

This is part of a wider service about Intellectual Property (IP) rights. This system covers renewing UK patents, trade marks and designs. It is consolidating three separate renewal processes (two online, one paper) into one system. Where these are renewed online, and within the given deadline, the new system will give an automatic result.

Service users

The users of this service are the citizens, businesses and intermediaries in the UK and overseas which apply for or manage intellectual property rights (patents, trademarks and designs).

User needs

The panel was pleased to see that the team has used a good variety of methods to learn about their users during alpha. They have identified usability problems and iterated. However, surveys were the only method used during discovery. The panel is concerned that this means the personas reflect wants rather than needs. The team seems to have learnt much during alpha and it would be good to see the personas iterated to take into account what they’ve learnt.

The team has created a long list of user needs. It was unclear which of those needs this alpha will meet and how the team will know they’ve met them. It seemed the alpha was trying to meet the needs of a broad and disparate range of users but had focussed research on bulk users. The panel was concerned that in doing so they may be solving problems for some users and creating them for others. It would be good to see the user needs refined and iterated so that the team has a clearer idea of the scope of the service before going into beta.

The panel was pleased to see that the team had taken into account the needs of users who are renewing IP in multiple countries, and are working with colleagues to make the journey consistent across national boundaries.

The panel was pleased to see that the team had considered assisted digital, digital inclusion and accessibility at alpha. Given that the personas include a broad age demographic it’s important for the team to do more research with a broad range of users with different digital skills and to plot them on the digital inclusion scale. Doing this will give the team a clearer understanding of the problems their users may experience with a digital service. It may also help them to identify problems that the new service can solve for this group. The same goes for accessibility. It’s important that the team do more research with real users of the service who have a range of access needs.

Before the team goes into private beta the panel recommends that they use what they’ve learnt from Alpha to refine their personas and list of user needs. That they prioritise the personas - who’s problem are they solving first, how will that affect the others and how can they mitigate against negative effects. This will help the team create a shorter, more focussed list of user needs that they will be able to use to measure the success of their private beta.

Team

The team is made up of civil servants, who are all permanently assigned to the team. They will get more resources for the beta.

The team is co-located, and discuss user research findings as a team. They include the wider Business in what they are doing – including by running breakfast sessions and inviting staff to observe user research.

The team is part of complex wider governance. The service owner feels empowered, but there has not yet been an example of where they need to make a significant decision without the wider governance. The chair of the wider governance and the CTO have both attended show and tells.

The panel was concerned that the scope of the work to date was too narrow, and hadn’t explored the key problems users face, instead focussing on replacing existing solutions. There has been prioritisation of work for an MVP, so sensibly not including functionality like user authentication where there is no clear user need at present. There is no clear plan for the private beta yet.

Technology

The alpha service is an HTML prototype only, which has been built on the GOV.UK Prototype Kit. The architecture for the beta has been designed, showing a three-tier structure with the frontend running on AWS and the API layer on an IBM cloud, linking to on-premises backend services using an enterprise service bus.

This separation of concern between frontend and business logic is sensible, allowing the frontend application to remain simple. However, although the on-premise part of this structure is out of scope of this assessment, the team should consider closely whether this complex architecture and the use of an ESB is appropriate given the services they provide. Equally, the use of proprietary IBM systems at the API and backend layers needs careful examination during the beta to ensure that no preferable open-source alternatives exist.

The frontend itself will be written in Scala using an existing forms platform developed by Equal Experts, and the API layer will be in Java. The team have committed to putting all code in the open via GitHub, and they should ensure that they do so from the start of development work. They plan to use GOV.UK Notify to send email confirmations, and GOV.UK Pay will be investigated as an option for the payment process.

The team are aware of issues around handling data. Most of the data for this service is public via the registers; care has been taken to restrict the personal data requested from the user to what is required to send confirmations, and data is not stored at the frontend. Security tests are planned for the beta.

The organisation has a documented ITIL-based process for service management and support at different stages and will be following this as the service progresses from alpha. The team follow a Kanban process for development. They have multiple environments for each part of the service. However, they are currently tied to a three-week release cycle; they must ensure that they develop the capability to release more frequently, especially as they iterate the frontend in response to user research.

Design

It is good to see that the team has tried different approaches before settling on this journey. However, the service team should test the full end to end journey including payment pages to ensure that users can get through the first time.

The panel are concerned that the start of the service may be confusing to users. The start page is hidden in several pages of guidance. The team should engage with the gov.uk content team to get a start page that is more in line with GOV.UK.

While the team has designed a service that is largely consistent with GOV.UK patterns and styles, the panel would like to see evidence of varied approaches tested before deviating from the design patterns. An example is the ‘Check renewal details’ pages. Aside from deviating from design patterns, it is not consistent with the “Check errors” pages on the step previous.

It was good to hear the team are doing the hard work to make it simple for users by showing them what fines or fees are due, rather than expecting the users to know that information.

The service team should ensure that all the information users need is available at the time of need. For example, in the initial step ‘What do you want to renew?’, information is hidden in individual radio button detail which is relevant to all of the options.

The panel recommends the team get in touch with GOV.UK Notify for patterns around uploading a spreadsheet. The Notify team have done extensive research and iteration on this interaction which would benefit the service team.

It is unclear that changing contact details doesn’t change register details. It is good that the team has noticed this in research and they should aim to make that journey clearer for users.

Analytics

The team are planning to measure the mandatory KPIs, using the existing online service for patents as a baseline. They will publish the data on the Performance Platform.

However, there was no clear outcome for the service as a whole. The team must work through the guidance on measuring success, and make sure they have clearly articulated the outcomes that they want to achieve from the service (which is likely to be wider than just renewals), and then use this to set appropriate KPIs for this part of the journey.

The team must also consider how they will measure the analytics. They currently use Motomo for other services (with different needs). The team must consider whether this is the right solution for this service.

The team should engage a performance analyst for the service.

Recommendations

To pass the reassessment, the service team must:

  • refine the user needs, to avoid a focus on bulk users
  • ensure that all user research participants are plotted on the Digital Inclusion scale, and use this to ensure that digital inclusion needs are better understood so that appropriate support can be offered
  • increase the focus on access needs
  • gather evidence from the existing products - including analytics from the existing online services, and data of the existing Assisted Digital need
  • engage with the gov.uk content team
  • address the findings of the content review
  • develop the ability to release the frontend more frequently
  • articulate the key outcomes the wider service is trying to achieve, and use this to set KPIs for this system. We recommend doing this via a performance framework workshop

The service team should also:

  • refine the personas to better reflect the user needs, based on the comments above
  • start planning for how they will run private beta
  • understand the wider service - including researching with staff (especially where they process applications or respond to calls etc)
  • develop a plan for what happens when the system is down
  • get some time from a performance analyst

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Not Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Not Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Not Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 11 February 2020