New Secure Access - alpha

The report from the alpha reassessment for DfE’s New secure access service on 8 December 2017.

Service Standard assessment report

DfE Sign In

From: Central Digital and Data Office
Assessment date: 22nd September 2017
Stage: Alpha
Result: Not met
Service provider: Department for Education

To meet the Standard the service should:

  • Ensure the team have explored the broad user needs for their service, and the problems they are trying to solve.
  • At present, is focused on refreshing/replacing an existing solution, rather than developing a user-centric solution to well-understood needs in schools and related organisations.
  • The team should engage with GDS and its GaaP programme in particular on the potential to use Platform as a Service.

About the service

Description

The service intends to make it easier for users in schools and related organisations to securely access DfE digital services. As currently defined, it will replace an existing service to provide role-based access control (RBAC) for a number of services DfE offers.

Service users

The users of this service are teachers and administrators in schools, colleges, academy trusts and DfE itself.

Detail

User needs

Currently users access DfE services by using a username and password that is generated for them. Users are unable to self serve and manage their own usernames and passwords which can result in users failing to log in when they either forget their credentials, or they expire after 30 days. This results in them needing to reset their username and password. DfE Sign In aims to resolve these problems by enabling users to access approximately 60 different services via one account which they can self serve and change their username and password.

The team has conducted research to identify key users of the service. This has involved conducting a feedback survey to understand pain points, and follow up interviews with 49 users. The users generally fall into two categories - users and approvers - and can be found across the following job types: back office managers, school business managers, and some teachers. The team has produced personas of these to help create empathy, and ensure that the service considers these users through its development.

Overall the research conducted was thorough, and the team were involved in all aspects, which was pleasing to see. Currently the research is undertaken by the designer on the team as a dual role, and although this has produced some interesting developments in the prototyping stage, the panel felt that this was affecting the overall effectiveness of this and in some of the research conducted. The panel felt that splitting out this role would enable the team to broaden the scope of the research conducted, while enabling the design to concentrate on developing the prototype.

The user needs for this service are largely based around the need to self service, however the panel felt it would be interesting to research other methods of authentication and password management. The panel also felt the team should increase the scope to understand broader user needs through conducting research into how services are made available to schools, the awareness of these, why they currently sit across so many domains, and how they can be more joined up.

Team

The service team is made up of a blend of a small number of permanent civil servants with the bulk of the team provided by a supplier. Aside from the service manager, the civil servants are relatively inexperienced, and would benefit from additional experienced digital practitioners from the permanent staff. The current model will need significant efforts to ensure effective knowledge transfer from the supplier’s team, and may not be scalable or sustainable when the team moves into Beta.

The team would benefit from separating the design and user research roles, as this blend of roles can potentially undermine the effectiveness of prototyping and insight gathering. Particularly at the Alpha stage, where multiple hypotheses are tested and rely on critical assessment during user research, a combined role can become conflicted.

There was some confusion over the establishment of a ‘devops’ role. Devops is a way of working more than a specific role, and the role as defined was more closely aligned to a traditional ‘webops’ person, provisioning and managing development environments and supporting technology. It’s possible that the service would be a good candidate for GDS’s PaaS service, which could mitigate the need for a dedicated webops role, potentially creating room for an additional user researcher or designer. We would strongly encourage the team to discuss the potential to use PaaS with GDS.

The panel were concerned that for a combined Discovery/Alpha project, the work completed was too narrow, and hadn’t more broadly considered the problems users faced and potential solutions, instead focusing on the development of a platform or portal solution. More detail on this is included in the user research and design sections of this report.

In terms of agile working practices, the team has established a good cadence, and is clearly working effectively together, deploying iterative improvements to the service based on research findings. They have access to appropriate tools and technology, and are supported by senior leaders who empower the service manager to take appropriate decisions.

Technology

The DfE Sign In team demonstrated good coverage over the main functional and nonfunctional concerns around such a service. The security aspects were adequately addressed and taking a well proven open source project as an implementation base brings additional security benefits, however a more precise and specific to the service threat model is needed for Beta. Bringing a security dedicated specialist in the team should cover those gaps.

The team have chosen a modern toolset of open source technologies and have a proven CI pipeline they intend to use and evolve in Beta. There is good understanding of the team structure needed to support the development in Beta, as well as the technology and automation stack. The technical team has an identifiable lead who is well supported by business.

The current choice for hosting is Azure, however the service team is considerably developing a cloud agnostic deployment model.

Both government and industry common components and services have been looked at and the team are open minded about re-use. As a strong consideration for GaaP services becomes a necessity for passing Beta, we recommend the team meets GaaP engagement managers in GDS to work out the suitability of adopting as many shared government services as possible. The GOV.UK Verify with LOA1 looks like a good fit for the objectives the service is addressing, so a closer analysis needs to be done before deciding on the integration with Verify.

The panel was pleased to hear that the team have taken a coding in the open by default policy and would certainly be beneficial to demonstrate the codebase progress in an open repository for Beta.

Design

The service is an identity management system for services DfE and agencies provide to school staff and administrators. The team demonstrated two approaches they had taken in the design of the service — direct access to the target service from a GOV.UK start page, and one that leads to a services portal, listing all the available portals.

One reason the team put forward for a portal is to raise awareness of the existing services. As the list of services are behind the log in, and therefore not available to users without current access or via a search engine, this doesn’t feel like the best way to advertise these services. Additionally users are task focussed. When they log in, they will be looking to complete a task, presenting them with a long list of services with undescriptive names is going to hinder them completing that task.

GOV.UK should be the start for all services, they should be discoverable from search engines.

The prototypes currently use a DfE branding. The service should be part of a seamless journey from GOV.UK to the target service, and therefore should be branded as being part of GOV.UK.

It is recommended that DfE should conduct discovery into the services they are offering to schools, how the names of those services (eg COLLECT, Key 2 Success) affect the discoverability and awareness of the services and how they can offer a better joined up service to schools.

Analytics

At this early stage, the team have considered approaches to measuring the service and understanding success measures. As a component in other services’ user-journeys, this will be more challenging to measure than a more straightforward end-to-end user journey, but the panel was confident that the team were considering this as part of their development.

Recommendations

To pass the reassessment, the service team must:

  • Review the team make-up, and seek to separate the user research and design roles, potentially by adding an experienced civil servant into either a dedicated user research or designer role. This would improve the balance of the team, whilst helping to ensure that there is a robust challenge between design choices and insight from research.
  • Ensure that user research considers the whole experience, and end to end journeys, of users, rather than constraining research to the RBAC approach. This should include rough prototypes of alternative approaches to the problem, including considering third party solutions like password managers, and using existing authentication mechanisms, such as relevant organisations’ existing authentication mechanisms via eg Google or MS Office365. It would be useful to the team to understand what lessons were learned during the ‘accessing GaaP services’ project in GDS, which was considering similar issues across government.
  • The team should contact GaaP engagement managers in GDS and look of the potential adoption across all products and let the engagement managers know where any of the common services does not meet the DfE Sign In requirements and why.
  • The team should re-address the need of the build a custom UI user management component if user research shows no additional benefits for it apart from trivial user and access management. In that case, a standard RBAC management software or service can be used instead of building one.

The service team should also:

  • The team should work with a security specialist and build a suitable threat model and work to secure against critical threats.
  • The team should design and test complex administrative scenarios, for example, multiple approvers each approve and reject an approval request.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Not met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform n/a
18 Testing the service with the minister responsible for it n/a

Updates to this page

Published 30 July 2018