Platform as a Service (PaaS) - Alpha Assessment

The report from the alpha assessment of the GDS’s Platform as a Service (PaaS) on 23 June 2016.

Stage Alpha
Result Met
Service provider Government Digital Service (GDS)

The service met the Standard because it:

  • Identified the common need across government for centralised and automated hosting, deployment and testing of code so service teams can focus on development.

  • Assessed and tested multiple solutions during the alpha whilst keeping focus on their minimum viable product at this early stage.

About the service

Service Manager: Rory Hanratty

Digital Leader: Chris Ferguson

The service provides a central infrastructure platform and command line interface for ease of deployment.

Detail of the assessment

Lead Assessor: Scott Colfer

User research assessor: Angela Collins-Rees

Technical assessor: Al Davidson

Design assessor: Tim Paul

User needs

The service team were able to articulate key user needs and demonstrated an adequate amount of user research to meet the alpha standard. However, the team have not maintained regular user research throughout the whole alpha period. Without the intense effort during discovery and the start of alpha, and again in the last month, the team would not have met point 1.

The team now have a permanent user researcher in place and in the assessment presented a robust research plan for their private beta. The panel was reassured to see the team have a clear plan for addressing research questions that will inform their private beta.

The panel believe GDS must lead the way in respect to user research and set an example for other departments. Although the team understood the key user needs, the panel judge the service would have benefited from having a researcher on the team throughout the alpha. However, the panel also acknowledges that technical platforms, such as PaaS, may require a slightly different team configuration compared with public facing services.

Team

The team have not met the standard for point 3 because of the absence of a user researcher during some of the alpha. Platforms should be held to the same standard as public facing services and government departments should make professions like user research, content design, and design available to these teams throughout development.

The panel note that the PaaS team have worked hard to recruit a user researcher and recognise the need for additional skills in the team, such as a content designer.

Technology

The team made sensible technical choices during alpha, using open source products and trialling alternatives with users. Where they have needed to change code, patches have been submitted back to the public version of the product, ensuring code is published and free to use.

The team have taken the security of their potential clients very seriously, engaging with both in-house and external penetration testers, and a CESG architect. The panel notes that while there are still some data isolation issues to be addressed in beta, the team are sensibly restricting usage to services containing only public data until those issues are addressed.

The panel is confident the team are able to test their service fully and will be able to iterate with confidence during beta.

Design

The team demonstrated that most users are able to use the service first time, without assistance. However, further work to understand the needs of users that are less technically capable is needed to ensure the right type of support and documentation is provided. For example, consider designers that are familiar with editing front end prototypes but less familiar with setting up servers using the command line.

It’s crucial that the supporting technical documentation and command line messages are treated with the same level of attention as the rest of the service. The current documentation format is not consistent with the GOV.UK style. The team needs the support of an interaction designer and content designer for this. This is a requirement across the whole of the GaaP programme.

The team should revisit the service proposition (including the name) to check that it’s well understood across government, particularly in relation to other services like Crown Hosting.

Analytics

The team has been measuring KPIs during alpha and the panel is confident they will be iterated and improved based on what’s learned during beta.

Recommendations

User needs

The team needs a dedicated user researcher to ensure the platform is built to meet user needs and appropriate user testing takes place to confirm the needs are being met.

The panel was pleased to learn the team had identified some high level user needs from their discovery research and recommend the team presents them at the start of their next assessment.

The panel was concerned to learn that there had been a significant period when a user researcher was unavailable to support the team’s ongoing work. While the remaining members of the team were pragmatic in their approach to gathering user feedback, this raised concerns about the research approach, the user groups interviewed and the subsequent knowledge gaps.

The panel learnt that prior to the assessment, the team conducted a round of usability testing with 15 junior developers. The panel would like to see further evidence of a much deeper understanding of their users, not just those building the service but also those maintaining and supporting it. As part of their research and usability testing the team must include users and potential users with access needs and those who use assistive technology.

The service currently does not support .Net applications. This will prevent some areas of government using the PaaS. The panel recommend the team engage with these areas of government to understand their needs and incorporate that into their wider programme of work.

Team

GDS should ensure that it has the people and skills available for its teams to meet the service standard. A lack of the right people/skills was a significant blocker for the PaaS team, who were forced (at times) to do the best they could with the resources they had.

Technology

While the team have so far concentrated efforts on making the service highly-available, the panel recommend that they devote similar effort during beta to considering their support model, onboarding process, and other non-technical aspects of their service wrapper. We also recommend continued thinking around isolation of logging data, in relation to the kinds of client service they can support, and user-testing of the command-line interface itself as well as the documentation.

Analytics

The team should consider how to measure the amount of money saved as a result of service teams using the PaaS.

Recommendations

To pass the next assessment, the service team must:

  • Ensure that a user researcher is available to the team during beta and broaden their research to include users with access needs

  • Focus on the broader service design of the PaaS, with specific focus on the design of onboarding and documentation

  • Define the ‘market’ for PaaS and ensure that the technology and the service wrapper are resilient enough for growth

The service team should also:

  • Understand the needs of development team members less proficient in deployment (eg designers)

  • Put together a strategy for marketing and encouraging the adoption of PaaS

  • Plan how they will manage the expectations of early adopters who want to use PaaS but need to know what they can use and when they can use it

Digital Service Standard points

Point Description                                                                      Result
1      Understanding user needs                                                         Met      
2      Improving the service based on user research and usability testing               Met    
  3      Having a sustainable, multidisciplinary team in place                            Met      
4      Building using agile, iterative and user-centred methods                         Met    
  5      Iterating and improving the service on a frequent basis                          Met    
  6      Evaluating tools, systems, and ways of procuring them                            Met      
7      Managing data, security level, legal responsibilities, privacy issues and risks Met    
  8      Making code available as open source                                             Met    
  9      Using open standards and common government platforms                             Met      
10     Testing the end-to-end service, and browser and device testing                   Met      
11     Planning for the service being taken temporarily offline                         Met    
  12     Creating a simple and intuitive service                                          Met    
  13     Ensuring consistency with the design and style of GOV.UK                         Met      
14     Encouraging digital take-up                                                      Met    
  15     Using analytics tools to collect and act on performance data                     Met      
16     Defining KPIs and establishing performance benchmarks                            Met      
17     Reporting performance data on the Performance Platform                           Met      
18     Testing the service with the minister responsible for it                         Met      
Published 22 December 2016