Award a Contract for Goods and Services - Alpha Assessment
The report for Crown Commercial Service's Award a Contract alpha assessment on 3rd November 2020
Service Standard assessment report
Award a Contract for Goods and Services
From: | Government Digital Service |
Assessment date: | 03/11/20 |
Stage: | Alpha |
Result: | Not met |
Service provider: | Crown Commercial Service (CCS) |
Service description
This service aims to solve the problem of allowing a user to undertake the end to end process of awarding a contract. The process of deciding on a framework, or a lot within a framework, to use is the subject of a separate service.
Service users
This service is for procurement professionals in central government, and potentially for those in related areas (including local government, health and the third sector). It aims to provide a simple digital end to end user journey that facilitates the purchase of goods and services where the price is not known or fixed up front and needs further procurement activities or competition.
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- they had undertaken 35 moderated user research sessions with four iterations of the clickable prototype demonstrated. The team had also used other research methods such as in-depth interviews and surveys
- they had segmented their users into six persona types
- they had identified needs from users such as a need for a more lightweight pre-market approach
- the team had begun to engage with the disability network in the civil service as a potential pool of users for accessibility research
- they involved the whole team in research by allowing team members to observe remote user research sessions
What the team needs to explore
Before their next assessment, the team needs to:
- research the end-to-end user journey, which should include all the steps from a user having their first contact with CCS onwards. Diary studies may be appropriate in an area where long timescales and complex process flows mitigate against being able to cover the end-to-end journey in 45 minute sessions. Furthermore, undertake focused research sessions on the areas where one journey transitions to the next. The key is to research those transition points of the end to end journey to ensure the experience is seamless to the user. This could involve running joint user research sessions with researchers who are working on other areas of the end to end journey
- see the commercial organisations providing services as users as well, and undertaking research with them
- involve users who have had no previous contact with CCS in research (including surveys, moderated sessions with prototypes and so on). The team had involved existing or known users to understand the needs for this service but there is a risk here of ‘survivorship bias’
- undertake further research on what makes a buyer’s own procurement service more attractive to potential users and how they intend to meet those needs. This is key to increasing take up for this service
- consider the difference between organisational and user needs. Increasing the spend undertaken through CCS or making CCS a default route to market are organisational needs; the team should research what user needs CCS needs to address in order to meet these organisational needs. For example, user needs might be for a process that is quick, or adherent to legislation, or that enables demonstrable cost savings to be made: until user needs are understood, it will be difficult to provide services that meet their needs
- produce a research plan for the next phase. This includes a plan to work with a range of accessibility users in both the buyer and supplier groups to ensure the service can be used by everyone
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team had used a service designer to undertake work on the user journey, and had an understanding of the five steps (from 1. Discovery to 5. Award) that a user would travel through
What the team needs to explore
Before their next assessment, the team needs to:
- the current service does not solve a whole problem for the user. As the service team acknowledged, the service presented only covers the middle part of the end to end user journey presented, and no thought has been given to how to integrate with the other parts of the end to end user journey. Asking the user to re-key, or effectively re-run their journey, from the “Find a Thing” service as the first step of this service is not a viable route forward. As a priority, the service team should work with the teams responsible for Find a Thing, Buy a Thing and Conclave to jointly agree a roadmap towards an end to end user journey that provides functionality to users in segments that they can use without any additional support, help or incentive. This may mean concentrating resources elsewhere to ensure that Find a Thing, or the Conclave solution for identity assurance and account management, are available before this service can be launched
- as part of understanding the full end to end user journey, the service team need to understand the points in the journey where users may depart and move procurement to other platforms, in order to design a service that can handle this
- the team needs to undertake user research with users from before their interaction with CCS, and to understand how users will discover the existence of procurement services and what their needs will be at this point. Similarly, the team need to undertake research at the contract award and post-award steps to understand user needs at these points
- even if elements currently exist outside of the scope of the service, the service standard makes it clear that a service team has to take responsibility for how the user journey will work with organisations or different areas responsible for different parts of the user journey. This means that when obstacles to users (for example 32 page user manuals, or access to training) are identified, the service team must work with other areas to resolve these issues
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team needs to explore
Before their next assessment, the team needs to:
- with evidence from user research, detail the needs of their users across different channels, including existing channels (digital service, contact centre, training) and other channels. This work should identify pain points and the points at which users are likely to switch from one channel to another
- show how users of the four interlinked services (Find a Thing, Buy a Thing, Procure a Thing and Conclave) are made aware of the existence of the other services and redirected to the most appropriate service for their needs. For example, this might involve work on the start page to redirect users to other services where appropriate, or it might mean ensuring that users have been to Find a Thing before starting their journey on this service. This work should resolve questions such as how users who land on Buy a Thing get redirected to this service, and people who land on this service but who would be better served by Buy a Thing get redirected there
- in addition to understanding how the four new services will interlink, the service team needs to have a clear plan about how existing services will be switched off and migrated to the new services. The plans proposed involve using the new services to surface some frameworks and legacy systems to continue to manage other frameworks. It’s not clear whether the vision is that this service will eventually cover all frameworks, or how users navigating between different frameworks will be able to move between different systems (with different identity management and so on)
- present a plan showing how users will be supported in private beta and beyond, and how this support will be provided. This should include details of how support is signposted in the digital service, and how requests for support are triaged on other channels. For example, consider whether the intention is to return users struggling with the digital service to the digital service, or to resume their journey using a contact centre operative to key their information for them
- consider how to get front line support staff involved in the research. This could include scenario-based research sessions with end users where support staff are on-hand (with call centre scripts ready). Other ways could include service demos and playing back user research to these teams
- show how any steps not currently provided by the digital service (for example, award and post-award) will be handled
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the look is a variation of the GOV.UK design system; it still has a certain look of a GOV.UK service. It is good to see that the team is still focused on making their variation of the design system meet the WCAG 2.1 AA standards. However, it would be helpful for this service and other Crown Commercial Service assessments to see a record of changes to their design system to see the improvements they are making
What the team needs to explore
Before their next assessment, the team needs to:
- build a coded (HTML) prototype. It was difficult to fairly assess this service from an interaction point of view as the prototype was static with some basic click throughs. Although a basic prototype is a great first step in the design process to get feedback and iterate quickly it has its limitations. The next step is to build a coded prototype with the expected interaction to gain more accurate user research and do accessibility and mobile testing
- undertake testing with users on the name of the service
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team had discussed the prototype with users of assistive technology, and tested the prototype with users who suffered from dyslexia
What the team needs to explore
Before their next assessment, the team needs to:
- test the guidance and support copy for this service with a range of users, both potentially new users and existing users
- research, understand and be able to explain the issues that could prevent their users from using the service. These issues could include digital literacy or technical issues. For example, the team mentioned one department that had security controls applied to their network that caused problems and the use of old or non-standard browsers and enhanced security controls is not uncommon in a public sector context. This may also affect the approach taken to identity verification in future, for example Google IDs are prohibited for official use by a number of government departments
- research with and cater for users with assisted digital needs. The service team currently believes they are not required to cater to users with assisted digital needs because their users fit into the service standard description of “users that you don’t have to support: [users who] work in the public sector and user your service is part of their work”. Regardless of whether they have assisted digital needs, the users of this service are likely to have a range of needs, for example, some needs may be related to the legal requirements for privacy and security in procurement exercises. Researching these needs will avoid problems when the service goes into private beta and later phases. Without research to understand potential problems, it would be easy for support routes such as the contact centre to become swamped with queries. Beyond this, the service team laid out a clear desire for this service to be used in future by individuals who work in other sectors (for example charities). All inclusion work is best built into a service from the beginning and not added in at a later stage. If the service team intends the service to be used by a wider audience, then they should be researching with users with assisted digital needs and providing options for them from the outset. As was mentioned at the assessment, one way to gather evidence on assisted digital needs is to add a question on this to contact centre scripts and contact forms
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- a full range of roles and skills was evidenced, including product, delivery, user research, content design, service design and other roles
What the team needs to explore
Before their next assessment, the team needs to:
- resolve who the Service Owner is for future stages of work. A service owner should have overall responsibility for developing, operating and continually improving your service. A Service Owner in this context is not the same as an ITIL/ITSM service owner, as they’re responsible for a much wider set of responsibilities than maintaining agreed service levels - including taking the service forward into future phases (public beta and live)
- a number of roles were working across multiple services. It’s important that multidisciplinary teams members are able to concentrate on the work of one service
7. Use agile ways of working
Decision
The service did not meet point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service team had worked well to ensure that teams working in an agile manner were communicating well with more traditional governance routes, and that risks and issues were proactively managed
What the team needs to explore
Before their next assessment, the team needs to:
-
develop an MVP that defines a minimum set of functionality for a minimum set of user personas. This should provide a clear end-to-end path that delivers value for users, whilst noting functionality that can be added subsequently. As examples:
- users from third sector or health backgrounds may have different needs and be a good candidate for later inclusion
- a single framework or framework lot (and probably one with the simplest set of requirements for additional steps) could be the best initial test at private beta, before additional frameworks are added
- functionality such as group/multiple rather than individual access could be added in as a subsequent step, leaving the individual with access to manage collaboration via email or other methods until this functionality is added
8. Iterate and improve frequently
Decision
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the thin slice clickable prototype presented had been through three iterations
What the team needs to explore
Before their next assessment, the team needs to:
- be able to evidence changes made in response to research feedback and testing. These should include changes made to designs, code and overall approaches
- be able to describe pain points and difficulties in the user journey through the end to end service, and show multiple options tested to resolve these
- how code testing can be built into the iterative development cycle, rather than delayed until SIT and UAT at the final stage of delivery
9. Create a secure service which protects users’ privacy
Decision
The service did not meet point 9 of the Standard.
What the team has done well
The panel was impressed that:
- alongside data protection specialists within CCS, the team is already considering the GDPR implications of the data they plan to process and store in CaT
- the team fully understands the importance of auditability in procurement and is designing the platform to ensure that a full audit trail of each procurement can be recorded
- the team is consulting with CCS Security and Infrastructure Architects to help understand the threats to the proposed design of the system. Existing best-practice templates will be used for the secure configuration of services within AWS
What the team needs to explore
Before their next assessment, the team needs to:
- test technical assumptions and designs to the point where the team is fully able to describe the architecture for private beta (including tactical choices and the implications of those choices). The technical solution design proposed for this service is complex. This service will integrate with many different components. Some are already being used by CCS and some are in design and development. The alpha phase is a chance to test and de-risk complex issues and environments. The team has done this to an extent by interrogating the documentation for the key eSourcing SaaS tool (Jaggaer). This has uncovered issues that the team now need to consider work-arounds for. For the next assessment, we would like the team to explain the option selected and take this good work a step further by also considering how they are derisking other assumptions and integrations. For example, how will the agreements API, the service API, the Jaggaer system and Salesforce all ensure they maintain the integrity needed to function together properly? What tactical solution will the service use for Identity and Access Management (IDAM) if the Conclave service is not ready and how will that link to other CCS systems? How will content from the CMS (WordPress) and content from Commercial Agreements (sourced from the Agreements API and Jaggaer) work together on the frontend? Alpha is the perfect opportunity to test and de-risk assumptions. Without doing so, serious and unforeseen risks can be encountered in later phases after significant investment in a direction has been made. In order to meet the standard at alpha assessment, the panel needs an understanding of how the underlying systems connect and what the security and privacy implications of the integrations are. For example, if the team selects the robotics route for the Jaggaer API work-around, how will credentials for the relevant authorised users be shared? Where will the logic for understanding which users are permitted to see and edit which procurements sit? This may need to be in multiple places (the service API and Jaggaer) which increases the risk of potential issues
- continue to work with security architects to consider and mitigate possible threats and fraud vectors as the architecture evolves
10. Define what success looks like and publish performance data
Decision
The service met point 10 of theStandard.
What the team has done well
The panel was impressed that:
- the team had thought through how to measure success on the proposed service in detail and proposed KPIs in addition to the mandatory KPIs
What the team needs to explore
Before their next assessment, the team needs to:
- once the technical approach is confirmed, firm up the approach to collecting data through the Jaggaer system and Google Analytics. This should include how this data will be stored, how it will be joined up, and how user permissions will be gathered - particularly for Google Analytics
- whilst the proposed KPIs cover some specific steps in the service, further detail will be required for some steps (for example for 7 - conversion rate, how will users be tracked across the service and drop-outs detected given the likely significant delays in timings between activities)
- develop a plan to collect data across the entire user journey including other CCS services (for example Find a Thing, Buy a Thing)
- confirm which of the KPIs will be published externally, and where this will be
11. Choose the right tools and technology
Decision
The service did not meet point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team plans to use technologies that are already in use at CCS and are in line with existing CCS standards
- the team carefully considered how best to reuse existing solutions within CCS. Buying over building can be more cost effective if extensive customisation is not required (more on this point in the next section)
What the team needs to explore
Before their next assessment, the team needs to:
- test technical assumptions and designs. As detailed in section 9, the technical solution design proposed for this service is complex. This service will bring together new technology and new integrations with existing systems (Jaggaer). How this will work together in practice is unproven. The alpha phase is an excellent opportunity to test the proposed design and ensure that assumptions are correct and to reveal unconsidered possibilities. We recommend that the team implements the flow in code (from the service manual: just complex enough to let you test different ideas, not production quality code). This will give a much clearer picture on how much logic will need to be supplemented or replicated in the Award a Contract for Goods and Services API and how much can be used directly from the Agreements and Jaggaer APIs. This will help the team understand how much Jaggaer customisation is needed, be more confident that the approach designed will be cost effective, and understand constraints
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team intends to code in the open for all the custom built elements of CaT. The team will use the Crown Commercial Service GitHub organisation
What the team needs to explore
Before their next assessment, the team needs to:
- ensure any code written is open
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team is committed to following GDS API technical and data standards for any APIs built as part of this service
- the Open Contracting Data Standard will be used for all relevant data structures within the service APIs
- Expressions of Interest (EOIs) and Requests for Proposals (RFPs) created in the service will be published automatically to Contracts Finder
- the team plan to use Government as a Platform components including GOV.UK Notify, GOV.UK PaaS and the GOV.UK Design System
What the team needs to explore
Before their next assessment, the team needs to:
- as part of their research on wider users, consider user needs for the data collected by the platform (for example commercial organisations researching markets, journalists, other government departments) . Although procurements by their very nature can be commercially sensitive, there is a great deal of information that could be made openly available. It’s fantastic that the service will automatically publish information to Contracts Finder and it’ll be wonderful at the next assessment to hear how this fits into the bigger picture for the continual improvement of transparency in public sector procurement
14. Operate a reliable service
Decision
The service did not meet point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the design of the custom parts of CaT uses services which can be scaled based on demand across multiple availability zones
- the team has considered how users might still be able to continue with procurements if there is a service outage and how they will communicate with them. This will mostly be driven by existing Service Desk processes within CCS
What the team needs to explore
Before their next assessment, the team needs to:
- devise a detailed plan for how to manage private beta, and particularly how to gradually scale up usage throughout the private beta period so as to avoid a “big bang” release. This plan should include details on how users will be recruited (for example, it might be possible to recruit a limited number of early users through your mailing list or contact centre contacts)
- develop a plan for how the release of the new service will affect old or existing services, including what the alternative service provision will be if there are any issues with the new digital service. Given the potential length of time that some procurement activity can take, the team should also consider what commitment they will make to users about the availability of the service, both in terms of opening hours and support but also what will happen to their procurement if they remove or change the private beta. For example, declare up front that people shouldn’t use the private beta unless they expect their procurement to complete within six months, or commit to ensuring that procurements entered into the system complete even if the service is amended or taken offline