Government Property Community

The report for the Government Property Community alpha assessment on 18 March 2021

Service Standard assessment report

Government Property Community

From: Government Digital Service
Assessment date: 18/03/2021
Stage: Alpha
Result: Not Met
Service provider: Office of Government Property (Cabinet Office)

Service description

Government Property Community is a virtual tool which brings together and communicates the full suite of career development, learning, training, and skills products which are available to the property profession. It is an online community for property professionals to learn from and interact with each other.

Service users

This service is for:

  1. Public Sector Property Professional
  2. Public Sector Property Line Manager
  3. Government Super User

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had sought to listen to data generated in both Discovery and Alpha and endeavoured to do research with various identified user groups
  • different members of the team were involved in research during Alpha and were able to contribute to discussion in the assessment
  • the team had looked to build on research undertaken during Discovery in order to test assumptions and gather more evidence to make informed decisions

What the team needs to explore

Before their next assessment, the team needs to:

  • clearly articulate user needs (with evidence). Despite the extensive user research undertaken during Discovery the team do not appear to possess clearly articulated user needs. This point stems from the lack of clearly defined users needs generated from Discovery and the actual problem that the service aims to solve (as noted above). Greater explanation and evidence of how needs were generated from data, and how these needs have been revisited and addressed in alpha is needed. The team showcased some user needs but appeared to refer to the solution to the problem rather than the problem they were trying to solve. For example, “I need to view all guidance and news related to government property matters, and access the skills tool”. The service that the team proposes to build is incorporated as part of the overarching need for this user group. This is problematic. Going forward the panel would suggest referring to the GOV.UK guidance on formulating user needs. Being able to clearly demonstrate both, problem and need, would be beneficial for future assessment and also for work that is to be undertaken. The guidance can be drawn on to create needs that will stand the test of time
  • embed a user researcher in the team. It is the panel’s understanding that research during Alpha was not undertaken by a user researcher. Notwithstanding the diligent efforts of the team to draw on user feedback it is felt this is particularly problematic as a research specialist would have helped to better guide the research process during alpha. For example, it is felt that the use of focus groups to demo the service and gather feedback is not the most effective means of gathering insight. As an additional or alternative approach, usability testing and focused discussion with individual users being allowed to run through the designs would yield more valuable input, particularly with regard to designs. Examples from research artefacts provided by the team that appear to show the use of leading questions/statements include:

  • in the user research script for the focus groups in the “home page” section the text reads “user friendly, images used for target pages”.
  • from the social media page: “This page is where you can easily find the Twitter pages for GPF, Cabinet Office and the LinkedIn page for the GPF”. Both of these introductions to the page include presumptions regarding usability and would potentially lead research participants to a particular evaluation. The use of these examples is not designed to take away from the work done by the team during Alpha, but rather emphasise the need for a research specialist. The team would be able to draw on their knowledge and expertise to test assumptions and gather valuable findings

  • spend more time on research during Alpha. It is the panel’s understanding that user research was conducted for no more than 2 weeks during Alpha. The panel feels this is not sufficient to derive a clear, evidenced-based picture of how the users’ needs are being met (or not)

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is clearly very knowledgeable and passionate about improving the experience for the Government Property Profession (GPP) users. This work fits as part of a wider Capability Programme, which aims to address multiple challenges in the community and deliver a compelling, end-to-end career experience and talent offering for anyone working in Government Property
  • there has been very strong engagement with the user community across multiple departments and organisations, and the work has been focused on addressing problems on a cross-government level
  • there have been considerations about how other communities of practice and professions in government have approached similar problems, and this has clearly informed the team’s design and thinking. In addition, the team has done engagement with Cabinet Office HR and other parts to help align this work into broader government priorities

What the team needs to explore

Before their next assessment, the team needs to:

  • validate the whole problem and solution through further user research. As mentioned in Point 1, further research is required to better understand the problem which was explored in Discovery and ensure that the proposed solutions actually meet the user needs. The panel felt that the Alpha was started with a pre-conceived solution for the collaboration portal and the skills mapping tool, and there was little other consideration or testing to validate if there are other solutions which could potentially better address the problem and needs (e.g. community pages on Confluence, using Slack or some other collaboration tool, a more complex end-to-end bespoke solution). In addition, the link between problems and needs, and the proposed features and solutions were not always clearly demonstrated. For example, the service team highlighted the requirement to have functionality that will mimic a social network (such as likes, tagging, and integrating with Twitter and LinkedIn). However, it wasn’t clear what the actual user needs driving this requirement were and also how this solution will address the original problem statement: “finding information is time-consuming, and it is difficult for career professionals to navigate to the best option for them”. Before their next reassessment, the team will need to validate and demonstrate through extensive user research how the two proposed solutions can comprehensively solve the problem for end users
  • consider the end-to-end journey for users and design with that in mind. The panel recognises that this work fits as part of a more complex programme of initiatives and work that spans across digital, engagement and comms. Therefore, it is not possible with this service to cover the whole journey of GPPs accessing the information, network and talent development proposition of the Government Property Function. However, it will be beneficial to fully map out the end-to-end service journey map of the potential priority users / use cases that this service wants to address. This will help to better define the value of the service, clarify the scope and identify areas for further research and development. In addition, it will help to ensure that development effort in Beta is spent on the areas which provide the greatest value for these priority use cases and users
  • prioritise and test their riskiest assumptions. During Alpha, the service team should clearly state and systematically test their riskiest assumptions to ensure that the new products actually meet user needs and are an improvement on the existing information site. For example, it was mentioned that a key driver for this work is that the repeat visits for the existing GPP portal are due to its limited and poor functionality. However, it wasn’t clearly demonstrated how well the new services will address this problem (and also what the impact of this problem is on users’ outcomes in terms of their career progression in the GPP). This assumption that the new product with improved usability will be adopted and used more, should be validated in Alpha before actual solutions get built-in Beta

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has engaged with professionals across many departments and agencies to understand the limitations and requirements they might have across multiple channels/organisations
  • the team is working closely with the Government Property Function comms team. They have considered how the non-digital channels and comms and engagement for these services will be handled to promote the service and keep the information on the platform up-to-date
  • the team has developed the skills tool as an additional channel to support the development and talent management of GPP. There was consideration about how the tool will be used the first time by employees, as well throughout the year to support their personal development and training. The team has also thought about the value of the tool and journey for line managers, although more testing and design for this should be completed throughout Alpha and Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • consider the end-to-end journey for priority users. As highlighted in Point 2, it wasn’t clear what the end-to-end user journey and experience should look like. Additionally, some segmentation of priority users/use cases will be beneficial to better understand and prioritise the necessary features and designs. This end-to-end service journey for priority users/use cases should be mapped and iterated throughout Alpha. It should include how the online and offline journeys will look like, and how using the collaboration platform and skills mapping tools will integrate with other GPP tools and touchpoints. This journey should then be tested and iterated further throughout Alpha and Beta
  • test with users the handoff and connection between the collaboration platform and the skills mapping tool. The collaboration platform and skills map have a very different look and feel (the collaboration platform is a Salesforce native web application with little input from the GOV.UK Design system or government branding, whilst the skills map tool is more closely aligned and branded with GOV.UK). The panel appreciates the limitations that the software poses on the design flexibility of the platform. However, there wasn’t sufficient evidence to validate if this difference would be well understood by users, or how they will navigate to the skills map. This needs to be tested further to ensure the designs and end-to-end service work seamlessly

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understood what the main problems were with the old (e-PIMS) system and the improvements that needed to be made. The team engaged with the users in Discovery and Alpha to understand the challenges
  • the design system style guide was considered in the skills assessment tool

What the team needs to explore

Before their next assessment, the team needs to:

  • the team needs to conduct usability testing with users to understand what the issues are with the service, what works and what needs to be improved. A demo approach may lead to bias in the feedback as it prevents users from directly interacting with the system. The team is highly encouraged to use a 1:1 research approach rather than a focus group
  • as mentioned in Point 3, the team needs to map and validate the most common user journeys and scenarios that the service that would be used in. It is unclear how users, especially in the case of “Brian”, find out about the service, when they would use it and what features they would need to access at which points of need. The team is highly encouraged to identify the most important touch points from a user perspective
  • as mentioned in Point 2, the Alpha stage is about testing the riskiest assumptions and prototyping/iterating a range of ideas. Although a Discovery was done which evidenced the team’s understanding of user needs, there is no further evidence to show that the team co-designed, prototyped and tested more than one solution with users. The rationale behind why the team built a portal and a skills assessment tool were unclear. The team is highly encouraged to run an ideation session with user groups and to validate whether the service proposed is the ideal solution
  • the service manual states that services need to be straightforward, users have to do a few things as possible and can do what they need as seamlessly as possible. The two components (the community portal and skills assessment tool) seem to be disconnected. The link between the two user journeys is unclear. This complicates the service as users need to go through more than one channel to complete a task. The team is highly encouraged to take a look at the service manual guide on making services simpleand to identify which areas could be improved on and to validate them with users in Alpha
  • as mentioned in Point 3, there is a lack of consistency between both components in terms of look and feel. This can cause confusion for the users as it creates the impression that they are two independent services. There are also concerns about the overload of text and the number of bordered boxes on the pages in the skills assessment tool. These points can be explored further and validated through usability testing in Alpha and Beta
  • point 4 of the standard says “test all the parts of the service that the user interacts with, online parts and offline parts”. There was no evidence that the team considered or mapped the offline components (for example the line manager’s experience with the tool offline), particularly how that impacts/fits into their service as mentioned in point 3 above
  • it is recommended that the team explores the possibility of renaming the service following the service manual guidanceto describe the task users are trying to achieve. This process will help the team better scope the service and to demonstrate the value for users

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated an awareness of the government accessibility standards. The team were aware of the need to design an inclusive service and inquired about conducting research with users with disabilities
  • the skills assessment tool is being converted from an interactive PDF (which is not accessible) into a digital alternative that could be made accessible

What the team needs to explore

Before their next assessment, the team needs to:

  • by the end of Alpha teams should have a plan for how they are going to tackle accessibility at Beta, which was not evidenced. The team is highly encouraged to review their alpha designs and prototypes. The team took into account the need to ensure the service catered to users of different digital abilities. However, with regard to digital skills, this appeared to be based on the assumption that property professional users (as government employees) had a particular skill set and ability with regard to digital use. The assumption needs to be vigorously tested with users and evidenced in the assessment to ensure that accessibility criteria has been thought about in the research. New services and platforms can be intimidating for users and the teams thinking around supporting users with low digital confidence was unclear as well as the thinking around an assisted digital model. Furthermore, as the team is working with suppliers that provide out of box solutions, the thinking around accessibility considerations such as whether the platform works with screen magnifiers, screen readers and speech recognition tools was not clearly indicated in the assessment. This should be explored further with the suppliers
  • at Alpha there needs to be evidence that people with disabilities have been included in the research. The team demonstrated that they struggled with including them as part of the user groups. The panel recommends to consider resources such as Civil Service Disability Network or a departmental disability champion. Recruitment agencies are also able to source users for more inclusive research. The team needs to show an understanding of the pain points of particular groups when using the service

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is embedded and leading this work in the GP Function, so they have a lot of autonomy and information to make effective decisions
  • the Senior Responsible Officer (SRO) and Product Owner (PO) are empowered and very experienced in the area. The SRO sits on the programme board which is ultimately responsible for the project direction and has full buy-in from them. The SRO is in control of the roadmap and project timelines. The PO is responsible for the prioritisation of each story and works closely with everyone on the team
  • the permanent civil servants and the contracted supplier (Arcus) work well together and everyone shares a common vision for the outcomes that the project wants to achieve for users and the GPP
  • everyone has participated in user research and co-designing of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • find an experienced user researcher who can support the work and help the team to test their solutions in a non-biased way based on best practices in the research community. This has been highlighted in Point 1 and is essential for a sustainable multidisciplinary team at Alpha
  • plan the team for Alpha and Beta and ensure there is continuity in case of any change of commercial arrangements - It is understood that the contract with Arcus is currently only until the end of March and commercially, there is still uncertainty of what the replacement will be. This poses a significant risk for how the project is developed further as Arcus has all the design and technical expertise. It is recommended that all the technical and design work and decisions get documented in detail, and there is an opportunity for a formal knowledge transfer session(s) in case of a change of suppliers
  • assign a technical owner of the solution who is from Cabinet Office - even if a supplier is responsible for building out the services and a lot of the functionality can be supported by Salesforce out-of-the-box, a responsible technical lead who understands the tech stack and both solutions should be assigned and closely work with the team from Alpha onwards. This will help to ensure that the product can be appropriately supported going forward
  • collaborate more and get support from the Cabinet Office Digital design communities to share best practices about the service design and content of their service

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses agile practices, tools and techniques. They work in weekly sprints, have daily stand-ups, regular planning meetings and retrospectives, and Show & Tells where the whole team and other stakeholders attend. They use Confluence for collaboration and Jira for their backlog story management

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how to really embed and get value from the benefit of working in an agile, iterative way whereby testing and iterations are embedded in the way of working, and there is more time and space to do so. This point is covered in more detail in the next section
  • proactively share best practices of working in agile with senior stakeholders who might not be so well versed in the benefits of this way of working. This could help to get buy-in to get more time for user research and testing, if there’s otherwise pressure on the timelines for delivery

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • both throughout Discovery and Alpha, the team has attempted to gather qualitative and quantitative user insights and implement some changes based on that

What the team needs to explore

Before their next assessment, the team needs to:

  • properly structure the Alpha and Beta, and allow more time for testing and iterations. The panel felt that the service is rushed through the delivery lifecycle without any time for proper iterations or substantial revisions or improvements to the service based on data and user research. The Alpha stage was really short (4 weeks) and this has resulted in a limited amount of work and evidence to ensure the service really works for users and meets GPP’s original goals. Overall, it seems that the project roadmap is very tight. While the assessment panel appreciates this is not a very complex service, it is recommended that more time is provided for testing, integrating and rolling out the service than what is provisioned for
  • show a clear map of iterations in response to user findings - It was unclear what changes were made, and why they were made as part of the design/research process. Transparent mapping of this would show the panel how users were being listened to and how the team was responding to feedback

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understand the nature and volume of the data, the fact that some of it counts as personally identifiable information and that none of it is regarded as sensitive
  • the service is reasonably well protected from compromise by third parties. Additional confirmation of email address is required on registration and at any time when a new device tries to connect to the service. Given that this service is not a highly attractive target for attack this level of protection is adequate

What the team needs to explore

Before their next assessment, the team needs to:

  • continue discussions with the Data Protection Officer to confirm whether a DPIA is required
  • consider implementing password policies in line with National Cyber Security Centre guidance
  • consider extra protection for administration accounts such as multi-factor authentication

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered the KPIs that they should start measuring in Beta. It was also good to see that they’ve also considered how to measure against the outcomes they want to achieve in Live
  • analytics and reports from Salesforce can be retrieved easily and provide a reliable source of information for the digital take up of the new platform

What the team needs to explore

Before their next assessment, the team needs to:

  • consider and validate further the performance metrics they are selecting based on the prioritised user needs and journeys that they establish. For example, if there is a strong user need for more social network collaboration style, then some metric to reflect this might be useful
  • in Beta, establish some benchmark or target for the use of the new skills mapping tool and start measuring against that to assess if the tool is successful and delivers value

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the solution is built on commodity public cloud facilities
  • the delivery partner and technology were chosen by competitive procurement. The use of Salesforce seemed somewhat arbitrary, given that Cabinet Office already use other cloud platforms and collaboration tools, but the choice of platform does seem to have been fair
  • the team recognises the requirement for participating departments or individuals to pay for Salesforce licences to use the portal and has already cleared this with some of the main participating departments
  • the plan is to use a single set of authentication credentials for the collaboration portal, the skills mapping tool and a property database to replace e-PIMS

What the team needs to explore

Before their next assessment, the team needs to:

  • understand more fully the reasons for product choices. There seems to be some evidence of user preference to have the collaboration portal and the skills mapping tool under one site but given that the look and feel are different for each, the products may not necessarily need to be based on the same technology platform. There are other ways in which to provide seamless authentication that can work between multiple platforms
  • understand the real requirement for limiting login credentials and the options for doing this. Using the same single authentication to access the collaboration portal, the skills mapping tool and the property database may well be desirable, but the user research evidence to support this appears limited. An option of Single Sign-On provided by integration with corporate IT systems might be preferable to users in some departments. The team noted that the number of organisations might be large (currently over 30 and could increase). That might influence choice of SSO options, but the effort involved and the potential convenience should be understood and explained

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is willing and able to make any new code publicly accessible
  • the team recognised the limited amount of new source code being produced to meet the service requirement. What is produced may be of limited value to others, and would benefit little from coding in the open, but meeting the service standard will require it to be made available. For an Alpha assessment and with little code value at stake, this is acceptable

What the team needs to explore

Before their next assessment, the team needs to:

  • produce or propose a plan to review and publish new source code produced as part of the service

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team recognised the limited scope for reuse or contribution of solutions
  • the team looked at solutions in use by other professions in the government

What the team needs to explore

Before their next assessment, the team needs to:

  • keep in touch with other government professions to share knowledge and experience of providing similar solutions

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has considered the reliability of the service and has made an informed decision about why they don’t need to consider service availability beyond what the Salesforce support and availability will provide. Given that it is not time-critical or essential service, this decision seems proportional and appropriate.

What the team needs to explore

Before their next assessment, the team needs to:

  • Consider and develop a clear support model for beta and beyond, which covers in particular who will be responsible for user complaints and queries.
  • Usability testing to gather data on areas of improvement in service operability

Updates to this page

Published 15 December 2022