Submit Data On Social Housing Sales And Lettings Alpha Reassessment Report
The report for MHCLG's Submit Data On Social Housing Sales And Lettings alpha reassessment on the 3rd September 2021
Service Standard reassessment report
Submit data on social housing sales and lettings
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 03/09/2021 |
Stage: | Alpha |
Result: | Met |
Service provider: | MHCLG |
Service description
This is a digital service for Local Authorities and Housing Associations to submit data about new social housing sales and lettings with the Ministry of Housing, Communities and Local Government (MHCLG).
The new service aims to make the statutory obligation of data submission easier and less manual where possible for the data providing organisation, leading to a time saving for providing organisations and analysts and better quality data for MHCLG policy makers.
There are likely to be three main methods to submit data:
- Webform to submit an individual log
- Data upload to submit multiple logs at once
- API submission
Service users
The primary users of the digital service will be data providers, groups within this are:
- front line housing officers in Local Authorities and Private Housing Associations. These users collate data from multiple sources and “create logs” of new social housing sales and lettings via webforms
- middle managers of the above housing officer teams. These users are responsible for quality and timeliness of their organisation’s submission
- business analysts within Local Authorities / Private Housing Associations. These users will upload data in bulk (multiple logs created at once) if the organisation chooses to do that as a method of submission
Data consumers include the following groups:
- analysts and policy people within MHCLG who analyse the data to produce stats releases, insights and policy.
- Local Authorities and Housing Associations for example Housing Performance Managers (via data visualisations within the service)
- external organisations for example NGOs, Academia
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- during the reassessment, the user researcher demonstrated in detail how the team had addressed the concerns the panel had during the initial assessment around the limited understanding of their primary user’s needs
- the team conducted additional usability testing exploring the end to end journeys for a chosen user type, taking the panel through unhappy paths, across different devices
- using their prototype, the researcher and designer demonstrated how the service would work for a housing officer, and highlighted how the key pain points with content and design were addressed in the various scenarios presented, based on several rounds of usability testing and research findings gathered during the discovery phase
- the team worked with an accessibility SME (expert) to carry out research with participants with access needs, and presented the panel with a broad range of persona profiles of users with assisted digital needs
- the team spent considerable time addressing earlier accessibility and inclusion concerns of the service, and presented sufficient evidence that their iterations now meets the needs of users with medium to low digital confidence
- Overall, the team has carried out a significant amount of research, and has made a lot of progress since the initial alpha assessment
What the team needs to explore
Before their next assessment, the team needs to:
- continue to progress this work, as they will need to start deepening their understanding and meeting the needs of other user groups, particularly, analysts and policy people within MHCLG, who analyse the data to produce stats releases, insights and policy
The team should also:
- be mindful of how they use unmoderated usability testing in beta. One of the benefits of private beta is having controlled access to users. Ideally the team should use unmoderated testing only when they are highly confident in their journeys
- the team should also aim to observe live data entry of private beta, particularly as the users do not input personally identifiable information (PII)
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has plans for how to move users to the new tool - including changing templates and guidance at the start of private beta and comms plans for suppliers. The team is also working with the Regulator of Social Housing to make organisations aware of the service
- the team has identified the user’s need to gather data onsite, in places that may have no internet access. Based on this they have redesigned and tested the paper form most used by data providers, have an option in the new service to let users print out a partly complete online form, and will iterate the service in public beta to be used offline on mobile and tablet
- the team has a content designer and is reviewing all parts of the service for content improvements, including letter, guidance, and the service. The team intend to do further work on when to put help text in or out of the service
- the team has identified the common help desk issues (the system not accepting correct rent costs, and postcodes for new builds) and will fix these in the new service
- the team has identified when Core ISS can be retired - this can be done when bulk and single uploads are moved to the new tool
What the team needs to explore
Before their next assessment, the team needs to:
- design, test and iterate the entire journey, including signing in (and any related problems). The team knows that there are needs around organisations with multiple data providers and has plans to work on these journeys in beta. The team must be confident that these journeys work before moving to public beta
- complete their planned work on communications and improving other offline materials - working with MHCLG comms will make sure that anything the team proposes meets department guidelines (for example, the layout of print documents and any printing constraints should MHCLG be required to do print runs)
- do their planned work on assisted digital support - reworking the help desk categories, engaging with the help desk team on identifying assisted digital users, and testing the end-to-end model
- test their models for letting people collect data offline. This is a specific type of assisted digital. In particular, the team will need to make sure that any branching questions make sense in a printed document, be it available before signing in or as a printout of the in-progress ‘draft’
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has iterated on the service based on research with a range of users. This has included creating groupings for the task list based on users’ mental models
- the service has visible signposting to the help desk throughout the journey
- the team has made content improvements to the service and has a full-time content designer
What the team needs to explore
Before their next assessment, the team needs to:
- complete any changes noted from their design and content audits. There are still parts of the service that don’t meet GOV.UK design system patterns (for example, in Local Authority, when the user selects a new option, a new section appears at the bottom of the screen) or GOV.UK style (for example, actions that are in title case)
- complete their planned work to simplify all language and add supporting text where it is required (the team is considering testing alternatives of in-service hint text and out of service guidance)
- complete their planned fixes to make the service usable on mobile, for example, redesigning pages with many radio buttons
- continue their work on the service name to balance new users and those already familiar with the CORE ‘brand’. Some services have balanced this tension in guidance by putting the ‘brand name’ after the more descriptive title, for example ‘Get a vehicle log book (V5C)’
- continue to iterate on the response to the need for users to skip questions. The current solution (users can continue without responding) is not obvious, especially since the first part of the report does not let users skip. This functionality may also become trickier when the team adds more question-based routing (as mentioned in point 3)
- understand the constraints for GOV.UK templates when suggesting out-of-service changes. The team has identified useful information needed before signing in, for example, the paper form for offline use. However, start pages cannot be customised. The team may therefore be designing and testing options that are impossible to release. MHCLG’s GOV.UK content designers and the cross-government content community can assist in understanding the constraints
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the service has been usability tested with users with access needs (4 users over 4 rounds of testing)
- the team has done interviews, surveys and usability tests with a range of user types, both internal and external
- the team has tested the prototype with various tablets and mobile devices, and has identified changes that they’ll need to make in beta, for example, making the table of log items responsive and not having a large list of radio buttons on a single screen
- the team has iterated on patterns to make them more accessible, for example using an experimental new component of the task list pattern that lets users jump to the first incomplete task
- the service is booked to have an external accessibility audit in beta
- the team has done internal accessibility empathy training, tried the prototype out using accessibility personas and is considering these lenses in their user research
What the team needs to explore
Before their next assessment, the team needs to:
- continue to test all features with users with differing access needs. The team has an ambitious goal to deliver many features in private beta and will need to allow enough time and capability to keep them accessible. This will be particularly important for features not shown in this reassessment, such as bulk upload
- complete the planned external accessibility audit and make changes as required
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has done several rounds of usability testing and a card sort
- the service has been iterated based on feedback - examples from this work are mentioned in point 4
What the team needs to explore
Before their next assessment, the team needs to:
- continue work to improve help desk categories as mentioned in point 3 - this will help track problems in public beta
- be clear about their minimum viable product (MVP) and the work still needed to achieve it. A service can only progress to beta when the team has understood the riskiest assumptions of their MVP. The panel was shown a demo of the single upload journey and is comfortable with this part of the service continuing to beta. However, the team also mentioned including yet-to-be-designed features and functionalities like sign-up and bulk upload journey in the private beta. Normally a feature like bulk upload is designed and developed in public beta as a value-add. Alternatively, some teams have addressed multiple risky assumptions by running consecutive alphas (one example of this is the Animal Science Licencing service). Either way, as mentioned in Point 3 recommendations, the team must test the entire user journey before private beta with users
Next Steps
Pass - Alpha
This service can now move into a private beta phase, subject to implementing the recommendations outlined in the report and getting approval from the GDS spend control team. The service must pass their public beta assessment before launching their public beta.
The panel recommends this service sits a beta assessment in around 3 - 6 months time. Speak to your Digital Engagement Manager to arrange it as soon as possible.
To get the service ready to launch on GOV.UK the team needs to:
- get a GOV.UK service domain name
- work with the GOV.UK content team on any changes required to GOV.UK content