Legal advice for civil servants alpha assessment
The report from the alpha assessment for GLD's legal advice for civil servants service on 22 May 2017.
From: | Central Digital and Data Office |
Assessment date: | 22 May 2017 |
Assessment stage: | Alpha |
Assessment result: | Met |
Service provider: Government Legal Department
The service met the Standard because:
- The team had taken this large area of work and successfully defined a manageable minimum viable product with content focussed on Advice Lawyers and Civil Servants.
- Feedback is informing the team, for example, changes to the home page, and login flow were directly made due to user research.
About the service
The service aims to reduce duplication of work by GLD lawyers and provide self service for some legal advice for public sector staff.
Detail of the assessment
The team were able to clearly explain the problem space - lawyers in government are asked the same question multiple times and the time spent answering similar questions would be better spent responding to more complex areas, and responding to civil servants more quickly. They were also able to give a confident estimation of the scope of the service, with around 3,000+ advisory cases open at any one time.
The team had taken this large area of work and successfully defined a manageable minimum viable product with content focussed on Advice Lawyers and Civil Servants. Starting with this limited scope means the team can focus on meeting their needs before extending the service to cover other areas.
Researching and understanding user needs [points 1, 2, 10]
The team has done an impressive amount of user research, and, as in the previous assessment, identified a whole range of users and stakeholders of the system.
By narrowing down to the two priority users, and a priority content area (changing the law), the team have been able to focus their questions and research time to make good progress in identifying specific profiles and needs.
User needs identified for each persona/user profile are specific and relate to the whole service, and they are based on research that the team have carried out.
Accessibility user research sessions have been carried out with 10 people, but as yet no improvements have been made. The team have attempted to find low skilled and low confidence digital users, but have not had success in identifying any potential people to take part in user research. I encourage the team to continue to try and find these people, and to make efforts to include them in research.
Needs have been clearly identified, however solutions have been reached quickly, for example content creator/lawyers are used to using pdf files. The team must work to understand how they are going to change and improve this.
Feedback is informing the team, for example, changes to the home page, and login flow were directly made due to user research. The process for user research informing the team is defined, with formal show and tells, prioritisation exercises and informal chats and meetings.
Some design patterns have been jumped to, without alternatives being considered and the user need thoroughly thought through. For example, the need for non-legal staff to understand legal phrases has been met through a glossary. However, the success of the service will depend on the needs of policy professionals to understand what lawyers are saying, and this could be met by enforcing a more stringent plain english policy on documents published.
While lots of user research has been done with policy professionals and lawyers and legal experts in departments, so far, there has been little research into the process of content production and curation, making it easy for people to publish good web content. And the team should make sure this is done in the private beta stage, as this will be a critical part of a successful service.
Running the service team [point 3]
The panel were impressed with the team’s attitude and approach to agile as an organisation new to digital. They should continue to encourage all stakeholders to attend show and tells and other agile ceremonies to ensure their buy-in and confidence in the team’s delivery.
One area of concern was knowledge transfer. The team was reliant on an interim developer and fast stream placements to fill key roles, however, they were working on limiting this risk through improving their documentation and planning for any change in their team. In addition to this, this technical risk would be reduced with a technical lead overseeing decisions at a department level.
One key role the team were waiting to fill was that of content lead. As this is such a content-heavy product, having a skilled content designer to lead this work, ensuring compliance, a process for keeping content up to date and the content cycle in place and working well is essential.
At the moment, content is prioritised by what seems to be useful at the time although there is no data to support this decision-making. There are multiple teams involved in creating, assuring and publishing content. It’s great to see the team working so closely with its stakeholders but there is a big risk that there isn’t a clear ownership of the content cycle across the service - ensuring it works and it consistent.
In beta, the panel recommends that a skilled content lead sets and takes ownership of this area of work, testing out various approaches until they are confident it works. Key areas are:
- Testing the content cycle including document creation, the publisher process, understanding where content gaps are and how you deal with them
- Keeping content up to date across the service
- Overall quality and consistency of content
- Discoverability and grouping of the content
The team should also get the support of a Performance Analyst to lead their measurement work and ensure they’re collecting the right metrics to improve.
In addition, the panel were concerned that the service would go into ‘business as usual’ delivery at some point in the future. It’s important that there is a team in place that can keep improving the service after it goes live. This is because internal IT teams are often not equipped to iterate a live service so there’s a risk that it will deteriorate after go live.
Designing and testing the service [points 4, 5, 11, 12, 13, 18]
During alpha the team had been tested and iterated the production service and as they go into beta they plan to test assumptions using the wireframes and prototyping software. We would recommend testing high fidelity prototypes using existing design patterns with users to avoid rediscovering interaction issues that have been previously resolved by other teams. Resources for prototyping can be found in the design section of the Service Manual.
The team had produced an alpha service that is generally consistent with GOV.UK design patterns although I would encourage them to share any cases where the patterns do not work for their specific users’ needs with the cross government design community. When defining new patterns you must be able to demonstrate a clear user need backed up with thorough user research.
We would suggest the team explore ways of defining legal terms which is currently being handled with a tool tip. This will likely be solved when they have a content designer working to set best practice for writing legal content for the web.
We would also recommend exploring other ways of informing users of future content. Blogging for example could help solve this need but also help promote recruitment for user research participants.
Technology, security and resilience [points 6, 7, 8, 9]
The panel were pleased to see an approach being adopted to ensure knowledge sharing and refactoring of the code base as they move towards private beta. The team have recognised the problems inherent in having had a single developer working on their project and are actively taking steps to address this.
Choices in technology were chosen for sensible reasons and are based on open source technologies.
Currently the code is hosted on a private GitLab instance, from which a process to regularly publish snapshots it to be enabled. The panel would suggest that this is a difficult and perhaps unnecessary process and would recommend that they look to run on a public source code management tool, utilising private projects for secrets, but primarily coding in the open.
The panel also noted the use of proprietary document formats in their workflow, and would recommend looking to use open standard alternatives.
Using the magic link style login process is interesting, and the panel would like to see research into it’s take up and also the best settings around automatic logout and so on, if this could be shared publically with the rest of government that would be beneficial.
Improving take-up and reporting performance [points 14, 15, 16, 17]
The panel were encouraged to see the team planning out what measurements would tell them if the service was meeting needs and help them improve. The team were focussing on search terms to understand where missing content may be, time spent on page and a ‘further question’ section. The team should set up and test some of these measures in beta.
Recommendations
To improve the service, the team should:
Researching and understanding user needs
- User research into the process of content production and curation, to inform the design of a successful content strategy
Running the service team
- Ensure their new content lead has ownership over the design, test and implementation of the full content cycle
- Have someone responsible for performance analysis and measurement of the service
Designing and testing the service
- Test high fidelity prototypes using existing design patterns
- Share any cases where the patterns do not work for their specific users’ needs with the cross government design community
- Explore ways of defining legal terms which is currently being handled with a tool tip
- Explore other ways of informing users of future content
- Be consistent with other internal services like Civil Service Learning, for example with headers.
Technology, security and resilience
- Run on a public source code management tool, utilising private projects for secrets, but primarily coding in the open.
- Use open standard alternatives, instead of proprietary document formats in their workflow