Apply for a Budgeting Loan - Alpha Assessment
The report from the alpha assessment of the DWP’s Apply for a Budgeting Loan on 20 October 2016.
Department / Agency:
DWP
Date of Assessment:
20/10/2015
Assessment Stage:
Alpha
Result of Assessment:
Pass
Lead Assessor:
D. Williams
Service Manager:
Z. Gould
Digital Leader:
K. Cunnington
Outcome of service assessment
After consideration, the assessment panel has concluded that the Apply for a Budgeting Loan service is on track to meet the Digital Service Standard at this early stage of development.
Reasons
The service team demonstrated an intent to digitise the entire service and not to simply replicate the current paper form in an online format. Indeed it was encouraging that the team have challenged policy in order to remove steps that aren’t material to the decisions, and the paper forms will be amended to take these into account.
The team have clearly recognised the importance of user research as a critical part of the project and demonstrated a good understanding of the user needs. The involvement of the current team and others in user research is positively recognised, as is the intention to immerse all new team members in the future. The panel was impressed that the team are using feedback from support centres to inform user stories.
The team have also undertaken research with some assisted digital (AD) users, and have identified that they may have a slightly different set of needs and realised the importance of further research into different groups of AD users in beta.
Although there seemed some confusion regarding the cost per transaction, there is a clear understanding of the other 3 KPIs. We felt that this is pretty advanced thinking at an alpha stage in our experience.
The team have considered and mitigated a number of external and internal threats to the service and minimised the data captured and retained on the service.
The team demonstrated good stakeholder engagement and communication, and show & tell sessions have received positive feedback particularly with regard to user testing informing better understanding of user needs.
The team has made an impressive start to research with AD users, and have clearly understood the importance of supporting this user group. Demonstrating commitment to the off-screen service, AD is an ongoing feature of the team backlog. The panel would encourage the team to continue to showcase this work across the department, including at show & tells.
In order to gain the best picture of AD user needs, a variety of methods and routes have been followed. A survey with partner organisations was used as a way to reach those who support users, but also users themselves. Through this work, the team were able to identify local welfare rights and support groups, housing associations/charities and the Citizens Advice Bureau as key routes for support. Further research over the phone, through third parties and in pop-up format has given the team an early indication of the AD population and possible support routes to test. The team showed a good understanding of the differing needs of both working and pension age users, mapping both groups on the digital inclusion scale.
Building on the evidence so far, further research will be carried out at local community and Sure Start centres, Age UK, Citizens Advice Bureau and Jobcentres across various sites. The team also plan on pursuing research with their internal home visits team and joining up with other services within the department, who may be able to offer research opportunities and share knowledge.
It was very encouraging to see the team proactively looking for AD users in places they currently go for support, and approaching this with a ‘user first’ attitude.
The team have a digital take-up plan based on current research findings and should continue to develop this as they understand more about user behaviours, support needs and digital skills.
Recommendations
The panel has some concerns as the service moves into the beta stage and has the following recommendations on how to address these.
Point 1- Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.
The emphasis on user research has been impressive, and there is a good understanding of user needs. However, the methods used in discovery when speaking with end-users were more attitudinal than behavioural - for example, pop-up groups exploring what people thought of GOV.UK. The panel recommends that for the research in beta and going forward, that the research methodology focuses on gathering behavioural data, i.e. what people do, rather than what they say.
It might be helpful to users to investigate a method for calculating the loan amount based on responses to previous questions.
The team have collected valuable evidence around the use of third parties for support, and must feed this into department-wide review of services to ensure this support is made sustainable. In addition, research suggested that pensioners ‘felt embarrassed’ about needing help to complete an online application, and so wanted to get this help from friends and family. As support from friends and family is not sustainable, the service must continue to put in place AD support to meet the needs of this group of users.
Point 2 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.
Lab based user research should also provide the opportunity for more targeted sampling of users. The panel recommends that the recruitment briefs demonstrate that research is being carried out with users who have recently applied for a budgeting loan so that they are able to proceed through the service with a personally relevant context of use. The sample should also continue to include users with lower digital confidence, and users with AD needs, and the panel recommends that some of the testing is done on handheld devices.
As stated in point 1, the panel recommends that research methods are focussed on gathering behavioural data. From this perspective, it would still be appropriate to consolidate the evidence around user needs with some more behavioural evidence. For example, contextual research in the homes of applicants would provide more grounded, behavioural data around the user needs.
Point 3 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.
A content designer needs to made available for a minimum of 3 days per week.
It was of concern that the content designer had only observed one round of user research during discovery, as user research should be informing content as well as interaction design.
Point 4 - Build the service using the agile, iterative and user-centred methods set out in the manual.
The governance issues are seemingly overbearing, potentially distracting for the service manager, and may impact on the success of the project. There are risks attached to being used as a ‘guinea pig’ for other business areas.
Point 6 - Evaluate what tools and systems will be used to build, host, operate and measure the service, and how to procure them.
The team are considering the use of Redis and there are pros and cons of such an approach. It’s difficult to introduce a Redis cache without dealing with the single point of failure it can bring about, Redis leans towards consistency rather than availability and a master/slave setup can inevitably produce outages so the design needs to be well thought out. The other approach being considering is to use hidden HTML fields which is the panel’s preferred approach as it keeps the cluster entirely stateless and therefore more scalable.
The team need to consider error handling from the back-end system, as this is partly asynchronous. If synchronous there may be a latency issue whilst ensuring the message is successfully delivered to the back-end DRS systems and the team need to consider how this will impact on peak performance and this needs to be addressed in load testing. If asynchronous the team need to consider retries/timeouts and error reporting. The user should not be left thinking they have submitted a form when in fact it has not been properly processed.
The team have opted to use Mongo DB as a backend store and the panel’s recommendation, as the team do not have much experience with Mongo, is to think very carefully about the sharding key. This key distributes the data amongst the nodes and can lead to very poor performance as the dataset gets bigger if it is not designed properly. The team should consider getting advice from Mongo DB schema design service and test the sharding under load with a full database.
Point 7 - Evaluate what user data and information the digital service will be providing or storing, and address the security level, legal responsibilities, privacy issues and risks associated with the service (consulting with experts where appropriate).
The team’s security architect has recommended an appliance called ShapeShifter which constantly moves the HTML and JavaScript around in order to prevent bots from attacking the site. It’s hard to see how the service could successfully guarantee the integrity of the design across devices if code is constantly changing. Presumably, it changes form names and tracks this at the back-end. If this is the case, it must be stateful, thereby making the site more difficult to scale. The panel’s recommendation is to tread very carefully and ensure adequate performance testing is in place including the appliance if it is used, and address whether or not this device introduces state as this will make scaling more difficult. The panel was also unsure as to what this is hoping to achieve if the service already has DDOS protection at the front-end.
Point 8 - Make all new source code open and reusable, and publish it under appropriate licences (or provide a convincing explanation as to why this cannot be done for specific subsets of the source code).
The team should make service code more widely available and also to those outside government.
The team should also ensure the new development team are aware of the need to document code at the start of the beta.
Point 9 - Use open standards and common government platforms where available.
The panel would encourage the team to investigate and monitor the progress of the GDS Platform as a Service (PaaS) and Notifications projects.
Point 13 - Build a service consistent with the user experience of the rest of GOV.UK including using the design patterns and style guide.
The panel strongly feel that the team needs to have a front-end developer in the team for beta. There is also a need for more time from a content designer, as mentioned in point 3.
Point 14 - Encourage all users to use the digital service (with AD support if required), alongside an appropriate plan to phase out non-digital channels/services.
The team should continue to gather ‘turn-down’ reasons to help inform improvements to guidance provided to applicants using the online form.
Point 16 - Identify performance indicators for the service, including the 4 mandatory key performance indicators (KPIs) defined in the manual. Establish a benchmark for each metric and make a plan to enable improvements.
The panel discussed a ‘light-touch’ diary study of the full end-to-end user journey which could be used to inform contextually appropriate performance measures.
Summary
The panel was impressed with the cohesion and knowledge within the team. The team all demonstrated a passion and dedication to providing the best possible solution for service users, and a commitment to continue gaining further insight into their needs.
Digital Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | Yes | 2 | Yes |
3 | Yes | 4 | Yes |
5 | Yes | 6 | Yes |
7 | Yes | 8 | No |
9 | Yes | 10 | Yes |
11 | Yes | 12 | Yes |
13 | No | 14 | Yes |
15 | Yes | 16 | Yes |
17 | Yes | 18 | Yes |