Update Charity Details - Beta Assessment

The report from the beta assessment for Charity Commission's Update Charity Details service on 29 July 2016.

Stage Beta
Result Met
Service provider Charity Commission

The service met the Standard because:

  • The recommendations from the initial assessment were all addressed
  • The team have put in place a strong approach to assisted digital
  • The team have made good progress in other areas since the initial assessment

About the service

Service Manager: Jane Adderley

Digital Leader: Craig Wyna

Details of the reassessment

Lead assessor : David Illsley

Researching and understanding user needs [point2]

Following on from the recommendations made by the original beta assessment panel, the team has done a great job in addressing identified gaps and improving its approach to the design of the service.

Accessibility audit has highlighted a few potential issues that the team has reported as fixed. One of the highlighted challenges was the service time out, which has been reported as addressed in “guidance”. Unfortunately, the team was not able to test with real users and demonstrate the experience. However as service adoption grows, it is likely that a number of users is likely to grow with people who are new to the service, users who are reporting financials, and users with low digital skills or access needs. It is important to test “time out” time experience and communications with users (even though not in a real situation) and look closely at time out analytics and calls related to that issue. Going live, the panel would be interested to know what team has learned and how service was changed as a result.

Following on the recommendation to develop AD support the team has built an impressive co-browsing support, with trained Call Centre staff able to identify and support users with AD needs and put in place a plan for users who are not online at all. The AD support service has been tested by specialists from Tinder Foundation and identified AD calls have been analysed. The team has reported that currently the amount of calls related to AD is very low. For the live assessment the panel would be interested to know if the situation has changed and how the service is being developed as a result of the calls analysis. AD calls could be an excellent resource to recruit users with access needs to qualitative testing of the product iterations.

The team has a challenging task of simplifying the language in the service while the environment outside is operating specific jargon. The team has reported testing with users who have not done return yet, analysing help text on the pages, getting feedback through call centre staff, all those are really valuable resources of information. It would be useful to report at the life assessment how service language has been evolving as a result of user research.

The team has a great plan in place to collect evidence in order to influence changes in legislation and support decision making. The panel would be interested to hear about the outcome from this.

Running the service team [point 3]

The team has expanded over the last few months, bringing in skilled, experienced contractors to help up-skill Commission staff, alongside a separate plan for digital transformation within the commission.

Designing and testing the service [points 5, 10, 12]

The team have recognised the technology lock-in and limitations raised by the panel in the previous assessment. As part of a broader digital transformation, the Commission are exploring open source alternatives. The team plan for this alternative approach to be in place in time for the live assessment.

The team are now carrying out load testing, with target user numbers based on averages based several years worth of submission data.

When running concurrent sessions for load testing, the team should ensure that different test users are used, rather than the same account many times concurrently, in order to avoid overly optimistic results due to caching effects.

The average the team was targeting may flatten out peak transaction rates - 1900 submissions a day, evenly spread, may be very different from 1900 submissions in the last 30 minutes of the business day - the team should gather metrics to give accurate peak transaction rates in future and use those to influence load testing targets.

Improving take-up and reporting performance [point 17]

The team have made good progress understanding their KPI’s and have worked with the performance platform team to get a page in place.

Recommendations

To pass the live assessment, the service team must:

  • Continue to iterate and improve the service, focussing on user needs
  • Continue to upskill staff and focus on building a strong digital team
  • Take control of their own technology, and have limited dependence on single-supplier technology

The service team should also:

  • Continue to work with the commission digital transformation programme and the GDS transformation support team
  • Review the performance testing approach to ensure it is realistic.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 18 January 2017