UKVI contact centre procurement beta assessment

The report from the beta assessment for UKVI's contact centre procurement on 26 April 2017.

From: Central Digital and Data Office
Assessment date: 26 April 2017
Stage: Beta
Result: Not met
Service provider: Home Office

To meet the Standard the service should:

  • Perform more user research and testing with more users in a private Beta setting.
  • Manage the expectations of the users, this is a new service with a new cost when the service was previously free. Explain to users clearly what they are paying for. Have research to demonstrate users understand why they’re being charged at reassessment.
  • Gaps in the team - visual designer, content designer, full time user researcher, front end developer and web ops - are needed for running the service.

About the service

Description

This UKVI (UK Visas and Immigration) digital service allow users that have applied for a UK visa to:

  • ask a question about their application and pay
  • Report a technical problem about the digital service

Service users

There are two distinct groups of service users:

  • users who are contacting UKVI from outside the UK; and
  • users who are contacting UKVI from inside the UK

Detail

User needs

Ask UKVI a question by email

For a service reaching the end of the private beta phase the amount of user research that had been completed was limited. This was largely due to the service team only having had a user researcher in place for the three weeks prior to assessment.

The team had completed two rounds of usability testing with 5 - 6 participants and a further two rounds with users with assisted digital needs. The team had identified other users, such as people applying for asylum and agents applying on behalf of someone, but hadn’t tested the service with anyone from these groups.

The usability testing that had been completed was limited in scope. The team hadn’t tested ‘unhappy paths’, such as not receiving the email acknowledgement, or being unable to complete the form in a single session. They also hadn’t tested any of the email response templates or how the end of an email exchange would be communicated to users. This meant it was hard for the service team to describe the scenarios where a user would have to start a new email inquiry, as opposed to continuing an email exchange. This is particularly important given this is a paid for service. The service team need to test the service all the way through to ‘query resolution’ in order to understand what impact charging for the service has.

The service team had analysed data from the existing email service and so had a good understanding of the kinds of queries the service would be used for. They gave examples such as finding out what kind of visa application to make and getting reassurance that everything was okay with a visa application, if it hadn’t been processed within the expected time period. But, there was limited evidence to demonstrate why the service team had chosen an email contact form as the primary way to address these needs. Having a user researcher and designer involved earlier on would have allowed the team to explore other ways to meet these needs and understand how contacting via email would work alongside other channels. The overall view of the panel is that the development of the service was driven by a perceived need to replace the existing email contact service, rather than by evidence that this was the best way to meet users’ needs.

It was encouraging to see that the findings from the user research research had been feeding directly into the product backlog. The user researcher also had an excellent understanding of the behaviour and needs of assisted digital users. The team had clear plans for the user research they would be doing in future which included further usability testing, observational research and interviewing service users.

Team

The service is being built and run by 3rd party suppliers who are already delivering other parts of UKVI’s contact service. The developers are supplier side and work remotely, but they seem to have an effective workflow between user research and developers. User stories/change requests are captured, sized and prioritised in a shared spreadsheet and change requests generally have a 1 day turnaround.

The teams are not co-located the Development team are managed by the supplier and are sent requirements from the Product Owner. Reassurance needs to be given to the Assessment team that that process will continue once the Service is live.

There does seem to be some duplication of roles and clarity is needed around who is performing what role.

There are no design roles (content, interaction or service) embedded in the team. The team have used the Home Office community to review interaction design and content, but this has been ad hoc spot reviews, and only since the user researcher has joined the team 3 weeks ago.

The panel would like to have seen a plan that covers how the team will move into running the service publically. This should be considered before the Service transitions into over to the supplier to run.

Technology

The contact and technical error forms are surfaced directly from the CRM software selected to run the call centre, Oracle Service Cloud. This is a hosted SaaS product customised under contract to UKVI by Sitel. The web service initiates an e-mail dialogue, handled by an operator in Sitel’s call centres, where operators can interpret requests for information and provide e-mailed answers using templates.

Call Centre staff have no access to personal information about users stored in Home Office systems. The UKVI team assured the panel that inadvertent disclosure of personal information or a credit card number (either in text or as a screenshot) could be scrubbed from Oracle with a support request, and Home Office security had visited and consulted with Sitel to ensure this process would operate effectively. If any personal data needs to be used as part of an e-mail dialogue, the conversation is escalated to an internal Home Office team. UKVI are content this complies with data protection law, and relevant Home Office accreditors and the SIRO have been involved through the procurement and implementation of the call centre contract.

Oracle Service Cloud is a proprietary product. GDS has previously required departments choosing hosted SaaS platforms such as Oracle, Salesforce or Microsoft Dynamics to design and prove an export process for customer histories in case a migration to a competing product is required in future, however, since no personal information about case work is being stored in the Oracle system, this requirement is not relevant at this time.

The panel still recommends the team consider mitigations for supplier lock-in and migration options, especially if the situation changes. Any content created for users (such as e-mail templates) should be exportable and it indeed appears HO/UKVI has ensured all customisations to Oracle including any intellectual property created by Sitel is owned by the department.

The contract signed with Sitel is for two years, and HO/UKVI have a process in place to evaluate whether the contract has produced value-for-money and has met user needs before it is reprocured. The HO/UKVI are in a position to to judge future contract tenders using metrics and KPIs collected from this one.

There is no source code that can be opened as the solution is largely ‘off-the-shelf’ with some customisations. The panel suggest if there is a way to open up these customisations, the team should consider this, as they may prove useful to another Government department or agency customising Oracle Service Cloud for other purposes.

For the limited customisations required to Oracle, the team are suitably empowered to push changes into production quickly (in only a few hours), have a suitable set of environments for testing changes (including dev, staging) and are load testing at a multiple of their required capacity. The SaaS will provide for most needs around uptime and disaster recovery, and some internal monitoring and escalation pathways exist for Oracle outages.

The team noted JavaScript may currently be required on the payment processing pages. The panel encourages the team to check whether this is the case or not, as the service standard requires services build with progressive enhancement as explained here:

https://www.gov.uk/service-manual/technology/using-progressive-enhancement

At the next assessment we will look for evidence that the service is built with progressive enhancement.

There were still a few bugs we noted during the demo, including opening hours being displayed as Mon-Sun 00:00-23:59. The panel would expect these to be fixed before the public beta is ready to launch.

The panel asked if the team would be able to launch a holding page for an extended Oracle outage, and while Oracle includes this ability internally, the panel suggests the team check how this functions and look at whether an outage of the Oracle platform may also take down these pages.

UKVI wish to use the same payment processor for their online and telephone offerings, and as such, GOV.UK Pay is not yet a suitable solution covering these needs. The team has contacted GOV.UK Pay and will continue to engage with them, looking at the possibility of migrating to it when it adds telephone capabilities in the future.

Design

There was no discovery phase in which to truly assess the validity and scope of both services. There has not been a clear alpha phase in which different approaches were tried and tested with users. Without trying alternative prototypes, or exploring the end-to-end user journey, it’s hard to know if this is the best design solution to meet user needs for the GOV.UK part of the service.

Neither service seems to consistently follow the GOV.UK front end design patterns. A design role on the team would help address this. Examples of the impact include not following the 1 thing per page rule, the back and cancel buttons within the payment page where we advise against the use of multiple buttons, and how the service is asking for users’ names. There are some limits presented by using Sage payments platform as to how closely the design can adhere to GOV.UK patterns, but the multiple green buttons for example could be addressed.

International email contact service

The panel have concerns about how the introduction of a charge for email contact will impact upon users who are self serving, how this impacts upon their expectations of the service, and how this service fits into broader end-to-end journeys that users are on. With a charge it becomes important to know what a user is getting for their money, how long that query lasts and at what point the line is drawn by contact staff and a new query must be raised. This needs both service and content design work.

There’s no clear understanding as to where the user journey starts and ends for the international contact service, and this is therefore not made clear to users through the service itself. As such, there is a risk that users will end up paying several times for an ongoing query, and there is no route from the start page for users who have already raised a query. The form has been tested from the GOV.UK start pages, but not to the end of the contact journey. Further exploration of this, through mapping of the end-to-end and unhappy paths, is needed to make the service design more robust.

Technical error service

This feedback form will be situated inside services for users to feedback when things go wrong.

We did not see the form demonstrated in the context of a service, or any usability testing to that effect, so it’s hard to know how effective the design is. For example, users are asked to provide a screenshot of their issue, but how they know to do that before entering the feedback form is unclear. The form itself is very simple, but how this fits into other services needs more exploration and design work.

Analytics

The team have engaged with GDS on metrics however the team do not currently have direct access to any analytics and are reliant on the supplier to send reports to the Service Manager and the Product Owner on a weekly basis, which is not ideal. The current service gets 1.4million queries per year so real time monitoring is the ideal. The team are not sure what levels of traffic the new service will attract. The tools being used to measure site traffic are Oracle out of the box capabilities. A strong recommendation would be the service team (Product Owner and Service Manager) gain direct access to the figures and configure them so that improvements can be made based on real time MI.

The panel would recommend that there are two sets of metrics, one for the email enquiry form and one for the technical form.

The team must test the Service at some point with the Minister to ensure that they are happy with the overall approach and socialise the metrics that the Service is hoping to achieve.

Recommendations

To pass reassessment, the service team must:

  • Test the end-to-end journey - including email responses and unhappy paths - with representative users of the service. Avoid explicit reference to contacting UKVI in usability testing scenarios to understand how the email service works alongside other channels.
  • Continue work to identify the different user groups for the service. Make sure that future research is conducted a cross section of these users to gain insight into the diverse contexts and challenges facing users.
  • Conduct research or draw on existing research to identify the specific use cases for the email service.
  • Review the shape of the team to reflect the service is moving to a live running state. Also review processes to manage the service’s response if it becomes unavailable and how that’s communicated to users. Capture this in a plan that all suppliers are signed up to.
  • Make more visible to the users the criteria at which the queries become a new query and therefore incur additional cost. This was very unclear in the assessment and before the team go live the upfront content on GOV.UK needs to make it very clear to users to manage expectations.

The service team should also:

  • Conduct user research with the agents responding to emails to understand what information they need to be able to respond to a query, to inform future iterations of the contact form.
  • Look at the assisted digital needs of users. Some work has been done here but assumptions seem to be made that assisted digital users would seek face to face help. Plotting users on the Digital Inclusion scale could help. Although Assisted digital is not a mandatory requirement for overseas user groups, findings have already surfaced that a significant proportion of the user base could have AD needs so it would be wise to understand how those users are supported.
  • We recommend that the Service Manager and Product Owner have direct access to metrics so that they are able to respond and identify problems on the service and make improvements.

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Not met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Not met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Not met
11 Planning for the service being taken temporarily offline Not met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Not met
15 Using analytics tools to collect and act on performance data Not met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Not met

Updates to this page

Published 30 July 2018