Heavy vehicle testing review: final report
Published 10 March 2021
Background and purpose of the review
Background to the review
Statutory testing requirements
The Road Traffic Act 1988 generally prohibits the use of a heavy vehicle on a road unless a test certificate has been issued for that vehicle within the past year. Section 47 of the Act covers Public Service Vehicles (PSVs) and Section 53 covers Heavy Goods Vehicles (HGVs). Annual roadworthiness tests are carried out by Driver and Vehicle Standards Agency (DVSA) staff, known as Vehicle Standards Assessors (VSAs). Details of how annual tests should be carried out are contained within the Goods Vehicles (Plating and Testing) Regulations 1988 for HGVs and the Motor Vehicles (Tests) Regulations 1981 for PSVs.
Most heavy vehicles tested are also regulated through operator licensing rules, supervised by traffic commissioners and subject to specific DVSA enforcement. Under the operator licensing rules, operators are required to conduct maintenance and inspections regularly (usually about every 6 weeks), which should ensure standards remain at or ideally above the standard of the annual test. The separate annual test then provides an external, independent check of the effectiveness of these systems.
Besides to the annual test, there are various other in-service tests to check particular safety aspects of vehicles when used for specific purposes. These are often collectively known as ‘specialist inspections’ and include, for example, European Agreement on the International Carriage of Dangerous Goods by Road (ADR) checks for dangerous goods vehicles.
Test provision
At present, there are about 436 full-time-equivalent VSAs able to conduct tests. The majority of heavy vehicle testing takes place at Authorised Testing Facilities (ATFs). DVSA has contractual arrangements with ATFs, which sets out the detail of the obligations for the ATF (provision of the test facility, scheduling of vehicles for test) and DVSA (scheduling of VSAs).
Vehicle operators book tests with ATFs, often via contractual arrangements for maintenance. Some larger operators may have their own ATF, which is not open to other operators.
DVSA charge a set fee per vehicle tested (as set out in regulation), while ATFs may also charge a ‘pit fee’ to operators for use of their facilities on top of the statutory fee. Pit fees are capped in the ATF contract. The statutory test fee funds DVSA’s delivery of heavy vehicle testing and also provides funding for GB heavy vehicle enforcement (also conducted by DVSA).
In 2018 to 2019, 432,778 HGV tests, 254,439 trailer tests and 77,766 PSV tests were carried out. There are also lower numbers of specialist tests, such as those for dangerous goods, which may be carried out at the same time as the annual test.
Test failure rates and influencing factors
Not surprisingly, test fail rates vary significantly depending on type of fleet and vehicle. There is a strong positive correlation between vehicle age and test failure rates. The failure rate for HGVs is generally higher than that for PSVs.
Table 1: test fail rates in 2018 to 2019, by vehicle age and category
Vehicle age (years) | HGV fail rate | PSV fail rate | Trailer fail rate |
---|---|---|---|
<1 | 3.6% | 2.5% | 3.6% |
2 | 4.2% | 4.3% | 4.7% |
3 | 5.0% | 4.7% | 5.4% |
4 | 6.2% | 5.4% | 6.5% |
5 | 8.6% | 6.3% | 7.5% |
6 | 9.3% | 7.6% | 8.6% |
7 | 11.8% | 9.7% | 10.1% |
8 | 14.4% | 8.0% | 11.0% |
9 | 16.6% | 11.3% | 12.1% |
10 | 18.3% | 9.9% | 13.7% |
11 | 20.3% | 13.9% | 14.8% |
12+ | 31.6% | 20.5% | 18.8% |
Note: There is no trailer registration scheme, so the age of trailers tested has been estimated using each trailer’s identity (ID) number.
To help support effective enforcement of heavy vehicle roadworthiness, DVSA assesses the risk of operators and uses this as a targeting tool. DVSA calculates its Operator Compliance Risk Score (OCRS) on the basis of an operator’s historic compliance with roadworthiness and traffic rules, with test fail rates a significant contributor. Other factors that influence scores include roadside inspections, prosecutions and weighing checks. Operators are assigned to ‘traffic light’ bands based on their score. The score can be used by DVSA to target their enforcement activities toward the operators likely to pose a higher road safety risk.
Operators who can show that their maintenance systems are exemplary (and with an annual test fail rate that backs this up) can apply for earned recognition (ER) status. These operators are less likely to have their vehicles stopped for a roadside inspection. Given the use of test fail rate in calculating OCRS and that it must be ‘good’ for ER to be granted, table 2 shows the unsurprising relationship that there is between fail rate, OCRS and earned recognition. Given their high compliance rate, whether earned recognition operators should have less frequent testing intervals could be considered. This could also have the effect of increasing flexibility in the system.
Table 2: initial test failures rates in 2018 to 2019, by OCRS band, for both HGVs and PSVs
Operator type | OCRS band | Initial fail rate |
---|---|---|
HGV | Earned recognition | 3.2% |
HGV | Green | 7.0% |
HGV | Amber | 18.2% |
HGV | Red | 29.2% |
HGV | Unassigned | 17.8% |
PSV | Earned recognition | 2.5% |
PSV | Green | 6.0% |
PSV | Amber | 21.1% |
PSV | Red | 30.3% |
PSV | Unassigned | 24.0% |
Table 3: summary of annual testing data for HGVs, PSVs and trailers
Year | Tests | Pass after rectification | Fails | Initial fail rate | Final fail rate | Retests | Retest fail rate |
---|---|---|---|---|---|---|---|
2018 to 2019 | 764,983 | 30,443 | 72,881 | 13.9% | 9.3% | 71.419 | 6.5% |
2017 to 2018 | 730,056 | 33,592 | 68,002 | 13.9% | 9.3% | 67,257 | 6.5% |
2016 to 2017 | 753,509 | 37,365 | 73,108 | 15.3% | 9.6% | 70,760 | 6.8% |
2015 to 2016 | 733.312 | 42,026 | 70,373 | 15.3% | 9.6% | 70,760 | 6.8% |
2014 to 2015 | 723,026 | 49,792 | 77,582 | 17.6% | 10.7% | 78,616 | 7.6% |
Note: the initial fail rate is the rate for vehicles as they were brought for the annual test. The final fail rate excludes vehicles that pass the test after rectification of minor defects at the time of the test.
Results from roadside checks can also provide an interesting perspective on the roadworthiness condition of vehicles. But, because these checks are targeted, the data needs treating with care. The ‘defect rate’ of normal roadside checks, therefore, gives a sign of effectiveness of targeting, as well as the state of the fleet – so you would expect the ‘real’ underlying defect rate to be lower. To deal with this, Department for Transport (DfT) commission DVSA to conduct a ‘fleet compliance survey’. This uses a controlled sample to check ‘random’ vehicles to reflect the make-up of the fleet. Table 4 shows the results of this survey.
Table 4: fleet compliance survey – prohibition rates, for HGVs and trailers
2010 | 2011 | 2012 | 2013 | 2014 | 2015 | 2016 | 2017 | ||
---|---|---|---|---|---|---|---|---|---|
GB HGV vehicles prohibited | 374 | 252 | 259 | 275 | 240 | 230 | 279 | 269 | |
GB HGV vehicles checked | 3,609 | 2,445 | 2,621 | 2,694 | 2,575 | 2,446 | 2,530 | 2,525 | |
Prohibition rate HGV (%) | 10 | 10 | 10 | 10 | 9 | 9 | 11 | 11 | |
GB HGV trailers prohibited | 227 | 170 | 161 | 180 | 135 | 125 | 159 | 146 | |
GB HGV trailers checked | 1,712 | 1,397 | 1,370 | 1,408 | 1,375 | 1,249 | 1,360 | 1,392 | |
Prohibition rate Trailer (%) | 13 | 12 | 12 | 13 | 10 | 10 | 12 | 10 |
It is noted that the absolute value of prohibition rate is slightly lower than the initial test failure rate. This is expected because the type of check done at the fleet compliance check does not allow as thorough an examination. So, while the trends are comparable between fail rate and fleet compliance, care must be taken when comparing absolute values.
The differences between fleet compliance and annual test results are interesting. Over the past 10 years, DVSA has become more systematic in how it uses its data for targeting purposes, and has been overt in doing this – fail rate now being seen as an important thing for operators to manage (and keep low).
Of note, is that the initial test failure rates across heavy vehicles have roughly halved over the past 10 years – from around 24% to around 12%. In the same period, the fleet compliance survey shows that vehicle condition when in-use has not improved – perhaps indicating that test preparation has improved rather than absolute in-service vehicle condition. It should be noted that the number of testable items has increased over time, or standards have tightened, so static figures may still represent an improvement in roadworthiness.
Effect of COVID-19 on the testing system
As a consequence of the COVID-19 pandemic and its effect on staff availability and the ability to safely work in ATFs (there was no ‘safe’ operating procedure for testing at that time), routine testing was suspended in March 2020.
To ensure vehicles could remain operational, vehicles due a test were issued with a Certificate of Temporary Exemption (CTE), which added (initially) 3 months to the test due date. While testing resumed in July 2020, a significant backlog of untested vehicles built up, which would have overwhelmed the capacity of the testing system, if mandatory testing was simply reintroduced.
It was, therefore, decided that the testing capacity available should be used to test the vehicles posing the highest road safety risk, while safer vehicles would be tested when capacity allows. This new approach has been enabled by the changes introduced by the Business and Planning Act 2020. This system has been in place for tests due from September 2020 onwards and the backlog of vehicles with extensions to test validity is gradually being reduced.
During the passage of the act, peers raised issues with the current testing system, in particular asking whether the introduction of delegated testing (whereby testing would be carried out by the private sector) would be considered. Although it was confirmed that delegated testing was not being considered for a variety of reasons, ministers agreed that a review of the testing system would take place, to be completed before the end of the year. The agreed purpose of the review is set out below.
Purpose of the review
The purpose of the review is outlined in the terms of reference (TOR), which were agreed by ministers and shared with the stakeholder panel at the first meeting. They state:
This review will focus on understanding whether current roadworthiness testing is fit for purpose and provide evidence on whether it supports or hinders the effective operation of the haulage and logistics industries. It may also identify areas for further investigation.
The review will consider if heavy vehicle roadworthiness testing can be improved to best meet the needs of customers and suppliers, while delivering the road safety and environmental benefits of vehicle testing (for example, via ensuring the effectiveness of operators’ vehicle maintenance systems).
The review will consider the service during business as usual and also the period from March 2020 to August 2020. It will develop recommendations to DfT ministers, to inform strategic operational and ministerial decisions.
The roles of the stakeholder panel and an outline of how the meetings of the panel were organised are both discussed in the TOR.
Several central considerations are laid out in the TOR, which guide the structure of this report. These are set out in full in Annex A, and the main elements are reproduced in the course of this report.
Besides these considerations, at the outset of the Heavy Vehicle Testing Review, the Chair asked the stakeholder panel to identify what they believed a heavy vehicle testing service should deliver. Their views can be summarised as:
- an availability of test slots well in advance
- a long-term booking system, as the current system only allows bookings to be made 3 to 6 months in advance
- an accurate digital tool that shows availability by time and by location, particularly for short notice availability
- flexibility of tester availability and test slots offered
- the ability to adapt to developments in vehicle technology
- a competitive environment for ATF operators and affordable service for vehicle operators
- a commitment to improving road safety
Considerations
Performance up to February 2020
As set out in the TOR, the first consideration of this report relates to how heavy vehicle testing has performed and issues arising during the 2 years up to February 2020. This period was chosen to ensure an assessment of ‘normal’ testing performance – in other words, how well the testing system was functioning before the effects of the ongoing COVID-19 pandemic.
This consideration is focused on assessing the end-to-end service delivery, identifying developments, which could be made to maintain and improve road safety, as well as ensuring that customer needs are met. It includes consideration of the following aspects:
- how accessible bookings have been to customers
- how effective staff allocation to ATFs has been
- how effectively ATFs have been using DVSA staff
- the implications of DVSA service stability plans on heavy vehicle testing, completed since summer 2018 and planned future improvements by DVSA
- the purpose, effect, sustainability and fairness of the current policy on new ATFs
- whether fees are enough and used effectively, this consideration should inform any future formal fee review
- the main strengths, weakness, opportunities and threats of the current service model
These issues are considered in turn below. But before considering these issues, it’s worth highlighting that DVSA commission independent user satisfaction surveys for both ATFs and operators.
For ATFs, overall, more than 8 in 10 are satisfied with the service they receive from DVSA (84%), which is again a significant improvement compared with 2018 when just under half were satisfied (48%). Just 6% of ATFs express any degree of dissatisfaction with DVSA, compared with 25% in 2018. Among the few dissatisfied, their reasons vary, including communication, flexibility and current lack of service.
For operators, 83% are satisfied overall with the DVSA, with over a third very satisfied (37%). This is significantly higher than the level of satisfaction in 2019 when 65% were satisfied overall. Just 4% of operators expressed dissatisfaction towards the DVSA overall.
Three-quarters of respondents (from operators) are aware of the booking process for annual tests, and 73% said they are usually able to get the exact test dates they want at their regular facility, which is lower than in previous years (86% in 2019). If desired dates are not available, the highest proportion (43%) of operators say they are usually offered an alternate date over 20 working days from their desired date. Of all those aware of the earliest date usually offered, four-fifths feel the length of time is too long (81%).
The highest proportion (45%) of operators mainly use an independent ATF, and of those which use ATFs, 91% say there are no regions in particular where testing availability is an issue.
To book annual tests, 38% of those who use ATFs usually contact the ATF within 1 month, and the same proportion usually books within 1 to 3 months. 85% said they have not had an annual test appointment cancelled by an ATF, while 13% have (versus 16% in 2019).
82% of operators are satisfied with the ease of booking tests with ATFs, while 87% are satisfied with being kept up to date during the booking process and the overall booking process. These proportions are more positive than in 2019. Reasons for dissatisfaction mainly relate to the availability of slots and long waiting times.
Accessibility of bookings
The current testing model functions by a contractual arrangement between DVSA and ATFs, whereby DVSA provides VSAs to the ATF. Vehicle operators then book tests with ATFs, so the operator has the immediate relationship with the ATF and not with DVSA.
Operators are required to have maintenance schedules planned well in advance, and so it is not unexpected they would wish to book vehicles in for their annual test well in advance as well.
Under current arrangements with ATFs, DVSA will confirm schedule testing staff to ATFs 3 to 6 months in advance – and some ATFs will therefore not confirm vehicle bookings beyond that time.
The panel members consider there would be advantages in DVSA confirming testing staff further ahead. It was suggested there would be logic in testing staff being scheduled on a rolling 12 months – which could allow a vehicle’s next test to be booked as soon as its current one has been done.
This was an area flagged by the panel as being an important area for improvement. It was noted that there would still need to be arrangements for scheduled resource to be adjusted nearer to the time – but 12 months was a good aspiration.
There have been previous changes to the way schedules of testing staff have been confirmed. Careful consideration will be needed to avoid negative consequences if changes are made. Allocating resources a long way in advance may, for example, reduce flexibility for those tests that cannot be planned a long time ahead.
The current heavy vehicle testing service model is focused on DVSA providing a service to ATFs. This is dependent upon vehicle operators being able to book tests directly with the ATFs when required. In turn, the ATFs bid for the testing staff resource in quarterly bookings rounds. From there the ATFs (alongside DVSA) determine whether there is capacity to test operators’ vehicles, and this requires regional relationships between the ATF and the vehicle operators.
For many operators, this is a straightforward process. They may have their own ATF, they may have their vehicle maintained by the same organisation that runs the ATF or their maintenance provider may have a contractual relationship with an ATF.
But, for some operators (particularly those who are not professional hauliers or are less professional in the planning of their operation) who do not have fixed relationships with ATFs, bookings (particularly if needed at short notice for prohibition clearances, for example) can be less straightforward.
The system works well for ensuring that testing delivery is efficient, in that the DVSA staff are being utilised for the full working day and that DVSA receives the resulting fees. This is illustrated in table 5, where DVSA fulfilment of bookings with ATFs is high.
But this does not guarantee that the vehicle operators receive the service that they would like or need. This point can be illustrated by the fact that a small number of operators can find it challenging to book short-notice tests locally in certain areas and they are reliant on DVSA network business managers (NBMs) finding ‘spare’ appointments that the ATF may have.
So, while there is capacity in the overall service to accommodate the testing of all vehicles, in these small number of cases, and for a variety of reasons, this service was not readily accessible to all operators.
Table 5: DVSA fulfilment of test slots confirmed with ATFs
Financial year | Percentage of ATF reservations met by DVSA | |
---|---|---|
2015 to 2016 | 99.9 | |
2016 to 2017 | 99.6 | |
2017 to 2018 | 98.6 | |
2018 to 2019 | 99.9 | |
2019 to February 2020 | 99.9 |
Note, testing was suspended in March 2020 due to COVID-19.
One area for consideration relating to booking availability is whether DVSA has the right level of involvement in the bookings process. As DVSA is responsible for delivering the testing service to vehicle operators but does not control specifically when or where testing is delivered, tensions can arise and, in some cases, this can lead to operators having a less than satisfactory experience with booking tests – for example if an ATF does not wish to deal with a particular customer.
If DVSA were to have a greater hand in the bookings, some of these issues could be better avoided, and would also better enable richer data on the situation. But there would be downsides to this if it was taken too far – it would be hard to maintain free and open competition between ATFs if DVSA controlled the work going to them.
The general conclusion from the panel was that transparency was important for operators in finding appointments – and DVSA should have a role in facilitating that, but it should not ‘take over’ the bookings process.
Additionally, the legislation underlying vehicle testing has not always kept pace with changes in the testing system, which has led to limited operational problems. In particular, the Goods Vehicles (Plating and Testing) Regulations 1988 are drafted on the assumption that DVSA will directly correspond with operators before a test and will set out requirements for test in that correspondence (see, for example, Regulation 8 (2)(k) relating to brake testing). Increasing DVSA involvement in the booking process could assist in resolving such issues.
Staff allocation to ATFs and utilisation of staff
Under the current system, DVSA’s customer service measures are focused on delivery of service to ATFs, with the assumption being that ATFs are taking on the responsibility for delivering a quality service to vehicle operators.
The primary measure used is the percentage of scheduled test slots that DVSA has committed to with ATFs. DVSA reports on when tests are cancelled by themselves or by ATFs. So, the current measures only consider delivery against DVSA committed resource rather than the resource requested by ATFs, which varies across regions and time of year, but in broad terms is about 15% to 20% more than that provided (or ‘needed’ to test all the vehicles).
Resource allocations, however, are necessarily driven by efficiency to ensure that the testing service is affordable within the existing fees structure. This highlights that there is a tension between the aim for DVSA to deliver the service within the current fee structure and levels and with the desire for greater flexibility in how tester resource can be deployed by ATFs.
Reconciling this tension would need a careful balance. There has been feedback from across industry in this review that there is a general willingness among customers to pay more for more choice and flexibility. But this would need to be tested further, including with specific proposals.
The DVSA service recovery plans, which are discussed further in Service stability and improvement plans, have made the most efficient use of testers a priority. Testing staff are utilised at rates of about 93% by ATFs (that is, ATFs use 93% of testing time provided by DVSA). This figure is important to ensure that the current capacity of DVSA testing staff, who are financed via fees, can cover the demand.
To achieve this average rate there are significant disincentives for ATFs if they do not fully utilise testers’ time –with DVSA favouring providing testers to ATFs that are more efficient.
During the course of the review, Logistics UK (a trade body representing logistics businesses) provided officials with data, which sought to illustrate some of the difficulties their members have had when making bookings.
It is important to note that this was a small group of large companies, so their experiences are unlikely to be representative of the majority of operators. But they provide some interesting first-hand experiences.
Operators who had to take a vehicle out of service due to a delay getting a test to clear a prohibition reported having to pay more than £500 to hire an equivalent, or otherwise lose business. When looking to take vehicles for re-tests, some reported that they were able to conduct these at their usual ATF, while others were not, with a maximum reported journey of 55 miles and an average of 28 miles. One operator also reported having to make 384 trips to ATFs other than their usual one, to find open testing slots.
While a fair amount of administration and having to shift some vehicles around is likely to be inevitable for large fleets, there was evident frustration among this group of operators about the difficulties experienced when getting tests. Their experiences generally support the view put forward by the review panel, that a higher number of testers would be beneficial to industry due to the added flexibility this would provide.
During discussions with the stakeholder panel, it was suggested that options should be explored about increased fees to provide enough testing capacity to meet short-term needs. It was felt by the panel that this could be acceptable to operators as it would represent a relatively minor cost across a year when compared to the overall costs of running a fleet.
It was noted in the review that if more tester resource were provided, that, at face value, this would result in some inefficiencies – and hence the need for fee increases to cover that. For example, if DVSA were to provide all the resource that ATFs currently sought this would mean about a 15% uplift in numbers of testers (about 65 extra full-time equivalent VSAs).
If vehicle volumes remain static (which broadly speaking they are expected to), then overall efficiency would drop from about 93% to about 80%. But it may be possible for ATFs to grow work to fill that gap with voluntary tests so that efficiency could be better (and fee increases less). This would need understanding better so that informed decisions could be made.
It was also noted that there may be other ways of building capacity in the system rather than recruiting more staff – which has a significant lead time and presents its own capacity issues. Most notably, attention is drawn to the confidence that there is in earned recognition operators’ quality systems – and the lesser importance that the annual test can be argued to have in their context, for ensuring the effectiveness of those systems for vehicle maintenance.
While out of the scope of this review to explore this point fully, it was noted that service improvement work may benefit in not being bounded by the constraint of testing needing to be annual for earned recognition operators. There is more work to do on understanding what road safety impacts there may be and practicalities on how any such change could operate
Service stability and improvement plans
In 2018, following situations where DVSA was cancelling more ATF booking slots than they should have been due to staff shortages, DVSA implemented a service stability plan to improve the service they offered. This plan included the following work:
- improved working relationships and communications with ATF customers, increasing the Network Business Manager (NBM) team, and regular newsletters to share important information
- a commitment to increase testing staff numbers in areas where there was higher test demand
- development of new methods for improving staff availability around times of peak demand
- active redeployment of staff from areas with less demand to those areas requiring extra testing resource
- NBM and deployment teams made efforts to increase ATF utilisation figure to 93%, as mentioned above
- reduction of cancellations from the DVSA side to between 0 and 3 per month
Some of the stakeholder panel members expressed views that DVSA communications efforts had not improved significantly and an area of concern was around the delay in DVSA responding to questions raised by those running ATFs. The typical response time experienced is considered by some of the panel to be unacceptable and creates problems for ATFs.
But the panel broadly agreed that the implementation of these service stability plans to date had resulted in positive change. This would appear to be reflected in the lower numbers of cancelled test slots by DVSA (see table 5) – through greater availability of DVSA testers.
Future improvements planned by DVSA
Besides the service stability plans, DVSA has several improvements to the heavy vehicle testing system, planned to be rolled out over the period until 2025. The main element of this will be through the Commercial Vehicle Services (CVS) project.
This will create a digital service, which, in part, will replace outdated IT systems, but will also allow DVSA to improve the testing system, with benefits for VSAs, ATFs and operators. The main elements of CVS are:
Creating an app to allow the capture of test results at the time of test. This will allow greater detail to be provided to operators on test results and for this information to be available more quickly. This will also ensure that the latest test results are available for DVSA enforcement officers at roadside compliance checks.
Creating a new system to manage test data and store technical records. This will replace a legacy IT system, provide easier access to data for VSAs and remove the need for paper forms.
Simplifying payment processes for ATFs. This should allow ATFs to manage their own payment systems more easily and minimise their administrative time.
Producing a modern and flexible IT system. This will allow any changes to the testing system to be more easily and quickly incorporated in the future.
During the period of the review, to address risks arising from the high volumes of vehicles due for test, DVSA has made steps towards achieving greater transparency of ATF capacity to vehicle operators. Essentially, this is the creation of a digital service that gives operators visibility of basic capacity information at ATFs (as supplied by the ATFs). This service also incorporates an easy route for operators to escalate to DVSA difficulties in getting test appointments. It is too early to assess its effectiveness, but it would appear that it may prove useful in resolving issues of operators struggling to find available capacity.
It was broadly agreed by the stakeholder panel that the implementation of DVSA’s planned improvements would be beneficial to the testing system as a whole.
Moratorium on new ATFs
The number of ATFs that are currently in operation has been limited by a moratorium on the establishment of new ATFs (announced in 2017). This step was taken as an interim measure to stabilise and improve the existing service delivery.
DVSA’s position on this has been frequently challenged by trade associations and industry bodies. Currently, there are 575 ATFs in operation, with around 50 potential new sites waiting for authorisation should the moratorium be lifted.
The general, but not unanimous, stakeholder panel consensus was that the moratorium on new ATFs should be lifted. It was raised by the panel that there have been some significant investments made in the development of new sites and these investments cannot be realised while the moratorium remains in place.
Some of the panel also made the case for maintaining the current number of ATFs, at least for the short term, noting that existing ATFs would ‘lose out’ should more ATFs open. The suggestion was also made that should there be available DVSA resource then priority should be given to those existing ATFs who may want to operate for more days than they were currently able to before any new ATFs were authorised to open.
The particular challenge to DVSA in this is ensuring that there are enough resources available to all ATFs if the number of ATFs were to increase. Even though the number of vehicles tested does not increase with more ATFs, the number of DVSA testers required is likely to increase (because of the inefficiencies of travelling between ATFs and splitting work across them), with a consequent increase in fees likely to be necessary.
Fees
Fees for heavy vehicle testing are set in legislation and vary depending on the vehicle or trailer being tested. Broadly speaking there is a relationship between test time and the fee level. The fees tables for HGVs and PSVs can be found in Annex B. DVSA fees are collected from operators by ATFs. Pit fees are also charged by ATFs, these are capped in the ATF contract.
The aim of government is to deliver the testing service within this fee structure. To do this, an emphasis is placed on staff utilisation rates, ensuring that demand can be met while avoiding having too many testing staff (and the associated costs).
DVSA uses the test fees to fund both the heavy vehicle testing itself and to partially fund associated GB fleet compliance activities. From 2019 to 2020, vehicle testing income was £57.9 million, with expenditure of £55.8 million, compliance income (mainly derived from the enforcement element of test fees and the Single Enforcement Budget) was £47.5 million, with expenditure of £52.6 million.
Forecast figures for 2020 to 2021 are inevitably affected by the suspension of testing due to COVID-19 and consequent loss of test fee income. Any future changes to the testing system which increase costs will have to be funded by fees, given the aim for the testing service to generally be self-sufficient (despite the impact of COVID-19). It should also be noted that current DVSA forecasts expect that ‘Vehicle Services’ (which includes vehicle testing and compliance activities) will be in deficit over the next 5 financial years.
Performance during the COVID-19 pandemic
The second consideration referred to in the TOR relates to performance during the COVID-19 pandemic and the restart (August 2020 to November 2020), including whether the operation of the system under stress reveals any more systematic issues that need to be considered to ensure service resilience.
Background
DVSA stopped mainstream heavy vehicle testing on 21 March 2020, following the announcement of the UK-wide lockdown. At the time that decision was made, DVSA was finding it harder, operationally, to meet its commitments at all ATFs due to staff sickness, unavailability of vulnerable staff and unavailability of staff due to childcare responsibilities (because schools had closed). It is also noted that at that time there was no ‘safe’ standard operating procedure for testing – and many of the controls that we now take for granted to allow for COVID-safe operating (such as face coverings and social distancing) were not fully understood.
Following several discussions with ministers, the decision was made to pause heavy vehicle test services, and to exempt vehicles that would have required a test.
While all mainstream services had paused from this point, some low-volume testing services continued, to keep vehicles on the road. This was to cater for annual tests for vehicles returning to service (that is, vehicles that were substantially out of MOT and so could not be sensibly exempted from test).
While testing was suspended, vehicles were issued with Certificates of Temporary Exemption (CTEs), which added 3 months to the test due date. This meant that when testing was resumed, there could be double the usual number of vehicles requiring a test in some months (both the vehicles that had always had tests due in that month and vehicles that had their test date shifted into that month). If testing requirements were simply reintroduced then demand would therefore have exceeded supply.
So, powers were taken in the Business and Planning Act 2020, to permit the issuing of longer CTEs, to vehicles and operators with good road safety profiles. This system was in place for tests due from September 2020 onwards. The testing backlog will gradually be reduced over time, generally prioritising testing for vehicles with the poorest road safety profiles.
Discussion of performance during COVID-19
The review aimed to understand whether there are lessons to be learned from the period of COVID-19 disruption and the recovery from it, which should be incorporated into how the testing system works in the future. Clearly, the powers now available via the Business and Planning Act will help to ease problems more effectively in the future, but there may be other issues that were brought to light by the system being under stress.
A point was raised by members of the panel about the overall resilience of the testing system. It was noted that testing was halted while the overall logistics industry continued to operate, highlighting a general lack of resilience of the system in response to this adverse event. This caused disruption to normal testing schedules and maintenance processes, due to modified test dates – and before announcing that programme of exemptions, there was considerable uncertainty within the industry.
It was also highlighted that the halting of testing and introduction of CTEs caused disruption in work planning and had a financial impact for ATFs, as bookings that had been in place may have been cancelled after the vehicle received a CTE. Members of the panel felt that improved communications from government would have helped to reduce uncertainty.
But it was noted that there are some inevitable delays due to the need for communications to be signed off, and DVSA was often unable to communicate operationally as fast as may have been desirable from customers – but the aspiration for faster responses is noted.
It is important to note that DVSA had made changes to improve its communications – both directly to operators, and also through trade bodies (establishing a weekly forum that updated on COVID-19 issues). A COVID-19 safe Standard Operating Procedure and a supply of personal protective equipment (PPE) are also now in place to conduct tests. This has meant that during later local and national lockdowns, vehicle testing has been able to safely continue.
Another issue raised by the panel was the remote clearance of prohibitions, by review of documentary evidence. It is not clear how robust this process was and attention may need to be paid to monitoring these cases. Finally, a suggestion was made to have a central point of contact for operators to raise issues with the testing system. Responsibility for the testing policy and operational matters is split between DVSA and DfT, so having a single place where issues could be reported would help ensure these are properly fed into decision-making.
Relationships, roles and responsibilities
The third consideration which the review aimed to understand was the relationships between DVSA, ATFs and operators and how effective these are overall.
The ultimate end-customers of the heavy vehicle testing system are the public, who benefit from having a robust system in place, which helps to maintain road safety. At a more operational level, the present organisational structure of testing means that ATFs are customers of DVSA (as they bid for testing resource) while operators (the end-user of the system) are customers of ATFs, although some operators have their own ATF and are therefore also customers of DVSA.
The system can often be further complicated as operators use third-party maintenance providers, who may arrange testing for them.
Panel views
While overall operational relationships across DVSA, ATFs and operators are generally good, the issues described can lead to tensions. So, resolving the issues of flexibility and transparency would be expected to improve the function of relationships. It was felt that codifying (at a fairly high level) the behaviour of each of the 3 participants in the system in an ideal world would be beneficial, as an aspiration for how the relationship should work. A suggestion of how this should be described is in Annex C, which could be used as a basis for further consideration.
Stakeholders felt that the current restrictions on tester supply had resulted in stagnation of the service provided by ATFs and that this would continue as long as utilisation was the principle measure of DVSA performance, rather than the service provided to the end-user. In an open market, there is typically competition, but the tester supply constraints limit growth at individual ATFs, which for example may be growing demand through high-quality service.
As has been discussed, the limit on tester supply has been driven by the desire for DVSA to operate efficiently but has also resulted in difficulties for some operators in booking testing slots.
As mentioned in Accessibility of bookings, there was general dissatisfaction among the stakeholders about the transparency of the booking system for tests. These issues were reported to be particularly acute for specialist tests, such as ADR, and for retests after a fault had been rectified.
But there was not a desire for DVSA to centralise a booking system (as the intention of the ATF system was to allow the private sector to deliver a better service than DVSA was doing). The panel, therefore, suggested that greater transparency in the system would be desirable so that operators could more easily find available slots.
This would involve actions (and possibly the requiring of actions) by ATFs. DVSA has recently introduced a simple ‘ATF Capacity’ service, aimed at providing further transparency, but, it’s not yet clear how effective this will be at improving the situation.
Some stakeholders noted that significant infrastructure investments had been made by their members into facilities that could serve as ATFs if they were given authorisation. New entrants are locked out of the market, while existing businesses who wish to diversify to get extra income cannot, despite having previously expected to be able to apply for ATF status.
It was also noted that having a clear set of agreed measures for the service as a whole, that could be actively monitored and managed by DVSA, operators and ATFs would help in ensuring transparency. It would also, with regard to the complexity of the customer-supplier relationships set out above, help to bring joint focus on continually improving the overall service. This is explored in more detail under Outcomes.
Trade union views
Prospect, the trade union representing those members delivering the testing of heavy vehicles, submitted a paper to the panel which stated that: “there needs to be a greater level of clarity and openness of the booking process, how ATF decisions on priority bookings are made and a greater level of access to that information to all end users to ensure consistency of approach and allocation of resources.”
The Prospect submission also addressed that the role of testing staff is to ensure the roadworthiness vehicles presented for test and as such testing is a matter of public safety. But they also agreed that the sometimes complex nature of the contractual arrangements can play a part in difficulties obtaining and completing a test.
In their view “the review would need to consider how this contractual interplay works, who is defined as the ‘end customer’, how a successful process may be defined and by whom as there will be different views on those metrics depending on where the individual or company sits in the process”.
In response to comments raised about the utilisation of testing staff, Prospect noted that timings of vehicle tests are pre-booked by ATF owners with the vehicle operators. DVSA and the ATF have prearranged timeslots based on the requirements of the ATF operator for the testing to take place, and so it’s the responsibility of the ATF to effectively manage the use of tester time.
Prospect were confident that there are very few examples of testing staff who are simply unable to attend the ATF at the arranged time (and this would appear to be reflected in the low cancellation rates shown in the data in Accessibility of bookings).
They argue that what is “more prevalent are examples of poor lane management at the ATF or the testing lane and facilities being presented in poor order (testing machinery not calibrated correctly, for example), which leads inevitably to delays and inconvenience to customers”. In their opinion, the review should make a recommendation that sets out “parameters about lane management and state of facilities at the ATF” to make these sorts of delays less frequent.
It is noted that issues such as this could be managed through joint focus across DVSA and ATFs (see Outcomes).
One final point raised in the Prospect submission was that of the condition and quality of some of the heavy vehicles presented for testing. Vehicle owners are responsible for presenting vehicles for test in good order, but, in their view, too often this is not the case. Prospect suggested this could be considered as it can cause delays in testing throughout the day.
Again, it is noted that issues such as this could be managed through joint focus across DVSA and ATFs (see Outcomes).
Outcomes
The final consideration set out in the TOR relates to the definition and measurement of outcomes and outputs for vehicle testing, including reviewing the DVSA Business Plan Key Performance Indicator.
Current and future metrics
At present, DVSA holds a variety of metrics related to the heavy vehicle testing system. Some of these are related to vehicles, such as fail rates (around 12%) and the causes of failures. Others relate to the satisfaction of ATFs with the service they receive from DVSA and the quality of service provided, such as cancellation rates.
DVSA record operator satisfaction with their testing experiences, such as their satisfaction with the availability of appointments, although some elements, more within ATF control, are not covered in detail. Operators are also surveyed for their opinions on DVSA, overall, although this includes issues such as licencing and other DVSA services.
Of operators surveyed, just under three-quarters (73%) rated the service they received from DVSA as good, and less than 1 in 10 (6%) rated it as poor. These results have stayed similar compared to the previous year’s findings. Large and medium-sized operators are significantly more likely to hold positive perceptions towards the service, and smaller operators are less likely.
Ensuring that the areas measured by DVSA provide an accurate and useful picture of the testing system is important to ensure that it is working effectively and delivering on the intention of improved road safety outcomes. This survey and the detail beneath the headline figures provides a rich source of information to enable areas for improvement to be identified, and the effectiveness of service changes to be assessed.
DVSA also has a Key Performance Indicator in its business plan, focused on the number of testing slots that have been confirmed with ATFs that get cancelled by DVSA. Whether this is an appropriate indicator, or if it should be replaced or added to, was discussed extensively by the stakeholder group.
For the past 3 years, DVSA has worked on reducing the number of testing slots confirmed with ATFs that they cancel. This was in response to an unacceptably high number of confirmed slots being cancelled by DVSA. On this measure, DVSA has improved significantly in recent years. From 2019 to 2020 (up to the point where testing was suspended), DVSA attended more than 99% of confirmed slots with ATFs and ATFs cancel more frequently than DVSA does.
While noting this is a positive step, it is clear that basing success or otherwise heavily on this metric misses out other negative aspects of experiences of some ATFs and operators. In particular, this number does not reflect the number of testing slots requested by ATFs or the experiences of operators (noting that there are parts of the user experience not under the control of DVSA).
Also, the failure to use a DVSA tester slot may in some circumstances be the correct decision for the ATF and its customer. But DVSA is not recompensed for the wasted time of the tester (and hence the loss of testing resources). An issue to consider concerning fees could be charging for testing slots not used when DVSA testers had been made available.
There were several suggestions made during the course of the review of metrics that could be included as future targets for DVSA, to more accurately reflect how the testing system is performing overall. The number of testing slots cancelled by DVSA is clearly important and relevant when ensuring that DVSA utilises testing resource efficiently and stakeholders did not argue that it should be discounted. Other ideas for extra metrics included:
- a measure of how accessible bookings are to operators, for example. what percentage of testing slots are available when required
- how far operators are required to travel for tests
- what percentage of slots requested by ATFs are agreed with DVSA
- whether the specific days requested by ATFs are met by DVSA
- whether specific test days requested by operators are met
- how long it takes for an operator to book a retest and/or prohibition clearance (short notice bookings)
- severity of test failures
- percentage of complaints resolved within a specific timeframe
It was generally agreed that these were useful metrics and should, where possible, be monitored jointly and openly. But some of these are not fully in control of DVSA, as ATFs operate the booking systems and provide the service.
Some of these may also be difficult to consistently measure and if they are possible, will need a period of gathering baseline data, to understand what a sensible target may be. Others may be essentially impossible for DVSA to change (for example, an operator based a long distance away from an ATF may always need to travel some distance to get a test).
There was some disagreement among the stakeholder group about how reasonable it was to include some or all these issues among DVSA targets. But, with suitable caveats in place, pragmatism about what is possible to measure and reasonable levels of ambition, it was generally agreed by the panel that a wider variety of target metrics for DVSA is desirable and would help in identifying areas for improvement, be that with DVSA or other parts of the system.
In particular, including metrics that record the experiences of operators was viewed as important. Even if a problem was predominantly the responsibility of ATFs, knowing about it is still useful to improve the service as a whole. For example, DVSA could make ATFs aware of the problem and work together to solve it or share best practice methods to help alleviate it going forward. More collaborative work between all parties was highlighted as an aspiration.
DVSA has suggested several possible metrics which could be considered and developed further – some of these are already measured (or measured in part) while others are new. These have been grouped into 4 areas:
- service to operators (from both DVSA and ATFs)
- service to ATFs
- effectiveness and quality
- investment.
While specific targets for each new metric have not been set yet, this could provide a useful framework for future work that focuses on defining how each metric would be measured and setting a reasonable initial baseline (which may need changing once data starts to be collected).
Proposed measures related to service to operators are:
- operators’ satisfaction measured via a satisfaction score or the percentage with satisfaction above a certain score. DVSA already measures operator satisfaction via a survey. It may also need to be considered whether the satisfaction of operators, or vehicle presenters (or both) is the main issue
- test accessibility, such as the percentage of test appointment requests satisfied (for both time and location) or DVSA capacity per month provided by area versus demand. This may be challenging to measure, as ATFs would need to provide data, or a survey could be used
- short notice test accessibility, for example, the percentage of short notice appointments available within a certain number of days and distance. Again, ATFs would need to provide data to allow this, or a survey would need to be used
Proposed measures related to service to ATFs are:
- ATF satisfaction measured via a satisfaction score, or the percentage with satisfaction above a certain score. DVSA already monitor this via a survey
- utilisation – the percentage of days DVSA can test at ATFs when planned. This is already measured, but could also include ATF cancellations
- cancellations by DVSA, via the number of cancellations without a certain number of days’ notice. This is already measured
- DVSA meeting ATF demand, via a measure of how far DVSA confirm ATF requests for testers and how well these are met (for example, whether this is the exact day requested or not)
Proposed effectiveness measures are:
- the effectiveness at improving the ‘state of the fleet’ via the results of the Fleet Compliance Survey (which DVSA already carries out)
- test failure rate, including both the number of failures and their severity (although it was noted that this should not be used as a ‘target’ – it is a relevant thing to measure and discuss as part of understanding the service’s effectiveness)
Proposed quality and investment measures are:
- test error rate
- investment in improved staffing, via days spent training on test quality, improved service and new vehicle technologies
Conclusions
This section sets out conclusions about the performance and potential improvements to the vehicle testing system, informed by discussions with the panel and by consideration of the issues set out in this report. These conclusions inform the Recommendations for potential future work.
Context of the review
The review has confirmed annual heavy vehicle testing is an important element of the regulatory regime designed to support public safety. Its performance and role must be seen in the wider context of the whole regime. For most heavy vehicles, annual testing is a different context to the annual testing of most light vehicles, where the test is often used to pick up defects and is associated with annual servicing and maintenance.
Most heavy vehicles tested are also regulated through operator licensing rules, supervised by traffic commissioners and associated with specific DVSA enforcement.
Under the operator licensing rules, operators are required to conduct maintenance and inspections regularly (usually about every 6 weeks) to the standard of the annual test. The separate annual test then provides an external, independent check on the effectiveness of those systems.
The development of the DVSA earned recognition scheme (a scheme to encourage high compliance), the strategy of DVSA and the Traffic Commissioners to target the seriously and serially non-compliant, and targeting of compliance work towards those with poorer maintenance records are important pieces of work, alongside this review.
They deserve continued attention outside the review, including in the context of acute pressures further to COVID-19 on operators and public authorities. There would be merit in examining whether the frequency of testing should be changed for certain groups of vehicles. In particular, analysing whether longer test intervals for vehicles belonging to operators in the earned recognition scheme would be justified from a road safety perspective and whether this could be practically implemented.
Effects of the current testing system on road safety
There is a disconnect between improving trends in annual test pass rates and static underlying roadworthiness levels, as measured by the fleet compliance survey (random non-targeted checks to assess the ‘state of the fleet’) and picked up by enforcement activities. This indicates that vehicles are heavily prepared for test from their ‘normal’ operating condition. The very existence of the test does create a positive improvement, bringing all vehicles to the minimum standard at least once a year.
Annual test pass rates for certain categories of vehicle, including those run by operators with earned recognition and relatively new vehicles are very high. This, together with wider information on the positive state of those vehicles, enabled many vehicles to have test due dates extended by 12 months in the wake of the suspension of heavy vehicle testing during the period March 2020 to July 2020.
It is not possible to conclude now about whether the lack of heavy vehicle testing for 3 months was associated with a reduction of standards or public safety. Compliance rates were affected by changes in how enforcement was done (including by DVSA, which was one of very few European enforcement bodies to maintain substantial enforcement activity through spring 2020).
Short-term behaviour during a crisis period may, in any case, not be indicative of longer-term behaviour during a more normal period. Many vehicles are now going to have test intervals of 24 months as opposed to the usual 12, which may provide an opportunity to assess behaviour further.
Current and historic performance of the testing system
The introduction and predominant use of ATFs to replace most DVSA testing stations has increased choice and flexibility for many users. It has also enabled substantial business investment in facilities and wider maintenance practices. But the need to use independent and state-employed testers spread over many sites necessarily constrains flexibility. It is important to note that genuine independence of testers is fundamental to the objectives of (and indeed legal requirements for) annual vehicle testing.
Currently, the system generally delivers the most fundamental need of customers of enabling vehicles to be tested and kept in service. Indeed, it has done so since it restarted after the COVID-19 stoppage, confounding significant levels of concern it would not – doing so through concerted action by DVSA to meet that goal.
There were significant service delivery problems concerning heavy vehicle testing around 2017. DVSA implemented measures in response to that disruption and it is continuing to implement service improvements. It was broadly agreed in the stakeholder group that the implementation of the DVSA’s planned service improvements would make further contributions to the better operation of the testing system.
Efficiency and constraints on examiner resource
The heavy vehicle testing system is not in crisis. Its operation, however, results in a greater degree of wider cost and inconvenience for customers and testing facilities than they believe is reasonable. The efficient use of testing staff required to balance DVSA scheme accounts under managing public money obligations and keep fees down is associated with testing facilities having to operate in ways that may be less than optimally efficient for them or for their customers.
Satisfaction survey results show that the majority of operators can get test slots readily and when convenient. The necessary planned nature of vehicle operation means maintenance and the annual tests must be scheduled well in advance. But, in some cases, operators need to ensure vehicles are kept in service and at times have to make significant efforts to do this, with consequent costs.
This includes having administrative staff contact multiple testing facilities seeking slots (including urgently for vehicles requiring a re-test or prohibition clearance to get back into operation), vehicles being tested at times of the week when fleet availability for service should be maximised and some vehicles travelling significant distances to test.
But there are areas for improvement in how ATFs are operated. The current moratorium on ATFs does constrain the market and can limit the options available to operators. It is also noted that the process of only scheduling resource to ATFs quarterly can give unwelcome operational uncertainty to ATFs and their operator customers.
Moving (back) to a scheduling approach more in line with the annual cycle of testing would appear to offer significant benefits. This would tie in with operator requirements about planning this well ahead. As part of the above ongoing review of performance measures (see recommendation 5), the effect of improvements should continue to be reviewed to understand whether further steps are needed.
The operation of the testing system inevitably results in costs to industry. Where there are difficulties in obtaining reasonably timed and located bookings, there are unnecessary costs to industry.
The utilisation of DVSA resources at ATFs is a useful measure of some aspects of the current system. The current 93% performance is high – and reflects positive work between DVSA and ATFs to achieve this. Current fee levels reflect those levels of utilisation being achieved. But industry has indicated it may value more capacity being available. For vehicle operators, this could provide flexibility around unplanned vehicle tests (such as prohibition clearances or retests) or where test location needs to be changed at short notice. For ATFs, extra capacity would better encourage competition. Industry has indicated that it would be willing to pay for such improvements.
A further consequence of tight capacity is that the interactions with and delivery of other vehicle tests are difficult to manage in some cases. Other DVSA operated testing regimes, including those related to dangerous goods vehicles, must be considered in an integrated way in the context of the development of the annual heavy vehicle testing service.
Performance measurement
There are several types of outcome sought from the vehicle testing system – for public safety and for the various people and organisations using it. These are not all currently measured and indeed some are very difficult to quantify.
The current principal measure of DVSA performance in its business plan – cancellation rates of confirmed testing sessions due to DVSA examiners not attending – and its supporting measures (related to DVSA efficiency) are important in themselves. If a DVSA examiner is not available, this can be highly problematic for the testing facility and customer.
Cancellations of tests by testing facilities have different effects and are not a comparable metric to cancellations due to DVSA. For example, they may arise from a customer not requiring a test or the testing facility not being able to be matched to a customer because it was inconvenient. But this is still an important measure in understanding the health of the overall delivery model.
There does need to be a wider set of measures. Some (but not all) of these could be associated with targets or standards. That more balanced set of measures could include some about DVSA fulfilment and commitment to test facility scheduling, measures of how well operator needs are fulfilled, measures of user satisfaction (testing facility and operator) – potentially derived from existing surveys, quality measures and test outcomes (such as test results, plus intelligence for wider use).
The only information held centrally on access to appointments for operators is through survey. This is useful, but no ‘data’ is held on actual performance and the survey may include an element of perception. More work is needed on how this could be collected, whether there would be value in doing this more robustly and whether that would be best done by more DVSA involvement in or control of bookings, or requirements for more data sharing.
In having a wider set of measures, it is important also to establish a regime for managing them. In the past (before the introduction of ATFs), DVSA managed a service level agreement through regular review sessions with the trade and it is suggested that a similar approach could be worth exploring. So, it’s recommended that measuring a wider set of performance indicators should be accompanied by setting up frequent review of those measures jointly between DVSA, ATFs and operator representatives so that areas for improvement can be understood and improvement actions can be agreed upon by all parties.
Relationships
The heavy vehicle testing system has 3 types of participant – DVSA, the testing facilities and the end customer (usually a road transport operator). The relationship between DVSA and the end customer is necessarily indirect. But the testing facilities and operators are all customers of DVSA.
Better information about the performance of the system and transparency should help these working relationships. But it’s clear that there is some dissatisfaction and ambiguity about how the working relationships work in practice. The fundamental obligations and responsibilities of the 3 types of participant in the testing system should be codified afresh. The quarterly performance review session proposed in the above section may be a helpful way of doing this.
Heavy vehicle testing was suspended across Great Britain very quickly as the COVID-19 situation escalated during March 2020. Meanwhile, much of the logistics and bus industries continued to operate with COVID-secure procedures, including for maintenance inspections like the annual test. There were difficulties with communications at the time.
Stakeholder feedback includes that there have been previous difficulties with having genuine dialogue with the leadership of the DVSA about the development of this service. Operational issues have also arisen as a result of a lack of direct contact between operator and DVSA at the time of test booking.
DVSA should consider a reset of the relationship and way its leadership liaises with representatives of service users in the road freight and passenger industries, as well as providers of testing facilities, in respect of the medium-term development of the testing service.
Other aspirations
It is also important to note that there is significant stakeholder appetite for the further consideration of other testing models, notably delegated testing. But the scope of the review is limited to the TOR. This review does not have a remit in considering any proposals to change the structural model of testing.
Potential future work
As mentioned before, the heavy vehicle testing system generally is not in crisis. The current system delivers the most fundamental need of customers, enabling vehicles to be tested and kept in service. To note, the independence of testers is critical (and a legal requirement) to delivering the objectives of annual vehicle testing, so any changes emerging from the above workstreams should not compromise this.
It is also noted that the current regime is funded through the test fee, so effects on fees must be understood in proposing change. Assuming the recommendations are accepted, then further work will need to be done on each of these to determine detail and, potential implementation timescales. Given the dependency on fee changes (and potentially contract changes with ATFs), some of these may take significant time to fully implement.
The areas recommended for further attention can be found in the Executive summary.
Annex A: Terms of reference
Purpose of review
- This review will focus on understanding whether current roadworthiness testing is fit for purpose and provide evidence on whether it supports or hinders the effective operation of the haulage and logistics industries. It may also identify areas for further investigation.
- The review will consider if heavy vehicle roadworthiness testing can be improved to best meet the needs of customers and suppliers, while delivering the road safety and environmental benefits of vehicle testing (for example, via ensuring the effectiveness of operators’ vehicle maintenance systems).
- The review will consider the service during business as usual and also the period from March 2020 to August 2020. It will develop recommendations to DfT ministers, to inform strategic operational and ministerial decisions.
Main considerations
Consideration of how heavy vehicle testing has performed and the main issues during the 2 years up to February 2020. This is with a view to assessing end-to-end service delivery and identifying developments to maintain and improve road safety, as well as ensuring that customer needs are met. It includes in relation to:
- how accessible bookings have been to customers
- how effective staff allocation to ATFs has been
- how effective ATFs have been using DVSA staff
- the implications of DVSA service stability plans on heavy vehicle testing, completed since summer 2018
- the purpose, effect, sustainability and fairness of the current policy on new ATFs
- whether fees are enough and used effectively. This consideration should inform any future formal fee review
- the main strengths, weakness, opportunities and threats of the current service model
Performance during the COVID-19 pandemic and the restart (August 2020 to November 2020), and whether the operation of the system under stress reveals any additional systematic issues that need to be considered to ensure service resilience.
The relationships, and roles and responsibilities between DVSA, ATFs and operators including:
- how clear and understood are the roles and responsibilities
- how effective are the relationships
Review the definition and measurement of key outcomes and outputs for vehicle testing for all customers and stakeholders, including reviewing the DVSA Business Plan Key Performance Indicator. Part of this will also be a review of whether the measurement of main outcomes translates to improved road safety outcomes.
Working method
The review will be led by DfT officials with the active and full involvement of DVSA officials.
All aspects of the review will be assisted by a stakeholder panel and outputs of the review will be available in summary form to this panel as the review develops.
It was agreed that an interim report would be completed in October 2020 and a final report in December 2020.
Stakeholder panel – Terms of reference
Role
The panel will assist in ensuring that the review can achieve its purpose and undertake the main considerations. In particular it is anticipated to cover the following issues:
- resilience and responsiveness in the testing system
- expected lead times for test bookings, and local variations
- understanding and reconciling customer, testing facility provider and DVSA information, along with evidence and feedback about the current testing system
- establishing a single, clear evidence base with which to assess levels of testing performance including against stated outcomes of improving road safety.
It will act as a sounding board for DfT officials conducting the review, framing key issues, interrogating data and assisting in developing recommendations to DfT ministers.
It was agreed the panel would meet on a fortnightly basis from September 2020 until the scheduled end of the review period at the end of December 2020.
The panel is additional to established relationships between the logistics and other sectors and government.
Membership
The panel will be chaired by an official from the DfT with active DVSA involvement. There is the potential for ministerial involvement if deemed appropriate. The members of the panel include those who have expressed an interest in attending, following Baroness Vere’s letter of 21 July 2020. The panel will include representatives of testing customers, testing and maintenance facilities and the traffic commissioners. Membership of the panel will continue be at the discretion of ministers. The current confirmed membership is available.
The panel may invite others to attend its meetings and contribute to its work as appropriate.
Meetings and papers
Meetings will initially take place every 2 weeks, and for 90 minutes. These timings will be reviewed at the first meeting. Frequency and duration may be varied at the discretion of the panel as a whole or DfT.
A secretariat function will be provided by DfT. The secretariat will be responsible for scheduling meetings, agreeing on the agenda and circulating papers at least 2 working days in advance of the meeting.
The secretariat will prepare a note of the meetings, which will be considered draft until agreed by the panel at the next meeting. The secretariat will also maintain a log of agreed actions and will report to members on progress on these at each meeting.
Information and papers will be requested from panel members as required.
Approval, review and assurance of terms of reference
The TOR for the review have been agreed by ministers.
Initial list of stakeholder panel members:
- British Vehicle Rental and Leasing Association (BVRLA)
- Logistics UK
- Confederation of Passenger Transport (CPT)
- Association of British Insurers (ABI)
- Society of Motor Manufacturers and Traders (SMMT)
- Road Haulage Association (RHA)
- Authorised Testing Facility Operators Association (ATFOA)
- Retail Motor Industry Federation (RMIF)
- Institute of Road Transport Engineers (IRTE)
- The Parliamentary Advisory Council for Transport Safety (PACTS)
Annex B: heavy vehicle testing fees
Table 6: HGV testing fees for motor vehicles
Test type | Vehicle type | DVSA normal working hours | DVSA out of hours | ATF normal working hours | ATF out of hours |
---|---|---|---|---|---|
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 2 axles | £112 | £150 | £91 | £129 |
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 3 axles | £144 | £182 | £113 | £151 |
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 4 plus axles | £177 | £215 | £137 | £151 |
Retests within 14 days, prohibition clearances (partial inspections) | 2 axles | £49 | £69 | £35 | £55 |
Retests within 14 days, prohibition clearances (partial inspections) | 3 axles | £69 | £89 | £49 | £69 |
Retests within 14 days, prohibition clearances (partial inspections) | 4 plus axles | £91 | £111 | £65 | £85 |
Part paid retests | Any number of axles | £13 | £13 | £13 | £13 |
Notifiable alteration | £27 | £40 | £27 | £40 |
Table 7: HGV testing fees for trailers
Test type | Vehicle Type | DVSA normal working hours | DVSA out of hours | ATF normal working hours | ATF out of hours |
---|---|---|---|---|---|
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 1 axle | £51 | £75 | £41 | £65 |
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 2 axles | £70 | £94 | £54 | £78 |
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 3 plus axles | £84 | £108 | £64 | £88 |
Retests within 14 days, prohibition clearances (partial inspections) | 1 axle | £25 | £38 | £18 | £31 |
Retests within 14 days, prohibition clearances (partial inspections) | 2 axles | £35 | £48 | £25 | £38 |
Retests within 14 days, prohibition clearances (partial inspections) | 3 plus axles | £46 | £59 | £33 | £46 |
Part paid retests | Any number of axles | £7 | £7 | £7 | £7 |
Notifiable alteration | £27 | £40 | £27 | £40 |
Table 8: PSV testing fees
Test type | Vehicle type | DVSA normal working hours | DVSA out of hours | ATF or DP normal working hours | ATF or DP out of hours |
---|---|---|---|---|---|
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | Up to 22 passengers | £127 | £165 | £103 | £141 |
First tests, annual tests, prohibition clearances (full inspections), retests after 14 days | 23 or more passengers | £163 | £215 | £128 | £180 |
Retests within 14 days, prohibition clearances (partial inspections) | Up to 22 passengers | £54 | £73 | £39 | £58 |
Retests within 14 days, prohibition clearances (partial inspections) | 23 or more passengers | £79 | £104 | £56 | £81 |
Part paid retests | Any number of seats | £12 | £12 | £12 | £12 |
Table 9: pit fees
Pit fee type | Fee | |
---|---|---|
Where a test is being carried out on an HGV | £55 | |
Where a test is being carried out on a trailer | £40 | |
Where a test is being carried out on a PSV | £70 |
Annex C: the operator, DVSA, ATF relationship
Ideal behaviour for each member of the system
During the review, it was agreed that there would be value in codifying the ideal behaviour of each member of the system. The below is not intended to be a final version but may be used as a basis for further discussion and refined.
The DVSA-ATF contract already stipulates several specific requirements for both DVSA and ATFs to adhere to. The intention is not to replicate that contract but describe ideal behaviour at a more general level.
Operators
- maintain vehicles in line with the Guide to Maintaining Roadworthiness
- have in place a planned programme for tests – such that volume testing is booked well in advance
- work with ATFs to help anticipate peaks in demand
- ensure that vehicles are fit for test
- attend booked tests and cancel tests than will not be attended with as much notice as possible
- present vehicles in safe and testable condition
ATFs
- be open to all (generally)
- guide operators on how to present and provide them with other information to aid efficiency
- put in place procedures and ways of working to aid efficiency and safety
- provide information to the ATF capacity service (and by other means to make available capacity readily transparent)
- if possible, respond to DVSA requests to ‘squeeze in’ customers who are struggling to find a test elsewhere
- request tester resource needed for demand (obviously understanding that some ATFs are keen to grow and get more slots so will want to request more)
- anticipate peaks in demand by working with operators
DVSA
- test to published test standards
- provide testers in line with an agreed scheduling process
- be flexible to work with ATFs and operators to test what is presented
- be fair in the distribution of tester resource
- aim for consistency in allocation year-on-year
- put in place planned or published improvements,
- help out operators who are struggling to find a test slot
- respond helpfully to queries about testing standards and other issues requiring an expert view