Consultation methodology: DIT consultations on trade negotiations with the US, Australia and New Zealand, and on the UK potentially seeking accession to the CPTPP (web/online version)
Updated 17 June 2020
Responding to the consultations: overview
This report summarises the methodology used to analyse all responses to 4 public consultations on the economic, social and environmental impacts of possible free trade agreements (FTA) between the UK and the US, Australia and New Zealand after the UK leaves the EU, as well as the UK potentially seeking accession to the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP). The public consultations were undertaken by the Department for International Trade (DIT) for 14 weeks between 20 July and 26 October 2018.
Those who wished to take part in the consultation could do so via one of a number of response channels. Such channels included an online questionnaire hosted on a Citizen Space website, by email, or by letter in the post. All responses submitted via these channels within the consultation period were processed as part of the consultation. DIT forwarded all of the responses received to Ipsos MORI’s public consultation team, who processed and included the responses within the consultation analysis.
The consultations included a number of free-text questions about priorities and concerns participants may have had about possible future FTAs between the UK and the US, Australia or New Zealand, or about potential UK accession to the CPTPP. Such questions were exploratory in nature and allowed participants to feed back their views in their own words. Not all participants chose to answer all of the consultation questions, as they often had views on certain aspects of the consultation, and made their views on these clear, but left other questions blank. Therefore, there were blank responses to certain questions. The figures in the reports are based on all participants commenting on the issues relating to the question (that is excluding those who did not answer) and this means that the base size (number of people the results for the question are based on) is different for each question.
As the analysis explores the themes which have emerged from what participants wrote in response to the consultation, these numbers need to be considered in that context. Some consultation participants have not necessarily expressed views about priorities or raised any concerns. Some responses were therefore coded as other comments.
Approach to coding and analysis
Analysis of the consultation responses required coding of the data. Coding is the process by which each individual response is matched against a series of themes that DIT and Ipsos MORI compiled, so that the content can be summarised, classified and tabulated. Each of these codes represents a discrete issue or viewpoint raised by a participant or number of participants in their verbatim responses. Coding of responses to public consultations is a proven method to accurately analyse and summarise what is being said in the responses – it is not a new process, but a tried and tested methodology.
The complete coding frame is comprehensive in representing the whole range of issues or viewpoints given across all the responses. The codes were continually developed throughout the consultation period as further responses were coded to ensure that any new viewpoints that emerged were captured and no nuances lost. Any one response may have had a number of different codes applied to it if a participant made more than one point, or addressed a number of different themes or viewpoints. Comments were coded in the section of the code frame they related to, rather than on a question-by-question basis. So for example, if a business mentioned priorities about rules of origin at a tariffs question, the comments were moved to the relevant rules of origin priorities question.
The code frames were developed for analysing both response forms submitted online, emails and letter(s) from all those who took part in the consultations. The complete codeframes have been published as a technical annex along with this consultation methodology document.
Receipt and handing responses
The handling of consultation responses was subject to a rigorous process of checking and logging to ensure a full audit trail. All original responses received on the consultations remained securely stored, catalogued and serial numbered for future reference. All stakeholder organisation responses to open questions in the response form, and unstructured responses via email and post were analysed and also coded into the main coded data set.
Developing an initial codeframe
Coding is the process by which free-text comments, answers and responses are matched against standard codes from an initial coding frame compiled to allow systematic statistical and tabular analysis. The codes within the coding frame represent an amalgam of responses raised by those registering their view and are comprehensive in representing the range of opinions and themes given.
The Ipsos MORI coding team drew up an initial code frame for each open-ended free-text question using the first 50 responses. An initial set of codes were created by drawing out the common themes and points raised across all response channels by refinement. Each code thus represents a discrete view raised. The draft coding frame was then discussed with the DIT Consultation team before the coding process continued. The code frame was continually updated throughout the analysis period to ensure that newly emerging themes within each refinement were captured. An updated code frame was sent to the DIT Consultation team each week during the consultation period.
Coding software
Ipsos MORI used the web-based Ascribe coding system to code all open-ended free-text responses found within completed response forms and from the free-form responses (that is emails). Ascribe is a proven system which has been used on numerous large-scale consultation projects. Responses were uploaded into the Ascribe system, where the coding team worked systematically through the verbatim comments and applied a code to each relevant part(s) of the verbatim comment.
The Ascribe software has the following key features:
- accurate monitoring of coding progress across the whole process, from scanned image to the coding of consultation responses
- an ‘organic’ coding frame that can be continually updated and refreshed; not restricting coding and analysis to initial response issues or ‘themes’ which may change as the consultation progresses
- resource management features, allowing comparison across coders and question/issue areas. This is of importance in maintaining high quality coding across the whole coding team and allows early identification of areas where additional training may be required
- a full audit trail – from verbatim response to codes applied to that response. Coders were provided with an electronic file of responses to code within Ascribe. Their screen was divided, with the left side showing the response along with the unique identifier, while the right side of the screen showed the code frame. The coder attached the relevant code or codes to these as appropriate and, where necessary, alerted the supervisor if they believed an additional code might be required
If there was other information that the coder wished to add they could do so in the ‘notes’ box on the screen. If a response was difficult to decipher, the coder would get a second opinion from their supervisor or a member of the project management team. As a last resort, any comment that was illegible was coded as such and reviewed by the coding manager.
Briefing coders and quality control
A team of coders worked on the project, all of whom were fully briefed and were conversant with the Ascribe coding software. This team also worked closely with the Ipsos MORI project management team during the set-up and early stages of code frame development.
The core coding team took a supervisory role throughout and undertook the quality checking of all coding. Using a reliable core team in this way minimises coding variability and thus retains data quality. To ensure consistent and informed coding of the verbatim comments, all coders were fully briefed on the proposals and the background to the consultation prior to working on this project. The coding manager undertook full briefings and training with each coding team member. All coding was carefully monitored to ensure data consistency and to ensure that all coders were sufficiently competent to work on the project.
The coder briefing included background information and presentations covering the questions, the consultation process and the issues involved, and discussion of the initial coding frames. The briefing was carried out by Ipsos MORI’s executive team along with representatives from DIT on Friday, 5 October 2018.
All those attending the briefings were instructed to read, in advance, the consultation materials and go through the response form. Examples of a dummy coding exercise relating to this consultation were carefully selected and used to provide a cross-section of comments across a wide range of issues that may emerge.
Coders worked in close teams, with a more senior coder working alongside the more junior members, which allowed open discussion to decide how to code any open-ended free-text comment. In this way, the coding management team could quickly identify if further training was required or raise any issues with the project management team.
The Ascribe package also afforded an effective project management tool, with the coding manager reviewing the work of each individual coder, and having discussions with them where there was variance between the codes entered and those expected by the coding manager.
To check and ensure consistency of coding, a minimum of 10% of coded responses were validated by the coding supervisor team and the executive team, who checked that the correct codes had been applied and identified issues where necessary.
Codeframe development
An important feature of the Ascribe system is the ability to extend the code frame ‘organically’ direct from actual verbatim responses throughout the coding period.
The coding teams raised any new codes during the coding process when it was felt that new issues were being registered. To ensure that no detail was lost, coders were briefed to raise codes that reflected the exact sentiment of a response, and these were then collapsed into a smaller number of key themes at the analysis stage. During the initial stages of the coding process, regular weekly meetings were held between the coding team and Ipsos MORI executive team to ensure that a consistent approach was taken to raising new codes and that all extra codes were appropriate and correctly assigned. In particular, the coding frame sought to capture precise nuances of participants’ comments in such a way as to be comprehensive.
Data processing
Once coding was complete, and all data streams combined, a series of checks were undertaken to ensure that the data set was comprehensive and complete. The initial check was to match the log files of serial numbers against the resultant data files to ensure that no responses were missing.
A check was then run again to ensure records existed for all logged serial numbers. During this process it was also possible to identify any duplicate free-format responses (for example where 2 cases for the same serial number appeared). Where duplicates were found, these were removed from the dataset. In addition, a few organisations completed the closed questions in the online form, and sent an emailed response with verbatim comments. In such cases, the responses were merged together for each organisation that did this, and counted once (and not twice).
Final codeframes
The final codeframes for the consultations are available in a technical annex published as a separate document.