Response to the Big Brother Watch report
The Surveillance Camera Commissioner, Mr Tony Porter, responds to the report by Big Brother Watch into facial recognition technology.
I welcome the publication of the Big Brother Watch report (PDF, 3.5 MB) as in my view it adds value to a much needed debate on a matter of growing public interest, the public interest which demands clear legislation, transparency in governance and approach and a coherent and effective regulatory framework in which they can derive confidence whenever and wherever their civil liberties are at risk from the state. I shall consider the report carefully.
The effective regulation of use of face identification technology (commonly referred to as Automated Face Recognition or AFR) by the police is a priority of the National Surveillance Camera Strategy and a matter which I have been addressing as a priority for some time now, engaging with the National Police Chief’s Council, the Home Office, fellow regulators and Ministers alike.
The police have to abide by the Surveillance Camera Code of Practice which I regulate under the terms of Section 33(1) Protection of Freedoms Act 2012. Those familiar with the content of the code will know that it is explicit in that face identification technologies used by the police in England and Wales will be regulated by it. That is not to say that I consider existing or indeed anticipated legislation as being wholly sufficient in these matters. I do not. My fellow regulators, the Biometrics Commissioner and in recent times the Information Commissioner have added welcome contributions to the debate.
I do think that the police are genuinely doing their best with AFR and to work within the current and anticipated legal regulatory framework governing overt surveillance. That framework is far less robust than that which governs covert surveillance, yet arguably the evolving technological capabilities of overt surveillance is the equal in terms of intrusion, to that which is conducted covertly. It is inescapable that AFR capabilities can be an aid to public safety particularly from terrorist threats in crowded or highly populated places. Andrew Parker, the DG of the Security Service rather eloquently set out the threat context to our society only recently. It is understandable that there is an appetite within law enforcement agencies to exploit face identification capabilities, an appetite which is doubtlessly borne out of a duty and determination to keep us safe. This technology already exist in society for our convenience and therefore it is arguable that the public will have something of an expectation that those technologies are so used by agents of the state to keep us safe from serious threats, but only in justifiable circumstances where their use is lawful, ethical, proportionate and transparent.
In the context of safety, the public also need to be safe from unlawful, disproportionate and illegitimate state intrusion, and they must have confidence that those technologies have integrity. In my view, the challenge is arriving at a balance and for that to happen there need to be a clear framework of legitimacy and transparency which guides the state, holds it to account and delivers confidence and security amongst the public. I have yet to have confidence that government has a satisfactory approach to the issue in delivering a framework upon which the police and others can rely and upon and which the public can have confidence, but I do believe that we are on a journey to that destination and a journey is fuelled by constructive and challenging debate.
The commissioner is available for media interview and contactable at scc@sccommissioner.gsi.gov.uk