
Paul Schwartfeger, 36 Commercial, writing forNLJ , looks at R (Bridges) v CC South Wales [2020] EWCA Civ 1058, in which the Court of Appeal held South Wales Police made unlawful use of automated facial recognition (AFR) technology.
Schwartfeger notes there are many reasons why facial recognition systems may create bias, such as, ‘when unrepresentative data sets are used to “train” the software to recognise faces.
‘Improper representation of ethnicities and sexes, for example, can lead to significant error rates being programmed in.’
He highlights that in-built bias also affects AI technologies, for example, regulators in New York State announced an investigation last year into algorithms used by Apple to determine credit limits, after noticing a pattern of women being offered less credit than men.