The murky world of biometric engines
Individuals are unknowingly at risk of becoming victims of deep fake scams or online stalking through seemingly innocuous activities, according to biometric security firm Daltrey. Participating in public events, or merely being a spectator, leaves individuals open to their image being uploaded into a biometric engine, a technology that is vastly more complex and powerful than a simple photo library.
Daltrey's CEO and co-founder Blair Crawford uses the popular annual City2Surf road running event held in Sydney each August as an example. According to Crawford, when the 60-odd thousand participants register and attend the event, their photos are taken and uploaded into the German-owned Sportograf facial recognition system.
“Each of the faces in the photos is subjected to a facial recognition system that maps their faces. This is the start of the issue, as people may not be aware that their images are being placed into such a system that is accessible by so many other people with so little protection. In addition, spectators’ images may unknowingly be captured in the background, uploaded and searchable, without the opportunity for them to consent,” Crawford said.
People can then find their face, or a someone that strongly resembles them, using a selfie or an image of someone they have. Once they have found a match for the picture of the person they are looking for, the selfie is erased (as per Sportograf’s privacy policy) but the images in the biometric registration system are retained.
“It’s possible for a stalker to track someone, for instance a participant or a minor who is captured in the background as a spectator, by accessing the images as they are not secured behind any sort of authentication. The images could be used to create a deep fake of the person, to confirm they were in the location of the event, and furthermore they are accessible anywhere in the world,” Crawford said.
He highlighted that in the case of spectators they have neither registered nor signed up to City2Surf's terms and conditions.
“Participants, who have registered and agreed to the terms and conditions, are unlikely to have read the details and fully understand the extent to which they have consented. This raises the key question of how biometric technology is outpacing the community’s understanding of its application, as we have seen recently with the Bunnings example,” he said.
Crawford argues that the responsible use of biometric technology is an imperative.
“Vendors of technology that can impact the security and privacy of people need to think through all potential consequences. Biometric programs must be built on a foundation of consent, where people must opt in based on a clear understanding of the scope and the value to the person opting in.
“In terms of a national framework, there are a lot of standards that already exist to guide the use and applications of biometric technology such as ISO/IEC 24745:2022, which defines the principles of confidentiality, integrity, and privacy protection of biometric information to make the use of biometrics safer. The focus should be on the adoption of these standards to safeguard the integrity of the users’ security and privacy,” he said.
CrowdStrike to buy Adaptive Shield
CrowdStrike is augmenting its SaaS security capabilities through the acquisition of Israeli-based...
LockBit named nastiest malware of 2024
LockBit, a ransomware malware known to have been used to attack Australian targets, has been...
Extreme Networks launches ZTNA solution
Extreme Networks' new ExtremeCloud Universal ZTNA solution combines cloud network access...