Such biometric interfaces are only the outer pieces of the puzzle: Behind the scenes, passenger biometric data is passed through algorithms and existing databases to decide who’s in, who’s out and who gets a human once-over. However, the “garbage in, garbage out” rule of computer programming still applies: If a software team isn’t careful, bias can be written into the algorithm.
If a software team isn’t careful, bias can be written into the algorithm.
Data can run amok in all sorts of ways, but the stakes around biometric information are among the highest. As biometric data collection and CCTV security cameras become ever more present in modern airports, experts wonder whether these new technologies will give way to automated discrimination. Speaking about the security overhaul at Australian airports, University of Wollongong biometrics expert Katina Michael says the technology could threaten individual privacy and raises ethical quandaries that had not been properly disclosed to the public. “We are steam-training right through all of these technological transitions and we’re not really thinking about the ramifications,” she told the Guardian. “I see the perceived benefit, but what I do know is that there will be real costs, human costs, not only through the loss of staff through automation, but also through discrimination of people who may appear different.”