An Australian woman with South Asian heritage has exposed a major flaw with Airbnb’s use of artificial intelligence after she was left unable to access the service.
Sydney woman Francesca Dias told her story to the panel on ABC’s Q+A, revealing she was left unable to make her own account due to an issue with the app’s AI.
She was left having to turn to her caucasian partner, who made an account with ease.
“So recently, I found that I couldn’t activate an Airbnb account basically because the facial recognition software couldn’t match two photographs or photo ID of me and so I ended up having my white male partner make the booking for me,” she said.
The story was met with horror by the panel, with host Patricia Karvelas describing her treatment as “really shocking”.
Ms Dias’ story was not surprising to AI expert and founder of the Responsible Metaverse Alliance Catriona Wallace, who said it was a societal problem causing the issue.
“Often society does not have good representation of the full population in its datasets, because that’s how biased we’ve been historically, and those historical sets are used to train the machines that are going to make decisions on the future, like what happened to Francesca,” she said.
“So it is staggering that this is still the case and it’s Airbnb. You would think a big, international, global company would get that s— sorted, but they still haven’t.”
Karvelas went on to question why large technology companies would not invest more in ensuring that all people can use their services, to which technology journalist Angharad Yeo said she had an “optimistic” view.
“So because the technology is still new, I think it’s very easy for them to get very excited that it’s being implemented at all,” she said.
“ … I think this is one of those areas that really puts a spotlight on these biases … when it’s a little bit more hidden, it’s easy to ignore, but when it’s ‘I literally cannot use this service because the AI isn’t working’, then that really makes you go, we have a real problem here.”
Large companies not being equipped to handle bias when it comes to AI should be something that should be dealt with in a “regulatory” way, according to CSIRO chief executive Doug Hilton, who said he was not surprised “at all” with Francesca’s story.
“We have racial discrimination laws and we should be applying them forcefully, so that is in Airbnb’s interest to get it right,” she said.
“We know actually technically how to fix this, we know how to actually make the algorithm [work].”