South Korean Al chatbot, Lee Luda, has been suspended from Facebook this week after being reported for making racist remarks and discriminatory comments about members of the LGBTQ+ community.
The popular chatbot was designed by Korean startup Scatter Lab and had the persona of a 20-year-old female university student. It had attracted more than 750,000 users since its launch in December last year.
The company said in a statement that they “deeply apologise over the discriminatory remarks against minorities” and that the AI “does not reflect the thoughts of our company and we are continuing the upgrades so that such words of discrimination or hate speech do not recur”.
Scatter Lab, which had earlier claimed that Luda was a work in progress and, like humans, would take time to “properly socialise”, said the chatbot would reappear after the firm had “fixed its weaknesses”.
The service spoke to the users by examining old chat records that it obtained from the company’s mobile application service Science of Love.
Some of the users took to social media to share the racist slurs used by the AI. The chatbot can be seen calling Black people “heukhyeong,” a racist slur in South Korea and responded with “disgusting” when asked about lesbians.
“Luda is a childlike AI who has just started talking with people. There is still a lot to learn. Luda will learn to judge what is an appropriate and better answer,” the company said in the statement.
The Scatter Lab is also facing questions over violation of privacy laws.
It is, however, not the first time, that an AI bot has been embroiled in a controversy related to discrimination and bigotry.
In 2016, Microsoft was forced to shut down its chatbot Tay within 16 hours of its launch, after the bot was manipulated into saying Islamophobic and white supremacist slurs.
In 2018, Amazon’s AI recruitment tool was also suspended after the company found that it made recommendations that were biased against women. (Source: Independent UK)