Twitter to look into image function users complain of racial bias

0

Social media giant Twitter said on Monday it’s investigating its picture-cropping algorithm that users complained favoured White faces over Black.

The issue came up when several users found that posts with a Black person’s face and a White person’s face would show the White person’s face in photo preview more often.

A San Francisco-based programmer found Twitter’s system would crop out images of President Barack Obama when posted alongside Republican Senate Leader Mitch McConnell.

“Twitter is just one example of racism manifesting in machine learning algorithms,” the programmer, Tony Arcieri, wrote on Twitter.

Twitter is one of the world’s most popular social networks, with nearly 200 million daily users.

Other users shared similar experiments online they said showed Twitter’s cropping system favouring White people.

Twitter said in a statement: “Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing.”

However it said it would look further into the issue.

“It’s clear from these examples that we’ve got more analysis to do. We’ll continue to share what we learn, what actions we take, and will open source our analysis so others can review and replicate,” Twitter said in its statement.

In a 2018 blog post, Twitter had said the cropping system was based on a “neural network” that used artificial intelligence to predict what part of a photo would be interesting to a user and crop out the rest.

A representative of Twitter also pointed to an experiment by a Carnegie Mellon University scientist who analysed 92 images and found the algorithm favoured Black faces 52 times.

But Meredith Whittaker, co-founder of the AI Now Institute that studies the social implications of artificial intelligence, said she was not satisfied with Twitter’s response.

“Systems like Twitter’s image preview are everywhere, implemented in the name of standardization and convenience,” she told Thomson Reuters Foundation.

“This is another in a long and weary litany of examples that show automated systems encoding racism, misogyny and histories of discrimination.” (Source: Thomson Reuters Foundation)

 

Share.