Facebook owner Meta releases human rights report for the first time


Meta, owner of social media giant Facebook, has released its first annual human rights report on Thursday, after being accused of turning a blind eye to online abuses that fuelled real-world violence in places such as India and Myanmar.

Covering due diligence performed in 2020 and 2021, the report includes a summary of a controversial human rights impact assessment of India conducted by law firm Foley Hoag which was commissioned by Meta.

Accusing Meta of stalling, human rights groups including Amnesty International and Human Rights Watch have demanded the release of the India assessment in full, in a joint letter sent in January.

In its summary, Meta said the law firm had noted the potential for “salient human rights risks” involving Meta’s platforms, including “advocacy of hatred that incites hostility, discrimination, or violence”.

The assessment, it added, did not probe “accusations of bias in content moderation”.

Ratik Asokan, a representative from India Civil Watch International who participated in the assessment and later organised the joint letter, told Reuters the summary struck him as an attempt by Meta to “whitewash” the firm’s findings.

“It’s as clear evidence as you can get that they’re very uncomfortable with the information that’s in that report,” he said. “At least show the courage to release the executive summary so we can see what the independent law firm has said.”

Human Rights Watch researcher Deborah Brown likewise called the summary selective and said it “brings us no closer” to understanding the company’s role in the spread of hate speech in India or commitments it will make to address the issue.

Rights groups for years have raised alarms about anti-Muslim hate speech stoking tensions in India, Meta’s largest market globally by number of users.

Meta’s top public policy executive in India stepped down in 2020 following a Wall Street Journal report that she opposed applying the company’s rules to Hindu nationalist figures flagged internally for promoting violence.

In its report, Meta said it was studying the India recommendations, but did not commit to implementing them as it did with other rights assessments.

Asked about the difference, Meta Human Rights Director Miranda Sissons pointed to United Nations guidelines cautioning against risks to “affected stakeholders, personnel or to legitimate requirements of commercial confidentiality”.

“The format of the reporting can be influenced by a variety of factors, including security reasons,” Ms. Sissons said.

Ms. Sissons, who joined Meta in 2019, said she now has eight people on her team, while about 100 others work on human rights with related teams.

In addition to country-level assessments, the report outlined her team’s work on Meta’s Covid-19 response and Ray-Ban Stories smart glasses, which involved flagging possible privacy risks and effects on vulnerable groups.

Ms. Sissons said analysis of augmented and virtual reality technologies, which Meta has prioritised with its bet on the metaverse, is largely taking place this year and would be discussed in subsequent reports. (Source: The Straits Times)