An investigation by two advocacy groups reveals failure of YouTube and Koo in taking action against reported misogynistic hate speech and other incendiary material in India and US.
India will elect its prime minister later in the year through a countrywide election, which is likely to be marred by attempts of misinformation and outright abuse undermining the election integrity.
A collaborative investigation by Global Witness and the Internet Freedom Foundation (IFF) highlighted YouTube and Koo's persistence in allowing content that violates their policies in Hindi and English, raising concerns about their handling of divisive election-related content.
The two bodies in November 2023 reported 79 videos on YouTube and 23 posts on Koo that violated their policies, using each platform’s reporting tool.
"However, a full month after we reported the hate speech, the results showed an alarming lack of response from YouTube and inadequate action from Koo. Of the 79 videos reported, YouTube offered no responses beyond an acknowledgement of each report," said the report.
Koo, an Indian social media company, in its turn, pulled down six videos and left the rest on its site.
One of the videos on YouTube the groups reported involved abusive words similar to what is often cited as a stereotypical rant of a misogynist.
"Realise this chick is over 30. This chick has had sex with over 100 guys, this chick is 100 percent worthless, useless. Your genes are - your genes are trash, absolute trash, your womb is trash, you add absolutely nothing to my life but you bend over on Instagram and a couple million likes from some thirsty Indian dudes so you think you still got it. You don’t. You’re old," says the male narrator in the video.
A video the groups reported berated a female journalist as a "Hardcore Jihadan Terrorist." And yet another carried a threat to a Muslim female journalist in a theatre that she "lucky to walk out alive."
In response to Global Witness and Internet Freedom Foundation’s investigation, a Koo spokesperson said that Koo conducts "an initial screening of content using an automated process which identifies problematic content and reduces its visibility… (and) subsequently reported content is evaluated by a manual review team to determine if deletion is warranted, following several guiding principles."
Prateek Waghre, executive director at the IFF, said, "This investigation was to understand how their own reporting mechanisms would respond if you were able to pick out instances that, based on some analysis, were in violation of their policies."
He also mentions that the goal of the research was to test how responsive the platform would be to user report which can be used as a metric of the platforms identifying violating content.
留言