Facebook categorised as banned content the old Hungarian tongue-twister: “Az ibafai papnak fapipája” (in English ‘The priest of Ibafa has a wooden pipe’). The algorithm probably did so because it thought the twister was advertising tobacco products. The band performing the relevant song would have filed a complaint to the tech giant. However, they found nobody to turn.
According to hirado.hu, a famous Hungarian rock band, the Belvárosi Betyárok (Downtown Betyars) remade the famous Hungarian tongue twister. They wanted to promote it on Facebook, but the algorithm banned it. Based on the notification of the social media site, the text of the tongue-twister was dangerous.
Frontman László Váray told the Hungarian public media that Facebook banned their content
either because of a smoking priest, or the text of the tongue-twister.
He added that they could not discuss the issue with the editors of the social media site. Even though they tried to upload the advertisement again, the algorithm denied its publication each time.
Mr Váray wondered why tech giants could censor traditional content using a universal algorithm. He added that they do not and cannot take into consideration national, cultural differences.
Here you can watch the banned video:
It is not without an example that Facebook falsely rates contents offending. In 2020, a British art gallery wanted to promote a small company with two grazing cows. However,
Facebook said it was sexually offensive and banned the photo.
The tech giant even blocked the profile of the company.
Facebook even rated a skyscraper in Hongkong sexually offensive once. Another time, they said about a disco inscription that it was promoting alcohol consumption. Once, they banned a photo of fireworks because they said it publicised weapons. Meanwhile, experts say that sometimes the algorithm lets dangerous and hate content through.
Based on the Wall Street Journal,
Facebook did not ban an Indian hate group before because they wanted to keep their employees safe.
Even though the tech giant said the movement was violent and dangerous, they did not remove a video from the news feed on which group members beat a priest and destroyed a Pentecostal church.
The role and questionable functioning of the algorithm played a significant role during the so-called Arab spring of 2010-2011. The protests, uprisings, and clashes claimed more than 200 thousand dead then. Meanwhile, Facebook provided a platform to organise violence and spread fake news – origo.hu wrote.
Source: hirado.hu, Wall Street Journal