Facebook removes pages ahead of Philippines vote
Meta Platforms says it removed a network of 400 Facebook accounts and pages in the Philippines as it guards against misinformation ahead of the May 9 elections.
The people behind this activity claimed to be hactivists who used duplicate accounts to deface news websites in the Philippines, Facebook’s parent company said in an April 6 statement. A separate network of Facebook pages maintained by communist armed group New People’s Army was also removed for violating policies against violence, it said.
“As with any major civic event, we’ve also seen inauthentic behaviour operators from various countries become active on the margins of the upcoming Philippines elections,” Meta said.
Also removed were pages and groups that switched their focus to elections to increase their following. It cited as an example a page that previously shared dance videos which was renamed to “Bongbong Marcos news”. The late dictator’s son, former senator Ferdinand “Bongbong” Marcos junior is running for the presidency.
In January Twitter said it suspended more than 300 accounts reportedly promoting Marcos for violating its policies against spam. The presidential candidate has denied employing online trolls to boost his social media pages.
Meta said it took down a dozen clusters of Facebook activity focused on fake engagement. It said it identified efforts to push out content at spam-like rates to drive people to particular pages or websites, including by an unnamed social media management agency that used hundreds of accounts for political and entertainment posts.
Activity clusters from Vietnam, Thailand and the US were also removed for posing to be members of communities in the Philippines “to monetise people’s attention on the election”, Meta said. These include pages, operated by spammers in Vietnam, posing as supporters of Philippine politicians to attract clicks to websites filled with ads.
Facebook also has third-party fact-checking partners in the Philippines.
“When a fact-checker rates a piece of content as false, we reduce its distribution, notify people who share the content — or who have previously shared it — that the information is false or misleading, and we add a warning label that links to the fact-checker’s article disproving the claim,” Meta said.
Bloomberg News. More stories like this are available on bloomberg.com
Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.
Please read our Comment Policy before commenting.