‘Coordinated inauthentic behavior’

Last Sept. 22, the head of Facebook’s Security Policy — Nathaniel Gleicher — announced that, following a thorough investigation, Facebook took down two separate networks for violating FB’s policy on “coordinated inauthentic behavior.” One of them is based in the Philippines, and the other in Fujian, China.

The Filipino network, consisting of 57 FB accounts, 31 Pages, and 20 Instagram accounts, has been traced to the Philippine military and police. Facebook claims: “This network consisted of several clusters of connected activity that relied on fake accounts to evade enforcement, post content, comment and manage Pages. This operation appeared to have accelerated between 2019 and 2020… Although the people behind this activity attempted to conceal their identities, our investigation found links to Philippine military and Philippine police.”

The information posted on these accounts and pages is almost singularly focused on the operations of the Communist Party of the Philippines and its affiliated organizations. But it also includes threatening comments against critics of the Duterte administration and political activists in general.

One post, for instance, lists down the schools that supposedly serve as the “recruitment basin” of the CPP-NPA in Central Luzon, as follows: “University of the Philippines in Clark, Bulacan State University, and the Holy Angel Academy (sic) in Pampanga.”

Facebook is telling the military and the police that it will not allow any entity, whether private or government, to employ fake accounts, spread disinformation, and mislead people. One of its policy updates last year explicitly warned against “foreign-led efforts to manipulate public debate in another country, and operations run by a government to target its own citizens. These can be particularly concerning when they combine deceptive techniques with the real-world power of a state.”

It is interesting to hear the reactions of our police and military. Of course, both denied they authorized any of their offices to engage in “coordinated inauthentic behavior.” The PNP said it will investigate. AFP spokesperson Major Gen. Edgard Arevalo said that Chief of Staff Gen. Gilbert Gapay asked Facebook to restore the private accounts of promilitary “advocacy” groups, in particular the “Hands Off Our Children” (HOOC) page.

The HOOC page, according to the US-based The Atlantic Council’s Digital Forensic Lab, is being managed by Army Capt. Alexandre Cabales. A 2008 PMA graduate, Cabales is the chief of the Army Social Media Monitoring Center. According to Rappler, Cabales’ FB account is “among the administrators of a private Facebook group linked to HOOC’s page.”

The HOOC presents itself as an independent organization of concerned parents whose children joined militant groups. On its face, there seems nothing wrong in similarly situated parents sharing information, experiences, and advice about their “lost” children. What is objectionable is when the organization allows its page to be controlled and used by the military for the latter’s own propaganda. As the DFRLab puts it: “It may be more closely linked to the Civil-Military Operations Regiment than it publicly lets on.”

The military, in effect, is being told: You may put out your own propaganda using your own Facebook page. But you cannot use fake accounts or conceal your identity, or artificially boost your popularity, or threaten your own citizens, without running afoul of FB’s community standards.

Having offered the world a powerful platform for communication and networking, Facebook now finds itself besieged by the growing realization that its platform is now also being used as a tool for mass deception and manipulation. To deflect charges of partisanship, Facebook insists it is primarily the behavior of the users, not the information communicated, that it is closely monitoring. It raises red flags when its investigators encounter CIB or coordinated inauthentic behavior.

CIB encompasses a broad range of online behavior. But central to the concept is the use of fake accounts. When multiple accounts act in concert on the same issues, as though prompted by one conductor, there is reason to dig deep into their online activity, their ownership, and management. Mistakes are bound to be committed because what is to be treated as authentic and inauthentic is premised on a working knowledge of a given society, its culture, politics, and history.

Facebook admits it is continually refining its tools in its effort to weed out behavior that corrupts public discourse. It seems particularly wary of entities that spend large amounts of resources and effort to boost their following, targeting vulnerable populations to shape the way they behave in the real world. Yet, Facebook itself, as a business entity, thrives on such expenditures.

The advent of the internet—of social media platforms in particular—paved the way for the decentralization and de-hierarchization of communication. Little did we expect that the same tool could be used with more insidious effect by those who control political and economic power.

This “social dilemma,” as the Netflix documentary describes it, is one that internet ethicists are trying to sort out. But, try as they may, ethical discernment cannot be built into algorithms, and artificial intelligence cannot replace human consciousness. We still need to fight these battles in the real world.

public.lives@gmail.com