Democracy c/o Facebook. Part 2.


  • December 9, 2018
  • (0 Comments)
  • 2418 Views

The fact that a youth in Macedonia can write a spurious article and grab attention of a different scale than a publisher like the New York Times – shows the nature, and the power, of the Facebook model. Soon enough however, Facebook became a global platform of not just misinformation, but also hate and polarisation. The Facebook algorithm figured out that polarised opinions, fear, hatred and anger are some of the most “engaging” human emotions. Several countries such as The Philippines, Myanmar, Kenya, Sri Lanka, etc., have been going through violent social strife that many have attributed to hate propaganda through social media platforms like WhatsApp and Facebook. But international uproar against Facebook reached a new high following a March 2018 story by The Guardian, titled “50 million Facebook profiles harvested for Cambridge Analytica in major data breach”. This is when the waves of this global controversy regarding Facebook hit India. Siddhartha Dasgupta writes. This is Part 2 of the series. You can read Part 1 here. 

 

 

In 2016, Facebook attracted worldwide attention during the American elections. While its CEO made repeated liberal democratic public spiels, his platform itself however seemed to play a key role in the campaign of Republican Party candidate Donald Trump. In the words of Trump’s Digital Media Director Brad Parscale, the Trump campaign paid Facebook a hundred million dollars for running their campaign. He claimed that as a part of the deal, in addition to running campaign ads, Facebook also taught the Trump campaign team how to use such ads to harness personal information of users, and in turn reach more and more precise target audience for specific kind of propaganda. The fancy term used for this is “micro-targeting”, and with the enormous amounts of data that Facebook collects every moment from billions of users, such micro-targeting can be very precise. Ironically, this is a self-feeding loop. The more such targeted audiences are fed with “engaging content” that they like, the more time they spend on Facebook as a platform, and the more they are exposed to such propaganda and advertisement. This, together with no editorial control over the fact-checking of any “information” that is shared, means nets of lies can be woven through facebook, at an exponential pace and scale.

 

By this time, Facebook had established its dominance on the news business. According to some reports, by 2016, 62% of American adults were getting their news supply through social media sites like Twitter and Facebook. All major and minor news outlets were primarily relying on Facebook for distributing news. Many of them in fact started publishing news directly into their Facebook newsfeed. Facebook effectively became the biggest global news distributor.

 

 

However, unlike a typical news distributor, Facebook categorically denies any editorial responsibilities, using the argument of free speech and that it is after all just a tech company. Incidentally, Twitter CEO Jack Dorsey recently said that “The free-speech party was never a mission of [my] company. It was never a descriptor of the company that we gave ourselves. It was a joke.” The descriptor of a “tech company” merely means that “platforms” such as Facebook and Twitter do not share the editorial responsibilities that media outlets or publishers do. Unlike a newspaper where the editor decides which news content deserves greater attention and which one doesn’t, Facebook cites “free speech” to stay clear of any such responsibilities of a news or media outlet. Ironically, in a latest lawsuit filed against Facebook by an independent App developer for not allowing him access to users’ personal data, Facebook defended itself in the court claiming that its decisions about “what not to publish” should be protected because it is a “publisher”. In the words of comedian and political satirist Hassan Minhaj, “social media is platform in the streets, and publishers in the sheets.”

 

Meanwhile, in the absence of legal clarity between the dual characters of a “platform” and a “publisher”, the only parameters that continue guiding the Facebook algorithm are popularity and money, when it comes to deciding about any content. In terms of legitimacy of news, what this also means is that any insincere news outlet, can now put up its own Facebook page, and post content that looks as legitimate as any other content in Facebook. Reporter Craig Silverman was one of the first to bring attention to the issue of Facebook newsfeed becoming a harbour of what he called “fake news” – a term that has caught on since. One of the key places to which Silverman traced the source of this misinformation campaign, was Macedonia. In fact, the vast majority of these websites – more than a hundred – were traced to one particular small town called Veles. “I started the site for a easy way to make money. In Macedonia the economy is very weak and teenagers are not allowed to work, so we need to find creative ways to make some money. I’m a musician but I can’t afford music gear. Here in Macedonia the revenue from a small site is enough to afford many things,” reportedly said a 17-year-old in Veles who ran one such site. For each click on their content, a Macedonian youth would earn fraction of a penny from advertisement revenue. They were not interested in US politics, or in the outcome of the elections per se. “Several teens and young men who run these sites told BuzzFeed News that they learned the best way to generate traffic is to get their stories to spread on Facebook — and the best way to generate shares on Facebook is to publish sensationalist and often false content that caters to Trump supporters,” reported Silverman. There were some in Veles who were reported to have made hundreds of thousands of Euros through the US election Facebook propaganda. Over the years, several of these Macedonian sites have been taken down, but nothing changed in the model of Facebook that would structurally prevent such possibilities. “News” generated from Veles included headlines such as “Pope Endorses Trump”, “Obama was born in Kenya”, “Donald Trump Assassination Plot”, etc – each of which went viral – going beyond a million likes, shares and comments. This was the same time when the New York Times did a serious scoop regarding Trump’s non-payment of tax returns – but that story reached nowhere close to such numbers. The fact that a youth in Macedonia can write an article and grab attention of a different level than a publisher like the New York Times – shows the nature, and the current power of the Facebook model.

 

 

But very soon Facebook became a global platform of not just misinformation, but also polarisation. The Facebook algorithm soon figured out that polarised opinions, fear, hatred and anger happen to be some of the most “engaging” emotions everywhere. While this was true about media in general, what was new was the scale of things. From the mid 90s when internet mainly meant emails and chat rooms and a library of sorts, today we have seen an explosion in the reach of internet, its power and the sheer volume of information it has generated and archived. According to one news report, 400 hours of videos get uploaded onto YouTube every minute. That amounts to a century worth of video content added every day and a half. No country in the world has Internet laws and technical infrastructure evolved enough to be able to monitor and control such volumes of content and their social impacts. 10 days before the US elections, Roger Mcnamee, one of Facebook’s investors and someone who was personally close to Zuckerberg, wrote a letter to Zuckerberg and Sandberg detailing what he found were systemic flaws with the Facebook algorithm and its business model that was leading to worldwide polarisation at levels never seen before. In response, Mcnamee was reportedly told that what he was describing as systemic flaws were just isolated cases and examples, and that each one of them had been or would be individually taken care of. Donald Trump went on to win the elections. Liberal sections of American media and civil society alleged Facebook of tilting the elections in favour of Trump. The Trump campaign claims that Donald Trump just used a system that was built by the liberals themselves. “The Obama campaign used Facebook openly, and the Left and the media called them genius,” said Brad Parscale. Zuckerberg denied any possibility of Facebook news having played any significant role in the elections.

 

Soon however the allegations reached a different level when American lawmakers alleged Russian interference in the 2016 elections, and questions were asked about Facebook’s role in it. Investigations have linked the bulk of these fake news websites to the same Russia-based Interned Research Agency that was part of Russia’s attack on Ukraine few years back. In many cases, this agency would float Facebook groups that are ideologically opposed to each other, and run both groups through different fake accounts, and use one to paint a caricature of the other, creating deep social and ideological faultlines through social media, which would then in turn play out in real politics on the streets. The details of the extent of these business deals, and the degree of conscious involvement of Facebook in manipulating elections, is an unfolding story. The New York Times investigation this report began with, is the latest development in this story. Referring to the two top executives, Zuckerberg and Sandberg, the NYT report says, “… the pair ignored the warning signs and then sought to conceal them from public view.” Zuckerberg who had till now consistently claimed that Facebook couldn’t have had any influence on the elections, has finally admitted that around 126 million Americans might have seen Russia-sponsored political posts.

 

While the US democracy is one of the oldest running democracies in the world with all its safeguards and protections from subterfuge, most nations in the world are young, fragile democracies. If Facebook could arguably have played such a central role in swaying the American elections, it is important to take a look around the rest of the world that has also jumped on the social media bandwagon over the past few decades. A key example is that of Philippines. Award-winning journalist Maria Ressa, who founded and runs a Philippines-based news website called Rappler, says she wrote detailed letters to the Facebook CEO and other top officials about how Philippines’ President Rodrigo Duterte was running a network of paid followers and fake accounts to spread lies about his policies and attack his critics. Duterte has been running what he calls the “Drug war” – encounters of those he claims are “drug dealers”. Philippino society has been ripped apart by these relentless encounters. According to some estimates, over 12000 people have been killed extra-judicially in the name of “drug wars”, and almost all of them are from the poorest sections. Human Rights Watch has called this “Government sponsored butchering” of people. Duterte uses his social media network to track and attack anyone who raises questions about his ‘drug wars’, extra-judicial killings, or any of his other policies. The Philippino Constitution is relatively young, and so are its democratic institutions. Already President Durterte has stripped off basic democratic rights protected by the Constitution, and is on the way to becoming the official dictator of the country. Ressa herself has been repeatedly persecuted because of her constant coverage of Duterte’s crimes. Recently an arrest warrant was issued in her name while she was out of the country to receive the prestigious Knight International Journalism Award. Latest news is that she has returned to her country amidst threats of an arrest, and that she has filed for bail. Maria’s letters to Facebook also prompted no action, and as she alleges, neither did they give her any explanation for the inaction. “Facebook walked in to the Philippines and their focus was growth. They did not realise that a country like the Philippines does not have strong democratic institutions to protect themselves. In such a situation, bringing everyone onto such a platform without any pre-set rules of engagement, is only going to create more chaos,” Ressa said in an interview. More recently however, under pressures from the ongoing investigations, Facebook did bring down the fake accounts Ressa had identified. They still however reject the claim that there are systemic flaws with the model itself, and that it is not just a series of isolated events of such misinformation campaigns happening.

 

Duterte’s “Drug wars” victim. This photo taken on July 23, 2016 shows Jennilyn Olayres grieving beside the dead body of her partner Michael Siaron who was shot by unidentified gunman and left with a cardboard sign with a message “I’m a pusher” along a street in Manila. Image: Noel Celis, AFP

 

Another prominent example is that of Myanmar, where the ruling Buddhist communities have been organising violent attacks on the Muslim Rohingyas. Facebook has been alleged to have played a central role in fanning these ethnic tensions in Myanmar, that have led to a genocide, and subsequent human migration of catastrophic scale. Degrading, dehumanizing, racist propaganda against Rohingyas through rumours, fake news, and misinformation have been flooding the newsfeeds for users in Myanmar. In fact, activists have alleged that the current phase of extreme armed violence on the Rohingyas sparked off in 2014 because of a fake report that went viral on Facebook, claiming that a Rohingya man had raped a Buddhist woman. A rioting mob of 400 Buddhists gathered that night and attacked a Muslim area. One Muslim and one Buddhist person were killed in the violence on that night. Several activists brought the situation in Myanmar to the attention of the Facebook headquarters, as early as in 2015. They even pointed out the similarity of what was happening with the use of radios during the Rwandan genocide. But again, no steps were allegedly taken by the company to control the situation. Massive swathes of violence soon swept over Myanmar over the next years to come, leading to the mass migration of hundreds of thousands of people over land to Bangladesh and India, and over the seas to Indonesia, Malaysia, and Thailand. “Facebook has now turned into a beast,” said the UN Special Rapporteur for Mayanmar. When asked about what steps are they taking given that the UN report described Facebook as a key instrument for the genocide in Myanmar, the company responded with “technical improvements” and “the need to set up connections with civil society organisations in such countries, to stay in tune with the real situation on the ground”. Still no comments about the model itself. They claim to have taken down accounts, improved policies and hired more translators over the past couple of years, in the case of Myanmar. When asked about the company’s legal culpability in all this, the response was “our accountability ultimately lies with the users. They wouldn’t be using our platform if they did not feel safe with us.”

 

A Rohingya refugee child tries to climb on a truck distributing aid for a local NGO. Courtesy: BBC

 

International uproar against Facebook reached a new high following a March 2018 story by The Guardian, titled “50 million Facebook profiles harvested for Cambridge Analytica in major data breach”. This is when the waves of this global controversy around Facebook hit India. Cambridge Analytica is a “data firm” that got access to the personal data of, according to some sources, 87 million people without their knowledge, through Facebook. Christopher Wylie who worked as the Director of Research for Cambridge Analytica (CA), blew whistle on the company’s secret workings with Facebook, describing the company as specialising in “rumour campaigns”. Wylie is reported to have submitted documents to The Guardian that he claims prove that CA got access to the users’ private information under the knowledge of Facebook. According to Wylie, CA acquired this data via a personality profiling app called thisisyourdigitallife, built by Aleksandr Kogan, an academic at Cambridge University. “While the app was downloaded by just 2,70,000 Facebook users, it pulled in data from the “Facebook friends” of these users, allowing CA to harvest the data of 50 million users, without their consent,” reports The Hindu. The data firm then created micro-targeted political campaigns for political outfits that were willing to pay them. In the US this was used allegedly by the Trump campaign. Here in India, according to allegations and counter-allegations, Cambridge Analytica had ties with several key political parties and their election campaign machinery, including the Congress and the BJP. “Both the parties received our presentation very enthusiastically. However, I don’t think a deal has been signed so far. As far as we know, the deal with the Congress looked more likely,” one source reportedly told The Print. Cambridge Analytica’s parent firm is London-based Strategic Communication Laboratories (henceforth SCL UK), founded in 2005. It has worked in India through an Indian company called Strategic Communication Laboratories Private Limited (henceforth SCL India).

 

International uproar against Facebook reached a new high following a March 2018 story by The Guardian, titled “50 million Facebook profiles harvested for Cambridge Analytica in major data breach”. This is when the waves of this global controversy around Facebook hit India.

 

Company records show SCL India has four directors: Alexander Nix, Alexander Oakes, Amrish Tyagi, and Avneesh Rai. The first two are British citizens who were also among the co-founders of SCL UK. Amrish Tyagi is the son of Janata Dal (United) leader K.C. Tyagi. He also runs the firm Ovleno Business Intelligence, which now works with Cambridge Analytica in India. Avneesh Rai hails from Bihar and had been doing voter profiling for years for individual politicians across the country. SCL UK has been reportedly interfering in elections across countries in the world, including Kenya, Ghana, India. According to The Print, surveys of English speaking people of Ghana would arrive at Rai’s office in Indirapuram, Ghaziabad. Around 2010-11, Rai, Tyagi, and officials from SCL UK, like Nix and Oakes began planning for a scale up in India, of creating a behavioral database for 28 Lok Sabha seats that they could then sell to the highest bidder in the 2014 elections. Rai claimed in an interview to The Print that they subsequently met top politicians in both the Congress and the BJP with their proposal, though he refused to name them. Both parties reportedly showed interest as well. In course of time, Rai claimed to have found out that SCL India was simultaneously trying to get a contract from the Congress, while also working with an Indian businessman based in USA who “wanted to see the Congress defeated”. Meanwhile, according to reports, SCL UK managed the polls for Uhuru Kenyatta in the Kenyan Presidentship elections in 2013 – a vote that Kenyatta won by a margin of only 0.03%. 4 years later SCL UK’s new electoral arm Cambridge Analytica helped him win again, in 2017. Large sections of Kenyan society have taken to the streets since in relentless protests, claiming both the 2013 and 2017 Presidential elections were rigged. At the time of writing this report, Kenya is still engulfed in political turmoil and violent conflicts as people are demanding a step-down of Kenyatta. Meanwhile, the SCL official who was in charge of their Kenya work, was found dead in his hotel room. Many have raised doubts of it being a political murder connected with the kind of role such companies have been playing in electoral politics. Alexander Nix meanwhile had started working on the American elections, through Cambridge Analytica.

 

A screenshot of the website of Ovleno Business Intelligence, the Indian partner of Cambridge Analytica. The website was taken down soon after the CA scandal broke. Courtesy: News18

 

Over the last few years, there has been regular complaints about platforms like Facebook and WhatsApp being used in India to spread false rumours about alleged “cow smugglers”, “children smugglers”, etc. Several people have been lynched by organised mobs, after such fake news went viral on these platforms. Many reports have documented how organised this fake news distribution is, how pervasive and active they are, and how they are primarily targeted at specific communities that are socially and economically marginalised, specifically Muslims, Dalits and Adivasis. Many have also drawn their connections with political and business interests. Several agencies such as AltNews, have been doing regular fact-checking of such misinformation campaigns, but there are fears among many about the role social media is going to play in the country leading up to the General Elections of 2019.

 

To be concluded in Part 3…

 

[This report is based heavily on the recent documentary called The Facebook Dilemma created by PBS/Frontline, recent investigations carried out by the New York Times, and interviews carried out by Democracy Now’s Amy Goodman, and by Vox Strikethrough.]

Share this
Leave a Comment