SOCIAL MEDIA A Toolbox For Autocrats, Insurrectionists, Scammers And Misogynists - Greg Waite Mark Zuckerberg built his corporate culture on the myth that Facebook connects people. It doesn't. Facebook addicts people, wastes their time, monetises their personal information, encourages anger and feeds depression, all while providing a perfect toolset to help the world's most corrupt. The inside story of Facebook, revealed by whistleblower Frances Haugen and many others in the book "Broken Code", shows our world to be much darker and more dysfunctional than I'd realised. The key point to understand is that Facebook, Instagram, etc take the world of information around us and reshape it by rules which prioritise what social media users see (the "algorithm"). A system which distorts the information we see will always give support to dishonest interests with the wealth to "game the rules" on a large scale. The world's most corrupt have seen this and built up or contracted in tech specialists who ensure their content dominates the posts sent to users. Social media has powered the creation of a vast ecosystem of corrupt online manipulators who dominate your "feed", an infrastructure which supports authoritarian political movements and human rights abusers. Even if Facebook spent much more money on understanding its own impacts and mitigating the worst effects, there is no good way for a corporation to distort what information the whole population sees. When the book ends in 2022, Facebook is instead busy limiting its teams which are tasked with reducing manipulation, focusing on the crisis points but ignoring the steady growth of corrupt players, and tightening up internal security to prevent future whistleblowers. Facebook's corporate home, Meta, doesn't rank in the top 50 global companies by revenue, but it is ranked seventh by market value (Wikipedia, 30/9/24). Why is Facebook so highly valued by the short-term shareholders of the world? Monopoly always translates into high profits. Why does the US government allow Facebook to buy all its competitors to remain a monopoly? Try to imagine how much money these US social media giants extract from non-US nations. And in addition to the money made, social media manipulation is now a core tool to get elected in the US, while supporting the longstanding US preference for compliant corrupt authoritarians ruling the rests of the world. This modern virtual empire is just as effective as the old military/Central Intelligence Agency (CIA) empire, and self-funded by the corporations. Next time you use Facebook or other social media, think beyond what you see on your screen. Facebook has multiple faces: The distorted addictive view of the world they send Western users is just the first. The second is their corporate push to get maximum usage in countries around the world where the Internet is new, where Facebook's distorting power is much greater and more dangerous. The third is the creation of an infrastructure of corrupt online manipulators who tune their tactics to each change of the algorithm, ensuring elite self-interest replaces honest journalism. The fourth is the rise of autocratic Governments relying on inciting social division and violence to dominate public debate. Inside Facebook The revelations in "Broken Code" all follow a pattern: Zuckerberg backs growth; the Growth team vetoes attempts by the Civic Integrity team to tackle Facebook's known support for political extremism, paedophiles, etc; Facebook doesn't pay much attention to the damage it does, especially outside the US, which creates a permanent infrastructure of social manipulators; US conservatives are more affected in Civic Integrity's trial counters to extremism (they're more corrupt, so manipulate social media more); Zuckerberg favours the conservatives, watering down changes; Facebook targets ever younger users to continue revenue growth. How Duterte Got Elected, May 2016 In 2016 the Philippines had the highest concentration of Facebook users in the world. Facebook's Elections (sales) team offered their standard consulting to all major parties. Candidate Rodrigo Duterte's team was a big Facebook user and he was a dangerous man, promising extrajudicial killing of drug users. Reports of mass fake accounts, lies on campaign Websites, coordinated threats on rivals followed. Duterte won, exactly because of his extremism and dishonesty. Independent media were banned from his inauguration, but he live-streamed it on Facebook. The promised killings began soon after. How Britain Brexited, June 2016 The Brexit campaign followed the proven formula for success, heavy on anti-immigrant sentiment and outright lies. Supporters of the "Leave" camp obliterated "Remain" supporters on Facebook. How Trump Got Elected, November 2016 Donald Trump made his name in Republican politics by using Twitter to question whether Obama was an American citizen; he insinuated rival candidate Ted Cruz's wife's dad might have helped Lee Harvey Oswald assassinate John F Kennedy. Here was an expert on lies and attack. James Barnes, Facebook's lead support for advertising sales to Trump's campaign, quit after Trump's grotesque public claim that his fame meant he could just "grab 'em by the pussy". The San Antonio campaign office temporarily suspended its ad buys and more or less shut down in preparation for defeat - but not everyone had given up. Trump continued to use his extremism and Facebook's aversion to limiting even blatant lies. He dominated the free advertising created by the shares of social media users. And days before the election, Bloomberg Businessweek confirmed a boast by Trump's digital team that it was running voter suppression operations on Facebook. Trump's digital lead Brad Pascale was absolutely clear: "Facebook and Twitter were the reason we won this thing". And analysis afterwards confirmed fake news had been the most viral content on Facebook during the election. Real example: "WikiLeaks CONFIRMS Hillary Sold Weapons to ISIS". On the Russian link, it's clear a troll farm in St. Petersburg manufactured content to erode voter trust, mostly but not entirely anti-Clinton; a large-scale hack-and-leak campaign targeted Hillary; Russia built up partisan accounts and pages on Facebook (Blacktivist, Secured Borders); bought $US100,000 worth of Facebook election ads, paying partly in roubles without anyone noticing; but Facebook looked for and found no evidence to confirm a Trump connection. These actions looked like generic anti-voting pro-authoritarian espionage. Those allegations also tell you something about how easy it had become to manipulate Facebook. The false Russian accounts had been promoted by paying for "Page Likes", which turn out to be for sale as paid marketing. These ads provide the initial boost to get controversial content visible, coupled with the creation of multiple copies of linked Web pages with stolen content from genuinely popular sites. How Facebook Changed Its Algorithms, 2016 First, two relevant backstories. In 2014 Facebook implemented "Trending Topics", a regularly updated list of the hottest news stories across the site. Automated scanning created a list of surging subjects, then a small curating team selected a credible, representative news story to promote with a brief writeup. Then in 2016 a minor tech site, Gizmodo, ran a series about opaque Websites which included an unconvincing allegation titled "Former Facebook Workers: We Routinely Suppressed Conservative News". When checked, it wasn't true, but Zuckerberg personally met "conservative thought leaders" including Trump campaign aid Barry Bennett, then ordered that the company move away from anything that looked like human curation. The entire Trending Topics team was laid off. And in 2015, Facebook introduced its first tool to combat hoaxes without doing fact-checking. Content which users disproportionately identified as false would be suppressed. Facebook's public announcement of the change noted their test finding that users reliably identified the truth, but omitted users also reported as false stories they didn't agree with. To stem those false positives, Facebook added a "whitelist" of trusted publishers whose stories would be assumed true. It was crude, disadvantaging obscure publishers, but it worked. Then, after the (false) accusations of bias in Trending Topics, Facebook turned it off - just months before the 2016 election. Also in 2016, the New York Times revealed Facebook, supporter of free speech, had been working on a censorship tool in its efforts to gain entry to the Chinese market. How Facebook Changed Its Algorithms, 2017 Carlos Gomez-Uribe, a machine-learning specialist, joined Facebook as a statistician in 2017 after making big improvements at movie-recommendation systems at Netflix. He found that a small number of hyperactive Facebook users liked, commented and reshared vastly more content than the average user. They liked edgier content, but had an outsized influence on News Feed based on their higher activity. Gomez-Uribe could not get support for a comprehensive fix, so trialled a simpler approach damping the effect of extremists. It worked, but "given how much bigger the American conservative publisher ecosystem was than the Left's, there was no question that views of content popular with Republicans would drop more". Facebook's Public Policy team didn't like it and head Joel Kaplan, a conservative lobbyist with eight years in the Bush Administration, insisted the decision go up to Zuckerberg. He heard out Gomez-Uribe and Kaplan for ten minutes then told them bluntly: "Do it, but cut the weighting by 80%, and don't bring me something like this again". Gomez-Uribe tried once more to test a change which increased the share of News Feed stories appealing to a wide range of users rather than narrow partisans. Designed as an experiment to gather data for evaluation, that change was again stopped by Communications and Public Policy. Gomez-Uribe quit. Michael McNalley also joined Facebook in 2017, fresh from 13 years at Google developing systems to fight spam. With users annoyed by increasing levels of false and manipulative content but Facebook unwilling to adjudicate between true and false, McNalley worked on a set of smaller changes penalising characteristics of false Websites. Fake news sites attract more negative user comments, typically have popups and tons of ads, and are slow to load. By downgrading for each, some of the most manipulative hoax companies folded. Next, McNalley's team developed and tested a metric called "Broad Trust" based on user surveys. It successfully reduced the amount of misinformation and sensationalist content in News Feed. Though publicly back by Zuckerberg, behind the scenes pressure from the much larger conservative ecosystem of partisan digital publishers and lobbying by Kaplan led him to water down the change. In late 2017 a new head of Integrity was appointed. He reported to the head of the Growth team. How Facebook Changed Its Algorithms, 2018 Facebook copped a lot of very public criticism after Trump's election, and in 2018 Zuckerberg publicly announced what seemed like a big shift. Research was showing long hours on social media were bad for mental health, so Facebook was introducing a new target, Meaningful Social Interaction (MSI). The truth was the system had been pushing out so much crap that users, especially the young users critical for future growth, were not joining the platform. The reason was simple; the platform pushed what made money and what kept people online longer, so real interactions with family and friends, user likes and shares, had been declining since 2017. Without engagement and the free content provided by users, Facebook was less personal and less interesting. The News Feed team was ordered into lockdown to fix a problem they only half understood in 30 days. The worth the programme assigned to user actions was completely upended, thereby changing completely the rules for what content was pushed to users. A reshare was now worth 30 times as much as a like; emoji responses five times a like. Engagement from "friends" a user interacted with most frequently got a small boost. Insiders felt intuitively these changes would encourage promoters of sensationalism and combative content. When a Growth team product manager asked if the change would favour more controversial content, the change team manager acknowledged it could, adding: "The News Feed Integrity team is working very hard (and quickly!) to mitigate the potential Integrity impact of the launch". But as the former director who worked on the MSI change said: "When an engineer tells you 'we'll get to that', you have to understand they're lying". This was a 30-day sprint by staff in an at-work no-home-life lockdown. If you want to understand Facebook, here is where you have to take a step back. There are two types of interactions going on in Facebook - the many small-scale interactions of individuals with their circle of family and friends, and the large-scale tightly focused actions of economic and political manipulators. Insider interviews make clear that senior technicians in Facebook understood how the latter worked. Renegade techies just like them worked for dodgy companies earning good money to test how each change of the rules worked, then how to game the new rules. It wasn't hard to do, and the rewards were big. Oh, and let's not forget to mention the long-standing Facebook rule that Governments and presidents were exempted from fact-checking. What about the effects? Around the world, political parties of all shades were finding themselves forced into inflammatory speech and posts to gain public attention. Poland's political parties described online political discourse as "a social-civil war", with one noting they had shifted the proportion of their posts from 50/50 positive/negative to 80/20 negative/positive, explicitly as a response to the change of the algorithm. Extremist parties actually boasted to a researcher they were running "provocation strategies" in which they would "create conflictual engagement on divisive issues, such as immigration and nationalism". One important improvement for the non-US world was the extension of some Civic tools rejected by its Public Policy team for use in the US. They were approved overseas to provide some level of balancing compensation for Facebook's low rate of problem detection in non-English speaking nations. These were "break the glass" tools though, not crisis-prevention but crisis responses, used to tamp down State-supported violence against Rohingya in Myanmar 2018 and after ISIS-inspired suicide attacks in Sri Lanka 2019. How Facebook Changed, 2019 By 2019 misinformation dominated Facebook content. An internal scan for scammers identified 33,000 entities receiving a quarter of all Facebook views, virtually all of which operated independently without contact with the Partnership team, and providing just 0.14% of Facebook's revenue (p.96). Any legitimate business would be acting in its own self-interest. Really, what else can you conclude but that Facebook is a political project of the Right? Outside Facebook, Brandon Silverman's company CrowdTangle had been designed to help nonprofits design and manage online activism campaigns. The product didn't make much, but proved to be very good at tracking what was happening on Facebook and Twitter, specifically when a topic began drawing unusual traffic or spotting a post at the start of a viral growth curve. What Facebook, with internal access to all data, should have been doing but wasn't. Facebook bought CrowdTangle in 2016 after media mogul Rupert Murdoch attacked Facebook publicly, and followed up by privately warning Zuckerberg that if Facebook didn't work more collaboratively with the news industry, newspapers would publicly push back against Facebook. In 2019 Silverman made a presentation to the heads of News Feed, Video and Groups on the day's top content. It was 100% crap, so bad that the lucrative celebrities' market had lost interest in Facebook. In October 2019 Zuckerberg gave a rare public address at Georgetown's Gaston Hall on the subject of free speech. "I believe in giving people a voice because, at the end of the day, I believe in people. From all of our individual voices and perspectives, we can bring the world together". Facebook's Communications team monitored and filtered listener comments. Only positives got through, including the satirical "thanks for ruining the country" which fooled the rules. In October 2019 the BBC ran an expose on the sale of domestic servants in the Persian Gulf States. Passports were seized, women were sold from household to household, forced to work seven days a week and confined to the house. Facebook took minimal action, deleting just one popular hashtag used in the sales. Apple however followed through and after referring continuing cases on Instagram without action, threatened to remove Facebook's products from its App Store. Unlike human trafficking, this was a crisis Facebook responded to. The company identified 133,000 posts, groups, and accounts within days and promised enforcement, Apple lifted its threat. Two years later the same abuse emerged in the Philippines. In the 2019 Indian election the ruling BJP was backed by Indian information technology (IT) company Silver Touch, violating Facebook's rules by running networks of false pages, etc. The Pakistani military and Indian National Congress did the same. Civic pushed for takedowns but was rebuffed, finally being allowed to take down over 1,000 pages and groups just two weeks from the election, without mention of the BJP's role. The BJP demanded and got some of its pages restored. The issue was referred up to Zuckerberg, resulting in a near moratorium on removing domestically organised political spam. Plans to take down fraudulent networks in Indonesia before its pending election were dropped. How Facebook Changed, 2020 In 2020 Sophie Zhang, a junior data analyst, quantified just how much dodgy publishers running multiple variants of ripped off Webpages were benefitting from Facebook's algorithm. Management hadn't been paying attention, again. Zhang, who'd been largely moonlighting on Civic work, was later fired for not focusing on her day job. In India, a whistleblower revealed Facebook was donating to Government-aligned charities, promoting BJP Ministers - and allowing Hindu politicians and advocates to incite violence. As a direct result of these posts, people were getting killed. A researcher set up a new dummy account for a 21-year-old Indian woman - for three weeks following Facebook's recommended content, the user was presented with a near constant barrage of polarising nationalism, misinformation, violence and gore. The content was particularly dark after a border clash with Pakistan. Facebook funnelled the user towards groups filled with content promoting full-scale war and mocking images of corpses with laughing emojis. Heading up to the 2020 US election, Civic prepared a presentation for senior management which warned that recent growth-focused changes to algorithms had created explosive growth in the reach of extremist content, finished with a clear warning that Facebook was ill-prepared, and asked for a lockdown to deal with company deficiencies. Zuckerberg said if Civic's head Chakrabarti thought a lockdown was necessary, the company would do it. Then he adjourned the meeting. As they exited, Zuckerberg pulled Integrity's Guy Rosen aside. "Why did you show me this in from of so many people?"" Deniability was now the new norm, but this time the three-month lockdown to prepare for the election had scraped through. New tools were created, identifying "narrowcast" propaganda efforts to manipulate specific ethnic minorities. Facebook started enforcing its anti-spam rules in groups and banned ads encouraging people not to vote. The angry emoji no longer amplified posts. Tools were built to detect the mass-scale voter suppression programmes which had become normal for autocrats. But there was still no green light to shut down domestically run political misinformation campaigns. Political ads were banned for one week before and after the election. Politician's posts and campaign ads were still exempt from Facebook's fact-checking programme. In September 2020 Trump started his campaign to denigrate post-in ballots, the sensible option with covid around, and a subtler form of voting disincentive. Zuckerberg pledged to donate $US300 million via nonprofits to boost local and state election administration, doubling the federal Government allocation, while making the decision to close down the Civic team permanently after the election. Sixty-four separate break-the-glass measures were used to tone down metrics for violence and incitement to "normal" levels, then rolled back when the election was called for Biden in November. How Facebook Changed, 2021 "Stop The Steal" went big on Facebook in January 2021 thanks to all the same conservative political funding, with the help of more, larger and uglier fringe groups created in the extremist promotions of earlier years. Facebook turned back on all its previous "Break The Glass" emergency tools and a lot more besides, suppressing the intended creation of a new "Patriot Party" backed by active insurrectionist communities. And after the new crisis they were all switched off again. I'm left thinking the big money hired enough smarts to win the US Presidency in 2024, but that's not the biggest issue. In between flashpoints like this, US conservative wealth will continue the slow buildup of its diverse, crazy movement until it has a vice grip on US politics, and therefore the world. Retrospective analysis of 700,000 supporters of Stop The Steal helped Facebook map out various roles in the campaign: ringleaders who created a strategy; amplifiers who used celebrity to spread messages; bridgers with a foot in multiple communities like anti-vax and QAnon; and finally, "susceptible users". These now well-developed vectors were capable of spreading even bizarre content on an unprecedented scale. Outside the US, in nations like India where Facebook oversight and crisis management is much less focused, life for political oppositions and minorities is getting much more dangerous. Back in the USA, whistleblower Frances Haugen uncovered a researcher's presentation on their findings, summarised as: "We make body image issues worse for one in three teen girls", and revealed the role of Facebook internal app X-Check, which exempted seven million VIP members from fact-checking. Paid celebrity endorsements now played an increasing role in amplifying misinformation, while Facebook was found to grant protection to "abusive accounts" and "persistent violators". Facebook responded with tighter monitoring of internal document access and security. Haugen also provided strategy documents showing Facebook's response to Snapchat and TikTok winning over younger users. Presentations included "Exploring Playdates as a Growth Lever" and a new Instagram Kids launch. In a third-quarter internal review, fully 70% of the top 20 most-viewed posts on Facebook met the company definition of "regrettable", with the remaining 30% being softer engagement bait. With past moves to reduce bad content rejected because in testing they reduced “user engagement”, frustrated data scientists set up an experiment to track the cumulative effect of integrity work against a "inimum integrity" control group over a much longer time frame. Integrity changes did dent Facebook usage but only for six months. After about a year, integrity changes produced a modest but significant gain, especially among new and younger users. Approval to share the research was held up for months and finally limited to Integrity staff. How Facebook Changed, 2022 All Facebook staff still working on integrity and societal issues in 2022 were now reporting to Marketing! Social scientists needed approval before they could conduct research. And Zuckerberg made a big bet on the metaverse in 2022, a system which had about 200,000 daily users, compared to Facebook's 3.5 billion daily active visitors. I'd suggest the metaverse was never a big deal for him, the rebadging to Meta was just a clever diversion to distract media scrutiny after the run of leaks. I wondered where the CIA sat on Facebook's support for political extremism after Trump's near insurrection in 2020, when Government agencies didn't take a stand either way on the day. You have to assume they're broadly supportive given their lack of action. By 2022, there's a growing list of CIA appointments at Facebook you can read about. But this longer story of a senior CIA transfer who is sidelined by controlling Facebook management is very consistent with the problems outlined above, and worth a read. Even CIA appointments can't seem to limit Facebook's drive for growth and profits by encouraging social division (ibid.). Facebook & The 2024 US Election So now, thanks to the whistleblowers, you know how effectively Facebook works against democracy. Yet Facebook is controlled by Zuckerberg with 61% of the votes. He claims to support democracy, but is totally wedded to endless growth for this corporate beast. And you know that in the last two elections Facebook had to use all their "break-the-glass" tools to dial down viral eruptions of electoral misinformation funded by Republicans. The logical conclusion is the result of US elections now rests on how many of those tools they use, for how long. Yet those decisions on how much they limit funded misinformation are increasingly limited by Zuckerberg's self-interested view that these are limits on freedom. So, sadly, I correctly picked a Trump win. Recall that internal research completed in 2022 but hidden, showing "integrity" changes (to reduce deliberate manipulation) could still increase usage by providing a product which helped users, not aggravated them. Why are Zuckerberg and Facebook's key managers not remotely interested in developing a product which truly connects people, if it could still generate high usage and therefore profits? We can't really know what goes on in their heads, but it's crystal clear that the super-rich are quite comfortable with social media which supports the growth of organisations encouraging anger and disengagement from democracy, as long as it maximises growth of online addiction and wealth extraction. Nearly at the end of this article, I still feel I haven't found the words to come close to conveying how dysfunctional our world is, once you understand social media's support for building extremist movements in countries where the Internet is new. In the context of post-war covert US support for military-autocratic Governments, today's US global domination of international forums and conservative economic ideology, the rise and rise of corporate power, social media misinformation will be tough to tackle. And then, look ahead and try to imagine how Facebook's support for unscrupulous vested interests will undermine our efforts to limit climate change. Meanwhile, we sit watching hours of meaningless click-bait but don't watch or understand the news. Citizens today have less understanding and provide less opposition. It's a perfect world for the rich, Rupert Murdoch's media manipulation on steroids, yet invisible. What's Our Response? Meaningful regulation of social media by the US isn't going to happen. The US makes way too much money from it. The only way to fix Facebook is to make it irrelevant, which means building real groups which bring real people together in real places to discuss real improvements to the bullshit world these corporations have created. A Word About The Book "Broken Code" That word is "messy". The insider access to Facebook makes this a real tell-all story, but it's technical and complex. Also, you have to think beyond the views expressed by Facebook employees; everyone believes they're striving to make a better product and a better world, yet they're clearly building a tool which hands over power to the worst of the world to manipulate and con us. The book has 19 chapters, all unnamed, with overlapping time periods. The last chapters mix the backstory of key whistleblower Frances Haugen with Facebook's shift to the metaverse. In an honest Acknowledgements section, author Horwitz tells us the original editor "washed his hands of me for perfectly understandable reasons". But congratulations to Horwitz anyway - he did his best, and got this book out in public where it clearly needs to be. For all that's not clear here, how Facebook works to promote extremism is crystal clear, which makes this book very important indeed. "Broken Code: Inside Facebook And The Fight To Expose Its Toxic Secrets" was written by Jeff Horwitz and published by Penguin UK in 2023. Watchdog - 167 December 2024
Non-Members:
|
![]() |
![]() |