After facing scrutiny about the role fake news and divisive content on Facebook played in the 2016 election, Facebook changed its News Feed algorithm to prioritize posts from friends and families, while decreasing content from brands and publishers. That move drew ire from organizations with revenue models built around Facebook. Some news organizations would go on to experience more than 80% drops in referral traffic from Facebook.
At the World Economic Forum in Davos, Switzerland, billionaire investor and philanthropist George Soros said Facebook and Google are a “menace” to society and called for tighter governmental regulation around the “monopolistic behavior of the giant IT platform companies.” This caught the attention of Facebook Chief Operating Officer Sheryl Sandberg, who later asked her subordinates to investigate Soros.
Facebook disclosed in an earnings call with investors that people were actually spending less time on Facebook “by roughly 50 million hours per day.” Zuckerberg said it was part of the company’s effort to encourage “time well spent” on its platform and apps.
After special counsel Robert Mueller indicted 13 individuals linked to a Russian internet troll farm, Facebook ad executive Rob Goldman tweeted to dispute the notion that Russian ad spending had a meaningful effect on the election. Ads, however, were only a small part of the Russian influence operation, which largely relied on organic content to reach US users. President Donald Trump seized on Goldman’s tweet to claim the “Fake News Media” had supposedly been disproven by a Facebook executive. Goldman later apologized.
Sri Lanka blocked Facebook and WhatsApp for three days in response to posts calling for attacks on Muslims in the country. The move was a last resort after Facebook ignored calls from both the Sri Lankan government and NGOs to control ethno-nationalist accounts spreading hate speech that contributed to deadly anti-Muslim riots in the country.
Three news organizations simultaneously published stories about Cambridge Analytica, a political data analytics firm that had ties to the Trump campaign, and its misappropriation of millions of users’ Facebook data. While the story initially attracted attention based on the notion that ill-gotten Facebook data helped sway the election toward Trump — which still has yet to be definitively proven — the stories laid bare Facebook’s lax policies around the use and sharing of user information. The scandal damaged users’ trust in Facebook, especially as executives remained quiet on the matter in the days following the initial stories, and set the table for Zuckerberg’s visit to Capitol Hill in April.
BuzzFeed News published an internal memo from Facebook executive Andrew Bosworth in which one of Zuckerberg’s most trusted lieutenants calls any effort to connect the world a “de facto good.” “Maybe it costs a life by exposing someone to bullies,” he wrote in June 2016. “Maybe someone dies in a terrorist attack coordinated on our tools… And still we connect people.” The note caused a backlash externally in light of troubling events, including Facebook’s role in abetting genocide in Myanmar and its role in broadcasting a spate of suicides and murders on its livestreaming tool. In a statement, Zuckerberg disputed the idea that Bosworth was speaking for the company and said, “We’ve never believed the ends justify the means.” In response to the story, Bosworth tweeted (and later deleted), “I don’t agree with the post today and I didn’t agree with it even when I wrote it.”
Zuckerberg made two appearances on Capitol Hill, in front of committees for the Senate and House of Representatives, where he faced questions on the company’s business practices, protection of user data, and ethics. While the questioning exposed some lawmakers’ basic grasp of technology — Sen. Orrin Hatch bafflingly asked Zuckerberg how the company makes money — it also exposed just how invasive Facebook’s technologies have become in our everyday lives. Zuckerberg was often awkward and evasive (“I’ll have my team follow up with you”), yet he said of Russian election interference: “It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”
WhatsApp founder Jan Koum announced he was leaving the company he founded, four years after selling it to Facebook for $19 billion. Later reports would point to internal disagreements over the business direction of WhatsApp.
Zuckerberg headed to Brussels to face questions from members of the European Parliament, one of whom asked Facebook’s CEO if he wanted to be remembered for creating “a digital monster.” The appearance came a few weeks after Facebook’s chief technology officer appeared in front of a UK parliamentary committee for questioning about the company’s role in the Cambridge Analytica scandal, where British lawmakers excoriated Zuckerberg for not showing up himself. Zuckerberg has since rebuffed their invitations to testify on multiple occasions.
Facebook confirmed, after several reports, that it had data-sharing agreements with Chinese device manufacturers Huawei, Lenovo, and Oppo. The findings angered US lawmakers, including Sen. Mark Warner, who criticized Facebook for providing special API access to companies with ties to the Chinese government. In a document submitted to Congress in July, Facebook also acknowledged that it had data agreements with 61 app developers that allowed them special access to user information as they came into compliance with new policies.
Following a brutal lynching of five men in a small Indian village that was sparked in part by rumors on WhatsApp, India’s Information Technology Ministry released a statement noting that WhatsApp “cannot evade accountability and responsibility.” The beating was the low point in a series of mob-related events triggered by misinformation about child kidnappers on WhatsApp that had gone viral in parts of the country. Since May, at least 16 lynchings related to WhatsApp rumors have led to 29 deaths in the country.
BuzzFeed News reported that Zuckerberg shared a phone call with Trump shortly after the election to congratulate the then–president-elect. Facebook and Zuckerberg had been largely mum on how Trump’s campaign effectively used the social network to reach voters, spending millions of dollars on the way to its 2016 victory. Internal documents obtained by BuzzFeed News also showed that Facebook employees lauded the Trump campaign as a success story for its ad tools, and considered bringing in members of the campaign to talk about their experiences using the platform.
Facebook experienced the greatest single-day loss of any public stock, losing more than $119 billion in market valuation following an earnings call with investors in which it warned of declining revenue growth.
In a series of blog posts, Facebook revealed that it was still seeing coordinated activity with possible ties to Russia on its platform. The 32 pages and accounts removed by Facebook were aimed at sowing discord among American users. Facebook’s openness about the takedowns continued through the year, sharing details on similar disinformation campaigns with links to Iran and other foreign governments.
After a months-long debate, Facebook removed Infowars founder Alex Jones and three other pages associated with him from its platform for violating the company’s hate speech policies. Although critics had been reporting questionable videos and posts to content reviewers throughout the summer, Facebook’s move only came hours after Apple decided to remove Infowars’ episodes from its podcast platform.
A report from the United Nations called Facebook a “useful instrument for those seeking to spread hate” in the genocide of Rohingya Muslims in Myanmar and that the company’s response to the crisis was “slow and ineffective.” While Zuckerberg had previously attempted to apologize to Myanmar NGOs, many believed that Facebook’s response was too little, too late after they had been warned for years about the effects of dehumanizing speech and misinformation spread by governmental officials using Facebook’s tools. The company has since committed to hiring Burmese-language moderators to deal directly with content from the country.
Following in the footsteps of WhatsApp’s founders, Instagram founders Kevin Systrom and Mike Krieger announced that they, too, were leaving the company they had sold to Facebook. Speaking at a conference the following month, Systrom would go on to say “no one ever leaves a job because everything’s awesome.”
Joel Kaplan, Facebook’s vice president for global public policy, was seated behind his friend Judge Brett Kavanaugh as he testified in front of the Senate Judiciary Committee about the sexual assault allegations of Christine Blasey Ford. Kaplan’s appearance, caught on television cameras, sparked outrage among some Facebook employees, who took to internal message boards to express their discontent. While Kaplan reportedly apologized to employees in a note, he would later throw a private party for Kavanaugh after the new Supreme Court justice was confirmed by the Senate.
The company disclosed that a security issue involving Facebook access tokens could have exposed millions of users’ personal information, including email addresses, phone numbers, genders, locations, birth dates, and recent search histories. While the social network initially said as many as 50 million accounts were affected, it later revised that number to 30 million and said that the FBI was investigating.
Facebook launched Portal, a video chat gadget that had long been in the works for the holiday shopping season. The device, despite some decent early reviews, was met with staunch criticism from onlookers who believed that consumers had no reason to trust a company that had been exposed for so many privacy and user data mishaps over the previous 12 months. “It honestly blows my mind that anyone (who has not been living under a rock) would put a product of Facebook— of all companies— in their living room,” wrote one Amazon reviewer who gave the product one star.
Brazilian newspaper Folha released a report showing that backers of right-wing Brazilian then–presidential candidate Jair Bolsonaro used his database of supporters’ phone numbers to target and spam WhatsApp disinformation campaigns about his political rival. The finding came a few weeks after Facebook brought journalists in to view its new “war room,” where it said teams would work together to combat election interference around the world. WhatsApp was a key vector for false news and disinformation during the Brazilian election, which Bolsonaro won.
Following reporting from BuzzFeed News, Facebook removed a spam network of 95 pages and 39 accounts in the Philippines for “encouraging people to visit low quality websites that contain little substantive content and are full of disruptive ads.” The move came after BuzzFeed News chronicled how an army of supporters of President Rodrigo Duterte used Facebook to cultivate support for his candidacy and subsequent drug war with inflammatory, hate-filled posts and memes.
On the night of US midterm elections, Facebook announced it had removed 100 Facebook and Instagram accounts for possible links to Russia’s Internet Research Agency. The removed accounts were identified following a tip from law enforcement agencies — the first time Facebook publicly acknowledged that it had taken action against a foreign influence campaign based on government intelligence, the New York Times reported.
A bombshell report in the New York Times laid bare poor decision-making in reaction to Russian election interference among Facebook’s top executives, who seemed more interested in protecting themselves than anticipating and acknowledging the company’s problems. The story outlined how Zuckerberg and Sandberg were slow to react to disclose the nature of Russian interference on the platform, and how the company attempted to attack critics like George Soros with tactics that included opposition research. Zuckerberg and Sandberg would later say they were unaware of the work being done by consulting firm Definers Public Affairs against Facebook’s critics, which wasn’t entirely forthcoming. (See below.) The resulting coverage about the company roiled employees who then engaged in an incredible amount of discussion and finger-pointing that has since spilled out into public view.
Zuckerberg revealed Facebook’s new policies over content and moderation, saying in a post announcing the changes, “One of the most painful lessons I’ve learned is that when you connect two billion people, you will see all the beauty and ugliness of humanity.” In the note, Zuckerberg covered complex topics including the possibility of an independent body to oversee content appeal and future regulation. It was overshadowed, however, by the fallout from the Times’ story a day earlier.
BuzzFeed News reported that Sandberg had personally told staffers via email to investigate George Soros following the billionaire philanthropist’s remarks about Facebook being a “menace” in January at Davos. The company maintained that her request was separate of any work that was later done by Definers on Soros. BuzzFeed News also later published some of the opposition research conducted by the public affairs company on Facebook’s behalf.
A UK parliamentary committee investigating Facebook released a cache of internal emails it had obtained from a plaintiff suing the company in California. The emails revealed a side of Facebook the public rarely gets to see and showed executives discussing the possibility of selling user data. One internal document revealed how Facebook came to the conclusion that it needed to purchase WhatsApp for $19 billion, while another featured engineers discussing how to hide prompts that would have notified users that the company was scraping phone and text data from Android devices.
Facebook disclosed that a bug may have exposed the photos of up to 6.8 million users to developers during a 12-day period in September. Affecting up to 1,500 apps from 876 developers, the security flaw gave developers the ability to access users’ photos, even those a person had uploaded to the service and not shared publicly. The disclosure came a day after the company built a pop-up store in New York City’s Bryant Park to teach users about privacy.
The New York Times reported on a number of special data-sharing relationships that existed between Facebook and companies like Spotify, Netflix, and Microsoft unbeknownst to many users. Using internal documents, the Times showed how Facebook shared user data, sometimes without people’s consent, to allow Microsoft to personalize searches on Bing or Amazon to obtain names and contact information through their friends. The story raised questions as to whether the company violated a 2011 consent decree with the FTC.
The attorney general for the District of Columbia filed a lawsuit against Facebook for its role in allowing Cambridge Analytica to harvest the personal information of millions of people without their consent. The suit is the first by any US regulator to punish Facebook for the scandal.
Image sources: Getty Images (12); Facebook; WhatsApp illustration by Vahram Muradyan for BuzzFeed News.