“Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism” is a memoir by Sarah Wynn-Williams, a former director of public policy at Facebook. In her book, she touches on how Facebook’s decisions cause real-world harm. Below are two examples of such harm: one being the decisions made in Myanmar, and the other being the targeted ads towards young women and girls.
Myanmar
- Myanmar had virtually no internet until Facebook. Facebook “made deals with local telecoms to preload phones with Facebook, and in many plans, time spent on Facebook wasn’t counted towards your minutes.” Pg. 342-343
- Hate speech towards a Muslim minority in Myanmar, the Rohingya, runs rampant. The Facebook team overseeing Myanmar dismisses the concerns raised.
- As hatespeech increases, so too does mob activity over misinformation. The posts are said to not “go against [Facebook]’s policies,” and are kept up.
- Community standards are not posted in Burmese, and several features (such as the like or reporting button) show up as corrupted characters.
- Myanmar does not run on Unicode, and posts outside of the area are not seen.
- When told about the riots and Myanmar not being in Unicode, the author was told, “I would love to prioritize getting this done.… I just don’t think we can justify that given the other things in our pipeline.” Yet they have no problem making teams for Chinese censorship. Pg. 346-347
- Myanmar is not a priority country in SEA (Southeast Asia), meaning there are no PR resources for them.
- The official Facebook app is unavailable for download in Myanmar, so people are downloading unofficial ones. In these versions, there is no report button, making reporting on hate speech impossible.
- The few posts that were reported were ignored, with no action taken.
- A Burmese contractor for content moderation is suspected of letting a muslin slur be accepted and defending its use. He is accused of “removing more posts by civil society groups and peace activists than government and anti-Muslim accounts.” These concerns are dismissed by higher-ups. Pg. 348
- What’s more, the slur in question is refused to be banned.
- The author’s team joins with teams outside of Facebook to get work done, and it is confirmed that moderators at Facebook are not removing harmful content, but instead removing content they shouldn’t.
- There is no way to verify accounts as the platform is “dysfunctional in Burmese,” and as a result, civil groups are being mass-reported as fake and taken down.
- The people monitoring Myanmar are in Dublin– the wrong timezone to monitor posts.
- They hire someone who is “male, older, white, and a Harvard graduate” in hopes these qualities will get the team the recognition they need with higher-ups, who also fit that criteria.
- The author is essentially told by the Republicans in the DC office that because she is not a Jewish person who went to Harvard, she will “never be like [her bosses]. And the sooner [she] grasp[s] this, the better. Pg. 350-351
- “Causal nepotism [that] runs through most of the senior team” plays a big role in the internal workings of Facebook. Pg. 351
- The author says, “it would be impossible to convince Facebook’s leadership to hire someone just for Myanmar.” Pg. 351
- The author is told to drop the Myanmar issue as she is “only in charge of Asia temporarily,” ignoring the fact that she started the hiring process long before she was temporary, and needed someone in-country, and they couldn’t find someone more qualified. She is told “not to raise the issue again.” Pg. 352
- Civil groups in Myanmar are quoted as saying: “I think it’s critical to recognize that Facebook is being used deliberately, systematically, and very effectively to target not just peace and interfaith activists, but also journalists and anyone who deviates from the dominant [junta] narrative.” Pg. 352
- Large patterns of hacking and account takeovers are reported in Myanmar, with a group of “ 571 members that seems to be a site for planning and sharing information for spearfishing attacks on verified users and pages,” just as the civil groups warned. Pg. 352
- Facebook would elevate posts that spread hatred and fear because it received so much engagement.
- When saying they should take down these posts, the content operations team says the “posts don’t violate local laws, so there’s no reason to take them down.” Pg. 353
- When these actions, from various countries, came to a slaughtering on the scale of a genocide, with inhumane actions against Muslim people, Facebook said, “It had found evidence that the messages were being intentionally spread by inauthentic accounts and took some down at the time. It did not investigate any link to the military at that point.” This could have been avoidable. Pg. 354-355
- “The UN report on the human rights violations in Myanmar devotes over twenty pages to the critical role Facebook played in spreading hate.” Pg. 355
- Most things listed in the UN report are issues the author’s team tried fixing for years. “The truth here is inescapable. Myanmar would’ve been far better off if Facebook had never arrived there.” Pg. 356
- “I’ve spent a lot of time thinking about what unfolded next in Myanmar, and Facebook’s complicity. It wasn’t because of some grander vision or any malevolence toward Muslims in the country. Nor a lack of money. My conclusion: It was just that Joel, Elliot, Sheryl, and Mark didn’t give a fuck.” Pg. 356
- “They apparently didn’t care. These were sins of omission. It wasn’t the things they did; it was the things they didn’t do.” Pg. 356
The Effects on Young Girls
- “Facebook is offering advertisers the opportunity to target thirteen-to-seventeen-year-olds across its platforms, including Instagram, during moments of psychological vulnerability when they feel “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” “stupid,” “useless,” and “like a failure.” Or to target them when they’re worried about their bodies and thinking of losing weight. Basically, when a teen is in a fragile emotional state.” Pg. 330
- Facebook and Instagram monitor “teenagers’ posts, photos, interactions, conversations with friends, visual communications, and internet activity on and off Facebook’s platforms and use this data to target young people when they’re vulnerable,” as well as when they have issues with their bodies. Pg. 330
- Facebook is evidently proud of this emotional manipulation/targeting.
- It’s known in the company that Facebook “commercialized and exploits [their] youngest users.” Pg. 332
- Facebook critics have warned of dystopian scenarios like this in the past.
- They prepare a statement but cannot say they will remedy the situation, as a staffer says that “[they] can’t say [they’re] taking efforts to remedy it if [they’re] not.” Pg. 332
- Facebook targets young mothers as well as racial and ethnic groups.
- Facebook even logs when teenage girls delete selfies so beauty ads are shown to them.
- The author is discouraged from finding an independent audit by a third-party so she can stop the problem.
- A teenager takes her own life, having followed accounts with Instagram’s targeted words, one being “Worthless.” They are warned of the risk of future incidents like this happening in the future.
- They now use “depressed” as a target word.
- They are working on making it so advertisers can target behavioral posts without Facebook’s help.
- Facebook lies and releases a statement saying: “Facebook does not offer tools to target people based on their emotional state,” (Pg. 334) even though they confirmed in emails that they do. Lying to the public.
- “This is the business, Sarah. We’re proud of this. We shout this from the rooftops. This is what puts money in all our pockets. And these statements make it look like it’s something nefarious.” The executive is worried more about what the advertisers think. Pg. 334
- Elliot, a higher up, says “If you and [the executive] both hate this—for opposite reasons—we must’ve gotten this exactly right.” They know what they are doing is immoral and against business practices. Pg. 335
Why This Matters For Civics, Media Literacy, or Democratic Accountability
What Facebook is doing here is so against normal human moral code that it goes beyond just civics and into, as the author mentioned, “dystopian future nightmares.” Facebook users are being failed to the highest degree; their emotions, personal data, and safety are compromised for money. In Myanmar, we have blood being shed, genocide on the hands of rich, white executives, and in Australia, we have teen girls taking their own lives over targeted ads pushed to make them feel worse.