By now, it feels like the day- to- day at Facebook is lurching from one dismaying shit show to the next. Mark and Sheryl seem completely removed. Focused on presidential runs or promoting new books or commencement speeches or whatever.
In April 2017, a confidential document is leaked that reveals Facebook is offering advertisers the opportunity to target thirteen- to-seventeen- year- olds across its platforms, including Instagram, during moments of psychological vulnerability when they feel “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” “stupid,” “useless,” and “like a failure.” Or to target them when they’re worried about their bodies and thinking of losing weight. Basically, when a teen is in a fragile emotional state.
Facebook’s advertising team had made this presentation for an Australian client that explains that Instagram and Facebook monitor teenagers’ posts, photos, interactions, conversations with friends, visual communications, and internet activity on and off Facebook’s platforms and use this data to target young people when they’re vulnerable. In addition to the moments of vulnerability listed, Face-book finds moments when teenagers are concerned with “body confidence” and “working out & losing weight.”
At first blush it sounds pretty gross, sifting through teens’ private information to identify times when they might be feeling worthless and vulnerable to an advertiser flogging flat- tummy tea or whatever other rubbish.
But apparently Facebook’s proud of it. They’ve placed a story in Australia explaining how the company uses targeting based on emotions: “How Brands Can Tap into Aussie and Kiwis [sic] Emotions: Facebook Research,” which touts how Facebook and Instagram use the “emotional drivers of behavior” to allow advertisers to “form a connection.” The advertising industry understands that we buy more stuff when we are insecure, and it’s seen as an asset that Facebook knows when that is and can target ads when we’re in this state.
It’s a reporter for an Australian newspaper who’s got his hands on one of the internal documents about how Facebook actually does this, and he reaches out for a comment from Facebook before publishing. That’s when I hear about it. I didn’t know anything about this and neither did the policy team in Australia. It’s an advertising thing. I’m put on a response team of communications specialists, members from the privacy team and measurement team, and safety policy specialists that’s supposed to figure out what to say publicly.
No one in that group, other than me and my Australian team, seems surprised that Facebook made an advertising deck like this. One person messages the group, “I have a very strong feeling that she [the Australian staffer who prepared the deck] is not the only researcher doing this work. So do we want to open a giant can of worms or not?” And they’re right. At first, we think the leaked document is one Facebook made to pitch a gum manufacturer to target teenagers during vulnerable emotional states. Then eventually the team realize, no, the one that got leaked was for a bank. There are obviously many decks like this.
The privacy staffer explains that teams do this type of customized work targeting insecurities for other advertisers, and there are presentations for other clients specifically targeting teens. We discuss the possibility that this news might lead to investigations by state attorneys general or the Federal Trade Commission, because it might become public that Facebook commercializes and exploits Facebook’s youngest users.
To me, this type of surveillance and monetization of young teens’ sense of worthlessness feels like a concrete step towards the dystopian future Facebook’s critics had long warned of.
A statement is quickly drafted and the response team debates whether Facebook can include the line, “We take this very seriously and are taking every effort to remedy the situation,” since in fact this is apparently just normal business practice. A comms staffer points out what should be obvious: that “we can’t say we’re taking efforts to remedy it if we’re not.”
This prompts other team members to confirm his take, revealing other examples they know of. Facebook targets young mothers, based on their emotional states, and targets racial and ethnic groups— for example, “Hispanic and African American Feeling Fantastic Over- index.” Facebook does work for a beauty product company tracking when thirteen- to- seventeen- year- old girls delete selfies, so it can serve a beauty ad to them at that moment.
We don’t know what happens to young teen girls when they’re targeted with beauty advertisements after deleting a selfie. Nothing good. There’s a reason why you erase something from existence. Why a teen girl feels that it can’t be shared. And surely Facebook shouldn’t then be using that moment to bombard them with extreme weight loss ads or beauty industry ads or whatever else they push on teens feeling vulnerable. The weird thing is that the rest of our Facebook coworkers seem unbothered about this.
My team and I are horrified; one of them messages me, “Also wondering about asking my apparently morally bankrupt colleagues if they are aware of any more. The Facebook advertising guy who is cited in the [Australian] article has three children— I talked him through his kid being bullied— what was he thinking?”
I’m still struggling to get a better picture of what we’re dealing with here. So I ask for an independent audit by a third party to understand everything that Facebook has done like this around the world, targeting vulnerable people, so I can try to stop it. Who has this information and how many advertisers has it been shared with? The team is not enthusiastic. Elliot nixes any audit and cautions against using the word “audit” at all, even as an ask like mine, saying that “lawyers have discouraged that description in similar con-texts.” He doesn’t say why but I’m guessing he doesn’t want to create a paper trail, a report with damning details that could be leaked or subpoenaed. Years later I would learn that British teenager Molly Russell had saved Instagram posts including one from an account called Feeling Worthless before committing suicide. “Worthless” being one of the targeting fields. This only emerged due to a lawsuit that revealed internal documents acknowledging “palpable risk” of “similar incidents.”
The initial statement Facebook gives the Australian journalist who discovered the targeting and surveillance back in 2017 does not ac-knowledge that this sort of ad targeting is commonplace at Facebook. In fact, it pretends the opposite: “We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate.”
A junior researcher in Australia is fired. Even though that poor researcher was most likely just doing what her bosses wanted. She’s just another nameless young woman who was treated as cannon fodder by the company.
When that doesn’t stop media interest, Elliot says, “We need to push back hard on the idea that advertisers were enabled to target based on emotions. Can you share to group so Sheryl et al can i) see the article, ii) understand next steps.” Joel wants a new, stronger statement, one saying that we’ve never delivered ads targeted on emotion. He directs that “our comms should swat that down clearly,” but he’s told that it’s not possible. Joel’s response: “We can’t confirm that we don’t target on the basis of insecurity or how someone is feeling?” Facebook’s deputy chief privacy officer responds, “That’s correct, unfortunately.” Elliot asks whether it is possible to target on words like “depressed” and the deputy chief privacy officer confirms that, yes, Facebook could customize that for advertisers. He explains that not only does Facebook offer this type of customized behavioural targeting, there’s a product team working on a tool that would allow advertisers to do this themselves, without Facebook’s help.
Despite this, Elliot, Joel, and many of Facebook’s most senior executives devise a cover- up. Facebook issues a second statement that’s a flat- out lie: “Facebook does not offer tools to target people based on their emotional state.” The new statement is circulated to a large group of senior management who know it’s a lie, and approve it anyway. It reads,
On May 1, 2017, The Australian posted a story regarding research done by Facebook and subsequently shared with an advertiser. The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state.
The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.
I take a couple of days off for a family trip and to celebrate Xanthe’s birthday, and the response team continues on without me. I’m glad to miss it.
Excerpted with permission from Careless People: A Story of Where I Used to Work by Sarah Wynn-Williams, published by Flatiron Books/Pan Macmillan India
**************
Shocking and darkly funny, Careless People gives you a front-row seat to the decisions that are shaping our world and the people who make them. Welcome to Facebook. Sarah Wynn-Williams, a young diplomat from New Zealand, pitched for her dream job. She saw Facebook’s potential and knew it could change the world for the better. But, when she got there and rose to its top ranks, things turned out a little different. From wild schemes cooked up on private jets to risking prison abroad, Careless People exposes both the personal and political fallout when boundless power and a rotten culture take hold. In a gripping and often absurd narrative, Wynn-Williams rubs shoulders with Mark Zuckerberg, Sheryl Sandberg and world leaders, revealing what really goes on among the global elite – and the consequences this has for all of us. Candid and entertaining, this is an intimate memoir set amid powerful forces. As all our lives are upended by technology and those who control it, Careless People will change how you see the world.
The publication of this book was announced literally a few days before it was published. As soon as the announcement was made, Meta, the parent company of Facebook, moved for an injunction to prevent Sarah Wynn-Williams from promoting her book. Meta won an emergency arbitration ruling to temporarily stop promotion of the tell-all book Careless People by its former employee. The New York Times book review “an ugly, detailed portrait of one of the most powerful companies in the world”, and its leading executives, including CEO Mark Zuckerberg, former chief operating officer Sheryl Sandberg and chief global affairs officer Joel Kaplan. Meta will suffer “immediate and irreparable loss” in the absence of an emergency relief, the American Arbitration Association’s emergency arbitrator, Nicholas Gowen, said in a ruling after a hearing, which Wynn-Williams did not attend. Book publisher Macmillan attended and argued it was not bound by the arbitration agreement, which was part of a severance agreement between the employee and company. The ruling says that Wynn-Williams should stop promoting the book and, to the extent she could, stop further publication. It did not order any action by the publisher.
Despite these efforts, Meta has been unable to prevent Careless People from zooming to the top of bestseller charts across the world. The Streisand Effect has come into play. There has been an incredible upsurge in the number of social media posts across platforms, including Meta-owned Threads, Instagram, and Facebook, either batting for or against the book. It has been extraordinary reading these “testimonies”. It makes one wonder in the phrase: There is no smoke without fire.
The extract that has been published here is just one of the many explosive revelations in Careless People. It is unnerving because it was always talked about but here is a former employee of the firm connecting the dots for everyone to read.
In fact, every MP across the UK will receive a copy of Careless People on behalf of Molly Rose Foundation, in a partnership with Pan Macmillan. Molly Rose Foundation was set up following the death of 14-year-old Molly Russell. A coroner concluded that the effects of online content viewed on social media “contributed to her death in a more than minimal way”. Molly was algorithmically exposed to a torrent of depressive, suicide and self-harm content on Instagram. The inquest heard she had seen more than 2,000 posts relating to suicide, self-harm and depression on the platform in the months before her death. The charity has sent the books to MPs to highlight the preventable harm caused by Meta’s platforms and the importance of being able to scrutinise the culture, practices and leadership of tech firms that expose users and society to harm.
Sarah Wynn-Williams book title “Careless Whispers” is a reference to the well-known novel The Great Gatsby published in 1925 by F. Scott Fitzgerald. It is about the mysterious millionaire Jay Gatsby and his former lover Daisy Buchanan. The relevant sentence in The Great Gatsby yoking it to Careless Whispers published a century later is: They were careless people, Tom and Daisy — they smashed up things and creatures and then retreated back into their money or their vast carelessness or whatever it was that kept them together, and let other people clean up the mess they had made.
Sarah Wynn-Williams is a former New Zealand diplomat and international lawyer. She joined Facebook after pitching a job and worked there for many years, ultimately becoming director of global public policy. After leaving the company, she has continued to work on tech policy, including artificial intelligence.
Sarah Wynn-Williams Careless People: A Story of Where I used to Work Flatiron Books, an imprint of Macmillan Books, PanMacmillan, London, 2025. Pb. Pp.400
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!