Life’s a breach: After a year from hell, can Facebook refresh?


Life’s a breach: After a year from hell, can Facebook refresh?

The social network suffered the greatest one-day stock market loss of any US company in history

Laurence Dodds

Every December, Facebook shows each of its 2.4bn users a personalised video of their “year in review”. Its algorithms try to pick their happiest photos and fondest memories – birthdays, marriages, new family members – to remind them that it has been there for all their best moments.
But if Mark Zuckerberg got such a video this Christmas he would be forgiven for recoiling from it like Scrooge from the ghost of Marley: Facebook had a truly hellish year in 2018.
Last January, it seemed like the company had finally come to Jesus.
After a long series of escalating scandals over fake news, election interference, political bias and phone addiction, Zuckerberg announced that his new year’s challenge (in previous years he has promised to learn Mandarin and wear a tie every day) was simply to fix Facebook.
And lo, from every company blog post and official statement the same message was echoed: Facebook had been “too slow” to recognise its problems, but 2018 would be different.
However, the chains Zuckerberg had forged over the past decade were long indeed, and in March its lax policies on personal data came back to haunt it.
The loophole that allowed the Cambridge Analytica scandal had been closed long ago, but 87 million users’ personal data was in the wild and had been used in political campaigns.
Facebook had suffered scandals before, but this one tied it to the election of Donald Trump, and earned it all the fury of his numerous enemies. Zuckerberg made an apology tour, gulping down water before the US Congress and enduring a monstering by the European parliament.
Then Facebook suffered the greatest one-day stock market loss of any US company in history when it revealed that its user growth in the lucrative first world was slowing down more quickly than anticipated.
It was blamed for lynchings in India, attempted genocide in Myanmar and fascism in Brazil; two large data breaches put its users at risk; journalists exposed its giant data-sharing deals with other companies and the hardball tactics of its hired PR firms; while the British parliament published leaked documents that revealed its internal priorities.
In between came a drumbeat of smaller scandals that it would once have shrugged off but were now part of a deluge.
The chaos has blindsided Facebook’s employees. “It had a deep impact on morale while I was there, and I’m sure it’s gotten worse since then,” says Brian Amerige, a former employee who quit in October in protest at what he described as a left-wing “monoculture” within Facebook.
“The attitude from folks who were at the company for four-plus years was more cynical and despondent than I’ve ever seen it.”
How did it all go so wrong? And how, after such a bruising 2017 and with such big plans for 2018, did Facebook fail to contain things?
Part of the problem is that it is still trying to mitigate the effects of decisions it took more than a decade ago.
Cambridge Analytica, the British parliament leaks and the recent row over access to private messages all came from Facebook’s long history of sharing user data with other companies to make itself a universal hub for every imaginable service.
Its problems with fake news and their violent results in Myanmar all stem from its great success in keeping users hooked and “engaged”. Its data breaches happened because gathering vast archives of private information under one roof is the basis of its whole business. Facebook’s biggest problem, arguably, is Facebook.
“It’s sad, but not surprising to see the way things have gone,” says Chris Eberle, a former director at Facebook.
For years, he explains, executives minimised potential problems and allowed themselves to believe that what was good for Facebook would be good for the world.
A leaked internal memo written in 2016 by Andrew Bosworth, a member of Zuckerberg’s inner circle, illustrates the problem. “The ugly truth”, Bosworth wrote, was that “connecting people” was so important that almost any tactic used to make Facebook bigger, or almost any negative consequence – including “terrorist attacks” and deaths from bullying – was worthwhile (he later said he did not agree with his post and had merely meant to provoke debate).
The result was that ad sales, and therefore profits, were systematically prioritised over safety. Staff responsible for taking down harmful content faced pressure not to hurt Facebook’s engagement statistics by removing too much; staff responsible for the ads market did not look too closely at where ads were coming from.
Those teams were full of talented, principled people, says Eberle, but “no matter how principled these folks are, their variable compensation is based on the performance of the company”.
Take fake news. Back in 2015, Facebook’s operations staff were worried about benign but untrue stories – people emerging alive from the bellies of dead alligators – spreading on their network.
It became a “point of friction” with sales staff, who argued the stories’ authors were buying lots of ads and should not be curbed.
“Hindsight is 2020,” says Eberle, “but had Facebook looked at fake news at that time ... they might have prevented or mitigated [political] fake news being a thing.” Today, both Eberle and Amerige say, there is a heartfelt engagement with these problems.
“I believe the commitment runs deep,” says Eberle, arguing that if Facebook is “humble” and clear about its goals then 2019 can be “a turnaround year”.
But that may be a big “if”. Even today, insiders have described a “culture of conformity” that encourages “sycophantic” praise and furious defence of the company on its internal social network.
That obstinacy is reflected at the top. Zuckerberg has told his employees that Facebook is at “war”, and that he must make quick decisions to protect it; he has attacked some negative news coverage as “bullshit” and allegedly dismissed the Cambridge Analytica scandal as “hysteria”.
The billion-dollar question remains: Will this really hurt Facebook’s bottom line? For years it looked like the answer might always be “no”.
Within eight weeks of the Cambridge Analytica scandal, the company’s shares had fully recovered; apparently Wall Street wasn’t that perturbed.
But since the start of 2018 those shares have dropped by almost as much as they did after Facebook’s calamitous public float in 2012. As the American columnist Shira Ovide has pointed out, the company has lost $270bn in market value since July – bigger than the entire capitalisation of Walmart.
Part of that is from the general tech stock bloodbath, but Facebook has suffered far more than Apple and Amazon.
The problem for Facebook is that its business ultimately depends on selling adverts that users don’t want to see next to the content that they do want to see.
Hence, according to Matti Littunen of Enders Analysis, “the overall amount of ad inventory is determined by how much time people spend” multiplied by how many ads can be injected into their feed “before [they] go mad”.
Facebook has now hit that limit, to the point where Zuckerberg actually wants to decrease the amount of time people spend on it to prevent them from burning out. That leaves Facebook very vulnerable to anything that might lower its ad sales or the number of eyeballs it attracts each day.
Enter American politicians, who have reversed their stance on Facebook with astonishing speed since 2016, when Sandberg had to dismiss rumours that she would serve as Hillary Clinton’s treasury-secretary if she won the election.
Efforts are under way to pass GDPR-style privacy legislation, boosted by Democrat gains in Congress. Britain has mooted a social media regulator that would impose stricter age-verification and deadlines for removing hate speech, in addition to the “tech tax” announced by chancellor of the exchequer Philip Hammond.
GDPR has already helped shrink Facebook’s user base in Europe; similar laws elsewhere could have a similar effect.
And although the company races to monetise new services such as Instagram and WhatsApp and to pull in new users in the developing world, Facebook itself remains its main source of revenue.
That, in the end, is why Facebook’s year from hell will leave lasting scars.
Even Mike Masnick, a veteran tech journalist who is no friend of GDPR or the California privacy act, has called it a disaster of the company’s own making. “Facebook’s continued inability to be open and transparent about its actions,” he warns, “is certainly going to lead to hamfisted regulations that will block useful innovations from other companies.”
If that happens, it could cause Facebook’s stock price to fall further, and that really would be dangerous for Zuckerberg.
An employee revolt is one of the few things that could topple him, and shares are a key part of its compensation regime.
So perhaps in December 2019 Zuckerberg will find he can watch his year-in-review video without flinching. Or perhaps he will find himself wishing for the relative peace of 2018.
– © The Daily Telegraph

This article is free to read if you register or sign in.

Times Select

If you have already registered or subscribed, please sign in to continue.

Questions or problems?
Email or call 0860 52 52 00.