‘We’re not editors,’ insists Facebook’s news feed chief. Yeah, right
The same old paradox keeps popping up when John Hegeman talks about supporting users rather than directing them
At only 34, John Hegeman is arguably the world’s most powerful editor. As the vice-president in charge of Facebook’s news feed, the team he runs, and the decisions he makes, influence the reading habits of 2.3 billion people.
The algorithms he builds help determine which photos they see, what issues they are aware of and what news articles they read.
Not that he sees it that way.
“No, we don’t think of ourselves as editors,” he says. “The thing we’re not trying to do is show people the content that we want them to see, or that I want them to see. We’re trying to understand which things people find valuable and help them connect with that.
“The primary reason [for what people see] is the actions that person took – what they decided to like, who they decided to be friends with.”
In other words, Facebook doesn’t decide what users see. Users decide, and Facebook supports them in that decision. That is the paradox of the world’s biggest social network. It considers itself a pure, transparent vessel for its users’ desires. It has no ideology, no philosophy; all it does is serve their needs. Yet, to do so it employs more than 37,000 people, pays them high salaries and constantly reshapes its technology in ways that can affect users’ behaviour and their moods.
It takes a lot of work, apparently, to become transparent, and much of that happens in Facebook’s news feed.
Rise to power
If you were asked to imagine a software engineer, you might well imagine Hegeman. He is gangly, a little awkward, with wide eyes behind his glasses. He is, of course, young for a person wielding such power. But his boss, Mark Zuckerberg, is also 34.
“I joined Facebook in 2007, and I don’t think anybody at that time really imagined how large Facebook would become, and the role that it would play in the world,” he says.
Back then, he was studying for a PhD in economics at Stanford University, whose campus near Palo Alto, California, feeds so many graduates and dropouts into Silicon Valley. Hegeman was one of the latter, abandoning his studies to join Facebook. As such, he has spent almost his entire adult life at one company, rising from lowly engineer to his current position.
In 1995, the digital scholar Nicholas Negroponte suggested that people would one day get their news from a “Daily Me”, a virtual newspaper that customised itself to its readers’ tastes. Facebook’s news feed extends that principle. Users opt in to seeing posts, pictures and news articles by becoming friends with or following the people and pages that produce them. From this “inventory” of eligible content, the news feed’s algorithms pick the items they believe the user will be most likely to interact with, assigning a numerical rank that determines their prominence.
If its predictions are right, it learns to do the same in future; if they are wrong, it rethinks them. It is Facebook’s primary product, its front page and, for now, the source of most of its advertising revenue. In Britain, it is the third-most-used news source, behind only BBC One and ITV.
The news feed was always controversial because it involved Facebook choosing, on a scale far beyond any human editor, what news articles should receive prominence. Over time, the feed also watched users interact reliably with the most infuriating, shocking and sensational content. Assigning such content a higher ranking, it created a feedback loop, well-attested by academic research. Nonsense spread from page to page and brain to brain like a parasite that drives its hosts to infect others. It has even been claimed that this process may have contributed to the alleged genocide in Burma, where most people use “Facebook” and “the internet” interchangeably.
The world’s biggest news source appeared to be spinning out of control. As such, much of Hegeman’s work has been about fixing the news feed. How to stop Facebook from becoming an angry place, full of furious debates and the mental junk food of viral videos? Tweak the news feed’s algorithms to prioritise “friends and family” (as Zuckerberg announced in January 2018). How to limit the spread of fake news and conspiracy theories? By “down-ranking” articles flagged by fact-checkers. How to reassure people that Facebook is not a shadowy editor? By adding a button to every news feed item that tells users why it has appeared.
Hegeman is in his element as he explains how all this works. One of the predictions the news feed now makes is whether or not users will consider a certain article to be “clickbait”. The news feed notices when people click on an article and then close it seconds later, or when large numbers of people comment using words like “fake” and “false”.
“Some of these problems are pretty adversarial in nature,” says Hegeman. “Bad actors who are trying to game the system to maximise distribution are always evolving their tactics, so even though we’ve made a lot of progress that doesn’t mean we can step back.” Sometimes, he admits, this involves trying to change the behaviour of news organisations – another example of Facebook’s editorial power. The news feed, he says, needs to make sure that publishers “write good content with good headlines”, and not “low-quality” articles with “deceptive” headlines.
He will not name specific measures that Facebook has taken, stressing only that his engineers are always adding new “signals” (data for the algorithms) and tweaking the extent to which different signals contribute to relevance scores.
There is some doubt over how well this is working. According to Newswhip, a social media analytics company, Facebook users’ engagement with content from the rest of the internet was higher than ever in 2018, despite Zuckerberg’s focus on “friends and family”. The top stories were divisive, shocking or untrue; one was a brief report from an American news site about a “child predator” prowling “in our area” without specifying where.
Hegeman simply says that Facebook’s evidence differs. “We have seen our internal tracking of this, as well as a few external studies, all saying the same thing: that there have been pretty dramatic reductions in the amount of traffic going to sites that spread misinformation,” he says.
But here that paradox comes up again. If Facebook is not an editor, why does it censor fake news and anti-vaccine conspiracies? Hegeman insists that Facebook does have limits. “There absolutely are things that we take a point of view on. It doesn’t matter if somebody wants to see terrorist content, they’re not going to get access to that.”
So, Facebook will give people what they want, unless it is something Facebook does not want to give. The way he describes it, Facebook’s responsibility is mostly technical. But then how does Facebook decide what “good” means? How does it decide the news feed’s values? It must decide somehow, because it is deciding.
For Hegeman the answer lies in being more transparent. “Our responsibility ... is to make sure we not just reflect our point of view but try to make those rules and values reflective of what society feels that we should be doing.”There is some chance that Facebook will retreat from the news feed business. Zuckerberg has said internet users are moving away from public sharing and towards private messaging, and that Facebook wants to follow them. Hegeman, though, says it will survive. “There’s a lot of things where the news feed is [still] the best way to share ... that will continue indefinitely.”In that case, Facebook’s grand “responsibility” to promote some posts and bury others will continue indefinitely. Perhaps one day it will admit that it has a philosophy too.– © Telegraph Media Group Limited (2019)