Aral, an MIT management professor and one of the world’s leading experts on fake news, is uniquely positioned to rip the covers off the inner workings of social media
On social media, as Sinan Aral makes abundantly clear, fake news travels farther and faster than true stories, often by an order of magnitude. Powered by its emotionally arousing content and the “in the know” social status gained by those who pass it on, false stories have three times the effect on stock prices as real news and are strongly suspected of having moved critical voting blocs in elections around the world. There is “a reality-distortion machine in the pipes of social media platforms,” writes Aral in The Hype Machine, “through which falsehood travels like lightning, while the truth drips along like molasses.”
Aral, an MIT management professor and one of the world’s leading experts on fake news, is uniquely positioned to rip the covers off the inner workings of social media. In crisp prose that explores numerous scientific studies while drawing on decades of experience within the major tech firms, Aral sets out how social media’s “promise and peril” are tightly bound—the factors that allow it to provide vital health information during a pandemic are the same ones that allow it to spread potentially lethal pseudo cures. The results are rigorously analytical and well argued, but what really makes Aral’s case for him is the emotional punch packed by his bad news about the fake news.
That’s just as well, because society really needs to focus its attention on false information. In spite of the stock manipulations and Russian electoral interference, we ain’t seen nothing yet. The age of “deepfakes” is almost upon us. A form of machine learning called generative adversarial networks (GANs) is now being used to pit two networks against each other: a “generator,” which creates synthetic media, and a “discriminator,” which works at determining if the content is real or fake. The first learns from the second, and like social media in general, feeds on itself. Fake video is already astonishingly convincing. If seeing is believing—and studies show humans retain 10 per cent of what they read but 95 per cent of the “messages” they watch—fake news will soon be more alluring than ever.
If contemporary society wants to corral the peril of social media while retaining its promise, writes Aral, it needs to act quickly. There are four levers that stakeholders—companies, governments and users—can pull, separately or in combination: the code that governs social platforms; the money (meaning the financial incentives generated by their business models); the norms followed by the industry; and the laws that regulate it.
The social media feedback loop fosters polarization through our catalogued preferences and via friend recommendations. Tests that inject what one researcher called “an equal share of pro- and counter-attitudinal news” onto Facebook feeds found that political polarization shrank almost as much as it’s grown over the past two decades. So consider, suggests Aral, having another look at the “like” button, and replacing it with a “truth” button (for content a user thinks is true) or a “reliability” button (when we respect the source). Perhaps a “veracity” score could be added to Twitter profiles, a grade based on fact-checkers’ assessment of previous posts. If high truth scores turn out to attract followers on those two platforms, such code changes will nudge the hype machine toward positivity in its feedback loop, since—just as the rich get richer—on social media the popular get more popular. Industry norms are already moving, if slowly, in that direction, Aral writes, arguing that users have “largely accepted that on Twitter retweets do not necessarily mean endorsements.”
If those developments sound too pie in the sky for pragmatic sorts, consider the tried and true: Follow the money. In the U.S., that leads immediately to antitrust considerations and the question of whether Facebook, in particular, should be broken up. Aral demurs. There are classic monopoly issues with Amazon, which sells its own products on its platform and can game the rules to promote them over competitors’ wares. But Facebook doesn’t use its monopoly power to exact more money from consumers—Facebook doesn’t charge them anything. What makes it a monopoly, in the sense that startup competitors wouldn’t have a chance against it, is its lack of “interoperability.” Ask Facebook for your “social graph” (a database of contacts that can easily operate with another network) and you won’t get anything “near as useful” for you, or as hazardous to Facebook’s profitability.
That brings Aral to his fourth lever, the law: National governments will have to mandate “that technology platforms make their data, and specifically their social networks, portable,” in the same way the American FCC mandated phone number portability in the telecom business in 1996. Social media badly needs competition, to force its thinking away from the value it “extracts” from consumers to what it “delivers.” The best way to do that, Aral judges, is to enforce portability and enable the rise of platforms that just might brandish truth and reliability buttons. That, with a handful of other monetary tweaks—such as reducing the profitability of fake news (YouTube has already removed ads from anti-vaccination videos)—is all we can get from government and the platforms themselves without harming the promise and peril of social media.
For Aral, the rest is up to us, the third stakeholder. Once competition arrives, “consumers will need to define the values they want the platforms to deliver, and to enforce those values by doing business only with platforms that make good on them,” he writes. It will be a slow, incremental process, he concludes, but it will be our essential tool in taming the hype machine.
From a Nicole Kidman prestige drama, writings of a socio-economic futurist and a trip down the tech rabbit hole, here’s what we’ve been reading and watching.