ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump

On Twitter, in a thread that went viral, François Chollet, an A.I. software engineer at Google, argued, “Facebook is, in effect, in control of your political beliefs and your worldview.”Photograph by Joe Penniston / Flickr

Mark Zuckerberg, the founder and C.E.O. of Facebook, admitted recently his company knew, in 2015, that the data firm Cambridge Analytica, which assisted with Donald Trump’s election campaign, had improperly acquired information on 50 million Facebook users. “This was a breach of trust,” Zuckerberg said, in a Facebook post. “We need to fix that.”

Nautilus Members enjoy an ad-free experience. Log in or Join now .

But that’s not the only thing Facebook needs to fix. “The problem with Facebook is not just the loss of your privacy and the fact that it can be used as a totalitarian panopticon,” said François Chollet, an artificial intelligence and machine learning software engineer at Google, in a tweet yesterday. “The more worrying issue, in my opinion, is its use of digital information consumption as a psychological control vector.” He elaborated on this point in a thread that’s been shared thousands of times. I caught it when global-surveillance critic and The Intercept writer Glenn Greenwald quote-retweeted Chollet, calling it a “great thread” on “Facebook’s menacing use” of psychology-manipulating A.I. “But remember,” Greenwald added, “Google is also a major exploiter of artificial intelligence with very little transparency or public oversight.”

Chollet bristled at this. “This is the laziest kind of thinking—just because two things share some superficial similarity (they’re large tech [companies]) doesn’t mean they’re equivalent,” he said. Why? Look to the Newsfeed, Facebook’s signature feature, Chollet went on, in this Twitter thread. “If Facebook gets to decide, over the span of many years, which news you will see (real or fake), whose political status updates you’ll see, and who will see yours, then Facebook is, in effect, in control of your political beliefs and your worldview.”

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Fil Menczer, Professor of Informatics and Computer Science at Indiana University, told Nautilus, in a discussion about his work on fake news, that the algorithms used on social media platforms, like Facebook, bias our decision-making in ways that exploit our socio-cognitive biases—but search engines like Google aren’t entirely innocent.

If Zuckerberg is morally bankrupt, he’s trying to hide it.

“The algorithmic biases feed into social and cognitive biases, like confirmation bias, which in turn feed into the algorithmic biases. Before, people were looking at the evening news on TV, or reading the local paper, for example. But the fact that the medium has changed to online social networks, where you shape the sources of information to which you are exposed, now means that you become even more vulnerable,” he said. “Search engines and social media, for example, try to predict what content may be most engaging for someone. Ranking algorithms use popularity as one of the ingredients in their formulas. That means that the more people in your group interact or engage with a piece of fake news, the more likely you are to see it. The social network can act as an amplifier because the people near you have opinions similar to you, so they are more likely to be tricked by a certain kind of fake news, which means you are more likely to see it, too.”

François CholletImage from Google Developers / YouTube
ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

For Chollet, though, the sort of danger Facebook poses is unique. “There’s only one company where the product is an opaque algorithmic newsfeed, that has been running large-scale mood/opinion manipulation experiments, that is neck-deep in an election manipulation scandal, that has shown time and time again to have morally bankrupt leadership. Essentially nothing about the threat described applies to Google. Nor Amazon. Nor Apple. It could apply to Twitter, in principle, but in practice it almost entirely doesn’t,” he said. What seems to clinch it for him is that Facebook is ambitiously pursuing advances in A.I. “What do you use AI…for, when your product is a newsfeed?” he wondered. “Personally, it really scares me. If you work in A.I., please don’t help them. Don’t play their game. Don’t participate in their research ecosystem. Please show some conscience.”

He takes the decision to work for his current employer, Alphabet—former motto “Don’t be evil,” and now “Do the right thing”—as a demonstration of scrupulousness. “For me, working at Google”—an Alphabet subsidiary—“is a deliberate choice,” he said. “If I start feeling uncomfortable, I’ll definitely leave. If I had been working at [Facebook], I would have left in 2017.” For now, he’s glad to labor for a company whose products—like Gmail and Android—are “anti-Facebook,” he went on. These empower people, while Facebook’s Newsfeed “seeks to maximally waste your time.”

If Zuckerberg is morally bankrupt, he’s trying to hide it. On Wednesday, he told CNN that he’d be “happy to” to testify to Congress and, on Facebook, announced several changes that will, supposedly, rectify what went wrong with the Cambridge Analytica data-breach scandal. “I started Facebook,” he wrote, “and at the end of the day I’m responsible for what happens on our platform.”

This is worth keeping in mind, though: That, also at the end of the day, “The motivation for Facebook is not to make you a better person—to improve you morally or intellectually—and it’s not even designed to improve your social group,” Simon DeDeo, an assistant professor at Carnegie Mellon University, where he runs the Laboratory for Social Minds, and external faculty at the Santa Fe Institute, told Nautilus. “It’s designed to make money, to show you things you want to see to hopefully induce you to purchase things.”

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @brianga11agher.

Victor Gomes, an editorial intern at Nautilus, contributed reporting to this post.

close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.