Numlock Sunday: Max Fisher on The Chaos Machine
By Walt Hickey
Welcome to the Numlock Sunday edition.
This week in another special podcast edition of the newsletter, I spoke to Max Fisher, author of the new book The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World.
It’s a fascinating book that looks at the science — the neurology, the social science, the psychology — of what social media usage does to us. It’s riveting and provocative and will definitely change the way you view social media apps.
This interview has been condensed and edited.
You are the author of a new book called The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. I thought this was a really interesting topic for you, because normally you're a national security reporter. You cover a lot of different international events. I was really struck by why you were drawn to social media, but the more that I learned about it, the more it made sense. Do you want to talk about how you fell into this?
Like you said, my background for years has been international reporting, so reporting on global politics and conflicts and wars. I did not think of social media as a story that was for me, or something that I frankly paid a lot of attention to. I thought these are just websites, it's just apps on your phone. How significant can it really be other than as a tech or business story? And that started to change for me, the way that I think it did for a lot of us, after the 2016 election, where there was kind of this sense that social media had something to do with Trump's election, but nobody was really quite sure what it was. It was with the platforms, they're very polarizing and there's a lot of misinformation on them. There are all of these weird, crazy groups and online subcultures on them that all seem to be converging on Trumpism.
I, like most people, still thought, well, these platforms are just reflections of what's already happening in the world, or at most their experience, like any website that you would read or any publication you would read, and maybe there's just a little extra amount of misinformation or garbage in them than other places.
That really started to more significantly change for me about a year after Trump's election when I went to Myanmar to report on the genocide there, which of course was this horrible and very sudden explosion of just complete societal violence between the majority Buddhist group and the Muslim minority.
When I was there, I had the same experience that everyone who was reporting on the genocide there at the time had, which is that social media seemed to be just everywhere in the story. I don't just mean everywhere in that everyone you would talk to would be citing things back to social media, things they'd seen there, groups they organized there, social media being involved, although that was also a big part of it. But also in the sense that it was just very clear, although in this way that was really hard to define, that social media was playing a much more active role in what was happening.
A lot of the hate speech and a lot of the incitement and this general sense of a societal movement to destroy an entire minority population was something that had emerged on the platforms, and in the way that people were using the platforms. They were experiencing on Facebook, especially, but also WhatsApp and Twitter, that it was pulling something out in them that had not been there just a few years before when social media had been completely absent from the country. Due to sanctions before like 2016, you couldn't get a cell phone, you couldn't get social media. And then all of a sudden social media was everywhere and then society took this huge shift.
Shortly after I was there, even the United Nations had concluded that social media had played such an enormous role, that one of their officials said that Facebook had played a "determining role in causing the genocide," which was a crazy thing to hear, that just this website would be driving something so enormous and so wide-scale and something that felt like it was coming from up out of the ground, but maybe it was also coming from these platforms.
It seems, just again, so much of what we've been talking about in just a general sense is social media as a business story. But I love how this happened accidentally, almost, because you just realized, no, they're just actually a social accelerant in different countries.
Yeah, right. It's actually playing an active and really significant role in the way politics works and the way the society works. That was kind of the fuzzy sense that I, like a lot of people, were getting. Then I started to notice, because of my job, I would travel around to lots of different countries for different stories or just to report or get a feel for things. Everywhere I went, I would hear over and over again these stories that would link back to social media, that would be like a microcosm of the Trump phenomenon or the Myanmar phenomenon or usually kind of both.
It would be smaller scale because obviously things of that scale only happen a few times in a century, but it would be a village that would suddenly combust into this crazy violence over rumors that had spread on social media. Or it would be a town that would get overtaken by this mass hysteria that would link back to YouTube. Or it would be this far-right figure or this far-right group who had always been on the fringes of society and then social media came in and all of a sudden that far-right group or figure was the most popular thing and completely running the culture and then would get elected to some local office.
It started happening over and over again. That was when I thought, okay, there's enough of a pattern here that it's worth trying to understand, why does this keep happening? What is it about social media that seems to be at least potentially having this extreme of an effect on the way that societies and politics work? Why is it the same pattern over and over again?
And that started for me in early 2018, and it became a series of stories for the paper. Then it became the book of trying to answer this question of, what is social media? How is it changing us? How does it change our behavior? How does it change the way that our minds work? How does it change our politics?
I tried to pull in for that a lot of traditional on-the-ground reporting of finding a story of a place or people that had been affected by this and then retracing step by step how it had happened, what it had to do with social media.
But also, and this is the part of the book that I'm really proud of because I feel like it's the first time it's been done on this scale, is to try to pull in every relevant field of scientific inquiry that was looking into this, because I wasn't the only one who was having this realization. There was also this whole constellation of neuroscientists and social psychologists and social scientists who were all having the same sense and were trying to empirically answer these same questions, pulling together a lot of their research. There's some original research of theirs that appears in the book. But to try to get a sense for how is this happening? Why is it happening? And what does it mean for us as a species?
I've been reading your work a long time; one thing that I've always admired about it was you do really go to the mattresses when it comes to figuring out the research and actually what's being done in the academic world as well as the scientific world, as well as all that. And so I have been looking forward to that component of this. You want to talk a little bit about what the science is beginning to reveal about how social media gets its hooks in us?
Oh, man. It's a big question.
A book-length question, one might say.
Right. Let me give you a couple of examples to give you a sense for how people are starting to understand it. These are just teeny tiny tips of the iceberg of understanding, because to understand something as huge as the effect of social media on society, there are like 18 different steps in the chain of you interacting with a post and then that happening on a scale of billions of people and then society changes. There are so many things that happen in that system that you can't understand all of them, but I couldn't possibly relay all of them in one anecdote, so I'll give you two.
The first is, there was one study where these social scientists wanted to understand, okay, does social media actually change us? Is it just, we happen to be on the internet and other things are changing? Can we actually narrow this down?
They took these two really big groups of people, an experimental group and a control group, and they had the experimental group deactivate Facebook for four weeks. So just four weeks, which relative to the amount of time that we all spend on these platforms — 10 to 15 years into the social media era — is very small. And just one platform, not even all of social media. You would expect the effects to be very small. Then over those four weeks, they monitored any possible thing they could think to measure. What's changing with these people? How are they changing the way that they think about the world, the way that they interact with the world? They found two really significant changes in the people who deactivated Facebook for those four weeks. The first was that they became just much happier. They had an increase in happiness and life satisfaction equivalent to about a third the effect of going to therapy, which blew my mind.
I know. Because therapy has a huge effect on your happiness and it's also really expensive, but turning off Facebook is free. That was one of a lot of pieces of research that support this theory, that is now very widely accepted, which is that social media is addictive, physically addictive, and that it creates a chemical reaction in your brain that makes you feel compelled to go back to it. That is a piece of evidence that we don't use it because we like to use it or because it makes us happy, but rather we use it because we've been addicted, even if we hide that from ourselves and we tell ourselves that we just want to pull up Facebook or Twitter.
The second thing that they found was that those people who had deactivated Facebook, their level of political and social polarization changed pretty dramatically. It was political and social issue polarization, which means the degree to which they were polarized on issues that were salient in society rather than overall how they kind of viewed the world. They found that that reduction in their polarization was equivalent to 50 percent of the overall increase in polarization in American life over the past 25 years, which is the entire cycle of the polarization of American politics. The researchers, if they were here, they would be grabbing at my shirt to emphasize that that doesn't mean social media drove 50 percent of overall polarization in American life. But it does mean that its role in the way that we as individuals experience that polarization is extremely dramatic.
Of course, this is just a few thousand people for this study, but if you ask yourself, okay, what about if that's all of society, then the effect starts to become pretty dramatic. That's when it starts to change politics, overall. If everyone is 50 percent more polarized than they would be if they turned off just this one platform, God knows how much less polarized they would be if they turned off all the platforms and for a longer period of time. That was one piece of evidence that didn't measure how social media changes you, but was just really one of many very strong pieces of evidence that it does change you in these ways that we kind of have a fuzzy sense that they do, but okay, it really shows you that it does.
Another study is one in the book that I cite the most, not because I think it's the most consequential, but because it hit really close to home for me personally. Everybody I know who's like you or me who's very engaged in media or in politics who spends a lot of time online, it's like, whoa, okay, that's a little scary, that did show how it changes you. For this study, a group of researchers took a group of people for this experiment, and before the experiment, they tested them all on their level of internal outrage. So how prone were they to outrage as people, and they gauge this.
Then they had the experimental group of the people in this study send a fake tweet on this fake Twitter platform that was built to look like Twitter so that they could control the experience, that expressed outrage in it. Even if these are people who didn't really want to send an outrage tweet, they would say, “You have to send it.” For the group of these people who sent the outrage-filled tweet, they would show it back to them later and they would show it with a lot of engagements on it, a lot of retweets and likes.
This is something that we know the platforms do, because there are these other experiments that show that if you have outrage words in your post its reach will be dramatically amplified. Sometimes you think, oh, outrage travels well because people respond to it. That's actually not why it travels well. The reason it travels well is because the platforms will deliberately pull it out and then shove it in front of a lot of people to engage them because it's this very charged emotion that gets a lot of participation.
They've got a thumb on the scale.
They've got their thumb way on the scale.
So if you send an outrage tweet, it will get engagement almost certainly because the platform has ensured that it will, because it's a great way to keep you on the platform and keep your friends on the platform. What they did in this experiment is they would show people that their outrage tweet had done well, and they found that it made those people more inclined to send more outrage tweets in the future. If they went through this cycle a few times with people, had them send a few outrage tweets, the really stunning thing that they would find is that these subjects in the experiment, even if they had not been prone to outrage beforehand, even if they were not outrage-inclined people, that they would become that way. That they would become not just more inclined to send tweets with outrage in it, but even when they were away from the computer, even when they were away from social media, their internal nature had become much more outrage-prone.
This training that they had received on the platform because they'd gotten this positive social reward, this is something that hits on this very deep school of social science and social psychology that says that our sense of morality, of right and wrong, is something that we derive heavily from social cues. If we think our community of people around us really want us to behave a certain way and will reward us if we do, we become internally more prone to chase and to seek out that behavior, not just because superficially we want the positive attention, but because our minds have tricked us into wanting to do that in order to get in good with our community because of the nature of the way that we evolved and just how we are as a species.
That was something that really blew my mind, because it shows you that the platforms are deliberately inculcating a type of activity that doesn't just change how you behave with your own social media, but that changes your internal nature. It changes the way that your emotions are. It changes the way that you behave.
And when you start to see these — because there are dozens of examples like this in the book, of these kinds of changes that it imbues in you — when you see all of these, and then you see that it's the overwhelming majority of Americans engaging with these systems dozens of times a day, American life today starts to make a little bit more sense. You start to see this kind of training effect and this change that really does feel like it's been society-wide, is something that is driven, I think, a lot more than we thought, or maybe wanted to admit to ourselves, by these incredibly powerful companies and their technology.
It's so interesting that you call it a training effect, because as you were describing that experiment, I was like, yeah, I've seen that experiment before! If the monkey presses the button and then they receive apple juice, then all of a sudden they're going to really love pressing the button. It's weird that it's that simple, man.
I know. I think this is one of the wild things about social media, is that it is that monkey with a lever button. But these companies figured out, not necessarily because they were so insightful about social psychology, although if you go back about 10 or 15 years, there would be very open discussions in Silicon Valley within the industry about exploiting our cognitive weak points, about training us, about changing our nature in order to —
They hired people out of Vegas to do some of the engagement, is my understanding, too.
They modeled the platforms specifically and deliberately on slot machines, because slot machines are physically addictive. If you look at your phone, it looks like a slot machine. You've got the colors, you've got the flashing lights. You get that haptic feedback when your phone vibrates.
But even more than that, what they were trying to hook into was not just the kind of physically pleasing sense of pulling a slot machine, which is addictive and does make you want to go back to it. But they wanted to, and very successfully did, tap into social needs and social impulses, which is not something that we're used to having manipulated on a physical, chemical, personal level like that. I mean, we might be aware that it's happening with our politics, like, oh, politicians are appealing to our baser nature.
But the platforms have learned how to do it in this, like you said, this kind of monkey-in-an-experiment way, that is both extraordinarily powerful because it bypasses all of the normal social checks and the social norms that we use to mediate our own behavior, mediate one another's behavior, by delivering it through these kind of electric bolts to the brain of these reward systems and punishment systems, but also because its influence is hidden.
I think one of the most important things to understand about social media is that you log on and you think that you're having interactions with all your friends and all the people in your community, and that's where the feedback is coming from. That if you say something that they like and you get a lot of engagement, that means that your friends like it. And if you say something that gets no engagement, that means your friends didn't like it.
But that's actually not what you're experiencing. What you're experiencing are the preferences and choices and desires of these very powerful algorithms and many other systems that are built into the technology, that are deliberately designed to encourage and train certain behaviors in you because those are going to be good for boosting your engagement and for boosting the engagement of people you interact with.
It's interesting you mentioned politicians appealing to our baser natures, which has always been the case. For a long time I was wondering, what's the deal with social media? Is it additive or is it subtractive? Is it chipping away at the social mores that prevent us from being assholes to each other all the time, and is it subtractive? Or is it giving people new, fascinating ways to be cruel to one another and new ideas about how to do it, like additive?
Over the past couple years, I've come more in line to the latter idea, and over the course of this conversation, I've really come around to that. Putting it to you, what do you think more of it is: Is it social media giving folks new ways to engage and new ways to kind of self-polarize? Or is it just revealing an inherent polarization underneath the hood, just kind of removing some of the guardrails?
I mean, it's all of the above, I think. And when you're talking about something as complex, in the sense that there are many different inputs and outputs, as social polarization in American politics, there are going to be 30 different causes of that, and of course, none of them are going to be the sole cause and driver. The fact that there are 30 all at the same time means they're all kind of multiplying each other. You sometimes hear from the social media companies, they'll be like, "Well, how can you blame us for social polarization when there's a long history of racism in America and racism is playing a role in social polarization?" And it's like, sure, but if your product is worsening that by 10 percent, by 30 percent, by 70 percent, whatever the number is, then that's pretty significant even if there has to be something in there to multiply in the first place.
I heard this quote from this one politician in a country that I went to called Sri Lanka to report on the way that social media had basically blown up the entire country over the course of a couple of months, where he said, "The germs are ours, but Facebook is the wind."
What he meant by that was that there had been racial animus in this country, there had been distrust, there had been weaknesses in the social system before social media got there, but it was the social media systems that amped in and multiplied this, not just in the passive sense that social media multiplies everything, which is another defense that you hear from the companies, but in the sense that these systems have learned, even if the people designing them didn't deliberately design this in, they have learned to hone in very specifically on very specific impulses — moral outrage, us-versus-them tribalism, more extreme forms of identity, narrower forms of identity, distrust of institutions — to really hone in on these things and to dial those just way the hell up. And to not do that for other forms of sentiment, not with other forms of engagement, like bringing everyone together, or a kind of shared sense of unity and purpose, or just information that is spread because it's true rather than because it's emotionally engaging or negatively engaging.
They just mega amplify those because those are the things that keep us plugged into the platform. The people who run social media companies, they actually have more than enough data to know this by now, because they started running internal experiments over the last few years to try to understand what their systems are doing. They're doing the same version of what social scientists have been doing from the outside, except they're inside the company so they have a lot more data that they can work with.
And all of their own internal researchers reach the exact same conclusions, that these platforms drive people toward very specific kinds of conspiracy theories, that they create very specific kinds of identity, the most extreme of which is QAnon, but you see things like QAnon over and over on the platforms because that's what gets people to engage more.
Even like Harry Styles and Chris Pine, that got real QAnon really goddamn quick in the course of like a day and a half.
That's actually a great example, because it's something where you pick a side: you're team Harry Styles, you're team Chris Pine, you're team Florence Pugh. And then that becomes a group identity on social media, that like, "Hey, we all agree that Florence Pugh is the best. And we all agree that the people who support these other celebrities are the absolute worst. And I'm going to make posting all day about how mad I am at people in this social out group, which is Chris Pine fans, which I didn't know I hated until 10 minutes ago, but now absolutely hate, my whole deal and my whole identity."
You see, it's exactly the kind of thing that does really, really well at boosting engagement. Social media did not invent Chris Pine and Harry Styles getting in a fight, but it did invent turning the fandom wars over it into just a whole-ass identity for seemingly a really large number of people. I think that that's actually a useful way to separate out what's the difference between what social media does and what are the preexisting things that it pulls from.
My hope is that the last chapter of this book is telling solutions for this possible problem. I would like that a lot because it seems like there’s a lot of problem here.
I guess, let's do this two ways. One, what would you recommend people personally do in their own lives and with their loved ones regarding social media? And then two, what are the big solutions that you're kind of looking at as a way to address some of these problems?
Sure. So there's a stock list of tips that I give, but I think what's more important than the specific tips is what ties them all together. I'll tell you the tips and then I'll tell you why they're important and what they tie together.
Limit your time. Obviously, limit your time on social media. Give yourself specific times to go on it. It would be easy to say, just delete all the platforms, never go online, throw your smartphone away and live in a cabin in the woods as a poet. But most of us can't do that. We have to be online. We have to be on social media because these platforms have completely conquered the way that we relate to one another and relate to the news.
Turning your phone on grayscale is actually a really, really effective way to make social media less addictive.
Yeah. On iPhones it's really easy. You have to go into settings and change the setting where you turn it onto grayscale, and then if you tap the power button three times, it goes between color and grayscale. And because they're designed to be visually addictive, if you turn it on grayscale, you will just find that the emotional effect from being on the platforms goes way, way down and the ease of turning it off goes way, way up. It's amazing how much of a change you will see from that, which is again, proof that you're opening it because you've been addicted, not because you want to open it.
If you are using it, try — and this is a hard thing to sell to people, especially in a time of high stakes in politics and deep political polarization and where it feels like every election is maybe going to be the last election in American democracy — try not to outrage post. And if you do, quote tweeting, where you take someone else's post that you don't like, that you thought was dumb, and then quoting it, and then adding a comment about, "Look at this idiot, what they had to say," try to just never do that at all.
The reason not to do it isn't because that person doesn't deserve it. I'm not someone who worries about like, "Oh no, the coarsening of our discourse. Why can't we all get along?" Because there are good reasons we cannot all get along and there are some stakes for our politics right now. The reason not to do it is because, first of all, you're not actually adding anything. Probably this person you're quote tweeting doesn't matter. But also that is one of the most powerful ways that the system trains you to be prone to outrage, to stop reading, to shut down the intellectual and rigorous part of your mind and just engage the monkey brain, dopamine-chasing part of your mind.
If you just stop doing that for a month, I think you will find a really pronounced difference in social media. Try not to dogpile people.
Again, these are things where it's not just like, “Oh, it's not nice to do,” but it's because when you do that, you are complicit in the way that these mega companies are training you to use their product more and more, so you're taking more and more pulls of the cigarette.
You're the centrifuge that is really increasing the volatility of the environment.
If you're on YouTube, open it in incognito mode, because if you were logged in, or even if you're not logged in, but you are using it on non-incognito mode in your browser, it will track your views very carefully and it will serve you up related content that is going to be as likely to hook you in as possible, which is just a great way to be shown stuff incrementally over time, even if it's not immediately obvious, that is not healthy for you.
What all of these little tips have in common is they're all about learning to see social media as a drug, which it is. It's a drug in the sense that it changes your brain chemistry. It's a drug in the sense that it's addictive. And it's a drug in the sense that when you're using it, your behavior changes, parts of your brain shut down, your emotions change, which is true of any drug that we use.
It's also a piece of advice I'd like to give because, while there are a lot of drugs we can't do, there are quite a few that we have all kind of decided are worth it for us to do a little bit of occasionally. I had a cup of coffee this morning. I'll probably have, it's Friday, one to two glasses of wine tonight, maybe even three. But I know, because I understand that these are drugs, that they change my behavior, that they're not always good for me. I know to take them in certain ways that I have come to learn are healthy for me. I know not to drive a car after I have a certain number of glasses of wine. I know not to read certain things if I've had a drink because they might make me upset, or get into certain kinds of social situations. I also know that if I start to feel a certain way after I've had a drink or two, like I start to get annoyed with a friend who I'm out with, I know internally, okay, that's not me, that's the alcohol that's making me feel that way, and it becomes easier to mentally separate yourself from that.
I think when you use social media that way it becomes so much easier to use it responsibly because you come to see maybe there are certain kinds of activities that you try not to do on social media, just like you try not to do certain kinds of activities when you're on a drug or when you're on alcohol, because you know it's not healthy and it's not safe for you. And you also come to see the difference between, okay, I'm on Twitter, I'm on Facebook, and I'm feeling a certain way.
But now I understand because I know Twitter and Facebook are drugs that they are making me feel that way. It's not actually something real that's happening in the world. It's not something real that is happening in this conversation that I'm having that's making me feel that way, so I want to disengage from it. That's my number one tip for using it safely. It's just understanding what it does to you makes it much easier to kind of take a step back from it, I think.
That's great. The book is called The Chaos Machine. It is the inside story of how social media rewired our minds and our world. Why don't you tell folks a little bit about it, where they can find it, and where they can find you.
It is, as you would expect, everywhere books are sold. It got some good placements at Barnes and Noble and especially in independent bookstores, which has been great. It's on the major online shopping website. I know surprisingly a really large number of people who've gotten the audiobook. I'm not really sure why that is or why people love going to the audiobook for this one.
It's basically just a podcast that has a point.
Isn't that all books, are podcasts with a point?
Listen, we don't want to crack this one wide open too quick. Where can folks find the book?
Everywhere you buy books. You can find me, unfortunately, on Twitter at @Max_Fisher. That's really the only public-facing platform I use. That's another piece of advice, limit yourself to one public-facing platform. And yeah, I hope that people read it and enjoy it. And if you do, I would love to hear from you.
Yes. And if they do, they should quote tweet you dunking on it so that the tweet accelerates in the algorithm's reward.
If you have anything you’d like to see in this Sunday special, shoot me an email. Comment below! Thanks for reading, and thanks so much for supporting Numlock.