The promise and peril of social media

Short summary

In the Hype Machine, Sinan Aral talks through the promise and peril of social media. Three lessons that will hopefully push you to being more intentional in how you use it:

1- Social media is designed to be habitual, to grab your attention and not let go.

2- It amplifies, scales and hypes choices we make ourselves. It does this through a “sense and suggest loop” in which machine learning watches how we act, changes what we see to maximize engagement, then we act again.

3- This loop is currently not optimized for wisdom, or truth, or veracity. Just engagement. So instead of a wise crowd, we are mobs. Segregated homogeneous groups, who see the same things and form the same beliefs.

Want to know more?

This blog post is part of a series I am making called Reading For The Aspirational Self. Don’t think of this as book summaries – I’m not doing that. Instead, I’m drawing out specific lessons that I find particularly interesting. And which I think could act, together, to help people who share my aspirations. If you, too, want to be present, family-centric, intrinsically motivated and polymathic, I can help.

  • The most distilled version of what I’m offering is a free mailing list designed for learning, “Think On Thursday” – each e-mail will include a lesson designed around the content. Click here for some information on that.
  • The series is also on YouTube in the form of 7-12 minute videos. Here’s the channel link – the video and transcript are below.
  • I’m tweeting excerpts from the videos, as well as some of the story of this project, how we’re doing it, and where it is going, on Twitter. @DaveCBeck

If you want to know more about The Hype Machine take a look at Aral’s website here.

Starboard reflections,

Dave.

The video

Transcript

The hype machine is what Sinan Aral calls our social media complex. For many people. This is the primary way of interacting with the world, finding out about the world and living in general. And, it’s all about engagement and hype. 

In this video, I’ll talk through how social media is designed to addict you, to control and capture your attention. Secondly, we’ll see that most of the effects both good and bad that spill out of social networks come from our input. We control the machines as much as they control us. Finally, I’ll talk about the wisdom of crowds and why social media is making a stupider. That’s my words. 

(section – the addictive nature of social media)  

Social media is designed to be habitual to grab your attention and not to let it go. If you don’t believe me, that that design is intentional. Here’s the founding president of Facebook saying much the same thing. Most human beings crave social rewards, and we can now get them at scale through social media. 

So this is giving us something that apparently we desire throughout our evolutionary history and can now be delivered fairly easily by machine intelligence, cheaply across the world.  

This mainly works through dopamine loops. This is most obvious with likes. So when you post your first thing on Facebook or your tweet, your first tweet, you’re more likely to get likes then you will be later on down that line. The idea is to give you an initial reward, hook you into the service, get using it more often. And after that, the rewards become more variable. They vary over time, seemingly for no particular reason. 

The rationale would be quite hard to decipher. It’s not just about the quality of your content. It’s also seemingly coincidental. However, these are designed they’re varied rewards. And so those of you who’ve trained a dog before, or many of the pets will know how this works. You give a treat to the dog for behaving well, the first time they do it and you keep doing that for a little while to reinforce the behavior, and then you slowly withdraw the treats. You made them slightly more variable. And that way the dog actually learns to crave the rewards. The dopamine comes from the craving, not from the actual reward itself. 

That’s a common mistake. It’s the craving that leads to the dopamine hits.  

And that’s basically how you’re being trained when you use social media nowadays, they’re training you the same way that you might train a dog. 

A question I’d ask is in what are the field of things that are available to you as a consumer do you see plenty of people talking about detoxes and taking a break?  

What Aral discusses through the book is both the promise and the peril. So the addiction doesn’t have to be a bad thing.  

One interesting study that Aral cites was a 2018 study of Facebook users in which users were paid to not use the platform for a little while to try and work out what economists call the consumer surplus of Facebook is. And it works out at about $50 a month or so. 

People who used Facebook more had to be paid a little bit more to give it up in the first place and people who used it a bit less and didn’t need to be paid as much, fairly obvious you might think. After a period off, interestingly, many of the users of Facebook recognized that their use previously had been somewhat habitual and have come alongside harms. 

Particularly a decrease in the attention that they could give to what was in front of them at the moment and the feeling of distraction. And they substituted the time for other more worthwhile activities. 

I’m guessing most of those users when they stop being paid came back, but it’s very interesting to think about Facebook use as a consumer surplus as something that you would have to be paid to give up and the same applies to the rest of the social networks, of course.  

(section – Feedback loops.)  

Mostly what the hype machine, which has all of the social media networks you might think of, does is amplifies the kind of things that we do anyway, as human beings in contemporary society. And it does this through something called the sense and suggest loop. That’s how Aral phrases it, and this loop involves a machine learning machine intelligence, looking at your behavior, sensing your behavior, how you interact with the website, how you interact across different websites and all of the other data that they can collect on you and building the platform around you to maximize your engagement. 

And then once that platform is built, it doesn’t stay static, it changes continuously. So every time you open your news feed, using Facebook as the example again, it’s the one that Aral talks about the most. Every time you open a newsfeed on Facebook, around 2000 different items that considered and only a few are displayed to you, including only a few at the very top where your attention is most often driven. 

And those are selected based on engagement potential. Based on the likelihood that you’ll react to them emotionally, that you’ll click on them or that you’ll do something else that indicates that you’re engaging with the platform. That’s really, the goal is engagement and that doesn’t have to be bad. It doesn’t have to be good. It’s just engagement. That’s the goal.  

So Aral likes the good example of Strava, which is a social running app. And he cites that when it’s sunny in New York, people in Phoenix ran more, which is a very interesting example of social contagion. So the idea is that if you’re on this app and there are plenty of others out there that involve kind of fitness tracking and Peloton is probably the most recent example of this kind of thing, where. you are encouraged to do more healthy exercise by your peers, by other people doing the same thing. So when you see a friend running an extra kilometer, one day, you’re more likely to run an extra 300 meters that day. That was the rough ratio that that was figured out in the study of Strava. 

Aral uses that to argue that our addiction to the hype machine doesn’t have to be a bad thing. It can actually be good for us. It can encourage us to take on healthy habits and to be more healthy individuals in this case.  

However, most of what he talks about through the book is an example of the perils of the hype machine. So this is still taking our natural behaviors and amplifying it through this loop this sense suggests loop, but it’s amplifying it in a way towards homophily towards groups. 

This is taking our tendency to congregate with a like people, for instance. And then amplifying it through things like friend suggestions or follow a suggestions or people you may know if you’re using LinkedIn, that’s where this originated from. And the way that they work particularly is by closing triads. 

So if you know somebody and your friend doesn’t know them, chances are, they’d be interested in them. There’d be something alike there. And through those triads and through a variety of other things. The algorithms are proprietary so we can’t kind of tell you exactly what’s what’s in there, but through those algorithms, the tendency is. 

That people, congregating groups of similar people, and that similarity can be measured in economics. It can be measured in depth in any of the demographic criteria you might choose to use. It could be measured in beliefs. Pretty much anything you can measure people by. The tendency is that when we’re using social media, we congregate in large numbers of people who are very similar 

The homogeniety of the groups that will be congregating has knock on implications because the feed algorithms take into account the responses of your peers. So the news and the other things that you see on Facebook and elsewhere, Twitter as well are curated. Based on things that you might find interesting. 

And if your friend finds something interesting, especially interesting enough to engage with it, but also interesting enough to hover on it or whatever. Chances are, you’ll find it interesting too. So that’s one of the things that’s taken into account. That’s what leads to what we might call filter bubbles and it’s measurable. 

So there was a study done on a German news site, in which the fourth spot on the list of thoughts, just that spot was human curated for 150 million user sessions. The algorithm got more engagement as you might expect, that’s what it’s optimized for. But the human curation had a very interesting impact. It’s not just that people were more likely to click on that fourth spot, which was more diverse for many of the users. It’s also that they were more diverse in their general content consumption. So the other links that they choose to click on are influenced by just that one spot on the list. 

(Collective stupidity)  

The third point that I’d like to draw out is that instead of moving towards a hive mind, or a collective intelligence, which was much of the promise of social media you really are on, and Aral talks about how that hasn’t happened. Instead of moving towards that, hive-mind, we’re becoming a stupid crowd or a series of stupid mobs.  

There were three key prerequisites for a wise crowd. So this is the idea that if you have a group of people together, they are more intelligent than any one person could be. Even if they are less informed individually, the kind of collective intelligence the hive mind works if these three prerequisites are met. And those three prerequisites are independence, diversity and equality of views. So the views of each of the individuals need to be as independent from each other as possible, which is impossible if you are all reading the same news within a group fit instance because your sources of information at the same, and your views are influenced by those sources of information. 

It’s fairly clear that beliefs, as well as outward conformity are kind of shaped by these filter bubbles. They need to be diverse and different. And that’s the same issue that I’m talking about there.  

And they need to be equal. And on equality social media has a particularly pervasive impact because of the way the influences work. So even though Facebook now is trying to reduce that and trying to encourage smaller networks of people to fall. And there are various other code based things that are all discusses that that are trying to reduce the influence of influencers. Individuals still have an outsize impact compared to other people’s views. 

So you’re much more likely to see something by Gary V than you are by me.  

But for a wise crowd form everybody’s opinion needs to matter equally. And that can’t happen the way that social media is currently coded and the way that we currently use it. 

Aral has a very interesting way of looking at this and proposes many solutions to these problems that I list. These are all problems he recognizes that he thinks can be solved by more social media, but different social media. What they basically boiled down to is encouraging the code and our norms to evolve together. 

So the way that we use social media is just as important as the code, because of the sense and suggest loop that I discussed. But both things need to change for progress. And one way of doing this might be for platforms to rate content by confidence, veracity, truthfulness, or however you wish to measure it. 

Even wholesomeness is measured instead of just engagement. If the platforms changed their code to prioritize things like wholesomeness and truthfulness over engagement, what will happen is that the truth tellers or the people who give useful wholesome content will get greater followings. 

And instead of a crowd that’s led by influences dictated by engagement. You’ll have a crowd that’s led by truth tellers by wise people and for Aral, that’s a better solution than what we have now and a better solution than a greater independence of views as well. Actually, as far as he’s concerned.