What you will learn from reading Social Warming:
– Why the promise of connecting people isn’t always a good idea.
– How radicalisation starts online.
– Why the fake news and misinformation war is an uphill battle.
Social Warming Book Summary:
Social Warming by Charles Arthur is a fantastic book that explores he polarising effects of social media. It’s start with why and how people can become radicalised in their views on social media and then goes on to explain the reason why it’s almost impossible to fight fake news and misinformation. If you want to understand the current world better, this is definitely the book for you.
Radicalisation can start by Frame Crystallisation:
When it came to the prevention of radicalisation that he had been called to discuss the problem was that its first steps didn’t look like radicalisation. Instead, it began with ‘frame crystallisation’, which consisted of identifying and agreeing on who was to blame for a situation, and what should be done about it. But ML systems find it very hard to distinguish between a discussion about a burst water main and one about overthrowing a political party – the latter being the sort of frame crystallisation’ that could lead, in time, to problems.
Key Concept — Scissor Statements (Statements that cut groups into two):
The error that people make is to think that they can ‘solve’ scissor statements, and that they might therefore be able to convince people on the other side of the logical blades to change their minds. But scissor statements aren’t intended to be bridged; they’re intended to divide.
Social Media and Purity Spirals:
If the outrage is strong enough, a second-order effect takes over: some people will decide that they have even more insight into the topic’s defects than the other people condemning it. This rapidly leads to what the journalist Gavin Haynes calls a ‘purity spiral’, where those who are insufficiently censorious are in turn censured, creating an ever-shrinking group who examine each other for sufficient purity in a process whose logical conclusion leaves just two people arguing over who can condemn something more than the other, and is thus more worthy.
‘It’s a combination of the speed with which a discrete nugget of information or misinformation can spread and this idea of performing in front of an audience, and having these conversations take place in front of an audience,’ Singal says. That has a perverse effect. The purpose no longer becomes to evaluate truth, or to try to find it; instead your visibility on the network means you have to signal to everyone how in tune you are with the values that you’re apparently meant to hold.
Key Idea — Connecting people isn’t always a good idea:
Facebook is a platform that’s designed for college kids in the US, and has this grand mission of connecting people. But what about a country like Myanmar, where connecting people who have never had any communications, like one side of the country to the other, with huge ethnic politics involved, is a bad thing?
Part of the problem with Facebook is that the platform and community standards rely on certain legal, political and cultural assumptions (such as freedom of speech and rule of law) that do not hold true in the Myanmar context today.
Facebook’s mission to ‘give people the power to buy old community and bring the world closer together’ doesn’t sound so beneficial when the community is carrying machetes, and they’re being brought closer to people they liken to rats.
A clear picture emerges; the population was switching from getting its news from TV, papers and personal conversations to a dis intermediated, unregulated free for all where it’s impossible to know what’s true, but where ethnic tensions are running high, and getting higher.
Tech Platforms avoid responsibility:
Tech companies avoid the rules as they don’t directly contribute the hate speech. They provide the platform not the hate speech. But, some contribute by enabling the content much more easily then others. A nice analogy is this; if the car you’re driving is spewing out more pollutants and ruining atmosphere more rapidly because of how you’ve made it. Finding a fix isn’t the drivers responsibility. It is the manufacturer’s.
The extreme feedback loop:
There’s a reward system in place: politicians who take extreme positions gain more followers, and those who shift towards extreme positions gain more followers. The ones in the middle who seek compromise tend to be shot by both sides, and don’t win the popularity that their more extreme colleagues do. The same can be said with sales. Those that make more outlandish claims make more sales.
For politicians in this day and age, ‘engagement’ is the name of the game. Those thick-skinned enough to ignore insults could build their following by being controversial and painting complex subjects as binary choices.
What is News?
After so many years trying to find a definition of news, here it is on a plate. Stuff I care about. Stuff I want to pass on. It’s so simple. Some news is both, some is just one.
Social networks have turbocharged the effectiveness of the spread of news. Stuff I want to pass on has become plentiful and tuned not to the accuracy but to the eagerness of people to share it.
Fake News and an Asymmetric War:
So what are the effects on people of fake news? Generally, you’d expect that you’d see a lowering of trust in institutions and information sources that don’t agree with what the person wants to hear.
The problem, Binkowski came to realise, was that this was an asymmetric war. Within the parameters Facebook had set, there was no way to stem the flood of misinformation, disinformation and untruths. The penalties for posting fake news were minimal, because the content wasn’t deleted from Facebook, only ‘downranked’. By contrast, the potential rewards from spreading it substantial and generating new versions of same story by writing a new headline and tweaking a few words and a picture were cheap.
And Zuckerberg was overlooking the advantages that the creators of ‘fake news’ have over purveyors of factual news: it’s easier to generate fake news than the real stuff, because you’re untrammelled by reality and you can thus make it both unique and eye-catching, both of which promote its spread.
Fake accounts and fake news:
Accounts that have post ‘fake news’ often disappear. So, there is no way to hold the author of the paper Facebook post to account: You don’t even know if it is a real person, so why do you trust the account of someone randoms claim over the newspaper you’ve taken for years, in good faith?
There is often no source on the fake news posts apart from ‘a good friend’, which might make it sound human but these words are used very carefully and cynically in order to mislead. So the lesson could be said to be this, ‘Whatever you do, do not believe a stranger on social media who disappears into the night.’
Conspiracy theories as stigmatised knowledge:
The trouble with encouraging conspiracy theories, though, is that once established, they’re impervious to facts, and especially rebuttal: ‘Rejection by authorities is [for believers] a sign that a belief must be true, as Michael Barkun’s seminal 2016 paper on the topic pointed out.
Belief in conspiracy theories also tends to snowball: believing one makes it easier to believe another. After all, if you think there’s a huge conspiracy by evil overlords to cover up one thing, then why wouldn’t they also cover up or control two things, or three or four? In that context, the recommendation algorithms at work on the platforms begin to look like an engine to make some people believe things that will actively make them ill.
The double edged sword of social networks:
The two-edged sword of social networks. All of the power of social media bringing people together, elevating voices and enabling you to reach and target people in a more efficient way then before – that can be used for a good, benign activity. But because it is fundamentally open in a lot of ways, and because the platforms are too big to actually moderate themselves, it creates a massive opportunity for bad actors, and for manipulation.
Social networks keep promising to unite us, yet their design works in the opposite direction.
Thy have become a tool that does not do what we want.
In the past, our toolmaking ancestors knew what to do with something that didn’t work as required: redesign and reshape it to match your desired outcome. We can’t change our essential natures. But we can change our tools. The extent of social warming demonstrates that the time has come to reshape and redesign these broken tools we have come to rely on.