NAB
Social media acts like a drug on our social behavior changing how we think about right and wrong and fermenting social division by design in order to boost the profits of a few tech giants, a new book argues.
article here
“It is a drug that 80-percent of Americans take a dozen
times a day,” says author Max Fisher. “And if you work in the media or you’re a
young person multiple that several dozen times. We are living in a world where
the vast majority of the population is taking a mood altering drug multiple
times a day.”
The investigative journalist has covered the impact of
social media extensively for The New York Times. For his
book, ‘The Chaos Machine’, he interviews researchers, psychologists,
whistleblowers, and Silicon Valley executives to paint a coruscating picture of
the current state of social media.
“The result is the single most complete understanding of how
social media has rewired our brains, our culture and our politics that I have
ever read,” says filmmaker Jon Favreau who interviewed Fisher for the Apple
podcast series Offline.
From the creation of the Facebook newsfeed to Gamergate and
the election of Donald Trump “he traces the origins of our current political
shitshow to many of the internet most consequential moments.”
Fisher argues persuasively that it is not just social media
algorithms that are the problem but the fundamental design of the platforms
themselves.
Extremism isn’t just amplified but created by social media
which Fisher concludes may be the most destructive force in society today.
In the book, he details how the polarizing effect of social
media is speeding up. Here is a key excerpt:
"Remember that the number of seconds in your day
never changes. The amount of social media content competing for those seconds,
however, doubles every year or so. Imagine, for instance, that your network produces
200 posts a day of which you have time to read about 100. Because of the
platform's tilt, you will see the most outraged half of your feed. Next year,
when 200 doubles to 400, you will see the most outraged quarter, the year after
that the most outraged eighth. Over time, your impression of your own community
becomes radically more moralizing, aggrandizing, and outraged, and so do you
see, at the same time, less innately engaging forms of content.”
Podcaster Rich Roll also speaks
with Fisher and covers the specific ways social media changes its users’
morality, and how algorithms can make users more prone to violence.
“This is an admittedly scary but crucial conversation about
how social media’s reach and impact run far deeper than we have previously
understood,” Roll says. “I’ve become increasingly convinced that the impact of
social media and technology on our lives and the lives of our children is one
of the great existential threats to social cohesion.”
Fisher writes that social media polarizes and radicalizes us because of the choices that algorithms make. But why is it that among the whole spectrum of things that the algorithm could show us, the things they choose to show us are the outrageous, polarizing ones?
“Because those are the things that are most engaging to us
and speak to a sense of social compulsion, of a group identity that is under
threat,” he explained to NPR's
Ari Shapiro.
The enjoyment of
moral outrage is one of the key sentiments Fisher sees being exploited by
algorithms devised by Google (for
YouTube) and Meta (for Facebook, Instagram and WhatsApp), which “discovered
they could monetize this impulse by having their algorithms promote hyperpartisanship,” writes
The New York Times. “Divisiveness
drives engagement, which in turn drives advertising revenues.”
Fisher goes into depth about the impact of social media
sewing discord in Sri Lanka. He links posts amplifying the division of Sri
Lankan society directly to violence that then took place on the streets. He
claims that Sri Lankan officials begged Facebook to do something before
violence broke out and these were routinely ignored.
“What amazing is that if you go and look back at internal
conversations within YouTube they explicitly said, our goal should not be to
surface the best information,” Fisher told NPR. “[YouTube’s] goal [it said]
should be to surface content that will get people to spend more time on the
platform. And they were saying this right at the start of what would turn out
to be arguably the most consequential election in American history.”
Fisher says that no-one in Silicon Valley deliberately set
out to write an algorithm to surface the most polarizing content but that now
the system has taken hold, shareholders aren’t keen to change it since doing so
would directly impact their profit.
“The thing is these people do not ultimately have the
authority and the power within these companies. The people who have the
authority in the power are - just like in any major corporation, are the profit
drivers. And those are the people who get that traffic up so they can sell ads
against it and continue to make billions and billions of dollars. And that is
the thinking that prevails in which Sri Lanka. They don't even make that much
money there.”
What can be done?
So how do we thwart the algorithmic overlords from abusing
the infrastructure that is beginning to rule the world? Is there a way to
change the model so companies are not so incentivized to feed people outrageous
stuff that'll keep them glued to the platform for hours?
Whenever Fisher asked this of experts their solution was
always some version of turning it off “not turning off the entire platform, not
shuttering the website, but turning off the algorithm, turning off likes, the
little counter at the bottom of the post that shows you how many people liked
it or retweeted it,” Fisher told NPR. “That's something that even Jack Dorsey
[the former head of Twitter’ floated as an idea because he came to see that as
so harmful.
A version of social media without these
engagement-maximizing features, could, Fisher thinks, potentially mitigate some
of the harms.
“His answers may not make us much more hopeful that we can
actually regulate social media but they may help us understand how we can all
reclaim some agency back from these platforms and restore a little sanity to
our lives,” says Favreau.
Fisher went further with in interview with
ABC News Prime. “There a lot of people who work at the big social media
companies whose job is to reduce misinformation, reduce extremism, reduce
recruitment for far right terrorist groups, but they are fighting a losing and
in many senses, unwinnable battle.
“Not because there's something about social media that means
that misinformation and hate are going to always be on there but because these
platforms are deliberately designed to ramp up engagement in the most ruthless
possible ways these companies can come up with.
“You can't clean it up as long as the companies are doing
that but it's also, at least in theory, relatively easy to fix because all the
companies have to do is turn off these engagement-maximizing features, and a
lot of this problem goes away. But they're not going to do that.
A lot of the people who talked with Fisher are apparently
still true believers in the theoretical potential of a more neutral social
media that does not have these engagement-maximizing features.
“They believe [that social media can be a] major force for
good in the world. But the problem is just these engagement-maximizing features
are just overpowering that good and creating a lot of harm in the world.”
The New York Times (arguably biased because it employs
Fisher) called the account “authoritative and devastating” noting that Fisher repeatedly invokes Stanley Kubrick’s 2001:
A Space Odyssey, in which a supercomputer coldly kills
astronauts.
“As a story about
trying to fix a wayward technology as
it hurtles out of control, it is beautifully apt,” reviewer and NY
University philosophy and politics teacher
Tamsin Shaw says.
She adds that the
way the book connects the dots between Facebook’s gleaming (space-ship
like) corporate HQ and the riots, radicalism and conspiracy “is utterly convincing and should obliterate
any doubts about the significance of
algorithmic intervention in human affairs.”
One of the tech
industry’s biggest open secrets, Fisher
writes, is that “no one quite knows how the algorithms that govern social media
actually work.”
He quotes Mark Zuckerberg’s
“astoundingly naïve view that “there is a fundamental mathematical law underlying human social relationships that
governs the balance of who and what we all care about.”
But none of this absolves Meta or Google from blame. “WhatsApp and YouTube play in fomenting
genocidal hate,” says the NY Times.
“There are of
course facts on the ground that determine
the algorithm’s effects, the local susceptibility to disinformation, the
explosiveness
of the divisions.
And this highlights an important point: Millions of people use social media
without succumbing to conspiracy
theories or allowing moral
outrage to escalate into violence.”
Human judgment and
morality, in other words, aren’t reducible to instinctual drives that can
be manipulated. So, Shaw
insists, we need to ask not just what
makes some people susceptible to manipulation,
but also what in the mind’s “wiring” protects others, even in lives saturated
with social media.
“The answer will
presumably include education, and will span the range from individual critical thinking skills to
the overall quality of the information environment.”
The lesson of
Fisher’s book, Shaw concludes, is that we need to make individual members of societies resistant to such efforts.
“We have the means
to do so if the political will is strong enough, and if our political system
hasn’t yet been wrecked by the chaos machine.”
No comments:
Post a Comment