Before the last presidential election, Sue Gardner tried an experiment. Gardner created a simple false Facebook persona named Kaitlyn. Kaitlyn was 19 years old, lived in Hoover, Alabama, and worked in customer service for T-Mobile. She had “no friends, no comments, no interests. No baby pictures, dinner parties or whatever.”
Then Gardener, as Kaitlyn, Googled only one thing: “Alabama Trump.”
The first thing that came up was “Alabama for Trump.” She “liked” it.
Immediately Facebook started sharing things into her feed. Without making any judgments, Gardner systematically “liked” everything that came in.
Gardner said, “It was fascinating in many ways. It was 100 percent fake news. It was all fake news. And it was all anti-Hillary Clinton, not pro-Donald Trump.”
Gardner described what came up, “Hillary had killed 445 people since the 1970s. She was a robot. She had syphilis. She had various diseases and was very frail.”
Garner said that the news feeds looked and felt like real news. They had headlines, bylines and photographs. There was nothing to distinguish them from authentic news.
Presumably, if Gardner had placed Kaitlyn in San Francisco, Googled “California Trump” and liked “California for Clinton,” Facebook would have shared information detrimental to Trump.
Why did Gardner do this? Everyone Gardner knew was for Hillary Clinton and was sure Donald Trump couldn’t possibly win. Yet, she knew that Trump supporters were just as sure that Trump would win. So, Gardner decided to find out what Trump supporters were hearing that she wasn’t hearing.
How could this happen?
The podcast pointed out that people generally like convenience and comfort. Internet companies are in the business of giving people what they want.
Facebook uses computer algorithms (equations) that are designed to automatically show us what we have already “liked,” spent time with and what is currently most popular for people who fit the demographic group that Facebook’s computer algorithms have put each of us into.
“Instead of acting as pit-stops in a more wide-ranging web experience, the goal of these apps is to keep users inside their [application] universe for as long as possible — exposing us to ads, and mining our private data to create an even more captivating experience. Increasingly, we don’t have to encounter things we’re not already interested in — be it consumer items, or points of view.” — taken from the Screened Off podcast website.
This segregation results in keeping us inside our own groups when it comes to politics, opinion, fashions and many other things. Web technicians call these comfortable virtual homes, our “filter bubbles” or “walled gardens.”
A second factor: What grabs people’s attention the most, and is “liked” the most is the flashy stuff: videos, photographs, and outrageous headlines. The video of a person getting beheaded is flashy. The backstory behind why a person thought it the righteous thing to do, to behead another human being, is not flashy.
Photographs and videos elicit emotional responses from people. Text, by its very nature, requires thinking and can elicit cognitive along with emotional responses.
A third factor: Gardner has spent years observing Silicon Valley and its devotion to creating the ideal user experience. She has seen how deliberately engineers create what they call “sticky” experiences that are hard to resist. Sticky experiences cause us to loiter on a site where we can be monitored and sold to, even when we have no intention to stay there. Links to flashy material play a part in keeping us around.
Finally, most people are just browsers. We don’t read the articles. We go from one flashy article to the next. We pass articles onto others that we ourselves haven’t read. We let what is put into our news feed wash over us, leaving an emotional residue.
These remarks about Facebook reminded me of some of the statements our emergency responders made during recent flooding: Lieutenant Sarah Richards, of the California Highway Patrol, said, “We were really struggling with the social media. All the rumors were hard to address because we just didn’t know what they were talking about. It was amazing what they were coming up with.”
Matt Solano, an engineer for Caltrans, agreed, “Facebook was a public information officer’s nightmare.”
Facebook is in the news business
According to Brodie Fenlon, another person on the podcast, Facebook is the largest news organization in the world, although it is reluctant to see itself as such. The results of the last election, however, pointed out the role Facebook plays in our culture and around the world in radicalizing people by separating them into likeminded groups. This is the case whether a person is liberal or conservative.
The Web is no longer primarily the home for individuals putting information on their own sites. Large corporations, like Apple, Facebook and Google have changed the Internet. The podcast asked the question: “Their products and services are convenient and often fun, but at what cost?”
Facebook and other social media, by their very design, give us more of what we already like and know, that confirm our emotional gut feelings. Facebook makes us comfortable in our little media cocoons, interacting only with tribes of likeminded people.
Yet, Facebook doesn’t believe it has a mandate to provide a wide diversity of news and opinion or to publish information necessary for individuals to play their proper role as citizens for the greater good of the community or the nation.
Newspapers have not been immune to presenting flashy stories and visual images. However, not to wave our own flag too much, most newspapers believe they do have an obligation, as our founding fathers envisioned, to provide citizens with truthful information that they need to do their job of overseeing our government. Newspapers also generally include a lot of text, which encourages the reader to be more rational in her or his response.
Perhaps Facebook and other social media need to adopt similar societal obligations? Perhaps citizens need to decide that getting just what you know, or think you know, is not the best way to interact positively with the rest of society?
The podcast is called “Screened Off: The Dangers of an Insular Web.” You can Google it online. Or you hope that Facebook finds it!