Analysis by Kyle A. Lohmeier
Not long ago I posted and then tore to pieces a completely fake news story “published” by an outfit calling itself The Boston Tribune that presented a compelling, professionally-written, official-sounding news story that 23 states were about to ban sale and possession of hollowpoint ammunition; which was, of course, front to back bullshit. Where did I first notice this silly story? In my own Facebook newsfeed, and, it turns out, it wasn’t alone; Facebook is jammed full of newsy-sounding articles that are complete fabrications. It’s gotten so bad that Facebook CEO Mark Zuckerberg had to respond to allegations that fake news on his social media site actually tilted the election to Trump’s favor; he dismissed the notion as a “crazy idea.”
However, the idea was sufficiently lucid, apparently, to prompt Zuckerberg to take some largely ceremonial steps that do nothing to solve the “problem,” and that’s okay, it just would be nice if he was intellectually honest about it. As of now, Facebook won’t allow any of the purveyors of fake news to utilize Facebook’s advertising system; whereby idiots like me who run websites with affiliated Facebook pages pay as little as $2, or as much as they like, to get their posts put in front of more Facebook users. These are the “sponsored” links that come up on users’ newsfeeds and already contain no links to the aforementioned fake news sites because those sites are all about making money on click through traffic, not about spending it on Facebook advertising. Facebook will do nothing about fake news sites posting links to fake news which then get caught up by trending search topics and other factors and put into people’s newsfeeds.
“’It’s a step in the right direction. However, Facebook generates traffic and Google monetizes it,’” said Filippo Menczer, a professor of computer science and an expert on fake news at Indiana University. “’For Facebook to do this with advertising, it’s not clear how that would help. You never really see sponsored posts from fake news sites on Facebook,’” wrote David Pierson in The Los Angeles Times.
Critics are claiming Facebook isn’t going far enough to purge fake news from its site altogether and thus protect the citizenry from being misled. Those critics are simply wrong.
Granted, Facebook belongs to Mark Zuckerberg and, as such, he can do with it whatever he likes. It is not, however, his responsibility to police the content submitted by literally a billion users for “news” stories that are fake, nor would such even be a good idea. Any effort to police “fake news” from the site entirely would invariably lead to non-fake-news stories being squashed. The line between news, news analysis and satire can easily get blurry. Last week I ran a piece with the headline: “Here’s Proof Major-Party Voters Cost Johnson the Election.”
At first blush, the headline seems kinda fake, but is clearly intended as satire; it was written to lampoon a truly moronic piece written by Jason Easley at Politics USA-dot-com that had a similar headline that fixed blame on Jill Stein and Gary Johnson for Hillary’s election loss. Would an aggressive ban on “fake news” have snared that story in its dragnet? Who knows? Glad we won’t be finding out.
Yes, fake news is a problem. It exists to capitalize on the way Google’s AdSense system and Facebook’s algorithms work to generate clicks – the clicks are where the money is, and the fake news stories entice those clicks. It’s the natural evolution of small, on-line-only outfits that post misleading and hyperbolic headlines that link to stories that are only barely related to the bombastic, click-bait headline. And yes, while those are annoying, clearly the ones that link to stories that are utterly made-up and false are worse. However, Zuckerberg is actually taking the right course of action here – essentially none at all.
So then, upon whom does the responsibility of dealing with these purveyors of falsehood fall? Well friends, it falls upon each of us.
A huge part of the problem with the current state of American society right now, as I stated in my Theory of Nearly Everything Else, is that Americans, by and large, don’t want the responsibility of having to think for themselves. We fob off important decisions on government regulatory agencies into which we place our blind faith, only to act surprised when the FDA recalls a drug it initially approved as “safe and effective.” We don’t even bother double-checking the things our favorite cable news network reports to us as gospel. In short, we don’t invest near enough time as a species or a society in understanding what the hell is going on at even the most basic level.
So, upon seeing a headline that is totally false, the average American doesn’t have the background knowledge of how things work to be able to recognize it as false; instead, they’re inclined to believe it, particularly if it comports with a previously held view. This “confirmation bias” plays a huge part in why fake news stories are so “effective.”
“’At the end of the day, the problem is one of confirmation bias, which is our natural human tendency to look for information that confirms what we believe and ignore information that goes against what we believe … we fall for fake news because something about it confirms our beliefs about the world and because we are in a news-grazing rather than news-reading culture,’” Pierson quoted Jennifer Stromer-Galley, an information studies professor at Syracuse University, in his article.
Despite some expert claims to the contrary, Facebook isn’t a “news site” and shouldn’t be expected to hold itself to a standard of journalistic integrity eschewed by most major networks and publications anyway. Yes, some Americans misuse Facebook as a means of getting their news of the day the same way people used to misuse the newspaper horoscope by not placing it under a pet bird in its cage. Such isn’t the fault of Facebook or the astrologer, but rather the people misusing their services. Sadly, there appears to be a lot of Facebook misuse going on. Pierson cited a Pew research poll conducted over the summer that found 20 percent of social media users changed their views on an issue based on something they saw on social media. Honestly, that number seems high given the aforementioned role confirmation bias plays. If that number is accurate, then it is more an indictment of 20 percent of social media users than it is of social media itself.
Leave a Reply