The Facebook Effect, or False Perceptions

I’ve been a Facebook user since 2005.  Back then it was a golden time; it was only available to college students and we used it to keep up with friends.  No annoying game requests, no worrying about what your grandparents or employer would think about that tired 3am post, and your news feed was limited to invites to meet up with friends for dinner or a night of RISK.  Whether you like it or not the platform has changed a lot over the last eleven years.

Take a quick scroll through the news feed and I bet within the first 20 posts you’ll see something pertaining to the election, something about abortion, something about immigration, or some other divisive topic.  These topics are not bad and Facebook provides a great platform to express yourself and to engage in discussion with friends and family.  That being said, there is a problem: something I call the Facebook Effect.

Sometime between 2009 and 2011 I began to change.  I was a Democratic Socialist and I began the transformation into a Constitutional Libertarian.  My views on topics like abortion, immigration, and the economy began to evolve and naturally I wanted to share my thoughts about these topics and others with my friends.  That is when the firing squad assembled.

What is the Facebook Effect?

facebook-770688_960_720The Facebook Effect is this phenomena that I have noticed developing as a result of Facebook’s algorithm.  When an article or image appears on your feed you have a few choices: Like it, delete it, or do nothing.  This is essentially the same thing as voting content up or down.  Because Facebook makes its money through ad revenue they want as many members to visit their site as possible; if the algorithm fills your feed every time with things you don’t like or don’t care about you aren’t going to use the service as much and so deleting content removes unwanted articles, pictures, and posts from your feed.  Ignoring content doesn’t say you want to remove it from your feed, but it also says you don’t really care about it either, so Facebook will leave that content but lowers its priority level.

Finally we get to the coveted “Like”.  I know that now there are others such as “Love”, “Haha”, “Sad”, and “Angry” but they all serve the same essential purpose: they tell the algorithm that said post has evoked an emotional response that caused you to take the time (even just a second) to click a button of approval.  The algorithm reads this as, “This is the type of content that will keep me coming back.”

The Facebook Effect is the result of hundreds of these interactions.  It is the result of deleting content and Liking posts to a point where you are greeted at login by a stream of at least 20 articles, pictures, etc, that most likely agrees with you.  Now for some, that might be mostly food or silly memes, for others it might be Doctor Who and Sherlock clips and references; for some it might be friends religious posts of Bible verses, encouraging memes, or the newest Chris Tomlin song, and for yet others it may be politics and current events.  It is the result of your interests and shared interests with friends and family.

Most people don’t want to be greeted by Pro-Life messages if they are Pro-Abortion.  Most people don’t want their feed full of Hillary Clinton support messages if they like Donald Trump or Bernie Sanders, and most people don’t want Doctor Who content if they don’t like or haven’t seen the show.  Therefore, this content is suppressed by ignoring or deleting and the Pro-Abortionist, the Trump or Sanders supporters and the nerdy Whovian all get their preferred content.

The results are entire groups of people, many of whom are becoming more active in sharing their views on contentious issues being surrounded by like-minded posts.  There is nothing wrong with this, but it does give the perception that your particular views are the prevailing ones which most likely is not true.  Case in point: in 2011 I was excited.  Ron Paul was surging in the polls in several states, his ground game was picking up delegates all over the place and my feed highlighted this to no end.  After six months of this I cast my primary ballot for Dr. Paul and was shocked that he lost the primary by a staggering 1,308 delegates.  The Facebook Effect had me and many others believing that Paul was going to pull out a victory while reality was quite different.  And truth be told, it wasn’t just Facebook that caused this but many other sites as well.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s