A stunning story in Wall Street Journal on May 26, 2020 reported that an internal study by Facebook’s own researchers in 2018 revealed that the social media company’s algorithms not only did not bring people together, but in fact were driving people apart.

 

“Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”

 

Many of Facebook’s own experts agreed. Their research showed that:

 

● extremist groups were growing on Facebook and Facebook’s algorithms were responsible for the growth

 

● 64% of all extremist groups that join Facebook are due to Facebook’s own “recommendation tools” pushing extremist connections and growth

 

● a disproportionate amount of the bad behavior (fake news, spam, clickbait inauthentic users) came from a small pool of hyperpartisan users

 

● in the U.S. Facebook saw a larger infrastructure of accounts and publishers on the far right than on the far left.

 

That meant that if Facebook adjusted its algorithms to not promote “bad behavior” that would result in disproportionately limiting right wing actors. When that became apparent, Mr. Zuckerberg lost his enthusiasm for changing Facebook’s algorithms to mitigate extremist clicks. Two reasons: (1) he needed right wing support in Washington and didn’t want to alienate the party in power, and (2) reducing clicks was tantamount to leaving money (a lot of money) on the table. Zuckerberg was loath to do either of these things.

 

The bottom line: Facebook had effectively monetized nastiness, divisiveness, and rage. It paid, and it paid big. And his friend in the White House had just given big companies like Facebook a whopping tax break in 2o17. Zuckerberg didn’t want to do anything to upset the status quo.

 

The result: Zuckerberg shelved the research. What’s a little divisiveness in the world when there is so much money to be made from it?