Below is an article from today’s The New York Times by Shira Ovide who writes the On Tech column for them.

 

The Wall Street Journal reported this week that Facebook had studied for two years whether its social network makes people more polarized.

 

Researchers concluded that it does, and recommended changes to the company’s computerized systems to steer people away from vilifying one another. But the Journal reports that the company’s top executives declined to implement most of the proposed changes.

 

Fostering open dialogue among people with different viewpoints isn’t easy, and I don’t know if Facebook was right in shelving ideas like creating separate online huddles for parents arguing about vaccinations. But I do want to talk over two nagging questions sparked by this article and others:

 

Are politics rather than principles driving Facebook’s decisions?

 

Facebook said it didn’t want to make important policy decisions on its own. Then why did it make these important policy decisions alone and in secret?

 

These questions are important because Facebook is not an ordinary company. Whether we are aware of it or not, the ways the company designs its online hangouts shape how we behave and what we believe.

 

In an extreme example, Facebook has acknowledged that it failed to prevent its social network from being used to incite a genocide in Myanmar. That’s why it’s crucial that Facebook makes good policy choices in fair-minded and transparent ways.

 

On the first question, The Wall Street Journal reported that Facebook decided not to make most of the suggested changes aimed at reducing the spread of divisive content in part because more material from the right than the left would have been affected, and the company was worried about triggering claims (again) that the company was biased against conservatives.

 

Some people at Facebook also said they believed that similar fears were behind Facebook’s light touch on policing political speech in advertisements and posts.

 

If Facebook made these decisions on the merits, that would be one thing. But if Facebook picked its paths based on which political actors would get angry, that should make people of all political beliefs cringe.

 

This is not a partisan point. Facebook is obsessed with staying in the good graces of people with power, period. This is natural, to a point. (President Trump’s anger at Twitter for adding fact-checking notices to two of his tweets this week shows there are consequences to such decisions.) But there should be a line between understanding the political reality and letting politics dictate what happens on your site.

 

People at Facebook say the company doesn’t bend to politics. And Facebook in a blog post on Wednesday detailed its investments and changes that it said were intended to reduce polarization.

 

It also unnerves me that multiyear research into Facebook’s impact on the world stayed entirely inside its walls. What happens at Facebook is too important to stay secret.

 

Five years ago, Facebook’s chief executive, Mark Zuckerberg, touted research by its data scientists that found that the social network doesn’t worsen the problem of “filter bubbles,” in which people see only information that confirms their beliefs. The public could evaluate the data and discuss an important question affecting much of our society.

 

Now, big-picture discussions about Facebook’s impact are confined to internal posts or company conference rooms.

 

This is the opposite of Facebook’s stance that it wants input from lawmakers and the public about important topics like what speech is harmful and how to prevent cyberattacks. It also runs counter to Facebook’s efforts to work with independent fact checkers and create a quasi-independent board to adjudicate disputes over posts that violate the company’s rules.

 

“People shouldn’t have to rely on individual companies addressing these issues by themselves,” Zuckerberg wrote in an opinion piece for the Washington Post last year. “We should have a broader debate about what we want as a society and how regulation can help.”

 

It’s a good principle — but not if Facebook believes it only when it’s convenient.