What responsibilities or duties does the platform have in such cases? Facebook maintains that it doesn’t choose what to show people—that its algorithm simply reflects the choices that users make—but this seems more like a way to dodge responsibility than an accurate reflection of what’s happening, especially when it is actively deleting news-worthy content.
As Facebook rolls out its “Instant Articles” initiative, in which news entities such as the New York Times and The Guardian are publishing directly to the social network, instead of just posting links to their own sites, media organizations and industry watchers are wrestling with the idea of Facebook as a platform for news. There’s the influence of the news-feed algorithm, for one thing, which is poorly understood—primarily because the company doesn’t really talk about how it works. But there’s also the fact that Facebook routinely deletes content, and it doesn’t talk much about that either.
In what appears to be one recent example, photojournalist Jim MacMillan happened to be walking through downtown Philadelphia shortly after a woman was run over by a Duck Boat (an amphibious vehicle that takes tourists around the harbor). Reverting to his journalistic training, he took a picture of the scene and posted it to his accounts on Instagram and Facebook, along with the caption “Police hang a tarp after a person was caught under ?#?RideTheDucks? boat at 11th and Arch just now. Looks very serious.”
MacMillan said in a post about the incident that he chose not to post a picture that showed the woman’s body underneath the vehicle and he also didn’t mention her name, or the fact that she was most likely dead.