Money. It’s “something that’ll make you do wrong… and make you do right.”1 In case you haven’t heard, some women’s organization gave Facebook a hard time because Facebook’s moderators didn’t remove some offensive user content quickly enough. Facebook returned a completely valid (though apparently unsatisfying) apology. It said, in so many words, “I know you want someone to be angry with, but we had nothing to do with that. We have a billion users and it’s kind of hard to keep offensive stuff under wraps. Okay? Give us a break. We’re just trying to maintain freedom of expression, and we have neither the resources nor the inclination to track down and destroy every little pesky thing that annoys a few users. So deal with it.”
I think this was a valid response. The content in question was extremely offensive. There is no doubt about that. (One meme had the following text super-imposed over a woman’s duct-taped face: “Don’t wrap it and tap it… Tape her and raper her.”) But Facebook can hardly be held responsible for all the content its users post, and where would they draw the line? Should Facebook remove posts that contain curse words? What about “sexy” pictures (most of which are inherently exploitative)? Here’s a tip for all of you: if someone generally posts things you find offensive, remove them from your news feed or unfriend them. Done. No more offensive posts from that person or organization. But that wasn’t good enough for the women’s organization in question.
So they started the twitter campaign “#fbrape [facebook rape]” to add a little muscle to their complaints. They posted pictures of the user content side-by-side with the ads that ran with it. American Express, Nissan, and some others began to pull advertising budgets from facebook. And guess what? Facebook executives magically decided maybe this issue was a whole lot more serious than they had thought previously. They realized that freedom of expression might not be as important as women’s rights. Facebook took drastic action, and so far, has managed to recover big advertisers without losing too much revenue.
It doesn’t matter what Facebook executives actually think about the user content. I thought it was extremely offensive myself, and individuals at Facebook probably felt that way personally as well. But does that mean I want Facebook to more closely monitor or more readily remove user posts that might be considered offensive to people? Not necessarily.
At the same time, it isn’t a First Amendment issue, as Facebook is a privately owned forum. They can censor and delete at will. They haven’t done this generally, again, for financial reasons. People use Facebook because people use Facebook. If it weren’t largely free (in price and in regulations) and ubiquitous, people wouldn’t care to use it. If people didn’t use it, advertisers wouldn’t pay Facebook to advertise. So, again, it comes down to money on both sides. Facebook now has to balance one money-making concern (freedom of expression) against the other (loss of revenue due to offensive content). This is not a moral concern ultimately. It’s a financial one.
It’s interesting that so many organizations (even the Boy Scouts, for instance) are banking on a shift of societal norms. Making moral claims in this environment has less to do with convictions and more to do with an investment in future financial possibilities. Like any investment, it has the potential of financial gain and it also carries a risk. In a culture that worships money, it’s no surprise that financial concerns shape morals more than anything else. Money moves morals. Because most people don’t have any. So money can get them to do wrong. And it can get them do “right.”
- Al Green was talking about love, I know, but I took some license. [↩]