I've recently realized that a lot of the crap on Facebook comes from this ambiguity.
They've basically created a system where it is very hard for anyone to take responsibility. Case in point: New articles posted on FB tend to have very low quality comments, almost as a rule. For pages that post these articles, it is extremely difficult for them to "moderate" those comments, in order to enforce high quality discussions. In fact, their is a lot of pressure on those pages to not enforce quality commenting. It is hard to say, even, that the organizations who post new articles are even responsible for creating a positive community around their Facebook posts.
Contrast this with the "old way" that newspaper brought in reader voices: Through letters to the editor. No matter whether or not you agreed with the opinions, these letters were (and are), of a completely different caliber than the typical FB comment. The newspapers willingly published them, and, in a sense, took responsibility for them. Namecalling, slander, misinformation, etc are mostly (though not always) filtered out of letters published by newspapers.
Compare this, too, to a forum like Hacker News. Here, we have a steady moderator who enforces the rules, and that the community also helps enforce. The rules are clear enough. Low quality discourse is discouraged, and sometimes even removed. There is someone whose job it is to maintain the quality of discussions. On the comments on a Facebook page, that almost never happens, neither from FB or from the company hosting the page.
As has happened through the history of online forums, without decent moderation, things tend to devolve very quickly. This is exactly what's happened at Facebook. They've created "pages" that could be there responsibility, or could be the responsibility of the people running the page, but nobody really takes responsibility. Everything, instead, is focused on engagement. Publishers want to drive traffic to their websites. FB wants to keep people on the site, and "engaged" with their advertisements. Moderation, in essence, has been almost completely ignored.
I've recently realized that a lot of the crap on Facebook comes from this ambiguity.
They've basically created a system where it is very hard for anyone to take responsibility. Case in point: New articles posted on FB tend to have very low quality comments, almost as a rule. For pages that post these articles, it is extremely difficult for them to "moderate" those comments, in order to enforce high quality discussions. In fact, their is a lot of pressure on those pages to not enforce quality commenting. It is hard to say, even, that the organizations who post new articles are even responsible for creating a positive community around their Facebook posts.
Contrast this with the "old way" that newspaper brought in reader voices: Through letters to the editor. No matter whether or not you agreed with the opinions, these letters were (and are), of a completely different caliber than the typical FB comment. The newspapers willingly published them, and, in a sense, took responsibility for them. Namecalling, slander, misinformation, etc are mostly (though not always) filtered out of letters published by newspapers.
Compare this, too, to a forum like Hacker News. Here, we have a steady moderator who enforces the rules, and that the community also helps enforce. The rules are clear enough. Low quality discourse is discouraged, and sometimes even removed. There is someone whose job it is to maintain the quality of discussions. On the comments on a Facebook page, that almost never happens, neither from FB or from the company hosting the page.
As has happened through the history of online forums, without decent moderation, things tend to devolve very quickly. This is exactly what's happened at Facebook. They've created "pages" that could be there responsibility, or could be the responsibility of the people running the page, but nobody really takes responsibility. Everything, instead, is focused on engagement. Publishers want to drive traffic to their websites. FB wants to keep people on the site, and "engaged" with their advertisements. Moderation, in essence, has been almost completely ignored.