Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Facebook blocks sharing of critical Guardian article (twitter.com/profcarroll)
172 points by afandian on Sept 28, 2018 | hide | past | favorite | 78 comments


Whether malice or incompetence the key thing is that it's blocking a legit post with no recourse. That's a serious problem given that it's such a core sharing platform for so many people.


Whether malice or incompetence the key thing is that it's blocking a legit post with no recourse.

Is The Guardian a legit source? What does legit source even mean? As far as I can tell, it means, "Has a strong brand." Given what we know about brand decay and bait and switch, are our criteria even remotely useful and valid? From what I've seen, I don't think so. It seems like many "legit" news sources with failing business models are cashing in their brand reputation, while it lasts, for some link-baiting and public manipulation.


It's one of the UK's leading broadsheets with nigh on 200 years behind it. Reasonable and valid complaints about mainstream media aside, this is about as legit as it gets!


It is deemed a legit source on the matter because the info in the article is correct.


Then why not take the evaluation down to that level? Why do we even need brand level proxies in 2018? It would probably be best if we evaluated down to the article level of granularity, at least.


> What does legit source even mean?

It used to mean something that was professionally run and trusted. Something that had earned people's trust.

Unfortunately, now it means "those who I agree with" because we live in a highly partisan environment - which the "trusted sources" had a hand in creating.


It seems like The Guardian is actually trying out new things that seem to be working. Interesting article about them here: https://digiday.com/media/guardian-improbably-put-way-path-p...


Is The Guardian a legit source?

It’s readership numbers have been declining and some of its writers churn out complete gibberish but it has been around for a long time and is generally considered to be a serious newspaper in the U.K.

Disclaimer: worked on their website in the late ‘90’s


On the other hand, Facebook is not a soapbox on a public place, it's a website owned and hosted by a private company which has precise criteria to decide which publications are more desirable than others.


But that is a problem that's my point. They sell themselves as allowing us to 'connect' - bless their kind selfless hearts - then filter, prioritize, and suppress what we choose to connect with.

At this stage 'private company yada yada yada' is wilfully missing the point of just how much Facebook is the social media entity for a good chunk of the world.


But it's no secret that the content you see on Facebook is the content Facebook and advertisers want you to see, not content you'd be interesting it in the first place.

Maybe I'm stubborn or stupid but I fail to see the problem. What make you think Facebook should have the interest of their users at heart to begin with, except to attract more users? The users are not their customers, the advertisers are. And anyway, nobody is forcing anyone to use Facebook or any social media at all.


You are 100% correct in theory, and everybody here knows this.

But the problem is that in practice, most people live on Facebook, and don't really care about any of this. This means that companies such as Facebook have a huge influence on what people see.

So the question really is, do we, as a democratic society, allow a single company to have such a big power over the opinion of the masses.

I don't have a proper answer for this, but that question is definitely there, regardless of the fact that people are not forced to use it.


You forget perhaps that we HN folk are not typical. What you describe is not widely accepted, not by a long way.

To most users FB is absolutely a pure(ish) content exchange medium. They treat posting like sending SMS or what have you, an 'open' exhange, and not entirely unreasonably IMHO given 11+ years of precedent.

With that in mind today's event becomes significant. It rams home that FB is in control. And frankly that's new to most people!


They treat posting like sending SMS or what have you, an 'open' exhange, and not entirely unreasonably IMHO given 11+ years of precedent.

Back when SMS was new, people treated it like it had time end delivery guarantees, even though it had neither, and messages could sometimes be delayed by 20 minutes or an hour. I even had a girlfriend accuse me of lying because of that.


That's the problem with anything that's true 99% of the time. People start to believe it's true 100% of the time.

That's why your ex didn't expect this 'SLA', because most of the time messages arrive practically immediately. And that's why most people don't expect Facebook to block non-spam, non-offensive content, because most of the time they don't.


So you’re saying it would be fine if they only showed content for one political ideology and one religious view, and suppressed everything else??


That's what a lot of websites do, and it's called "editorial choices".


I disagree with you, nevertheless I upvoted you.

It's so galling that legit opinions, which are well presented are trashed, just because they go against the direction of the polit bureau.


Thank you. Well I'm playing the devil's advocate there. To me Facebook has always looked very untrustworthy. They have a very bad habit of mishandling users' data and manipulating public opinion. Nevertheless I believe in editorial fredom of private websites.


If Facebook is a publishing company, then they should be treated as such.

They should lose their immunity to lawsuits for user created content on their publishing platform.


I've increasingly felt that fb, twitter etc with their filters and algorithms are more publisher than platform.

They indisputably go to great lengths to control what you see ergo they are not simply facilitators.


They should be liable for what is published on their platform but they certainly cannot be guilty of _not publishing_ something.


By refusing to publish something, they are no longer acting as a neutral platform.

Losing the immunity that they get by being a neutral platform, means that current content on FB, that they were previously not liable for, they would now become liable for.

For example, if a user posts something illegal on FB, then Facebook could be sued, even if they remove the content as soon as they became aware of it. (In the same way that the NYT is responsible for the content on it's writers).

Think about how phone companies have immunity from someone making illegal phone calls, and then phone companies becoming liable for damages that they previously were not liable for previously.


I'm not a lawyer, what are the legal basis for that "immunity" and the fact they lost it somehow?


For telecomms, this is known as “common carrier” status; it means you are not liable for what goes through you, but to keep that status you must not filter anything either.

If you do filter, you’re supposed to lose that status - and then you become liable for what goes through your pipes.

I’m not sure there’s a similar legal concept for publishers.


Where is this precise criteria? Can anyone explain it for a layperson? While we all know it is the aggregation of many different algorithms at work, the results of those algorithms have consequences. Claiming that it's fine because the algorithm exists is begging the question.


If they have decided (consciously or not, because sometimes algorithms have unpredicted consequences) to not have something published then they are free not to publish it. It's not like it's a public service run by the government.


> which has precise criteria to decide which publications are more desirable than others.

The issue is that we don't know these criteria.


Hilarious, but I bet it's Hanlon's razor at work again:

> Never attribute to malice that which is adequately explained by stupidity.


How many times do they need to be caught out before the razor gets blunt?


Oh, there are no bounds for human stupidity!


As we're talking in hypotheticals why not pivot the question.

How 'stupid' can a company, with an arguable international monopoly on a major form of communication and dissemination, be before we acknowledge the serious danger it is putting society in?


> arguable international monopoly on a major form of communication and dissemination

Monopoly in what, precisely?

Facebook has a monopoly in online, blue-themed social networks owned by Harvard dropouts.

It doesn't have a monopoly on social networks, on communication, on online networks, eve on blue-themed online social networks.

Coca-Cola has a monopoly on cola-flavored soft drinks in many countries, but that isn't relevant, since cola-flavored soft drinks compete with a thousand other types of drinks.


I did hedge the word 'monopoly', as we are into unprecedented territory (SE suggests 'near-monopoly') [0]. I don't know if there are appropriate analogies, but soft drinks don't represent the gravity of the situation IMHO.

For a massive number of people, FB is the only platform that they and the majority of their friends to share and comment. It is treated as de facto neutral infrastructure. I don't know how it represents itself these days.

[0] https://english.stackexchange.com/questions/361600/need-a-wo...


> or a massive number of people, FB is the only [online] platform that they and the majority of their friends to share and comment.

For a lot of people, Amazon is the only online marketplace they buy from.

And for a lot of people, Coca-Cola is the only soda they drink.

None of those grant those companies a monopoly on shopping, or on drinks.


- Facebook is not in a traditional 'monopoly'.

- The term 'monopoly' is conventially used in connection with the sale of traditional goods, like soft drinks and train services.

- It represents a boolean state, and was created to describe situations that arise in established economic models.

- The new trade in information, such as propagated by Facebook, does not follow many of the established rules of the trade of goods and services.

- The network effects of buying soda are weak. Maybe if my friends "prefer the taste of X" I will buy that too. But their choosing a given brand of soda does not preclude me from using a different brand of soda.

- The stakes are lower too. Notwithsanding the actual literal crimes committed by Coca-Cola, if people consume more of one brand than other, or Coca-Cola try a different formula, the worst harm that is done is some shareholders lose some money.

- If Facebook controls the flow of information, directly or indirectly, intentionally or unintentionally, big social outcomes change. For example the election of the current president of the United States, and countless other decisions round the world.

- Many things are changing. Business models, societal models, communication models, user behaviour. Even the use of the word 'user' sheds light on the situation -- what we might have called "cizizens engaged in converstaion" is now "users on a platform" and, in some dominant sectors is the dominant platform for discussion.

- We need new vocabulary to talk about the effects that technology has brought about. No extant word is a perfect fit. That is not a good reason to not try to talk about them.


> Facebook is not in a traditional 'monopoly'.

Just because you invented a new definition of the word monopoly, doesn't mean it is relevant. It isn't.

> The term 'monopoly' is conventially used in connection with the sale of traditional goods, like soft drinks and train services.

No. At least not when talking about economics. By people who don't understand economics, maybe.

> It represents a boolean state, and was created to describe situations that arise in established economic models.

Also no (to the boolean part).

> The new trade in information, such as propagated by Facebook, does not follow many of the established rules of the trade of goods and services.

You have to prove that it changed the definition of monopoly. You didn't. It didn't.

> The network effects of buying soda are weak. Maybe if my friends "prefer the taste of X" I will buy that too. But their choosing a given brand of soda does not preclude me from using a different brand of soda.

Network effects don't create a monopoly, just barriers to entry in a specific market. But that market might have competing markets, or replacements.

> If Facebook controls the flow of information, directly or indirectly, intentionally or unintentionally, big social outcomes change. For example the election of the current president of the United States, and countless other decisions round the world.

Doesn't imply a monopoly.

> We need new vocabulary to talk about the effects that technology has brought about. No extant word is a perfect fit. That is not a good reason to not try to talk about them.

"I'm wrong about the way I use the term monopoly, so we should change the term so I'm right".

Nice try. You were able to get almost ever single point wrong.


PS: Quoting stack exchange for a definition of monopoly can only be a joke. You need to read better sources...


Oh, there are no bounds for human stupidity!

Can we automate and get an order of magnitude advantage out of the artificial kind?


Yeah, if they actually wanted to censor something they would do it silently, by not showing it in people's news feeds. Proving such practice exists would probably be near impossible without having access to the source code or core usage statistics.


Looking at some of the tweets it seems FB has a mechanism that autoflags article links as spam if "too many people are trying to share" it.

This just makes me wonder exactly how automated that feature is, because surely a non-nominal number of people share current event links all the time about something political, sporting event, natural disaster. I wonder if it happens then, or is this a matter of recency....Streisand (there's got to be a better word for it than that) following the news of those 50million accounts being breached.


What I don't understand is why this is considered spam. Are lots of people not allowed to talk about the same thing?


I can think of a couple of possible explanations:

1. Viral content like "share this or Facebook will delete your account" - may as well shut that down straight away.

2. A security flaw in a Facebook site that lets malicious websites post to your wall automatically from a webpage. In that case, the malicious website will probably post a link to itself, so that it can collect more victims. In this case, this limits the impact of the flaw. Look at the Samy worm as an example.

3. News feed algorithm "breaks" in some way when a lot of people are posting the same thing, so they just prevent that from happening.


>Yeah, if they actually wanted to censor something they would do it silently, by not showing it in people's news feeds.

Not really. Reddit does exactly this. Posts to /r/news are only allowed from a whitelist of "real news" sources.


Huh? Well I meant if they want to censor something silently they could get away with that.


Exactly. Even if an experiment could show something reliably not appearing in other timelines, it could easily enough be waved away with the standard throw-your-hands-up-and-say-‘algorithms’ explanation.


Hilarious, but I bet it's Hanlon's razor at work again

My corollary: If you want to instigate against adherents of Hanlon's Razor, it's best to use weaponized stupidity.


I'm surprised at how credulous everyone is being about this. Even assuming a maximally evil Facebook (which they make pretty easy to do...), they'd have to be incredibly stupid to be unaware of the Streisand effect and the fact that this would take off even more (and much more negatively) if they tried blocking it.

It seems a lot more plausible that this article is unusually widely shared, which is obvious since everyone on Facebook has some interest in Facebook...


Facebook is made up of individual people. Incompetent evil could be the protest of an employee who doesn't want to be evil but also doesn't want to get fired for obviously refusing an order.


Despite the complete moral vacuum there seems to be a facebook, I do still believe that this isn't deliberate. But big name companies do far worse things, and Facebook themmselves have done worse. And stupider, since you mention it. It is entirely plausible that they are intentionally suppressing the news.

I find it a fascinating side of the human psyche that it finds it hard to believe that "big brand name company" might be responsible for atrocities when history is littered with examples. Companies are only staffed by humans, after all.


> But big name companies do far worse things, and Facebook themmselves have done worse.

.

> find it a fascinating side of the human psyche that it finds it hard to believe that "big brand name company" might be responsible for atrocities when history is littered with examples. Companies are only staffed by humans, after all.

I'm not sure why the majority of your comment is focusing on how plausible it is that they would do something evil. It couldn't be less relevant to my comment.

Like I said, my point doesn't rely on any assumptions about Facebook's morality as an organization, just some baseline of competency. I know organizations have done stupider things, but I can't think of a similar org taking an action whose _first-order_ effect would so obviously be the opposite of what they intended. It's far more common and far more plausible that they didn't consider the second-order PR effects of an algorithmically-enforced broad policy they have.


Ok, well maybe we're at cross purposes. You did use the word plausible originally, and again in reply. It depends whether you bring any context to a given judgment or rely on intuition on statistical distributions. I think that when appraising the plausibility of bad faith you should take past behaviour into account.

I can totally see a situation where a well meaning PR person says "what can we do to slow down this story?" and some engineer says "we have this mechanism but it wasnt designed for this purpose..."

Compare with yesterday's fun and games when the same metanarrative was reported. "We could shift more units if we had phone numbers"."we do but they were given to us for an expressly different purpose" https://news.ycombinator.com/item?id=18082017

When you're in this position, ethics is a competence. They may be great at shipping code but past experience suggests that there are deeply inadequate decision making processes.

Apologies if this misses your point again.


This is sadly funny. It's probably a legitimate feature to block spam - timing and what it is about is just highlighting the issues with large scale automation and data analysis. Must be a tough day and upcoming weeks at Facebook....


Wow, FB keeps shooting itself in the foot. They'll likely blame an algorithm, not a deliberate human choice, but that's not a valid excuse.


Not an "excuse", but a valid explanation, don't you think?

When we offload decisions to algorithms, that's wha you get.


They don't need any excuse.


    Some users are reporting that they are unable to post today’s
    big story about a security breach affecting 50 million
    Facebook users. The issue appears to only affect particular
    stories from certain outlets, at this time one story from The
    Guardian and one from the Associated Press, both reputable
    press outlets.
    ...
    The situation is another example of Facebook’s automated
    content flagging tools marking legitimate content as
    illegitimate, in this case calling it spam.
https://techcrunch.com/2018/09/28/facebook-blocks-guardian-s...


Does FB have a team dedicated to responding on HN yet?


What is amazing to me is that twitter still does not have a solution for sharing mobile links to desktop users. It is a really bad user experience.


I realized the other day that wikipedia doesn't either. You'd think it wouldn't be all that hard to redirect in the year 2018, but apparently it is.


That is pretty Orwellian of them. Are we at 'peak' facebook, or will they be able to continue growing from here?

Seems like trust and enthusiasm in their platform has been eroded to the point that they are much more vulnerable to competition than they were a year or two ago.


Could you explain what exactly is Orwellian here? Not meaning to pick on you but you're currently the top comment that doesn't attempt to Chakrabarti engage with the topic.

I don't see much evidence of anything other than a crap algorithm here as yet.


Orwellian is often used to describe deliberate and malicious censorship of information, but I think perhaps this case is an even more accurate parallel: creating a convoluted automated system where all participants can believe they are acting ethically but the net effect is censorship of information which would damage the system and restriction of individual expression.


Aaaand it's back (just tried to share it and it worked).


The power and domination can not accept critics. All big companies are involved in war communication on their own activities. FB just reacts as Monsanto or Texaco/Chevron do with all possible means.


So Facebook is trying to censor a story that they themselves made public just a few hours ago? Guardian thinks a bit too much of itself.


Guardian didn't say censor, they said block, which is factual.

Nice strawman though.


So Facebook is trying to block a story that they themselves made public just a few hours ago?


Wow. This is too much!


It's astounding that educated tech journalists think this is malicious. Clearly, this is an algorithm.

Just like as a thought experiment, wouldn't it make more sense to allow the post to go through and then prevent everyone on Facebook from seeing it? Obviously, FB doesn't do that, but wouldn't that be a more sensible approach if they did do that type of thing?


A spam-detection algorithm that doesn't whitelist pages like The Guardian? I have to say I find it hard to believe that they would be so incompetent.


Facebook can't be seen to be prioritising one news source over another.

So I can't imagine they have lists of any news sources and rely entirely on user signals to determine the quality of a source.


I second that.

An example: a few weeks ago I shared on my FB profile the Mozilla Foundation petition link for the EU copyright reform[1] and it was removed with this motivation:

"We removed this post because it looks like spam and doesn't follow our Community Standards."

[1] https://blog.mozilla.org/netpolicy/2018/09/07/eu-copyright-r...


A whitelist? Huh? What is this the 90's?

I'm pretty sure the algorithms use something more intelligent than a whitelist. A whitelist would be a major loophole in a spam algorithm.


The evidence at hand casts doubt on the idea that the algorithm is very intelligent.


> It's astounding that educated tech journalists think this is malicious. Clearly, this is an algorithm.

Who's claiming this is malicious? In the first couple of replies to the tweet, there's talk about how this is probably due to a poorly designed anti-spam algorithm that's badly in need of a whitelist.

That said, Facebook is responsible for the actions of their algorithm. If their algorithm starts blocking critical articles, it's still egg in their face.


> Who's claiming this is malicious?

brianmcc, afandian, mrob, shortexpresso, cwkoss, golemiprague in this discussion at least


>>> > It's astounding that educated tech journalists think this is malicious. Clearly, this is an algorithm.

>> Who's claiming this is malicious? In the first couple of replies to the tweet, there's talk about how this is probably due to a poorly designed anti-spam algorithm that's badly in need of a whitelist.

> brianmcc, afandian, mrob, shortexpresso, cwkoss, golemiprague in this discussion at least

That's nonsense; those are just HN commenters. It's clear from the context of my comment that the "who" I was referring to was "educated tech journalists" and the people in the twitter thread.

The great-grandparent comment was basically criticizing a straw-man, and I was only pointing that out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: