Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

True, however this wraps back to the publisher vs platform debate. If they don't want the government forcing them to host content and uphold free speech then they have to take responsibility for the content in the eyes of the law. Currently they're having it both ways, free to ban ideas they don't like and control the content they deliver without being responsible for libel/slander.


> If they don't want the government forcing them to host content and uphold free speech then they have to take responsibility for the content in the eyes of the law. Currently they're having it both ways, free to ban ideas they don't like and control the content they deliver without being responsible for libel/slander.

It sounds like you're saying that they either have to host absolutely any video anyone uploads, or else stand behind every video as if they made it themselves.

I don't see how a video platform could exist on those terms.


That isn't what I said nor what I meant. They are overstepping on the censorship front by continually constricting their "acceptable" guidelines based off of their very fluid internal rules. "In 2018 alone, we made more than 30 policy updates. One of the most complex and constantly evolving areas we deal with is hate speech" They even state it as a point of pride - they continually shift the lines on what is "hate speech" and because of their market dominance, those they target are effectively shut down entirely just because the execs at YouTube/Google are intent on pushing their political viewpoints into company policy. I'm saying that this behavior shouldn't be acceptable for a company that has such a strong grip on societal discourse. Either curate, publish, and be accountable by law, or be a platform that removes only the videos that break the law and allow users to filter out content they deem inappropriate.


I share your concern about how Youtube influences society, and it's likely that content I consider good will be taken down by Youtube in the future. I don't like that.

But it still sounds to me like you're demanding that, after removing illegal videos, a site should either be 100% curated or 0% curated.

https://www.tubefilter.com/2019/05/07/number-hours-video-upl... says:

> The platform’s users upload more than 500 hours of fresh video per minute, YouTube revealed at recent press events.

100% curation is unrealistic in that case. They could crowd source the work, but it's hard to see how they can be legally accountable for what users mark as OK.

But 0% curation would mean that they're forced to host videos they consider repugnant. Imagine starting a video sharing platform, having your business grow, and one day being told "well now you're big enough that you no longer get any say in what your site hosts."


I see what you're saying but it's already a problem for them. 100% of videos coming onto the platform will be . Instead, the discussion we're having is where they should draw the line on what gets removed. Is it videos that break the law? Or videos that they deem inappropriate? We can vote to change the law, we can't vote to change their policies. Users can go to a new platform but there is nothing close to an equivalent competitor. (Name me one popular, full time content creator who did it without YouTube)

I don't like the sound of government interference at all, trust me. However when people are being silenced and demonetized because they hold political views that YouTube doesn't like I feel that it's necessary in order to uphold the users constitutional rights. Many experts have speculated that YouTube operates at a loss - is it fair that they gain market dominance this way and then flex their power to remove ideas they don't like?


> However when people are being silenced and demonetized because they hold political views that YouTube doesn't like I feel that it's necessary in order to uphold the users constitutional rights.

User's have no constitutional right to political speech on a private platform. Their rights are not being violated.


> I'm saying that this behavior shouldn't be acceptable for a company that has such a strong grip on societal discourse

YouTube doesn't have "a strong grip on societal discourse", it's a glorified video sharing site that's widely regarded as a cesspool in terms of discourse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: