Before the Internet and social media, groups had a practical size cap because they had to meet up in person. Polarization was naturally limited.
I don't think the social media companies' algorithms are entirely to blame. But more broadly it's centralized moderation of public online spaces.
Moderation of public behavior of physical spaces was only necessary because it wasn't possible to selectively filter people's influences on eachother in public. If someone is doing something you don't want to see in public, covering your eyes is not good enough because you also block out the people you do want to see. Centralized moderation was a practical half-measure rather than an ideal solution for a democratic society that values free expression and self-determination.
That kind of moderation isn't necessary online because all filters can be implemented client-side. We just aren't doing it because people are so used to the old way. But the old way will naturally lead to more and worse conflict when we have infinite connectivity.
Yes. Indulging in that desire to boss over other people (and also hubris, thinking oneself knows better) was fine before social media because it was hard to have a say over what others hear at scale. People congregated into groups that moderated speech internally, but those small groups rarely got large enough to influence nation-states. Democracy worked fine because those small groups were independently-thinking entities despite being internally homogenous. And individuals from different groups could still talk to eachother intelligently because the inter-group rifts weren't that large. The groups weren't that large. Many small groups existed and could debate eachother to generate effective policies in a democratic society.
Now there are big social media mobs. The number of independently-thinking entities have gone down drastically over the past 2 decades. We're ranking, filtering, and moderating ourselves into authoritarian governments ran by Internet echo chamber mobs.
Government-moderated echo chambers existed long before the internet. In fact I think they're the default throughout history. But I think business has developed the manipulation (through specific technologies, products, services) into a science. The gift of big data and business analytics applied to the problem of manipulating public opinion.
Government propaganda is used to rile people up so that they're willing to kill eachother in war. This is indeed a default throughout history but I don't think it's a good default and we could change it by making and then keeping the Internet a free place.
It doesn't have to be government propaganda, though. It can be everyone with enough money to broadcast, and in the Internet age that is basically everyone.
We've spent most of the 20th century honing propaganda techniques to the point where its potency is like a nuke compared to a dynamite stick. And then we've spent most of the 21st century so far making that nuke cheap and easily available to everyone.
I'm very pro-free-speech in general and I don't think censorship is a solution, but in order to argue for free speech in good faith we have to acknowledge the problem. The reason why people want government to censor is, in many cases, the same reason why they want government to crack down on someone who is building a nuke in their garage (but are okay with the government itself having the same nuke).
Speech is not comparable to nukes. Not even close.
I don't want everyone building nukes in their garage because nukes kill people en masse in an instant, and I cannot counteract the effects of someone else's nuke by setting off my own nuke.
People want the government to do their work for them. But you can't expect someone else to advocate for your interests automatically. The only relationship where that happens reliably on a regular basis is that of a baby and a mother. The government is not your mother. The government's policies are the result of the intellectual output of its citizens in a democratic society. You are supposed to control it as a citizen, to serve the interest of yourself and other citizens. It is not an entity that you can delegate decision making to and expect no consequences down the line.
If you think someone else's ideas are wrong, put forward your own. I don't think we should ban propaganda, as that would require a central authority to determine what info is/isn't propaganda and thereby creating propaganda in the process. I think we should ban the mechanisms that encourage the formation and snowballing of intellectually/ideologically homogenous groups on the Internet, thereby making it a hostile environment for propaganda and similar heuristics in general, so that the best ideas may thrive in dynamic competition.
I don't understand what you mean with your last paragraphs, but I disagree that there's a centralized moderation. Of all the social media I'm aware of, moderation is mostly distributed, and weighted. Instagram flagged posts go to "a team" to review, but they weigh that against the account and the content. Reddit moderation is per-subreddit. Facebook moderation is a combination of group moderators and site moderators.
In any case, I don't think moderation isn't a factor, because moderation is for commenters. Something like 90% of people just lurk. And that's where ads, influencers, comments, and everything else, are targeted.
How can we get more people viewing TikToks, or YouTube Shorts? How can we get more people to sub to our Patreon? How can we get more followers, or likes? This has nothing to do with moderation. It's the math of "what can we post that will make people "engage" with their eyeballs and their clicks?" That's what matters. Partly because eyeballs equal ad dollars; but also because eyeballs equal influence. The science of manipulation is getting you to see what I want you to see, and you coming back for more. The more you come back, the more you're part of my in-group.
Another example is astroturfing. I can't remember if it started for commercial or political gain, but the point is the same. Post some fake shit to make people believe there's a grassroots opinion, in order to get them to back it, with them assuming it's really a grassroots movement. Whether you're Vladimir Putin or DuPont, you benefit the same way: manipulation of public perception, through the science of social media disinformation.
I don't mean there's a single centralized moderating authority overseeing everything, but rather the general tools and mechanisms used for moderation on the Internet are centralized and undemocratic. They produce groups with authoritarian power structures and norms. Those groups get larger over time with no limit on their size. When they get large enough, they fight over which one gets to run a country. This is how modern democratic countries can turn authoritarian very fast, and it's already happening.
When I say "group", I don't mean an actual Facebook group or subreddit (it could be, though). I mean a group of intellectually/ideologically homogenous people. They may be distributed across many subreddits and comment sections. Forums/subreddits/servers can be separate entities in form, but not in substance. Two Discord servers that moderate content the same way are the same group in this context.
Moderation is not just for posters. It also affects lurking viewers because it changes what they will see. If a post is deleted by a moderator, then that moderator has decided for the viewers what they can and cannot see.
Up/down voting (aka likes/dislikes) is a hidden form of moderation as well. People's likes and dislikes are deciding what other people are more likely to see because upvoted posts get to the top of the feed. Recommendation and ranking algorithms do the same thing.
I'm not making a statement about who has nobler goals, be it the ad companies or Putin or the US gov or the people here on HN. I'm saying that the concept of centralized moderation on the Internet is itself the problem. Regardless of what or whose goals these tools serve, they're bad because they coagulate people into intellectually and ideologically homogenous groups, and there is no group size limit due to the practically infinite connectivity of the Internet. This will create nasty real-world consequences in the long run. But we can defuse this by moving all moderation and ranking to the client-side.
I don't think the social media companies' algorithms are entirely to blame. But more broadly it's centralized moderation of public online spaces.
Moderation of public behavior of physical spaces was only necessary because it wasn't possible to selectively filter people's influences on eachother in public. If someone is doing something you don't want to see in public, covering your eyes is not good enough because you also block out the people you do want to see. Centralized moderation was a practical half-measure rather than an ideal solution for a democratic society that values free expression and self-determination.
That kind of moderation isn't necessary online because all filters can be implemented client-side. We just aren't doing it because people are so used to the old way. But the old way will naturally lead to more and worse conflict when we have infinite connectivity.