Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not happening in a meaningful way. The technology for a decentralized web is already here -- it's just the normal web. Our artificial legal barriers are what keep it centralized, and those will be an issue regardless of technical innovation.

Legalize scraping and fix copyright law so that users can truly assert ownership over the content they generate, and this will quickly become a non-issue.

P2P technology is cool and it has its uses; I'm even developing a decentralized distribution thing as a side project. But it is more work, which means in the typical web browsing use case, it is slower and less convenient than a conventional 1:1 conversation with a stable endpoint.

Suggesting that everyone introduce four hops of latency or that we all participate in a multi-billion device DHT (which all just translate to "much slower" to the typical user) is just not a practical solution to these problems.

The good and correct solution is to look to the root cause and fix it. That root cause is the incentive structure, including legal and financial arrangements, that heavily encourages the "AOLizaiton" of the web and allows the AOLizers to use the courts to clobber the hackers that try to re-liberate it.



> Suggesting that everyone introduce four hops of latency or that we all participate in a multi-billion device DHT (which all just translate to "much slower" to the typical user) is just not a practical solution to these problems.

DHTs can actually scale really well.

But systems don't have to be fully distributed to be less centralized. For example DNS stakes out a strong middle ground.

The problem with e.g. Facebook is that it's a closed system. You can't "apt-get install" a Facebook daemon and run your own Facebook server whose users can still talk to the people using facebook.com.

Part of the reason for that is the law but most of it is just network effects. Most people don't use GNU social because most people don't use GNU social.


>DHTs can actually scale really well.

Sure, DHTs are efficient, but they're not cost-free. You are still introducing a lot of unreliable, inconsistent, and slow hosts into the mix, and potentially having to traverse many of them to get access to the entirety of the content you're seeking. This is not a pleasant user experience, without even getting into the significant privacy and security tradeoffs, which can only be mitigated by making the system do more hops, more crypto, more obfuscation (which means, slower still). Octopus DHT is cool, but it is by no means a speed demon.

>But systems don't have to be fully distributed to be less centralized. For example DNS stakes out a strong middle ground.

There isn't an obviously-better "less-centralized" solution for DNS as far as I know. See Zooko's Triangle. [0]

>The problem with e.g. Facebook is that it's a closed system. You can't "apt-get install" a Facebook daemon and run your own Facebook server whose users can still talk to the people using facebook.com.

Saying "it's a closed system" is forfeiting the point. I can send packets to facebook.com and then turn around and send packets to any other destination. Why can I not send packets in such a sequence that the packets obtained from Facebook are then transmitted to some other place that makes it more convenient to use them? Because if I do that, Facebook will sue the crap out of me, as they've done to others. There is no real technical barrier preventing this, it's purely legal.

Facebook, Google, et al are not in their position just due to network effects. They've both sued small companies because they both know that if people can get the same data through competing interfaces or clients, if it's simple and easy to multiplex the streams and move the content around, the consumer won't need their company specifically anymore. They'll be relegated to replaceable backend widgets. That's their nightmare!

Facebook and Google are middlemen, brokers between what the user really wants and the people who are providing it. They are terrified of a world where their brokerage is unneeded, and they work hard to make sure that you don't realize it.

Twitter had the same realization about multiplexed streams, leading to their infamous crippling of third-party clients. Craigslist had this realization in their brutal about-face with Padmapper, after coming to their senses and noting that it posed a serious threat to their business. The entity that controls the user's attention controls the game.

It is at this point practically illegal to use a third-party exporter to read out and easily transfer the content from your Facebook page to another site. Even if it's 100% original content that you own completely from a copyright perspective, you can't run a program to read it out because the Copyright Act has been interpreted to mean that loading someone's HTML into your computer's memory could be an act of copyright infringement (this is called "the RAM Copy doctrine").

It's also usually illegal to download that page with an unapproved browsing device such as a crawler or a scraper; this is exceeding authorized access under the Computer Fraud and Abuse Act. You agree to all of this when you agree to the site's Terms of Service, but your agreement is not necessarily needed for these provisions to be effective.

Why are there are no easy "Try NewFace.com Services, We'll Copy Your Friend List, Post History, and Photo Albums right over!"? Because you'll get sued and left owing Facebook $3 million dollars if you try to do that. [1] :)

Once you throw something into the Google or Facebook black hole, they make it very difficult to pull it back out again. That's not an accident, and it's naive to just attribute it all to organic "network effects". The competition is dead not because no one else wants to compete for these users, but because they'll be sued to death if they do it in a way that's accessible to the mass market.

[Note: I know that both Google and Facebook have buried deep in the innards of the user configuration a mechanism that allows you to request the generation of a crudely-formatted, multi-volume zip archive representing some or all of your account data, and that you can receive some email some hours later delivering this data in chunks. This is not a practical way to move data for most people, because even _if_ you get someone to go through all this pain, the amount of time it takes to build, process, collect, and upload these archives ensures it is essentially a one-way thing. It can and should be a free-flowing exchange of information, which the internet can already easily facilitated. The only barriers are artificial, legal barriers.]

[0] https://en.wikipedia.org/wiki/Zooko%27s_triangle

[1] https://en.wikipedia.org/wiki/Facebook,_Inc._v._Power_Ventur....


> fix copyright law so that users can truly assert ownership over the content they generate

I agree copyright needs to be fixed, but the issue isn't user-generated content. Essentially, all users do is collaborative filtering.

The real work needs to be done to break up content 'owning' middlemen that form monopolies and use those to create walled gardens. Maybe a solution would be to ban the walled garden as 'anti-trust'.


Anti-trust is a kludge that proves something went awry. We shouldn't have to go through a manual process of breaking up the power brokers, especially since government and big corporations frequently get into bed together. We should design systems and processes that are self-healing -- systems that allow sufficient natural competition that the eventuality of a monopoly virtually never occurs.

It's like a computer system. Yes, if you write corrupted data back to the database, you can manually go in there and do your best to rectify it, restoring pieces from backups, etc. But the strong emphasis should be on designing the system such that corruption is a near-impossibility.

Copyright and network protections are so excessive and far-reaching, it's no wonder we're seeing this re-centralization.


Anti trust is needed. Power and influence breeds more of itself. You need regulation to keep free markets free. Laisez fair will lead to power concentrating into monopolies. Decent anti-trust regulation, combined with laws preventing anti-competitive behavior is needed. Granted, the anti-competitive part is also important.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: