Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Usenet, what have you become? (2012) (90percentofeverything.com)
63 points by pmoriarty on Nov 28, 2015 | hide | past | favorite | 57 comments


I am a former usenet user. I used it first for actual discussion threads (like HN!) but later for binary content. I tried it again a couple years ago and the commons are being abused. An example of it is password-protected binary content that requires going to a website to get the password which of course requires additional actions like viewing an ad or signing up for a large number of mailing lists (with embedded referral codes). There are indexers that help find content that does not suffer from this problem but they come and go.

I still would like to use usenet for that content I can't get on Netflix/Amazon/HBO/Redbox but the effort required is too much -- I'll settle for watching something else. I also feel better about actual paying for the content I consume. I have my fingers crossed that we'll see more and more of the unstreamable content come to streaming providers.


There are Usenet indexers that filter out password-protected archives and provide well-organized listings of high-quality binaries.


> There are indexers that help find content that does not suffer from this problem but they come and go.


The fact is, it is now confusing for most end users. I tried explaining usenet the other day, and people just can't wrap their head around something other than "a web page" or "an app."

Not only is it a different protocol, but it is a different interface as well.

But I realize this post was meant more towards Usenet's intended audience. I know for me, I really started migrating away from Usenet around 2005-06 when the SPAM got so bad I could barely sift through my daily posts.


could you explain here?


Usenet (NNTP) is a protocol somewhat similar to email. Unlike email, the default communication style is in "groups" (sort of like a mailing list for a specific topic). One posts individual messages (like one email) to a group; replies are threaded, much like mailing list discussions. Historically the groups were organized somewhat hierarchically.

The end result, with a good news reader program, is somewhat like a large global collection of topic-specific forums.

Binary posts are just a bunch of posts to the same group with a message Subject matching some pattern (usually with ascending numbers counting the posts). The content is encoded to work around the limitations of the protocol and/or provider. Historically this was "uuencode", but you can also do base64 (like email attachments) and the new jazz (hah! New in 2003, anyway) is "yEnc." yEnc relies on 8-bit safe NNTP servers and only encodes the few characters not permissible in NNTP messages (periods at the beginning of lines, I forget the rest). So you get much better encode ratios (~101-102% of original size) than with something like base64 (133% of original size).

NNTP providers, especially in binary groups, are subject to occasional message loss. So posters will typically generate Reed-Solomon forward-error-correction blocks using a program called "par2." This allows one to download binaries that are missing some messages, as well as some additional FEC (potentially on an as-needed basis), and rebuild the original content. Generating and recovering from par2 files is extremely computationally expensive.


Follow-up:

> One posts individual messages (like one email) to a group

Messages can be posted to more than one group; this is called "cross-posting", much like with email mailing lists.

> The end result, with a good news reader program, is somewhat like a large global collection of topic-specific forums.

You can also choose to "subscribe" to a subset of the available groups; your news program is able to pull metadata only for that subset you care about. You can also download individual messages on demand, much like IMAP email access. It's fairly bandwidth efficient for clients.


follow up - unlike email, the groups are actually publicly viewable, whereas email you needed a username and password.


A surprising number of people aren't able to separate the concepts of "the internet" and "the world wide web". When you try to explain that USENET is a whole separate entity from the web, running along side it on the Internet, their eyes tend to glaze over.


More than anything else, that's because most people don't care about the fussy details of the term "world wide web".

"It's online but not a web page" won't glaze over many eyes.


They understood napster just fine.


Distributed (reddit + stackoverflow) text-only, without tags, search or any 'new', 'top', or any other summaries.

Posts were (at least after https://en.m.wikipedia.org/wiki/Great_Renaming) organized hierarchically, with the hierarchy created by consensus/voting.


Usenet is also usable for backups. Normally you would upload a huge encrypted RAR.

Some time ago i built a tool where you can store files/folders incrementally on Usenet with encryption, parity etc. I put much effort into this. It was possible to restore a directory tree by a unique ID (which you could write on some piece of paper) that securely resolves to a chain which links to meta/raw blocks via Message-IDs and you could mount the whole thing with OSXFuse. The ID was reusable after updates to the tree, so incremental backups worked without a new ID.

I thought this would be a nice alternative use for binaries on Usenet instead of piracy stuff. But I never released it because I think that it would lead to pollution of the Usenet network.

Is anyone interested in this? Or maybe somebody has an idea on how to use this without polluting the network? Would love to hear some thoughts about this!


I wonder if you could convince cperciva (Tarsnap) to exploit Usenet as an encrypted block store (as a cheaper alternative to S3).

> I thought this would be a nice alternative use for binaries on Usenet instead of piracy stuff. But I never released it because I think that it would lead to pollution of the Usenet network.

> Is anyone interested in this? Or maybe somebody has an idea on how to use this without polluting the network?

No way around it, it's absolutely an abuse of the network. That being said, so is the piracy on the binary subgroups. I think the end use would be small enough to not materially affect the binary NNTP hosts anyway.


I wonder if you could convince cperciva (Tarsnap) to exploit Usenet as an encrypted block store (as a cheaper alternative to S3).

I suspect that if Tarsnap was over half of Usenet's traffic volume, it would start getting filtered.


Not sure about being the middleman yet. What if the big Usenet providers decide to delete data or we violate their non-business-usage policies? But on the other hand it's really cheap indeed! We can cancel our Usenet subscription and renew it when the data is needed. So I thought it's maybe better to cut out the middleman by giving the software away.

About the abuse: I imagined that the method could be used like some unlimited-disk-space-providers which had to get rid of that plan because the users used it as advertised.

Of course, apart from that, anyone can write such software. So it's maybe just a matter of time? Though I couldn't find any other solutions besides RAR archives.

The Usenet can be used as a key-value store with handicaps. And stuff can be built on top of that.


I don't understand how you could use newsgroups for storage, except in a "I don't care about data loss" kind of way. I haven't used USENET in decades, but NNTP is a messaging protocol it says nothing about storage AFAIK. I do recall that back in the day, messages in busy newsgroups would expire rather quickly (probably due to limited storage on my news server at the time). So if I post a message containing my backup to a binary newsgroup, what assurance do I have that I will be able to get it back?


Commercial binary NNTP providers have basically infinite retention at this point, if your content doesn't generate DMCA takedown requests. E.g. http://www.news.astraweb.com/ $10/mo gets you 2660 days (>7 years) of retention, growing at roughly one day per day. (Four years ago, 3 years of retention was common.) There are also pay-by-download-in-GB a la carte plans that would be good for backup-only use (because upload is free).

Even if it isn't infinite, you can download and repost every 7 years if you care about having backups for longer than that.


Well, one could run their own NNTP server.


> You get one thing – [release day availability]

I'd bet it's not zero days, but one stop shopping. So the consumer saves the overhead of wondering, "wait, is this on Hulu? Netflix? Amazon? iPlayer?" The entire catalog is in one place.

I think Netflix and Amazon are getting to the point where their catalogs are deeper than usenet.

Of course, most film junkie's "essential 100" lists will still probably hit somewhere less than 10% representation on Netflix: http://blogs.indiewire.com/shadowandact/films-on-spike-lees-...

Still an availability problem though.


Netflix (and probably Amazon too -- I have Prime but the UI for the video service is so bad I normally don't bother with it) is falling into the cable model of hundreds of channels but nothing to watch. More often than not I will spend 15-20 minutes browsing Netflix to find something interesting that I haven't already watched, and end up just switching it off because there's not much there.


It's the same as opening the fridge, finding nothing you want, and going back a bit later with lower standards.

It's not that there is nothing there, your standards just need adjusted before you can appreciate what is there.

I guess the same can be applied to cable, but you have to really lower those standards to appreciate dozens of home shopping networks, reality TV and infomercials.


Too true.

They used to run competitions to improve their recommendation engine, because it was a big cost for a customer to get shipped a lemon.

They stopped that, I think they're running a different game now, where they just juice the expected rating to convince people to stream stuff that's really just ok.


> I think Netflix and Amazon are getting to the point where their catalogs are deeper than usenet.

In the US. Even in the UK, the offering is not even close, and in the rest of Europe, well though luck.

Then you have the choice of paying for a vpn and a netflix, and -maybe- you get lucky. Or you start wondering why would you send money to people that do not want you to see their stuff at any price.


One-stop shopping and the legal "protection" of not appearing on a bittorrent peer list. (It's still illegal, but less detectable.)

The media companies have largely caught on to Usenet and have been issuing automated DMCAs on Usenet content immediately since the early 2010s. Since the posts are stored on a centralized server, the providers can and do comply with DMCA takedowns with automated software. As a result, popular TV and Movies with correct titles are usually taken down quickly.


There's still "eternal-september.org", which offers the non-binary USENET groups. The "comp.lang" hierarchy is still widely used, and the main discussions of changes to the C and C++ standard are still on USENET.

There is almost no spam.


One of these web-server setup fails again. The site is only accessible with the www. sub-domain: http://www.eternal-september.org/


No, no, "news.eternal-september.org", using NNTP.


http://eternal-september.org works fine in my browser.


Not in Chrome. I just get "ERR_NAME_NOT_RESOLVED".


In the past 3 years on usenet, something has changed though. Perhaps in response to this article and/or the others talking about it.

Some studios are getting drastically faster at getting their content pulled within minutes of nzb's appearing on the index sites. It's not as if its difficult to scan the groups and topics and issue take-down's on posts to the NSP's.


I got 99 problems, but missing articles ain't one.

Get cheap block accounts from various providers, ideally one from each major reseller, configure them as backup/fill servers.

http://i.imgur.com/w8sklP2.png


I think what changed was it achieved enough popularity for rights owners to notice it. Due to being (relative to say bittorrent) complex to set up, it skated by on obscurity.


> within minutes of nzb's appearing on the index sites.

I was surprised nzb didn't get attacked more often. Posting an index of a post before the post it indexed could have meant an attacker grabbing the index, and then uploaded a bunch of junk to poison that index. If they're posting little 1 MB files and you're posting 20 MB split rars their posts are going to propagate faster.


This. Some shows seem impossible to get via usenet and I end up looking for torrents anyway. (Also some shows seem not popular enough, so again torrents to the rescue). Which still is a major PITA as I never looked into something like SickBeard but for torrents. Any recommendations?


Yeah, some networks (HBO) are extremely vigilant. If you don't have some kind of always-on device scanning for releases you'll be too late.


You might find there's a SickBeard fork for torrents called SickRage...


Just as with torrents, the public trackers will always have this issue. Private ones tend to have less of a problem.


What is the usenet equivalent of a private tracker? The problem with "private" indexers is the content they are indexing (the actual encoded articles) are still public.


There are private NZB sites where the NZBs themselves are not indicative of the context -- hashed titles, with password protected files. I work for a Usenet provider - it's still very much alive both for discussions and for file sharing. You just have to be very savvy.


$20-35 US Dollars a month for Giganews usenet

Can be found cheaper though, like 1/4th of that price, if you're happy with less retention (1600 days instead of 4000 or so) and a download speed limit (never found this a problem, background downloading during the night or day still means you can watch shows same day or day after if you're into that).


I was a loyal giganews user for 10+ years. Then the content creators started getting content pulled very quickly, sometimes within days of upload, and the whole thing started becoming a huge hassle.

By this point I was making enough money that $30 a month split between a few streaming services and a few bucks occasionally for a movie on amazon prime wasn't a big deal. It feels good to pay for quality. But as a young kid, who wanted to experience so many types of media and no means to pay, usenet was the best.

Piracy is the perfect thing for kids interested in computers to use. It's complex, you need to figure stuff out, takes a lot of time (which kids have), and the reward for doing so is free content. But once you get old and are time-poor and have the money, it's not really needed anymore.


Piracy still has the better catalog; until not long ago, you couldn't even watch Star Wars online.


    telnet towel.blinkenlights.nl


That's very interesting. I wonder if, three years later, there would be a script to set this all up on a Raspberry Pi to run automatically.

(If not, I wonder if I should? Sounds at least like a fun weekend project.)

In any case I think Popcorn Time still works well, and if not, there's always regular torrenting.


A script like that would be completely feasible, but wouldn't that make it more popular? Isn't it's difficulty to set up what's keeping it under the radar?



FWIW, nzbget (http://nzbget.net) is considered better than SABnzbd nowadays.


SABnzbd is so slow I don't know why anyone would want to use it


Are there browser extensions?


So, today's desktops and high-speed internet connections are more than enough to just have personal Usenet servers instead of subscribing to somebody else's (or having your ISP shut down there's), anybody know what it would take to do this for real?

(after all I remember having access to full Usenet servers being served off ancient computers hooked into a T-1)


Sure, if you have the bandwidth and hard drive space to mirror 25+ TB of new content a day...


Text only should be doable. All of Reddit is less than a terabyte.


Having the server is easy. Getting a peering feed is not, as that involves them sending you all the news.


I don't agree with the claims that Usenet is dead, or that it isn't used for actual discussion anymore. I am subscribed to many newsgroups and most of them are very active. They work in all regards like mailing lists: you will find support lists for programming languages or software, announcement lists, and discussion groups about humanities, politics and many other subjects you might be interested in. People who ask questions actually get answers.

There is spam, yes, but almost all of it is in newsgroups that receive no messages other than spam, and there is no point in subscribing to these newsgroups in the first place. You can filter the rest away with SpamAssassin or Bogofilter.

You can get free access from news.solani.org or www.eternal-september.org to the non-binary newsgroups, where all the interesting discussion happens. Any client like Gnus or Thunderbird can do.


There are now a lot of encrypted or hashed uploads. And to get the content you need to pay for an indexer.


Cost per month: $10 for usenet access with SSL support $3 for VPN anonymity $100 for 150mbps internet access $15 for electricity to run a 24/7 box

Total per month - $128 to have things literally on demand.

How much do you pay your cable or satellite company?

EDIT:

I should say that the box does cost money, but usually can be piece together with old equipment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: