Hacker Newsnew | past | comments | ask | show | jobs | submit | zafiro17's commentslogin

The nightmare is over? I've used Usenet for years. Most serious users filter out any post originating from G2 to do away with Google posts. It's been a huge source of spam and annoyance.

Usenet gets better as it gets smaller, honestly. Most groups are carcasses these days, but several have regular use. comp.misc, misc.news.internet.discuss, and sci.misc are active. Several groups in the comp* hierarchy are active. Thunderbird, claws-mail, sylpheed, tin, slrn, alpine, neomutt all handle Usenet just fine. Visit http://newsgroupreviews.com for a list of providers, some free, some paid.


> Most groups are carcasses these days, but several have regular use. comp.misc, misc.news.internet.discuss, and sci.misc are active.

misc.news.internet.discuss and sci.misc are bascially a single person posting links without any discussion.


Is there a broader tutorial for this, newsgroups mostly missed me, I don't really grok how they fit into the greater internet or how their hosting works or why you have to pay a separate provider


These two wikipedia articles are a start:

https://en.wikipedia.org/wiki/Usenet

https://en.wikipedia.org/wiki/Nntp

As for "why pay separate provider" -- that would be because circa 2000 most ISP's dropped offering Usenet access for their customers as part of the deal, which created a market for the paid providers.

But, you don't /have/ to pay:

https://greycoder.com/best-free-usenet-servers/


In the dark ages when internet access was limited and/or via dialup for most of us, there were several tools and protocols to work with what we had: email, ftp and usenet were some of the main ones. Usenet was basically a primordial set of distributed forums platform (and quasi-social network) where copies of the data were stored locally. It was how a lot of information got disseminated prior to the mid-90's when the web took off and then over.


IMO an analogy to modern times would be a federated Reddit. I'd liken it to Lemmy but don't have enough experience with it to be sure it doesn't ruin my analogy in some way :)


It's probably better, on a technical level, than any of the fediverse stuff. I understand that Usenet works as a gossip protocol with nodes sharing everything they have (and haven't filtered) while Fediverse usually relies on directly contacting the origin server over IP. The integration between different instances on Mastodon and Lemmy is nothing like Usenet. The integration in the Fediverse is limited to following users, liking posts and replying to posts on other servers, using your federated identity on your server.


Update: I actually used Usenet a bit yesterday, mostly reading administrators explaining how the system works, and deleting spam. It's a cool system, I guess.

It should be possible to have other things built on top of the protocol. The protocol is a store-and-forward broadcast system, and is mostly agnostic to what goes in the packets, although if you want to add a group, each server operator has to agree to carry it, and they can have server-specific restrictions on things like message sizes, dropping messages outside the limits, so it isn't reliable. If you wanted to make a Usenet group simulating Reddit, you could easily pass votes through the system. Clients would have to interpret them, so you'd have to write your own client.

It's tolerant of spam as long as it isn't extremely excessive. There was talk of de-peering Google due to a large volume of spam, before Google made the decision itself, but most groups are about half spam besides that. The low volume of actual users contributes to this.

There's a group, I think news.admin.peering, for requesting your own server to be connected to the network. I see many successful requests from various people in the group's history. The network is fully decentralized, so being connected is reliant on at least one other person being willing to connect to you.


Also where the (outdated) idea that "the internet never forgets anything" came from.


There was a period of time where the notion that everything would be archived was unfathomable to the average user. Sure, in theory it could be. But storage was expensive and the value of storing that stuff was low.

Unfortunately, when DejaNews came out it became clear that at least one person out there recorded everything. Many an embarrassing post of mine was unearthed. When one has tens of thousands of posts and is young & stupid, that was bound to happen. Of course now all these years later I can barely find any of my posts on the internet, embarrassing or otherwise.


>I don't really grok how they fit into the greater internet or how their hosting works or why you have to pay a separate provider

In much the same way that you need a specialized email client to use IMAP/SMTP, you need a specialized client that speaks NNTP to access usenet.

The providers are heterogeneous but usually are basically beefy servers connected to fat pipes to handle the massive volume of traffic, which is why you typically need to pay for access.


One way to think usenet/newsgroups is as a set of distributed peer-to-peer mailing lists.

You post a message to your (NNTP) server. Your server connects to other peer NNTP servers to transfer messages back and forth, so eventually your message will propagate throughout the usenet network. (40 years ago peering was through UUCP, etc, and done at night when phone calls were cheaper or only to other local numbers so it might take a week, now it doesn't take that long. See also, bang paths).


Is there a reason why you can't run your own Usenet server nowadays?

In the old days, the issues were that bandwidth and disk were expensive. That's no longer true.

Is there a techncial reason we can't have individuals running NNTP nodes?


I think you can run Leafnode if you want to host your own, small groups. Otherwise (my understanding) you need to peer with other providers and be able to consume something like 30-100TB/day if you want all groups on the network (unless you can find providers to peer with that let you limit groups)

Some discussion here https://www.reddit.com/r/usenet/comments/ziqdh2/become_my_ow...


Note that 30-100TB/day is mostly copies of pirated copies of movies, getting re-encrypted each time so you have to pay someone to get the encryption key and to avoid detection.

If you want text groups, it must be a lot lower.


Text content is under 100MB per day, including spam, according to some Usenet admins.


I imagine so but it sounds like you're still at the mercy of a provider to peer with you (which has a cost to the provider, even if low)

I think that's where the "build your own network using nntp" comes in


Well, at this point the mainstream usenet network is dedicated to piracy. That's what it is - 100% piracy. The only reason they use NNTP instead of something like BitTorrent and call them "articles" instead of "files" is to maintain plausible deniability that that isn't what they're doing.

The textual discussion network may as well be its own wholly separate network. I only know of Eternal September, but I'm sure there are other servers.


Correction: there are several other servers and the text part is less dead than I thought.


If Usenet's requirements haven't increased in a 2-3 decades, and in its moribund state I expect they haven't, then it should be trivial to run Usenet servers on almost any device, at least in terms of resources.


Guess it's hard to get others sync with you?


The principle use-case for Usenet has always been small, controlled-access networks.

That's what ARPANET and the early Internet were, where "small" meant < 1 million active participants and "controlled access" meant meeting the requirement of belonging to one of a limited number of academic institutions, tech companies, government agencies, or select military units. Total Usenet participation in April 1988 were 140,000 readers, by Eric Reid's DEC surveys, as cited in John S. Quarterman's The Matrix (1990):

<https://archive.org/details/matrixcomputerne0000quar/page/24...>

Spinning up an NNTP group for internal company communications, for a family group, or for a (sufficently tech-savvy) organisation or hobby group would be a viable option. My experience is that peak group experience tends to occur with somewhere between about 5 and maybe 1,000 participants on the high end, and that's being generous. In practice, 5--50 is probably a better sweet spot.

The technology isn't the stumbling point nearly so much as convincing people to use the service. In some industries, privacy and disclosure rules may preempt usage (e.g., healthcare, finance). And the fact that Usenet has no intrinsic authentication or attestation of identity (though these can be provided by additional mechanisms such as PGP/GPG signatures) means that publicly accessible newsgroups are likely to be nonviable given malicious actors.

What's underappreciated about much of the early Internet / online world is how small most of the "large" (as in influential) groups or movements were. Usenet, the WELL, Slashdot, and early blogging had core groups of perhaps a few hundred to a few thousand people, possibly fewer. Yes, there were additional participants in many cases, but in terms of how concentrated the bulk of activity was, small.


Hmm and if you limit the use case to those small groups, there are a lot more options now. There's the whole Fediverse stack, of course, but also lots of stand alone forum options you can self host. And why self host when there are also lots of low-cost community platform options out there?


So, that really wasn't the original question, so noting that we've moved on from that ...

Basic protocols such as NNTP and IRC have the benefit of being stable, small (running a Usenet server on an OpenWRT router with additional writeable storage would be doable, running a Fediverse instance not so much), and being relatively client-independent and even protocol-flexible. NNTP <-> SMTP gateways are, for example, A Thing, and people can participate in such newsgroups from either Usenet or email clients. Similarly Web gateways to either NNTP or SMTP systems.

The flipside is that the underlying protocols are also simple and absent some additional tooling lack much of what it is that people have come to expect from more recent messaging systems:

- Mobile device access. That's probably the big one, though I'd argue that for useful discussion you're better off eliminating mobile use. I've commented that losing a keyboard and being forced to a small screen cuts my effective IQ by half, and that quality of discussion on public systems also seems strongly negatively affected by mobile use. Read-only access might be a compromise, with some "I have something to say but will wait until I can access a Real Computer" might be an option. Android + Termux + Bluetooth keyboard meets my minimums as "real computer", FWIW.

- Formatted text, usually a flavour of Markdown or some "smart editor" for normies. Then again, we seem to survive with much less than that on HN.

- Images. That's the principle feature NNTP lacks natively, though they can be shoehorned in via uuencoded posts. Along with audio and video, the ability to create multimedia posts (popular, occasionally useful, terribly abused) might be an objection raised by many.

- Third-party clients. The great strength/weakness of NNTP and SMTP are that neither are locked to a single client, or Web client (though the latter is available). Having a specific tailored client program (tin, slrn, Emacs, pan, etc.) is useful because the features of the platform are strongly supported by those clients. Downside is Yet Another Piece of Software to install and train on. In the mobile world apps have been breaking the notion of everything-in-the-browser, though often bringing along their own set of issues, most notably surveillance, but also general issues of design, UI/UX, maintenance, security, and abandonment.

The initial question addressed could and the answer really is "trivially", from a technical level.

Yours is "should", and as I'd hinted in my initial response, that's far more social and up for debate (as we're doing here).

But at least you're demonstrating a exception to Ian Malcolm's could/should criticism.


It's actually pretty trivial. Eternal-september seems happy to peer with anyone who asks (just send an email), and there's a newsgroup specifically for asking for peers; everyone who posts there gets lots of offers pretty quick.


no. it's just a tcp client/server model like any other. your isp probably won't let you host services but that's been the case for decades


> your isp probably won't let you host services but that's been the case for decades

How would your ISP know?

NNTP originally was originally store-and-forward over dialup like links, no?

So, if I contact a different server and upload/download the changed data, how would they even know?

I guess the big issue would be having some "rendezvous" server in order to help with NAT punching as well as authentication.


No, NNTP is a TCP protocol. Early USENET moved messages over systems like UUCP.

NNTP is to USENET what ActivityPub is Mastadon. USENET is a confederation of cooperating peers, currently using NNTP, but it could be any mechanic to move the messages. You could wire it up using SCP I imagine if you were motivated to do so.

Specifically, the NNTP protocol does not talk directly to how USENET works. For example, you can find out how to exchange messages for particular topics using NNTP, but not how to actually create those topics. That left to the actual software and administrators. USENET using the concept of "Control Messages" to exchange information about things like newsgroup status, but the content and format of those messages are not specified in NNTP.

As a USENET peer, pretty much all peers are "equal". You host your own news server, you tell it what groups you're interested in, it peers with another host and exchanges messages about those groups. At that level, they're equal.

But just because you create a newsgroup on your system, doesn't mean that group is instantly propagated across the planet. That's where the USENET governance kicks in as to who is going peer and distribute new groups, or not. Each individual relationship within the peer group can be different.

I can't say exactly how a news client differs from a news server. It may not have any of the peering logic, posting and reading groups directly from a server which then handles the peering. News servers "peer", clients are lighter weight.


The protocol is from the era where each protocol used a dedicated TCP port rather than "it's all JSON over HTTPS" like now.


Yeah. Amen to that

There were some bad ideas in the beginning, FTP as an example of several of those


FTP splitting the data and control ports was a smart implementation. It allowed for optimization of the ports (one focused on throughput) and meant that control could still occur even while long responses were happening on the data port.


I haven't tested it, but I imagine it allows you to trick an FTP server into making an HTTP request (without TLS).

It probably seemed like a good idea at the time, and there was no way to know the problems without trying it.

It also allows you to send a command to cancel an ongoing transfer.


Well... RFC 953 specifically mentions using PORT to send a file to a line printer so that seems intentional. [1]

For streaming mode STORe/RETRieve the connection closes at the EOF so you could send the request but the response would be lost.

[1]

    It is possible for the user to specify an alternate data port by
    use of the PORT command.  The user may want a file dumped on a TAC
    line printer or retrieved from a third party host.


it also allowed for FXP, which was a godsend in early pirating/warez days ;-)

...or so I've heard


This could have been better accomplished by multiple connections to the same port.


A lot harder to implement QoS on that.


QoS didn’t exist when FTP was invented. The protocol was designed badly.


You think it's worth getting into Usenet for new people?


Its worth it in the same way that HAM radio is / is not worth it. You do it as a sort of lifestyle choice.


can you explain this a bit deeper?


Facebook and Reddit are better ways to talk to people, but you're already here on Hacker News so it's apparent they don't suit your tastes. If you're concerned about Internet centralization, learning other systems that may be less centralized is great, but you know you won't be using it to talk to people you'd talk to on Facebook. And if you're concerned about the Internet itself, you learn ham radio.


Properly threaded, low-bandwidth, decent signal:noise ratio, text-only discussion? Sounds good to me. Far too little of that around.


It's worth it in order to experience a more efficient discussion UI. The difficulty is in finding newgroups where actual worthwhile discussion is going on.


The problem with ham radio is, that most people are not that interested in the tech itself. In the past, even if you didn't care about tech, you could still buy some radio kit and join the hobby for the "chatting" part. Now, you can get 'chat' everywhere, a lot easier and cheaper to set up, and you also gain privacy (encryption, no real-world indentifiers etc.), although I prefer the "open forum" type of ham radio communication, because well... it's meant to be open.

It's same with usenet... you can get the "chat"/discusson part everywhere now (reddit,...), usually with a bit better UI and it doesn't really offer much more than that. Back in the time, it was the best (and well.. the only) thing to have.. now with alternatives, people move elsewhere, and unless you find a niche you're interested in and a community exists on usenet, you won't have much to do there.


> you can get the "chat"/discusson part everywhere now (reddit,...),

There's no organic discussion happening on reddit. Even the TV show subreddits are tightly moderated these days. If you try and 'discuss' in a way that invites genuine criticism and isn't just mindless cheerleading of the product on offer you'll find yourself railroaded off the site in short order, and the political/real-world/location subs are an order of magnitude more controlled.


You can’t get the efficient discussion UI of usenet clients anywhere, except maybe on mailing lists with a specialized mail client like Mutt. Usenets clients were peak discussion UI. Reddit is a far, far cry from it.


Can you explain for us noobs why that is?


It relies on keyboard navigation, threading, and per-message tracking of read/unread status, and on subject lines. The threaded view with one line per message allows a quick overview over which parts of a discussion one has already read or which are new, or are still unread from the last time. You usually have a split screen where one part of the screen shows that thread tree and the other shows the currently focused message. Focusing a message marks it as read. Common key bindings are pressing Tab to jump to the next unread message, and Space to page-down through the contents of messages. This makes it very efficient to catch up on a discussion (or really, to catch up on all ongoing discussion over all subscribed newsgroups) even after a few days (or longer). And if you reply some time later, you can still expect other participants to pick up your reply and easily see its context.

Replies don’t get lost because the discussion has moved on or the thread is a day old, like they do on HN and Reddit. This enables long-running, structured discussions. There are more usability aspects related to the text editor used, such as editor commands for handling quotes when replying. There are of course the usual(?) forum features of being able to ignore (filter out) specific users, and other features like automatically highlighting/ranking/filtering messages based on other criteria. But the main usability benefits come from per-message read/unread tracking, keyboard navigation, and the compact thread representation.

Note how on HN and Reddit you have threading, but because each message is displayed inline, you see less of a thread at once. You also see only one topic (thread) at once, whereas in a Usenet client you see all recent or currently ongoing threads in the same view as the intra-thread structure. This causes a more shared experience of subscribers of a newsgroup being aware of the current ongoing threads (over days and weeks) than on Reddit and HN. And most importantly, when returning to a discussion thread, on Reddit and HN you don’t see which messages you have already read and which you haven’t, and thus threads die out after a short time, and longer-form discussions are not practical. The experience is profoundly different.


Gopher is the next big thing.


I literally write a briefing paper advising my university to clamp down on the nascent web and continue with gopher which was much more systematic and organized.

Clearly they ignored me as they should.


You're not wrong - the web is very disorganized. This makes it more valuable to shareholders, as they can design sites to present us with whatever extracts optimal value at the present moment. That's how Google gets to interpose itself between us and what we're looking for, and redirect us to SEO spam pages.


Interesting follow-up to your Autopoint story. I met Jonathan Veley at a pen show. He is a pencil afficionado too (a self-professed "Lead Head") and bought the remaining stock of the Panda lead factory, plus a lot of equipment. He now produces replica Autopoint mechanical pencils and sells lead. He's a really interesting guy and he's written several books about the history of manufacturing of mechanical pencils.

See https://www.legendaryleadcompany.com for his story. I bought the least expensive of his newly manufactured Autopoints (they're called Legendary now) and I really love it; it's my best pencil. I love 1.1mm leads and find narrower leads too thin for my preference.


This is tangential to the subject at hand, but I feel obliged to complain about how much Substacks now requires you install its app to read the content. Very similar to Reddit, in fact. I've watched the progression in pushiness over the year. On my phone, I did not install the app and there was apparently no other way to read the article.

I block Reddit at the router for similar dark patterns.


I’ve never seen this. Can you screenshot what they’re showing you?


I'm kind of surprised by the comments here. I've used TDE for years (on the Q4OS Linux distro). It excels at being a low-resource requirement virtual desktop for teams needing a modern distro/kernel/browser but want it to run as lightly and cheaply as possible. My company uses it on virtual server infra accessed over Nomachine. It runs perfectly well with <1GB RAM and even in 500MB RAM it does well. The DE it's updated worked great on a Pentium III with 128MB back in the day.

The folks installing TDE want, if I am any example: a modern distro, kernel, and up-to-date Chrome or Firefox, and just enough desktop (task bar, apps menu, virtual desktops) to make it useable. IceWM is another option here but TDE is just a bit more comfortable, without all of the bloat that gives you the UX improvements people seem to be asking for here, or the non-mainstream (fluxbox, i3, etc.) that scares the non-technically inclined.

I'm a big fan and will keep using and installing it. "Not slick enough"? meh, I'm over here getting work done.


This is one of the apps that not only keeps me on a desktop/laptop computer (vs a phone/tablet) but also keeps me on Linux. I've got all my photos stored on a NAS device in my house, and manage backups myself (to external harddrives and to rsync). Digikam is my photo manager. I couldn't be happier and I'm sure I don't use even half of the advanced features. What a great app - easy to discover, straightforward, and very polished in my opinion. Many thanks, open source Digikam team.


> but also keeps me on Linux.

It works just fine on Windows.


I wish both the K9 and Thunderbird teams all the best. K9 needs some love, and so does Thunderbird, honestly.

I use Aquamail on Android and love it. Great feature set, works great with multiple accounts, lots of configuration options. It's not open source or free and I honestly do not care.

Back in the day it was something like USD 4. Now it's something like USD 25. Still worth it, and I don't mind paying for software if it keeps the developers in business.


Comp.misc for computer stuff Misc.news.internet.discuss for oddball Sci.misc for science

These are three decent groups for content similar to HN.

If you need a usenet provider, try solani.org (no cost) or individual.net (paid but excellent)

For clients, any *nix install will have slrn, tin, pan, or possibly knode. Thunderbird is on mac and windows as well as BSD/Linux.


I'm using Mageia with great success. KDE, frequent security and package updates, big repository, non-Ubuntu. Former Mandriva and Mandrake for those who remember Linux in the early 2000s.


I just tried out delomore and am very impressed. Bookmarking now!


I applaud the work that went into this, but unless I'm missing something, they've recreated a terminal server, where nomachine has delivered desktops for ages and works over a connection as poor as dial-up.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: