I had a list of...not features, exactly, but desirable elements, in the first post. I intended to update it from comments but didn’t.
Error
I’m not sure I’m getting out of that comparison what you meant to put into it. I find the set of files in a directory a heck of a lot more convenient.
Your proposal requires a lot of work
Well, yes. That’s more or less why I expect it to never, ever happen. I did say I’m a crank with no serious hopes. ;-)
a new protocol, using JSON over HTTP, with an API representing CRUD operations over a simple schema of users, posts, comments, et cetera
While I don’t object in theory to a new protocol, JSON over HTTP specifically is a paradigm I would like to destroy.
(which is kind of hilarious given that my day job involves an app with exactly that design)
Some kind of NNTP2 would be nice. The trouble with taking that approach is that, if the only implementation is your own, you haven’t actually gained anything.
Admittedly every protocol has to start somewhere.
sites can delegate authentication to other sites via Google/Facebook login, OpenID
I had actually forgotten about OpenID until you and Lumifer mentioned it. Also, since you mention it, I’m a huge fan of pubkey-based auth and am bitterly disappointed that almost nothing I use supports it.
I’ll just repeat my core argument here. Extant NNTP software is far more terrible, if you penalize it for things like...
I think this is our core disagreement. I find web forum software worse even after penalizing NNTP for everything you mention. Well, partially penalizing it; I don’t acknowledge the lack of editing (supercedes exist), and it turns out links to netnews posts also exist. Which is something else that I’d forgotten. Which is funny because following such a link is how I discovered Usenet.
Having an RFC isn’t really that important.
Agreed. Any spec would do as long as it’s widely implemented and can’t be pulled out from under you. The RFC “requirement” is really trying to rule out cases where one party has de-facto control of the spec and an incentive to abuse it.
Upvoted for actually considering how it could be done. It does sort of answer the letter if not the spirit of what I had in mind.
Objection: I’m pretty sure Usenet had a colossal amount of porn, at least by the standards of the day. Maybe even still the case. I know its most common use today is for binaries, and I assume that most of that is porn.
I use RSS all the time, mostly via Firefox’s subscribe-to-page feature. I’ve considered looking for a native-client feed reader, but my understanding is that most sites don’t provide a full-text feed, which defeats the point.
I dislike that it’s based on XML, mostly because, even more so than JSON, XML is actively hostile to humans. It’s no less useful for that, though.
So far as I know it doesn’t handle reply chains at all, making it a sub-par fit for content that spawns discussion. I may be wrong about that. I still use it as the best available method for e.g. keeping up with LW.
I think that’s a terrible idea and it is awesome that it exists. :-P
At this point you can say that you’ll argue your case in a future post instead of replying to this comment.
I will, but I’ll answer you here anyway—sorry for taking so long to reply.
I strongly disagree that NNTP is a good choice for a backend standard
I feel I should clarify that I don’t think it’s “good”, so much as “less bad than the alternatives”.
But we don’t need to deal with the problems of distributed systems, because web forums aren’t distributed!
Well, yes and no. Part of what got me on this track in the first place is the distributed nature of the diaspora. We have a network of more-and-more-loosely connected subcommunities that we’d like to keep together, but the diaspora authors like owning their own gardens. Any unified system probably needs to at least be capable of supporting that, or it’s unlikely to get people to buy back in. It’s not sufficient, but it is necessary, to allow network members to run their own server if they want.
That being said, it’s of interest that NNTP doesn’t have to be run distributed. You can have a standalone server, which makes things like auth a lot easier. A closed distribution network makes it harder, but not that much harder—as long as every member trusts every other member to do auth honestly.
The auth problem as I see it boils down to “how can user X with an account on Less Wrong post to e.g. SSC without needing to create a separate account, while still giving SSC’s owner the capability to reliably moderate or ban them.” There are a few ways to attack the problem; I’m unsure of the best method but it’s on my list of things to cover.
Given all of this, the only possible value of using NNTP is the existing software that already implements it.
This is a huge value, though, because most extant web forum, blogging, etc software is terrible for discussions of any nontrivial size.
There’s probably an existing standard or three like this somewhere in the dustbin of history.
Is there?
That’s a serious question, because I’d love to hear about alternative standards. My must-have list looks something like “has an RFC, has at least three currently-maintained, interoperable implementations from different authors, and treats discussion content as its payload, unmixed with UI chrome.” I’m only aware of NNTP meeting those conditions, but my map is not the territory.
You could perhaps nullify the upvotes after a few months to preserve the system in the long run. The idea is that the short-term effect of his actions should be net-negative from his own perspective.
Other ways of achieving the same effect may also work; that was just what I came up with after five minutes of thinking about it.
Out of curiosity, is he still serial downvoting? I thought of something that may convince him to stop: Instead of deleting his accounts, disable them and convert all their downvotes against known targets into upvotes (and make sure he knows that). If all his efforts end up benefiting the very people he’s trying to hurt, well...
The Web Browser is Not Your Client (But You Don’t Need To Know That)
This fits my own prior experience of the life cycle of a community—but when my previous community failed, a fragment of it broke off and rebuilt itself in a few form. That fragment still exists as a coherent tribe more than a decade later, and I still love it even if I disagree with certain, uh, technical decisions surrounding the splintering process.
So it’s not impossible.
That is an excellent and thought-provoking essay, and a novel approach.
...I actually don’t have more to say about it, but I thought you’d like to know that someone read it.
Note that there is now a Lamp2. Going by the quoted parts of this subthread, he appears to be reposting his own deleted comments verbatim.
I’m a sometime admin. Ban evasion irritates me.
If I remember right, the most recent survey asked those exact questions. So we may well find out.
One of the interesting things about NNTP’s structure is that the moderator and the host don’t need to be the same entity or even use the same software. The same goes for UX elements. It would be entirely possible to run something-that-looks-like-a-blog on your own site, have it use hypothetical-lesswrong-hosted NNTP for hosting its content (buying you native-client support for users who want it), and still have ultimate control over who can post what. I’ll be describing how that works at some point.
It would rely on goodwill from the LW hosts, of course; but the worst they could do is stop hosting you—and they could not hold your content hostage as long as someone, somewhere, has kept a local cache of it. You could even self-host and still interoperate with the site, because the system was designed to be decentralized even though it doesn’t have to be used that way.
My opinion? Convenience. It’s more convenient for the user to not have to configure a reader, and it’s more convenient for the developer of the forum to not conform to a standard. (edit: I would add ‘mobility’, but that wasn’t an issue until long after the transition)
And its more convenient for the owner’s monetization to not have an easy way to clone their content. Or view it without ads. What Dan said elsewhere about all the major IM players ditching XMPP applies.
[Edited to add: This isn’t even just an NNTP thing. Everything has been absorbed by HTTP these days. Users forgot that the web was not the net, and somewhere along the line developers did too.]
Solving such integration and interoperability problems is what standards are for. At some point the Internet decided it didn’t feel like using a standard protocol for discussion anymore, which is why it’s even a problem in the first place.
(http is not a discussion protocol. Not that I think you believe it is, just preempting the obvious objection)
It is a non-starter, but there are ways to get the equivalent of a client in a web browser without using javascript to do it.
If it helps, any compromises I make or don’t make are irrelevant to anything that will actually happen. I don’t think anyone in a position to define LW2.0 is even participating in the threads, though I do hope they’re reading them.
I figure the best I can hope for is to be understood. I appreciate your arguments against more than you may realize—because I can tell you’re arguing from the position of someone who does understand, even if you don’t agree.
YAML’s the least-bad structured format I’m aware of, though that may say more about what formats I’m aware of than anything else. It’s certainly easier to read and write than JSON; you could conceivably talk YAML over a telnet session without it being a major hassle.
I agree that non-textual formats are bad for most cases, including this one.
I wouldn’t object to that, as long as 1. the specs evolved in tandem, and 2. the gateway was from http/json to (NNTP2?), rather than the other way around.
The temptation that’s intended to avoid is devs responding to demands for ponies by kludging them into the http/json spec without considering whether they can be meaningfully translated through a gateway without lossage.
This...might trip me up, actually. I was under the impression that requests for a previous message ID would return the superceding message instead. I appear to have gotten that from here but I can’t find the corresponding reference in the RFCs. It’s certainly the way it should work, but, well, should.
I need to spin up INN and test it.
We either disagree on the desirable model or else on what the model actually is. I’m ambivalent about distributed architecture as long as interoperability is maintained. Mod powers not in the spec seems like a plus to me, not a minus. Today, as I understand it, posts to moderated groups get sent to an email address, which may have whatever moderation software you like behind it. Which is fine by me. Users not being tied to a particular server seems like a plus to me too. [edit: but I may misunderstand what you mean by that]
Karma’s a legitimately hard problem. I don’t feel like I need it, but I’m not terribly confident in that. To me its main benefit is to make it easier to sort through overly large threads for the stuff that’s worth reading; having a functioning ‘next unread post’ key serves me just as well or better. To others...well, others may get other things out of it, which is why I’m not confident it’s not needed.
I’ll have to get back to you on immutability after experimenting with INN’s response to supercedes.