Unix was probably a great IDE by the standards of 1975, because literally every design decision about Unix was made based on the criterion of convenience for a 1970s programmer. Those decisions haven't aged well: fork() sucks, lack of completion-based async I/O sucks, ill-thought-out file, socket, and process APIs riddled with TOCTOU bugs suck, two-letter YAFIYGI commands suck, untyped pipes suck, and don't get me started on C or its standard library.
It turns out, however, that we haven't really strayed far from the "Unix as IDE" approach, as in order to turn Vim or Emacs into an IDE the typical approach is to just use Unix primitives and make the editor into a sort of super-shell that orchestrates pipelines of processes consisting of Unix command line tools. And that's fine.
Time has marched on, however, and we have been the lucky recipients of not only better IDEs than Unix, but also things that are better than Unix at what Unix was thought to do well. PowerShell, for instance, is a vast improvement over Unix command-line environments with its ability to operate over typed pipelines of objects. We should be expanding our horizons beyond what 1970s programmers found convenient to implement and embrace new ways of thinking about, writing, testing, and debugging code.
I agree that PowerShell is theoretically better since it can deal with structured data.
But in practice, I could never get as efficient with it as I could with "the Unix way" (everything is a string).
If I was writing a "production" script, then it's better. But when I'm just trying to quickly do something on the command line, the "everything is a string" approach is good enough.
It's a very interesting topic, since the amount of brittle sed/grep/awk done by everybody using *nix on earth probably requires 128bits numbers, yet structured interactions does slow you down due to idiosyncracies. That said I wonder how quick things like powershell reward you. It's a bit like having a unit standard.
The thing that we'd ideally want to happen, but will probably never will unless a mad person does it, would be to have something that wraps around a POSIX shell, and extends it in a sort of non-compatible way. Let's call this new tool $NEWSH.
Larry Wall basically tried to do this with Perl but I guess his vision was too lax for such a pursuit, Perl tried to be everything, all at once.
What I mean by this new approach is that this new tool would implement everything in POSIX shells, as a sort of v1 layer and then everything else and new and modern (typed, etc), as a sort of v2 layer. There would be simple ways to call back and forth between v1 and v2, but inside a layer, you couldn't mix and match (maybe you could do it with a construct such as "unsafe".
Long term the idea would be that this tool becomes pervasive, installed and available everywhere and everyone can safely and reliably use just the v2 layer.
The v2 layer would be extensible by design, so it can keep up with modern practices.
You'd want to be able to go $NEWSH my-bash-script.sh and it should just work. $NEWSH my-newsh-script.nsh should also work, obviously.
The reason this would never happen, IMO, is because it's a thankless job that needs to be done for at least 1 decade. This new shell would need to:
- implement full POSIX shell compatibility - the v1 layer (done before, doable)
- implement the new, designed from the ground up, language - the v2 layer (sort of done before by alternative shells, doable)
- design a clever, simple to use, safe interop + unsafe construt for v2 (I imagine this would be hard)
- package and promote this new thing, once it's stabilized, so that it's picked up by: Debian/Ubuntu/Mint, Arch, Fedora, FreeBSD, etc, and wait for literal decades for distribution to happen; Debian Stable and RHEL are especially egregious, since for ubiquity you'd want the new tool to be available in the 2 latest LTSes, so DevOps people can rely on its availability... so as I said, count 10 years for that (crazy hard and super boring)
I said only a mad person would do this, because with this kind of volume and intensity of work, you can probably start a successful company or an open source project with much higher impact.
Oh, and this project will be absolutely HATED (think "getting death threats" type of hate) by large parts of the community. See systemd.
Yeah, that would be the only thing that sort of resembles what I'm saying, but I'm afraid it's biting off more that it can chew: the long term goal is to have some sort of distributed language plus it's using its own custom Python runtime of sorts, it's implemented in a convoluted way.
Also, right now the project is a 1-man-show, for something of this magnitude to move fast enough it needs to become a full blown community project.
Oh, Oil also need to mature. Osh IMHO needs to be fully usable as interactive shell (I don't think it is), it needs to be packaged for everything.
It could be that Oil gets there but it's going to take a looong time. Oil itself was started in 2016, that was 7 years ago.
I'm keeping an eye on it for sure and hoping it succeeds.
I think it would sort of be like GNU. Stallman created the tools before the kernel because without tools it’s got no value.
You need the programs to make this hypothetical shell useful. That’s structuring the output of pretty much every program as data and passing it through as data conserving types. That’s arguably a very large barrier to entry.
The project isn't a solo project anymore! There have been 6 people funded by two NLNet grants, and we just got a third one, mentioned in the post.
If you want it to succeed, then you should try it and report bugs.
One way to view it is that Oils has a 7 year head start on any other project that wants upgrade shell. Like you say, it's a huge amount of work, and it's the ONLY project that's compatible with bash (which BTW is a much bigger job than being compatible with POSIX sh).
Shell became the #6 fastest growing language on Github in 2020, so all those shell scripts people have written in the last few years make the Oils project more valuable. That is, people are writing scripts compatible with OSH at a greater rate than any other new shell (e.g. new shells == ones with precise error messages)
The new command line tools like fd and bat are ALSO compatible with OSH and YSH because they use the Unix process interface :) There have been several shells that want to bundle everything into the shell, but I disagree with that philosophy, because it limits growth to being "inside" the project.
(although YSH has a lot more functionality than bash built-in, like JSON support)
---
People who have contributed are acknowledged in the release announcements.
Testing is a good contribution. Writing weird HN comments isn't a good contribution :-P
There has been some misunderstanding of the project, but very little "hate" ... Mostly encouragement!
> Those decisions haven't aged well: fork() sucks, lack of completion-based async I/O sucks, ill-thought-out file, socket, and process APIs riddled with TOCTOU bugs suck, two-letter YAFIYGI commands suck, untyped pipes suck, and don't get me started on C or its standard library.
sound like problems with unix as an IDE; most of what the article discusses is unaffected by underlying APIs.
I also feel like I should question whether you're not attacking a strawman; modern unix-likes are continuations of unix, but they're not stuck in the 70s. For example, AIUI everyone agrees that fork() has shortcomings... which is why we have vfork (and I think others?) now.
"We should be expanding our horizons beyond what 1970s programmers found convenient …"
You're right; we should be going back to the Lisp and Smalltalk machines of the 1980's.
In that sense, I've found PowerShell to be something of a letdown. I was hoping for something that was an ergonomic abstraction over NT's primitives (or Linux's) and over the CLR's primitives at the same time, in much the same way that Lisp worked on Lisp machines, but PowerShell is only frustratingly alright at both.
With the deprecation of the ISE, the debugging story is also something that's pulled into question. Is the VS Code extension the place to go now?
I wind up spending more time in F# because the whip-it-upitude story is a little better there, especially with the Jupyter notebook support, and the choice of using the same debugging and development tools I use for the rest of my tech stack is an obvious one.
I also think a lot about how some parts of PowerShell were inspired by CL on IBM i, and I find that I'm similarly frustrated when I can't just whack F4 in the middle of muddling through a command's parameters in order to be presented with a structured form interface for interactively filling out said parameters.
As much as I'd really like having Genera's hoverable and clickable command listener in PowerShell (like, imagine being able to do a `Get-ChildItem` and getting back an object display that allows you to left-click on an item to run `Get-Content` on it), I'd absolutely settle for the level of interactivity and hypertextuality offered by IBM i.
see, that's what i thought in 01998, but you may have noticed that google, facebook, aws, openai, github, android, ios, docker, instagram, slack, whatsapp, fastly, tesla, etc., all run on unix for some reason
so i guess fork and epoll and the racy unix file and socket and process apis are adequate, or anyway in some way not as fatally flawed as windows. a lot of those successful tech things of the last 25 years do involve some java or golang or erlang
stack overflow runs on windows tho
almost everything in my list runs on linux, mostly ubuntu. the exceptions are that ios runs on darwin and whatsapp runs on freebsd
Also, now with an IDE, code completion is table stakes and we are moving toward LLM integration where the IDE will assist in coding.
The Unix pipeline model is not suitable for this as you need persistent processes and low-latency two way communication.
It is interesting to note that the Language Server Protocol which is the biggest innovation in terms of scaling language completion across multiple languages and editors came from Microsoft.
We are moving into a world where although the code is stored as text, our programming tools understand and manipulate it at a deeper level than just a sequence of bytes which is what the Unix tools do. Rename that uses the AST is much more safer and powerful than sed. Context aware search is better than grep.
Your argument might hold water, if the standard for productivity was simply churning out code. The Linux/Unix world, already has tools, like nano, for such users.
The software world, might be better served, however, if programmers slowed down, and engaged their brains, before bashing keys. I’ve solved most of the difficult problems, in my programming life, in the bath - where I can allow my mind to wander, and come up with the best solution.
Oh I don't mean around just IDEs, but really I have a ton of new guys on my team who can understand how apis work but not any futher down. Now, 90% of the time it doesn't matter but, anytime the problem is 1 layer lower it falls apart. There are so many layers when we should be reducing layers and simplifying.
It turns out, however, that we haven't really strayed far from the "Unix as IDE" approach, as in order to turn Vim or Emacs into an IDE the typical approach is to just use Unix primitives and make the editor into a sort of super-shell that orchestrates pipelines of processes consisting of Unix command line tools. And that's fine.
Time has marched on, however, and we have been the lucky recipients of not only better IDEs than Unix, but also things that are better than Unix at what Unix was thought to do well. PowerShell, for instance, is a vast improvement over Unix command-line environments with its ability to operate over typed pipelines of objects. We should be expanding our horizons beyond what 1970s programmers found convenient to implement and embrace new ways of thinking about, writing, testing, and debugging code.