I've seen a lot of Animal Crossing screenshots recently; people will spend an incredible amount of time working on something to make it look nice. They can spend hours to make simulation-life more productive in a way that gets such a marginal return, they won't make the hours back.
Meanwhile, I've recently spent an equally unreasonable amount of time unifying my bash environment across all machines so that I get a different color for each host name. So now my bash prompt is color coded by machine. I was lying to myself about it being more productive or whatever. I did it because I thought it would be neat, and it looks cool, and I felt pretty clever implementing it. One of my customers is a lawncare website and I get some satisfaction seeing the @host turn from blue to green when I ssh into their server. I think this is the same human satisfaction of organizing your toolshed, or customizing your town in Animal Crossing.
I genuinely get so much enjoyment over tinkering around with my tools, apps, shell, you name it. When everything is tweaked to look nice to me, work nicely, scripts automating things and when everything just clicks.. it's amazing.
I used to re-arrange my bedroom every few months when I was a kid, I feel like I do the equivalent with my dev machine. Look at tweaking themes, switching out plugins/tools, updating things, change things around a little bit and I always love it afterwards.
I also play Animal Crossing and I find it an interesting comparison. I'm playing with my GF and we made a plan of how we want the island to look and the timeline is 3-4 weeks with everything (because of how artificially they slow down a lot of things). Each individual thing sounds tedious and boring (dig up that whole hill, place all the paths, etc. etc.) but it feels good in the end for some reason. It turns into what we imagined, what we wanted it to look like and work, the same way I guess my dev setup right now.
I've spent so much time just tinkering and it gives me great joy!
As a newbie I was quite confused with the difference between shell, terminal and console, finally it's a super user post https://superuser.com/questions/144666/what-is-the-differenc... that helped me in this regard, I find it really interesting that the author shed some light on it.
Personally I'd recommend not customising your shell and just learning to live with the defaults on whatever operating systems you tend to use. Makes it much quicker to be comfortable on a new machine because everything is already same as you're used to.
I've been basically "using the defaults" for the last ten years after spending twenty years customising everything; I've noticed a couple other good advantages:
- It's easier to understand what your customers will see: If they customise things, they'll need to figure those things out themselves anyway, but if you customise things, there's a chance you'll forget a detail and it'll leak out to your users.
- Nudge; Your automation tends to spread: If I have a deploy_to_prod alias, chances are it can only be used by me, but if my project has a deploy.sh script, then there's a path to get other people on my team to use it. If you've got the discipline for this by some other way, then that's great, but having to eat dog food meant I first had to make the dog food, and requiring (of myself) helped me build that habit.
- It's one less thing to think about.
Of course, the downside is that one day I woke up and found myself using zsh.
This is addressed in the first paragraph of the article.
> "I feel that sharing files between different computers is now a solved issue, and the benefits I get from having personalized my work environments are so great that I gladly pay the small price of synchronizing that configuration between my computers."
The issue is not solved when you sh into a container for a quick debugging session.
Or when you work in an airtight environment, like in banking.
It is 'solved' when the only computers you connect to is small enough to be be described as 'my computers', which seems like a pretty narrow use case to me.
I would go with some middle ground. I have my bash and git aliases etc for fast local work (which speeds up my normal work), but I have nothing so complicated I couldn't live without them on a random remote server.
I would like to start using zsh but that would introduce too much of variation from the remote envs I might encounter.
I third this. And push your most convenient aliases and git hooks or whatever to github. It’s not much work to symlink, git clone, ansible them, or whatever works for you.
> I'd recommend not customising your shell and just learning to live with the defaults
Sometimes the defaults are not sane at all, for example urxvt, it's just a blank window with a very small font size, I'd advise to customize, but not heavily so that in a matter of seconds (the time to copy your config file) you can have the thing you're used to.
Or you could stick your dotfiles and other config in a repo and clone it onto the new machine. Some examples of approaches on this prior thread: https://news.ycombinator.com/item?id=11070797
I went a step further, and automated as much as I can with Ansible. Had to reinstall everything after an SSD failure recently, this made it a fun project. I'm also planning on upgrading my machine soon, so that should make the process a lot faster.
I understand the reasoning behind this, but in practice I can't imagine working like that. I love watching Gary Bernhardt (https://www.destroyallsoftware.com/screencasts), who is an incredibly fast typist and knows Unix very well. But I just can't get over the sheer volume of typing he does, because he hasn't wrapped his most common commands into short aliases and abbreviations[0]. I would find it exhausting to do that much typing relative to the amount of output.
Here's a customization I would recommend everyone who spends a lot of time on the shell do: Be able to fuzzy find from the shell. The specifics aren't important, but here's how my bindings work (I use and recommend `fzf`[1] for this):
All of these work logically like this: If the binding is triggered at the start of a prompt, do what I mean (`cd` to directories, or edit a file), but if I've already typed something, insert it as a parameter, so for example I can hit `ls<space>⌥z` to fuzzy insert a recent directory (or just `⌥z` at a raw prompt to fuzzy select a recent directory to jump to).
I'd argue fuzzy finding is the major enhancement to computer productivity of the last twenty years. I recommend using it on the shell (assuming you use the shell often enough to justify it).
> But I just can't get over the sheer volume of typing he does, because he hasn't wrapped his most common commands into short aliases and abbreviations[0].
Interestingly I go the other way: I type fast, so I don't need to abbreviate. Instead I make my aliases and abbreviations memorable. EG for git instead of aliasing `checkout -b` to `cb` or something I alias it to `newbranch`. More typing than the two-character alias, but also more memorable than the default.
I find if a command isn't memorable I'll forget to use it, so I never build the habit needed to make it useful. Single-letter commands are especially bad for this. I've had z for years, and remembered to use it when I needed it maybe once.
> Interestingly I go the other way: I type fast, so I don't need to abbreviate.
That's definitely an interesting perspective I hadn't considered, thanks for sharing!
I'll add one more note about automation and customization (this is only tangentially related to your post, it just made me think of it): Most people talk about automation and customization about doing things faster, that is definitely not how I think about it, the goal is to reduce effort. The goal of customization, automation, and software expertise in general, is to make hard things easy.
Here's an example: Lets say you have ten different `git` repos that all have the same dependency you're updating that has an API change. If you're using entry-level tooling, this is a big hassle, e.g., if you're lucky you can do a find and replace, but you still are probably going to do it manually in each repository separately, and then you've also got a bunch of manual git commands to enter. So what do you do? Most likely you just put off the problem. But if you do this with automation, e.g., personally, I would use a combination of `vim` macros, fired on some CLI `rg`[0] matches, and some `tmux` synchronization to do the `git` steps all at once in all repos, making it essentially the same amount of work to do one repo as it would be to do all the repos. Which makes it pretty easy to do, so I just do it now to get it over with. This is my goal with automation, as an extension of software expertise, is to make what I consider to be "easy" be as much output as possible.
This is why I like fish shell so much. It has the same features as zsh, but requires no config. They have a guiding principle of "if users need to configure something, that means that the defaults are broken".
All I do is "sudo apt install fish" and I'm good to go with the 100s of out-of-the-box improvements it has over bash.
It's not that config is impossible, but it tries very hard to make the defaults work for the largest subset of users possible. And then provides a much easier config experience than most shells, since if things are broken for you they should damn well be easy to fix!
On the other hand I'd recommend customizing as much as reasonably possible on your home rig. You stand to learn so much more about how your system is constructed. Once you get to a certain level of understanding, most machines that you come into contact with that sport a terminal are easily usable.
Or customize, keep the relevant bits in git, make a shell script to install any utilities you add, and push the whole thing to whatever hosting provider you want. Then you can get comfortable with the new machine by checking out your remote and running the one script.
It's not necessarily exclusive, you can be used to default settings when connecting to containers or remote servers, but still have your own local customization.
I’d recommend against using ligature fonts in your shell. If there’s one place you don’t want any ambiguity about what glyph you’re using, it’s in the middle of sed, awk, or grep.
Programming code has special semantic considerations. Ligatures in programming fonts are likely to either misrepresent the meaning of the code, or cause miscues among readers. So in the end, even if they’re cute, the risk of error isn’t worth it.
I have been using ligatures in my shell and editor for a few months and haven't had any issues.
I do recommend it; the only issue is now I'm a bit lazier. For example, instead of making proper arrows like →, I just type ->, which looks good to me in monospace font, but looks different to everyone else.
Not trying to extract the bad parts here – this looks like a lot of effort went into it and it certainly is useful – but there are a few wrinkles:
[[ $? ]]
is always true, even if $? is non-zero; it only checks if $? is non-empty. A working check for successful exit status would be (for Bash)
if (($?)); then
echo "not successful"
else
echo "succesful"
fi
For prompt configurations: the article hardcodes the "$" sign. A useful feature is to indicate when the user is root by switching to "#"; this can be achieved by escaping the "$".
While it's great to use other terminals and customize the terminal visuals to your liking, don't forget about the additional latency it adds; this is one of the main reasons I still use macOS' default Terminal.app
Perceived latency is why I stuck with Terminal over a decade ago. Since then, Terminal has gotten better. Maybe I should re-evaluate, but it's serving me well and I prefer using default applications where I can.
Shoutout to yakuake (Quake-style dropdown terminal with tabs, for KDE). I do all my console work in it; the convenience of having access to your console at any time with the press of a button is hard to overstate. guake is the gnome version, and I am told ConEmu supports someting similar for Windows.
I use fish but only interactively. For scripts I still use bash. The amount of fish-specific syntax I use interactively (and, string interpolation etc.) is so small, it doesn't really bother me.
Addendum: I did write my entire prompt (which is 81 lines right now) in fish (my rationale was "maybe it's better performance-wise"), which was hell and I hated it but that's the only piece of fish code I've ever written.
Same, I used fish for about a month. I really liked all of it's out of the box features. I reverted back to zsh when everything I googled had to be fish specific and it felt like I was spending more time trying to make bash/zsh instructions work with fish.
I do miss the flag suggestions that would autofill.
Regarding the prompt part, if you switch between different shells, you may prefer to take a look at https://starship.rs/ so you only have to configure your prompt once.
I used to use oh-my-zsh but I found that it added too much latency to my shell startup. I removed oh-my-zsh and realized I don't need the majority of what it provides.
The only major modification left is my shell theme, which tracks the amount of time between commands, and is green/red like the OP for successful/failed commands.
I also do not source things like nvm, rbenv, etc by default, instead just sourcing them as needed.
Meanwhile, I've recently spent an equally unreasonable amount of time unifying my bash environment across all machines so that I get a different color for each host name. So now my bash prompt is color coded by machine. I was lying to myself about it being more productive or whatever. I did it because I thought it would be neat, and it looks cool, and I felt pretty clever implementing it. One of my customers is a lawncare website and I get some satisfaction seeing the @host turn from blue to green when I ssh into their server. I think this is the same human satisfaction of organizing your toolshed, or customizing your town in Animal Crossing.