Cool seems like unix `cat` command with addition of storing the filepath (so maybe like `find`?)
While this of course sound like a cool project and my comment might be the most Hacker News-y thing [1] I think it's aimed at developers and so it fits.
I don't agree with the zip downsides listed in the repo since firstly `unzip -l` exists to see what's in a zip and secondly the argument with reuploading the zip doesn't work in favor of a different format since you also need to update it in a remote when changed.
Secondly I am kind of amused about this format when `touch`, `mkdir -p` and `echo` are available in every POSIX compliant system which can be combined to create a nice coherent shell script which without dependencies would cover the functionality of this project as far as I understand.
I don't want to sound condescending but I'm a feeling a little left-pad [2] on this one.
I see this as a declarative vs imperative approach. The interface is a bit barebones and heavy right now by needing to fire up node and run the functions. But like declarative vs imperative DevOps, the declarative nature of the resulting format seems like it can carry a lot of the same benefits.
POSIX scripting and other tools are a viable alternative as well. Though despite knowing all of the commands you’ve mentioned, and regularly sharing/applying git patches, I think I would still have difficulty renaming FileA to FileB in a tarball or patch prior to unpacking or applying it.
The declarative approach seems like a nice feature that greatly reduces the cognitive friction there. But as a single feature, it’s not clear whether it’s worth a tool change.
I use this pattern a lot along with a tool I built for doing server deployments and administration using plain old shell scripts and ssh (golem: https://github.com/robsheldon/golem/).
There are two caveats:
First, if there's any chance at all that the heredoc may contain a $, or a `, or possibly some other shell-magical characters, then you have to use a single-quoted heredoc:
cat <<'EOF'...
This means that if you want to do variable interpolation, like you're doing, then you need something that looks like:
cat <<'EOF' | sed -e "s/\\\$username/$username/g" -e "s/\\\$my_email/$my_email/g" ...
It looks yucky and unwieldy at first, but I've found that it's nice to be able to see at the top of the heredoc exactly what's getting replaced and what values it needs.
Second, if root privileges are required to write the file, then you need to use `tee`, because you can't sudo an output redirection:
cat <<'EOF' | sed -e "s/\\\$username/$username/g" | sudo tee /path/to/file >/dev/null
After using it for a while, I've found I really like this pattern for managing configuration file templates.
You're right though that something like Stamp could be built using standard shell tools if someone were so inclined.
shar is specifically mentioned in the footer, with the drawback that it's really an arbitrary shell scipt. ptar is also mentioned, and seems rather nice. It's also way better documented[0] though with glaring holes e.g. how does it deal with non-UTF8 file content (it's not clear whether file size or delimiters is relevant, and why you'd have a closing delimiter if the filesize rules), it also specifies file names as UTF8 or ASCII, neither of which is sufficient to handle the full breadth of possible file names.
I guess that's true. I suspect the support for non-UTF8 names in modern tooling is very, very spotty, given how many config files and file formats that refer to other files use UTF-8 themselves. E.g. can you refer to one of these names in an nginx config? (just an example; I have no idea if its config is UTF-8 or not)
The main benefit, in my mind, over zip/tar is the built-in parameter substitution.
You could imagine using this as a development dependency, to standardize the creation of new (anything) that follows a predictable pattern. Check in your stamp file, and any junior dev who creates a new [anything] in your project can get all the custom boilerplate and best-practices right away.
Yes, the only real advantage over zip files seems to be the parameter substitution. I’d rather build that on top of the zip format, which supports extension by custom fields, and provide a wrapper around zip/unzip that adds the substitution functionality. Users of regular zip could still use the zip files then, just without automatic substitution.
Zip unpackers usually spill files into already populated directories or create double foldered contents, depending on how you prefer to unpack them: <here> or <into zipname>. The only app that does the right thing is The Unarchiver from AppStore. All other unpackers on all platforms force you to look into the zip file before unpacking. This is annoying AF.
I can’t tell from the article, but if it allowed “unstamp -“ and then simply copy-pasting into stdin from a site, that would be great.
I think the main takeaway over zip is that with zip you need to actually run “zip” before committing whereas here you can declare it in “code”. I don’t see the benefit over shipping a skeleton directory though.
Why don’t you write it with posix only tools then? It’s a great idea that didn’t exist yet. Who cares how the developer made it? I’m sure if it takes off people will rewrite it in go and rust with standalone binaries and the HN crowd will love it. I’m pretty sure people use the tools they’re most familiar with. Bashing people for their choice of tool is a popular comment on HN but doesn’t accomplish anything.
It's not bashing a tool per se, sorry if you read it as such. I was pointing out that with the abundance of tools we have at our disposal this project sounds to me like a solution looking for a problem.
As for the solution not existing yet - as people pointed out there is tar, zip, git, and shar which all seem to accomplish the same thing
Yes I read it and I don't agree with the posted downsides for most of them, since the tool itself seems to share the downsides with the posted pro-cons of other solutions and doesn't seem to solve any problem it promises to do.
Is visibility into what's being created a problem?
The listed problems like "everytime the template changes, author has to rezip and reupload the folder" don't seem to be solved with stamp to me, it seems like the problem is now shifted to controlling a version of this new format and still distributing it somehow (probably through git) at which point you ask yourself why can't the git repo already have the structure? If somebody wants to rename something they do this really easily, no need for var substitution.
If there was a migration mechanism that would move files from an old template to a new one I would see value added.
While this of course sound like a cool project and my comment might be the most Hacker News-y thing [1] I think it's aimed at developers and so it fits.
I don't agree with the zip downsides listed in the repo since firstly `unzip -l` exists to see what's in a zip and secondly the argument with reuploading the zip doesn't work in favor of a different format since you also need to update it in a remote when changed.
Secondly I am kind of amused about this format when `touch`, `mkdir -p` and `echo` are available in every POSIX compliant system which can be combined to create a nice coherent shell script which without dependencies would cover the functionality of this project as far as I understand.
I don't want to sound condescending but I'm a feeling a little left-pad [2] on this one.
[1]: https://news.ycombinator.com/item?id=9224
[2]: https://www.davidhaney.io/npm-left-pad-have-we-forgotten-how...