this post was submitted on 01 Jan 2025
7 points (100.0% liked)

Shell Scripting

1385 readers
1 users here now

From Ash, Bash and Csh to Xonsh, Ysh and Zsh; all shell languages are welcome here!

Rules:
  1. Follow Lemmy rules!
  2. Posts must relate to shell scripting. (See bottom of sidebar for more information.)
  3. Only make helpful replies to questions. This is not the place for low effort joke answers.
  4. No discussion about piracy or hacking.
  5. If you find a solution to your problem by other means, please take your time to write down the steps you used to solve your problem in the original post. You can potentially help others having the same problem!
  6. These rules will change as the community grows.

Keep posts about shell scripting! Here are some guidelines to help:


In general, if your submission text is primarily shell code, then it is welcome here!

founded 2 years ago
MODERATORS
 

After a long time I'm in a situation where I sometimes work on a temporary system without my individual setup. Now whenever I might add a new custom (nushell) command that abstracts the usage of CLI tools, I think about the loss of muscle memory/knowledge for these tools and how much time I waste looking them up without my individual setup. No, that's not a huge amount of time, but just out of curiosity I'd like to know how I can minimize this problem as much as possible.

Do you have some tips and solutions to handle this dilemma? I try to shadow and wrap existing commands, whenever it's possible, but that's often not the case. Abbreviations in fish are optimal for this problem in some cases, but I don't think going back to fish as my main shell for this single reason would be worth it.

you are viewing a single comment's thread
view the rest of the comments
[–] MajorHavoc 1 points 3 weeks ago* (last edited 3 weeks ago)

I assume there can be similar situations in the future for other reasons.

You may be happily surprised - we don't agree on much in technology, but bootstrapping with git is supported in places where nothing else works, and is finally also even popular among Windows engineers.

I recall encountering two exceptional cases:

  • An 'almost never change anything' immutable distribution like Batocera.
  • A host with no Internet access.

In both cases, I still version the relevant scripts in the same git repository, but I end up getting scrappy for deploying them.

On an immutable distribution, I'll curl, wget, or Invoke-WebRequest to get a copy of each file I need, as I need them. I encounter this often enough that I find it worth putting copies into a public S3 bucket with a touch of nice DNS in front. It does wonders for me remembering the correct path to each file.

On a completely offline distribution, I run git init --bare in a folder on a the root of a thumb drive or network share, and then I git push a shallow copy of my scripts repo to it, and git clone from it on the machine to work on. I also simply file copy a copy as well, in case I cannot get git bootstrapped on the offline machine.

I do still bother with the git version because I invariably need to make a tiny nuanced script correction, and it's so much easier (for my work patterns) to sync it back later with git.