this post was submitted on 01 Jan 2025
7 points (100.0% liked)

Shell Scripting

1378 readers
2 users here now

From Ash, Bash and Csh to Xonsh, Ysh and Zsh; all shell languages are welcome here!

Rules:
  1. Follow Lemmy rules!
  2. Posts must relate to shell scripting. (See bottom of sidebar for more information.)
  3. Only make helpful replies to questions. This is not the place for low effort joke answers.
  4. No discussion about piracy or hacking.
  5. If you find a solution to your problem by other means, please take your time to write down the steps you used to solve your problem in the original post. You can potentially help others having the same problem!
  6. These rules will change as the community grows.

Keep posts about shell scripting! Here are some guidelines to help:


In general, if your submission text is primarily shell code, then it is welcome here!

founded 2 years ago
MODERATORS
 

After a long time I'm in a situation where I sometimes work on a temporary system without my individual setup. Now whenever I might add a new custom (nushell) command that abstracts the usage of CLI tools, I think about the loss of muscle memory/knowledge for these tools and how much time I waste looking them up without my individual setup. No, that's not a huge amount of time, but just out of curiosity I'd like to know how I can minimize this problem as much as possible.

Do you have some tips and solutions to handle this dilemma? I try to shadow and wrap existing commands, whenever it's possible, but that's often not the case. Abbreviations in fish are optimal for this problem in some cases, but I don't think going back to fish as my main shell for this single reason would be worth it.

you are viewing a single comment's thread
view the rest of the comments
[–] MajorHavoc 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I have a public git repository that I keep those kinds of recipes in.

So on a temporary system, I usually clone that repository first, so I can reuse past solutions.

[–] TheV2 1 points 2 weeks ago (3 children)

Me, too, and it works for other Linux distros, but in this case it's a Windows Sandbox. Unless it's copy and paste, for this case it wouldn't be worth it and I assume there can be similar situations in the future for other reasons.

I once started to work on auto-setup scripts for Windows, but the unpredictable nature of it made me give up on that :D

[–] MajorHavoc 2 points 2 weeks ago

I once started to work on auto-setup scripts for Windows, but the unpredictable nature of it made me give up on that :D

Yeah. This still sucks, but is getting substantially better every year. My lazy rule of thumb is if I find a solution inside of WMI (Windows Management Interface), then I'll script it. Otherwise, I figure I'm wasting my time as it will change anyway.

[–] MajorHavoc 1 points 2 weeks ago* (last edited 2 weeks ago)

but in this case it's a Windows Sandbox.

If it's Windows 10 or later, winget is preinstalled (sort of / mostly) and has acess to a release of git. (WinGet is available on 'Modern' Windows 10 and later., and it may take a few minutes to bootstrap itself after first login.)

So I'm able to bootstrap this pattern on Windows with something like:

winget install --id Git.Git -e --source winget

Syntax from Stack Overflow

I'm pretty sure I just use winget install Git.Git, but someone on SO recommends the above longer version. I'm guessing it prevents an interactive prompt, since there are more than one package source for git, if I recall.

[–] MajorHavoc 1 points 2 weeks ago* (last edited 2 weeks ago)

I assume there can be similar situations in the future for other reasons.

You may be happily surprised - we don't agree on much in technology, but bootstrapping with git is supported in places where nothing else works, and is finally also even popular among Windows engineers.

I recall encountering two exceptional cases:

  • An 'almost never change anything' immutable distribution like Batocera.
  • A host with no Internet access.

In both cases, I still version the relevant scripts in the same git repository, but I end up getting scrappy for deploying them.

On an immutable distribution, I'll curl, wget, or Invoke-WebRequest to get a copy of each file I need, as I need them. I encounter this often enough that I find it worth putting copies into a public S3 bucket with a touch of nice DNS in front. It does wonders for me remembering the correct path to each file.

On a completely offline distribution, I run git init --bare in a folder on a the root of a thumb drive or network share, and then I git push a shallow copy of my scripts repo to it, and git clone from it on the machine to work on. I also simply file copy a copy as well, in case I cannot get git bootstrapped on the offline machine.

I do still bother with the git version because I invariably need to make a tiny nuanced script correction, and it's so much easier (for my work patterns) to sync it back later with git.