this post was submitted on 10 Dec 2023
105 points (95.7% liked)

Programming

17536 readers
266 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
105
submitted 11 months ago* (last edited 11 months ago) by starman to c/programming
 

TL;DR: It would be cool if all CLI apps supported JSON output, but in the meantime we can use jc

all 16 comments
sorted by: hot top controversial new old
[–] [email protected] 36 points 11 months ago (1 children)

Not to piss on anyones parade here, but grepping something out of a json structure is one of the most asked questions for jq as well. Of course json is nice, but if the goal is to simplify data extraction, I'm not sure much will be gained by this.

As far as reducing the toolchain necessary to extract the same data, this is a welcome addition.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago)

I've never been able to successfully write a jq command without having to Google it. It is something so complex and I don't do it often, so I just forget everything.

I hope they figure out something with more I tuitive syntax, something SQL-like that people can write without having to look at a manual.

Anyways, AI is here... pretty soon we'll just translate natural text to whatever overly complex language there is.

I'm sure I'll get replies of people saying jq is easy. It isn't for me, right now I can t even remember how to filter and create associations between objects. I think I'll just start writing small python apps to process JSON. A bit longer but at least I can maintain it. The only issue is that Python is too heavy... I'll figure something out.

I've been thinking for a while that what we actually need is a modern shell language. Like a mix between python and shell. Imagine if you had native support to read a JSON using shell.

Edit: oh shit. Said all of this and then saw the comment below talking about Nushell. Today is a good day.

[–] ishanpage 31 points 11 months ago* (last edited 11 months ago) (4 children)

While jc is a great tool, and I'm definitely a fan, I believe the real solution to the overarching problem lies in a paradigm shift: see nushell

[–] starman 7 points 11 months ago (1 children)

I actually use both! It's so nice to just jc git log and then work with the data using nushell :)

[–] dukk 1 points 11 months ago

Oh that’s smart! And then nushell just handles the data for you…I might try that!

[–] [email protected] 6 points 11 months ago* (last edited 11 months ago) (1 children)

I've always struggled with actually retaining knowledge on how to use the myriad tools you'd usually need to extract/parse data (awk, sed and friends) and this was a game changer. I don't quite daily drive it just yet but when I do need it, it's vastly more ergonomic.

[–] [email protected] 2 points 11 months ago

love to read fellow* people in the unix world discovering what made powershell great 10+ years ago as though it's a paradigm shift. and the top comment of the thread is still nitpicking the points in the post while missing the forest for the trees. I mean ifconfig as an example is lol but still.

like powershell's got such bizarre warts and design choices and it's more ergonomic than unstructured text pipelines despite that

i know it's not that simple but still, lol

*linux has been my primary os for 9+ years, not throwing stones here

[–] [email protected] 5 points 11 months ago

Nushell is so great! I’ve been using it for a couple years. It has completely replaced my need for tools like grep, sed, awk, etc. and because it handles JSON and so many other data formats natively I rarely even need to think about parsing.

[–] [email protected] 2 points 11 months ago

Thank you for bringing up nushell, had never heard of it

[–] [email protected] 16 points 11 months ago* (last edited 11 months ago)

The author it trying to solve non-existing problem with the tool that does not meet requirements that he presented himself.

$ ifconfig ens33 | grep inet | awk '{print $2}' | cut -d/ -f1 | head -n 1

Yeah, it's awful. But wait… Could one achieve this a simpler way? Assume we never heard about ifconfig deprecation (how many years ago? 15 or so?). Let's see at ifconfig output on my machine:

ens33: flags=4163  mtu 1500
        inet 198.51.100.2  netmask 255.255.255.0  broadcast 255.255.255.255
        inet6 fe80::12:3456  prefixlen 64  scopeid 0x20
        ether c8:60:00:12:34:56  txqueuelen 1000  (Ethernet)
        RX packets 29756  bytes 13261938 (12.6 MiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 5657  bytes 725489 (708.4 KiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

Seems that the cut part of pipeline is not needed because netmask is specified separately. The purpose of head part is likely to avoid printing IPv6 address, but this could be achieved by modifying a regular expression. So we get:

$ ifconfig ens33 | grep '^\s*inet\s' | awk '{print $2}'

If you know a bit more about awk than only print command, you change this to

$ ifconfig ens33 | awk '/^\s*inet\s/{print $2}'

But now remember that ifconfig has been replaced with the ip command (author knows about it, he uses it in the article, but not in this example that must show how weird are "traditional" pipelines). It allows to use format that is easier to parse and that is more predictable. It is also easy to ask it not to print information that we don't need:

$ ip -brief -family inet address show dev ens33
ens33            UP             198.51.100.2/24

It has not only the advantage that we don't need to filter out any lines, but also that output format is unlikely to change in future versions of ip while ifconfig output is not so predictable. However we need to split a netmask:

$ ip -brief -family inet address show dev ens33 | awk '{ split($3, ip, "/"); print ip[1] }'
198.51.100.2

The same without awk, in plain shell:

$ ip -brief -family inet address show dev ens33 | while read _ _ ip _; do echo "${ip%/*}"; done

Is it better than using JSON output and jq? It depends. If you need to obtain IP address in unpredictable environment (i. e. in end-user system that you know nothing about), you cannot rely on jq because it is never installed by default. On your own system or system that you administer the choice is between learning awk and learning jq because both are quite complex. If you already know one, just use it.

Where is a place for the jc tool here? There's no. You don't need to parse ifconfig output, ifconfig is not even installed by default in most modern Linux distros. And jc has nothing common with UNIX philosophy because it is not a simple general purpose tool but an overcomplicated program with hardcoded parsers for texts, formats of which may vary breaking that parsers. Before parsing an output of command that is designed for better readability, you should ask yourself: how can I get the same information in parseable form? You almost always can.

[–] [email protected] 15 points 11 months ago (1 children)

I kinda love it in theory.

Will be trying this out.

I do find it funny however that awk is lumped together with these small use case tools like sed, grep, tr, cut, and rev, since awk can be used to replace all of these tools and is it's own language.

I don't think the emphasis should be on simplicity, but rather on understandability (which long awk commands are not either).

If you give someone a bash script, they should be able to know exactly what the code will do when they read the script without having to run it or cat out the source it might need to parse. Using ubiquitous tools that many people understand is a good step.

Sadly awk is installed by default in most distros and tools like jq and jc would require installation.

[–] [email protected] 4 points 11 months ago

AWK is fucking awesome!

[–] [email protected] 8 points 11 months ago

And there is also Nushell and similar projects. Nushell has a concept with the same purpose as jc where you can install Nushell frontend functions for familiar commands such that the frontends parse output into a structured format, and you also get Nushell auto-completions as part of the package. Some of those frontends are included by default.

As an example if you run ps you get output as a Nushell table where you can select columns, filter rows, etc. Or you can run ^ps to bypass the Nushell frontend and get the old output format.

Of course the trade-off is that Nushell wants to be your whole shell while jc drops into an existing shell.

[–] [email protected] 1 points 11 months ago

I rather like this idea.

Basically take all of the "let's all write parsers now" work of handling the plain text output of *nix coreutils and bundle all of that work into a single tool. JSON is then the structured output data format, which should then replace all of the parsing work with querying work, which should be nicer and easier.

Backwards compatible, kinda unix-y, optional and should play nice with existing tooling. I hope it works out!