logoalt Hacker News

vidarhlast Wednesday at 6:10 PM1 replyview on HN

I'd replace the first part with (which isn't any shorter, but in general if I want a list of files for a pipeline, find is usually more flexible than ls for anything but the most trivial):

    find -maxdepth 1 -type f -printf '%s %f\n' | sort -n | head -n 5
For the latter part, I'd tend to think that if you're going to use awk and jq, you might as well use Ruby.

   ruby -rjson -nae ' puts(JSON.pretty_generate({n: $F[1], s: "%.5f MB" % ($F[0].to_i / 10e6) }))'
("-nae" effectively takes an expression on the command line (-e), wraps it in "while gets; ... end" (-a), and adds the equivalent to "$F = $_.split" before the first line of your expression (-n))

It's still ugly, so no competition for nushell still.

I'd be inclined to drop a little wrapper in my bin with a few lines of helpers (see my other comment) and do all Ruy if I wanted to get closer without having to change shells...


Replies

ectosphenolast Wednesday at 10:24 PM

Ruby is a pretty natural fit for shell scripting.

https://lucasoshiro.github.io/posts-en/2024-06-17-ruby-shell...

show 1 reply