Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The 'sh' command that it corresponds to is above it:

  find . -iname '*test*' -print0 | xargs -0 grep data
but possibly a bad assumption on my part that the mapping was clear. In any event, 'm' is for regex string matching, and 'f<' is for reading a file into a generator (basically an iterator over the lines in the file). It's a good point that more, simpler examples would help.


The examples in the README are really not the best.

Anyway, given from the examples for me this reads as a language to write hacky onliners of code which are probably easy to write once but hard to read anytime after. One moment... we had such a language in the past: Perl! :-D

Perl5 onliners were infamous for their expressive power but sometimes crazy to understand even if you thought you were fluent in perl ;-)


> Anyway, given from the examples for me this reads as a language to write hacky onliners of code which are probably easy to write once but hard to read anytime after.

There is definitely a 'write-only' angle to concatenative (postfix) languages that rely on a stack. I think this type of language/approach is uniquely well-suited to this use case, though, where the focus is interactive use, plus short programs that are not generally intended for distribution, since you get the most out of the advantages around conciseness, without incurring the costs that come with larger programs.


We have 2 now, perl and jq


Even after perusing jq's manual multiple times and having written several complex incantations, I still have no idea how to properly combine `|` and `.[]` except by trial and error, or why `select()` needs to be used inside `map(select(...))`

Recently I needed to extract some data, and after fighting with jq and its manual for half an hour, I solved the problem in 30 seconds with node.js

I appreciate the idea behind jq, but its language is horrible. Even XPath was easier and cleaner.


Some nice alternatives for querying JSON via CLI include jello, yamlpath, and dasel.


Which one do you think is best? And, if applicable, which one do you love but it’s not quite first place material yet?


Don't forget `gron`.


Hmm. I also think jq is more awkward than it needs to be, but I don’t think the points you mention are a problem. Maybe explaining them would help?

(Note: the following explains jq’s operation using the smallest possible subset of the language, it doesn’t aim to use the most natural programs possible.)

So jq’s data model (much like XPath’s actually) is that everything is a (possibly empty) stream of (JSON) values. On input (unless you use -s), it accepts any number of concatenated JSON objects (usually separated by newlines or ACSII RS, but as JSON is self-delimiting that isn’t strictly required) and makes a stream out of them.

That is then fed into the program, a pipeline of |-separated transforms, each of which can generate zero or more output elements from each input element. For example, .foo is a one-to-one transform that, when it accepts an object, emits the value of its foo property (and fails otherwise):

  $ echo '{"foo": null}{"bar": 1, "foo": 42}' | jq .foo
  null
  42
And .[] is a one-to-zero-or-more transform that, on accepting an array, emits each array element separately (and fails otherwise):

  $ echo '[false,1][][2]' | jq '.[]'
  false
  1
  2
While select(F) is a one-to-zero-or-one transform that, on accepting an element, feeds it into F and lets its pass through if it got a truthy value or rejects if it got a falsy one:

  $ echo '{"foo": null}{"bar": 1, "foo": 42}' | jq select(.foo)
  {"bar": 1, "foo": 42}
OK, that last one was a bit of a lie. Because we don’t want to introduce functions into the language as a separate kind of thing to transforms, F is also a transform, so it might possibly emit more than one value in response to whatever select fed it. The full truth is that select(F) is a one-to-zero-or-more transform that emits each input value as many times as there are truthy values in F’s response to it:

  $ echo 'false 42' | jq 'select([true, "also truthy"] | .[])'
  false
  false
  42
  42
That might have not been terribly useful, but it illustrates two points. First, a JSON literal is a valid transform: one that emits itself every time it gets something. (That’s why you need to write .[] for flatten: plain [] is the empty array literal.) Second, while jq cannot do many-to-one transforms, on pain of losing its streaming nature, it can do something like nested contexts, where it launches a subordinate pipeline and does something with the results.

And it is willing to collect those results instead of streaming them: if you have a pipeline P, [P] is a one-to-one transform that, for each input element, runs P on it, collects all the results from them, puts them into an array and emits that. For example:

  $ echo '[[0,1],[2]] [[]] [] [[3]]' | jq '[.[] | .[]]'
  [0,1,2]
  []
  []
  [3]
Or:

  $ echo '[false,1][][2]' | jq '[.[] | select(.)]'
  [1]
  []
  [2]
(here . is the one-to-one identity transform). Instead of [.[] | P] you can write map(P).

What this boils down to select(C) will go through the input stream and pare down its elements to those that satisfy C, while map(select(C)) will go through the input stream of arrays and pare down each array’s elements to those that satisfy C.

Final point: if you want to give up streaming, the -s / --slurp flag will slurp the input stream into an array, then feed it to your program as a single element. That is, jq -s '.[] | P' is a worse synonym of jq P.


Don't listen to most of the people here. Perl? What are they talking about. It is a concatenative shell. Think forth.

That said, the examples could be better explained just like you did here now. The examples are not bad but things like the f iterator probably needs some explaining.

I really like the idea. A more simple shell language is something I've wanted. Either lisp or forth would work but the basic issue is that bash and the rest are so complicated and with so many special things you have to know. Doing anything other than most basic shell scripts is horrible.


> That said, the examples could be better explained just like you did here now. The examples are not bad but things like the f iterator probably needs some explaining.

Thanks, the readme has been expanded now so that it has better examples, and the full documentation (https://github.com/tomhrr/cosh/blob/main/doc/all.md) is more clearly marked for those who are looking for more detail.


I love the premise. I've got be your market audience, but the examples are too hard for me: both sh & cosh

I don't use `find` and I would have to look up `-0`

Why does cosh use ; the syntax kinda looks like they're not needed?


> but the examples are too hard for me: both sh & cosh

Thanks, I'll look at adding some simpler examples.

> I would have to look up `-0`

The thing about '-0' is that it's not required in cosh, because you're dealing with proper values instead of text streams. The problem that '-0' (and '-print0') is addressing doesn't arise.

> Why does cosh use ; the syntax kinda looks like they're not needed?

';' needs to be used to denote the previous string (token) as a function where that can't be determined from context. For example, if you type '1 2 +' and press enter, the shell will assume that because there's a function called '+', the intention is to run that function, but you could also enter '1 2 +;' (or '1 2 + ;') to get the same result. Whereas '1 2 + 1 2 + +' doesn't work, because the shell doesn't know if the first two '+' symbols are meant to be interpreted as function calls or just plain strings. The other place where it assumes that a function call is intended is at the end of an anonymous function, so `[1 2 +]` and [1 2 +;]` have the same effect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: