Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There is no operator precedence. A bare a * b + c is an invalid expression. You must explicitly write either (a * b) + c or a * (b + c).

Honestly I've often wished for this in mainstream languages. It seems like operator precedence should go the way of bracketless if and implicit int casts. (Though I wonder if they wind up making exceptions here for chains of method calls? I guess technically those rely on operator precedence sort of?)

Edit: Yeah I see the example code has "args.src.read_u8?()". So it looks like they figured out how to keep the good stuff.



Yes please! I'm always using parenthesis for every compound expression, and I've heard so many times from coworkers or code reviewers smuggly going "you know you can skip that, right?". At the same time I've heard the same people having discussions and scratching their heads about precedence in some attempt to code golf their way through a feature. Not to mention bugs caused by incorrect assumptions. Or pausing to figure out what some previously written expression actually does. Meanwhile, I'll gladly write `X + (Y / Z)`. You can thank me later.


Thank you!

In some sane industries where functional safety is required, this is strictly enforced. It leaves no ambiguity - what you write is what you get - and when the excrement hits the quickly revolving pressure modification device, you can just glance at the expressions and tell if they make sense or not.


LISP-like languages have enforced operator precedence due to polish notation e.g. (+ (* a b) (+ c d))


In addition the variadic prefix-notation means the operators are not limited to being binary:

  3*x*y*z+w
becomes:

  (+ (* 3 x y z) w)


Also, pleasantly the 0-ary invocation is the identity so (+) is 0 and (*) is 1.


In addition to requiring remembering the precedence of + versus * this requires you to remember order of evaluation. Is it (ab)c or a(bc)? And no, with certain types those are not necessarily the same. Floats, for example.


1. It doesn't require you to remember precedence, since there is no ambiguity

2. It doesn't require you to remember order of evaluation because the order is unspecified (* x y z) is defined to be "The product of x, y, and z" with no requirement on the order of evaluation. If you need a well-defined order of evaluation then you can do that explicitly: (* x (* y z))


At least you could write a simple macro to do left to right evaluation with standard operators to get the same effect if you wanted.


Does lisp have native bigints, or would something like (+ MAX_INT MAX_INT MIN_INT) still suffer from operator precedence issues?


Common Lisp's integers are transparently multi-precision. There is no need to work with a separate type, or to use special syntax for writing bignum tokens in source code.

Bignum support first appeared in the MacLisp dialect in 1970 or 1970, one of the main predecessor dialects of Common Lisp.

According to Gabriel and Steele's Evolution of Lisp paper, "bignums—arbitrary precision integer arithmetic—were added [to MacLisp] in 1970 or 1971 to meet the needs of Macsyma users".


I dont' know about other lisps, but both common-lisp and scheme have native bignums and fixnums are promoted to bignums automatically.


there's no operator precedence if you don't have (multiple) operators that could precede each other. In LISP-like languages these are simply functions (or more correctly, forms) which have other expressions as arguments, like any other functions or forms. LISP works just fine without much of the things we take for granted in ALGOL-like languages.


Polish notation enforces binary operators. LISP doesn't, so you have to have the parentheses. (+ a b c) is + a + b c or + + a b c in polish notation. These are the same, of course until thet are not, such as with floating point arithmetic or in case you trap on integer overflows.


That is a completely wrong-headed view. There is no precedence there because there is no ambiguity. The parentheses in your example are the function call parentheses, not the optional grouping parentheses. They are mandatory.

There are some issues of associativity in the semantics of some Lisp functions. For instance we know that that the syntax (+ 1.0 2 3.0 4) is a list object containing certain items in a known order.

But how are they added? This could depend on dialect. I think in Common Lisp, the result has to be as if they were added left to right. When inexact numbers are present, it matters.

This isn't a matter of syntax; it's a semantic matter, which depends on the operator.

For instance in (- 1 2 3 4), the 2 3 4 are treated as subtrahends which are subtracted from the 1. But (- 2) means subtract 2 from 0.

In TXR Lisp, I allowed the expt operator to be n-ary: you can write (expt 2 3 4). But this actually means (expt 2 (expt 3 4)): it is a right to left reduction! It makes more sense that way, because it corresponds to:

     4
    3
   2

The left-to-right interpretation would be

     3 + 4
   2
which is less useful: you can code that yourself using (expt 2 (+ 3 4)), for any number of additional terms after the 4.


APL (and, its derivatives, I think) evaluate strictly right to left, so

a * b + c

is a * (b + c). It might be jarring at first but I really came to enjoy the consistency, I never had to remember operator precedence, which helps in a language like APL where most functions are infix.


Conversely, Smalltalk is left-to-right, so

a + b * c is (a + b) * c

which is simply a result of every operation being a message send - muddling the rules with precedence would be likewise confusing, and would ruin the simplicity of the grammar.


and in forth

a b + c *

I believe language with "default precedence" was meant to help us write less (parenthesis) but in the long run we ended up abusing (it)


I tend to think it's fine for the very most common and obvious operators (MDAS, etc), but as soon as you get outside of those I agree. In particular I've been bitten by the precedence of JavaScript's ?? operator:

  function foo(a) {
    return a ?? 10 + " is the num";  // a ?? (10 + " is the num")
  }

  foo(12) // 12


Me too! So far I have seen four actual bugs in large numerical code bases that were caused by overlooking operator precedence. I expect to see more in the years to come.

I think that precedence of '*' over '+' is acceptable (as everyone knows it instinctively) but I would love a way to require parenthesis for everything else.


APL wasn't about to bet that * is obviously over + in precedence. There everything is of equal precedence, right-to left. Until you put parenthesis. And that's for verbs (monadic and dyadic operators over nouns), not for other parts...


Everyone doesn't know it instinctively, although I used to think so.

A while ago I taught an introductory spreadsheet class for adults. I got them to try "=2*3+4" and about half the class were surprised that the result wasn't 20. It's a lesson that has stayed in my mind.


I don't know if it was a joke (good one if so), but you seem to have mixed the operators up in your example.


I don't think ditching the "basic" operator precedence (MDAS etc.) is a good idea, but I strongly agree that operator precedence should be a partial order, not a total order. See also [1].

[1] https://foonathan.net/2017/07/operator-precedence/


Have there been attempts at creating languages that use a postfix (RPN) notation?


Forth is (I think?) the oldest and most well-known. Postscript, the printer control language, is possibly more widely-deployed. And Factor is a modern take on Forth.


Dont forget to add UNIX's dc (in old times bc was a wrapper for dc)


There's also Bitcoin Script, which is a forth-like language.


to add to the list: HPL, the programming language on Hewlett-Packard's RPN calculators.


like forth?


That's just brilliant, and now I think about it, I wonder why no other newer languages have adopted this. I wish this to be the new norm of 2020's.


Only the good use it like that. Pony eg enforces it, and it was way before wuffs. Rust on the other hand lives with precedence rules which you have to remember by hard.


That isn't a bad idea, but keep in mind, there is always a usability aspect (I'll just call it the programmer computer interface problem) of "what makes a programming language popular". For example, consider PL/I: https://en.wikipedia.org/wiki/PL/I#Implementation_issues

When people see for example, * or + (or a, b, c). They may have some preassumptions about some implied associativity from arithmetic (depending on what they are taught and what level of math they are at), that may be hard to break. If you have learned some college (abstract) algebra, it may mean something quite different. How about the = sign? Of course, a, b, c may be meaningless to someone who is not a native latin-1 speaker either. My point I guess is that these are just matters of convention, there is just some implied commutativity or associativity usually implied, but this is all arbitrary.

Now, one intereting "quirk" with with PL/I was that certain things looked similar "to what people were used to" (relative to say other PL/I code, or FORTRAN or COBAL), but worked differently even in some small spatial area on a screen (two blocks of nearby code in some editor). For example, if the programmer's eye saw a block of code, reflexively, depending on their experience they may be able to predict what the result of the computation could do. PL/I was an interesting experiment because of the lack of reserved keywords. This made it very expressive but very hard to understand code in context. For example, in pseudo PL/I: foo = 1; = = 2; bar 2 + foo. You are basically changing the grammatical syntax of the language in 3 lines.

But on the other hand, everything is just a symbol and this may not be completely unusual. Consider the diversity of the world's languages and how they are written and how meaning is derived. Natural language grammars may connotate very different representations and transformations, but people learn because they see enough examples. Consider for the differences between Han, Brahmic scripts, Arabic BiDi, various African scripts, Cuneiform, Emoji, whatever. Perhaps all computer languages are "overfit" due to for example, Chomsky's ideas and BNF (keep in mind Chomsky's ideas about morphology were quite different).

Now, let's consider mathematical notation. Depending on how much pure math (or say, mathematical physics or other sciences) you consider, there may be more and more semantic overhead with the conventions of mathematical notation, and people often historically just "cartesianize" and "euclidized" things for convenience because of lack of tooling (think of a sheet of paper metaphor, we've simply moved it over to a computer. it's a skewmorph). Clearly we have better computer graphics, so why haven't developer tools and languages changed along with it? Maybe with more immersive manipulation they will.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: