Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Why do we write 5+5 and not five plus five?

Because tradition is a very hard stuff to get rid of. Just like clocks with needles, with 12 sections.

If you assume addition as implicit juxtaposition syntactic operator ⁙⁙ or ⬠⬠ is also damn efficient to represent five plus five (/faɪv.plʌs.faɪv/ in IPA).

In Ruby:

  class Fixnum alias_method :plus, :+;end 
  five=5
  five.plus five
Multiple letter "five" is a single symbol, just as the multiple pixel 5 glyph.

Now, there is some convenience in having scripturally short symbols when you do your math with paper and pencil, for example posing some additions in column. But that’s all there is to it, pencil convenience.

A concept is a single semantic idea whatever the length of the symbol used to refer to that concept.



> Now, there is some convenience in having scripturally short symbols when you do your math with paper and pencil, for example posing some additions in column. But that’s all there is to it, pencil convenience.

You say that like you can just say it, and it's so, but you've got no evidence all symbols are equal even if they are equal in some ways, and Iverson delivered such an utterly convincing argument to the contrary by showing example after example of the exact opposite. You should read it a few times:

https://dl.acm.org/doi/pdf/10.1145/1283920.1283935

> A concept is a single semantic idea whatever the length of the symbol used to refer to that concept.

No, symbols take up real space on the page, on the screen, and in our minds; there are only so many symbols you're going to be able to put in your mind in your career, or in your life, so if there's a way to say some symbols are better than others, then by knowing the better symbols you will among other things, be able to solve bigger problems faster and with fewer bugs. That may not be important to you, but it's important to me!

"+" is a really good symbol. That's why the tradition has been so hard to shake, and that "pencil convenience" has been with us for thousands of years, but "+" isn't that old! For a long time addition was performed with juxtaposition! Just a series of stroke marks like |||| but seriously 5+5 is better than ||||||||||| and nobody can convince me otherwise! I think though, if you aren't convinced by now, I'm not sure what else I can do.


+ was invented by Nicole Oresme in 1360. He was tired of writing “et” (latin for and) over and over.

While we have some standardized operators that got into programming languages, we are doing a very poor job of using symbols even for very useful functions. Iverson was, and still is, right about terse notation. However, APL/J/BQN are not flexible enough in this regard. You can’t introduce new symbols. For example, you CAN do “et = +”, but you CAN’T do “• = *”. User-level definition of custom operators would enable APL-family to break out of the diamond world in which they live in.


I don't think user-level definitions of custom symbols are that useful, but for what it's worth, you can do it in ngn/k...

    (Π):!/:  / make table
    
    Π[`a   `b
     ( 1    2
       3    4)]


TIL that ngn/apl and ngn/k supports user-defined symbols. Thank you!

Creating custom notation (aka DSL) can be very liberating. As Iverson noted, having a good notation enables you to think previously unthinkable thoughts. Einstein’s theory of relativity was enabled by use of his sum notation. [0]

Of course, it should be paired with the right user interface. Inputting non-ASCII chars on keyboards is still pretty cumbersome. But APL with its weird symbols is perfect for paper and pencil, where it has its origins. [1]

Some might see pencil-and-paper interface for an array-oriented REPL as going backwards, but I consider pencil to be mightier than the keyboard. Finally you could sketch visually your algorithm beside the runnable code.

[0]: https://en.m.wikipedia.org/wiki/Einstein_notation [1]: https://mlajtos.mu/posts/new-kind-of-paper


Much of the progress in mathematics over the past 500 years can be attributed to the development of a good, concise notation.


Yet there is at least one famous person in CS and EE (and physics?), who says, that mathematical notation is not all that great, compared to how precise computer languages are or can be. Yet I remember often struggling to decide the order of things expressed with mathematical notation, when merely the minimal number of parentheses was employed by a lecturer or teacher.

Mathematical notation is very concise, but I would not call it a great notation in terms of intuitively being clear, which operations to apply in which order or which symbol groups its argument(s) in what way and found myself rather tending to write an extra pair of parentheses, to make things clearer for myself.

There is also the mix of operators and functions. Some computer languages do not mix that, but employ only functions, which makes things clearer as well.


IDK, for instance, I'm familiar with Structure and Interpretation of Classical Mechanics, and even though I have had my share of fun from it, the (functional) presentation there still looks terrible, to the point of becoming completely opaque in places, compared to the traditional mathematical notation of the Lagrangian etc. I have also seen books on calculus that focus on function composition (resembling the style of category theory) rather than on sets, and I felt that this was completely unnecessary as it only served to complicate both the exposition and the understanding of the subject that is already difficult enough.


I'm not the person you replied to, but symbols are easier to distinguish visually, tho they need careful balance. Too many are worse than too few


> ⁙⁙ or ⬠⬠ is also damn efficient to represent five plus five

Ok, now do a million plus a million.


MM? Using roman numerals... Seriously, I do value good notation, but for this particular case, juxtaposition, we do use it for multiplication - in algebra, where it's easy to find the boundary of a number, and variables are always 1 symbol long, so 'xyz' is a product of three values, not a name of one.

Coming back - notation is important, but sometimes it's hard to find "the best" or even "good enough". Newton's primes as symbols of differentiation used alongside with Leibniz' dy/dx, and there is the shortcut for "differentiated by time" as dot on top of the function symbol, so for various cases different notation is more convenient.


MM in Roman numerals is 2,000. There are some Unicode symbols representing one million, but I dunno if they're in the range HN accepts.


oo


Now do ten million plus ten million


I guess that the "Combining Cyrillic Ten Millions Sign" could do the job: ꙰ ꙰

Though you point it seems is more like using only a small set of symbols. Obviously, to render large number you need to introduce new syntax apparatus if keeping terseness is a desired constraint.

Just like you would wrote 1×10⁷ or 1e7, you can have say a single point on one line, and an heptagon or seven points on an other, still juxtaposing these two columned digits. In Unicode, it seems these last symbols are not present, but instead you can use cards from different games.

  🀟 🀟
  · ·
How fun! :D




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: