Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Building a new functional programming language (onebigfluke.com)
137 points by signa11 on March 5, 2023 | hide | past | favorite | 234 comments


> Why is the gap between these two perspectives so large? My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

Now that (nearly) every mainstream language has lambda expressions, there are very few patterns in functional programming that cannot be replicated in mainstream languages, which means that moving to a pure functional language now provides strictly fewer choices when problem solving. In order for a programmer to accept a language with strictly less expressive power, those new restrictions must solve problems that the programmer actually cares about.

Rust reached adoption because mainstream systems programmers recognized that memory errors are a huge problem and dealing with the borrow checker is worth it to solve them. Static typing reached mainstream adoption because it made coordinating work in a large team dramatically easier and solved refactoring.

Functional-style programming has hit the mainstream—embedded within imperative and OO code—but purely functional languages have yet to make a good case for their additional restrictions, and I'm not convinced that multiprocessing is that case. Golang mainstreamed a very effective model for multiprocessing in an imperative environment that solves most of the problems without asking programmers to completely change the way they write code, and that model is increasingly being adopted in other languages (Java's project loom, for example).


> Now that (nearly) every mainstream language has lambda expressions, there are very few patterns in functional programming that cannot be replicated in mainstream languages

I don't think this is true. Algebraic data types and exhaustive pattern matching, which implies a type system, are the things I use most in my adventures in functional programming. Mainstream languages (for some definition of mainstream) don't yet have these features.

The original post doesn't mention pure FP. I don't think purity is that interesting when you can have a type system that can reason about mutation (i.e. linear or affine types, similar to Rust.)


It's also a matter of performance. Functional non-mutating patterns in JavaScript and Python can be substantially, sometimes catastrophically, less performant than the procedural/imperative mutating equivalents. All functional programming languages that I know of which have decent performance rely on an optimizing compiler to some extent, and/or core data structures that have been optimized for immutability.


> Functional non-mutating patterns in JavaScript and Python can be substantially, sometimes catastrophically, less performant than the procedural/imperative mutating equivalents

What portion of this is due to the fact that you're using a functional pattern on an imperative language? For example, none of those attempts can take advantage of any underlying purely functional data structures (see: Erlang/OTP/Elixir, or Clojure)


Case in point: Tail Call Optimization has been part of the JS spec since ES6, but remains completely unimplemented in all mainstream browsers/engines besides Safari[1]. For all but the most predictable inputs, you're pretty much forced to use loops due to potential overflows in situations where recursion might otherwise be preferable.

More fuel for the pyre: async Iterables cannot be consumed as a piped stream. You must use the for await construct, which is a shame considering the FP niceties that the Array type already provides for more traditional lists. Once again, one is forced by the API to use an imperative construct unless they specifically want to invite trouble by converting it into an Array (... and potentially choking in the process)

[1]: https://kangax.github.io/compat-table/es6/


In practice I feel like recursion isn't actually used that much in functional languages. Most things can be expressed as maps, flatmaps, folds, etc


It depends on the domain. For example, recursion can be extremely useful when processing a tree of data.


Python bogs down where there is excess use of the dynamic aspects, e.g., a large list comprehension.

Go with a generator, and watch that list be more kind to the resources.


> Python bogs down where there is excess use of the dynamic aspects, e.g., a large list comprehension. Go with a generator, and watch that list be more kind to the resources.

This is a micro-optimization and it's not always true. Generators usually use less memory, but list comprehensions can actually be faster because the memory gets allocated all at once and the loop is performed internally inside the interpreter, instead of the generator possibly being consumed as a slow top-level loop.

I was talking about examples like making a copy of a list/dict/object instead of replacing elements in-place, or using a stack of map/filter operations instead of a single-pass for-loop with dynamic programming.


Fair point, especially in the case of network-bound or disk-IO intensive work.


The thing that really irks me is that the generator pattern doesn't have to be an OO-first feature. Observable streams[1] work with the same basic foundation and those are awesome for FP. It's really frustrating that standard libraries are so eager to adopt generators without taking that last step into supporting functional stream consumption.

[1]: https://reactivex.io/


Python supports generator comprehensions too!


Kotlin and Java now have sealed classes/interfaces, which can provide exhaustive pattern matching and as far as I can tell can be used to model any ADT.

EDIT: In fact, both languages are now listed as having ADTs on the Wikipedia page: https://en.m.wikipedia.org/wiki/Algebraic_data_type

Also, TypeScript's union types can express ADTs, and Rust's enums. I don't think it will be very long before the rest of the mainstream languages catch up.


Agreed, but that's why I hedged with "some definition of mainstream". I don't think Kotlin or Rust are mainstream yet, most Java places are still on Java 8 or 11, and ADTs in Typescript are a PITA.


ADT's in TypeScript aren't as nice as in a language that supports them natively in many respects, but "PITA" is really overselling it. We use them pervasively at my shop. In fact, the ability to synthesize them without naming them (because TypeScript is structurally-typed instead of nominally) sometimes makes them even more convenient than in most FP languages (OCaml excepted) - the ability to, say, return an ad-hoc ADT from a lambda without giving it some meaningless name like "HelperFunctionResult" can occasionally be really handy.


I wouldn’t be surprised if there would be more people on Java 19 than on Haskell, for what it’s worth.


ADT's and exhaustive pattern matching are in Rust. The way you wrote your comment makes it clear that you know that, but might as well make it explicit.

IMO, people come to Rust because of memory safety, but they stay because of ADT's and similar features.


Algebraic data types and exhaustive pattern matching Are a core part of Rust.


Rust, but practically no other mainstream or OO/imperative style language


Swift, kotlin, Java


There is no proper pattern matching / destructuring in either Java or Kotlin.

There is a discussion about it in Kotlin, see https://youtrack.jetbrains.com/issue/KT-186/Support-pattern-...


Ada might be getting pattern matching soon too:

https://github.com/AdaCore/ada-spark-rfcs/blob/master/protot...


also Scala, which I'd take over Java any day


Rust has ADTs and it has exhaustive pattern matching AND it's not functional. It's arguably mainstream now.

Haskell doesn't have exhaustive pattern matching it just gives you a warning.


> Haskell doesn't have exhaustive pattern matching it just gives you a warning.

It can have exhaustive pattern matching if you use `-Werror`.

That's frequently the default in an industrial Haskell setting.

I wish that non-exhaustive patterns would be an error by default though.


> In order for a programmer to accept a language with strictly less expressive power

Sort of hard to reconcile this view with just how much more expressive the Haskells and Lisps of the world are compared to Go and Python! I've used all of these extensively and functional languages have consistently let me define more general and flexible abstractions, do a better job of reflecting my conceptual model of the domain in my code... all in substantially less code and less boilerplate (excessive imports aside :P) than primarily imperative languages.

Functional languages don't stop you from writing imperative code, they allow you to write code that isn't—and, either way, that's just one small aspect of what makes a language adaptable and flexible.


> Functional languages don't stop you from writing imperative code, they allow you to write code that isn't

Functional languages are definitely about removing things or stopping things. Nothing stops you from naming a new variable everytime in C, but Erlang stops me from mutating variables. Only stack depth/lack of tail call optimizations stops me from using recursion to loop in C, but Erlang can only loop through recursion.

And Erlang is pretty pragmatic; some other functional languages are a lot more opinionated.


LISP isn't a pure functional language, it's a multi-paradigm language that is as dynamic as they come. Its problem isn't lack of expressivity, it's probably that it's too flexible (and of course the syntax throws people off).

As for Haskell, I guess that gets into the question of how you define expressivity, which is admittedly a very fuzzy term. Here I meant it to mean capable of fluently expressing many different kinds of algorithms without having to resort to circumlocution.

Functional programming is natively supported in almost all modern languages. You call a function which calls a function which calls a function which calls a function, return the result. There's no syntactic sugar, just function calls.

In Haskell, while I've seen people argue that do notation may as well be imperative code, it isn't. It's an approximation of imperative code using Church—Turing and a liberal dose of syntactic sugar, and using it effectively still requires you to understand the underlying monadic model.


>LISP isn't a pure functional language

In the modern history-rewritten sense of the term to mean Haskell and purity.

In the past LISP was the 100% canonical idea of what a functional language is, including on HN before 2010 or so.


Lisp was never "pure", but yes, there's definitely been a shift in the meaning of the term "functional language". Which makes sense: the things that made Lisp more "functional" or better suited to the functional style have almost universally adopted by other languages. There's nothing particularly functional about it anymore.

Lisp has always been deeply multiparadigm, even if it was the harbinger of FP. It's had support for mutation and OOP since the 70s. If you wanted to (although most Lispers wouldn't), you could view that support as an escape hatch for when the functional paradigm, as expressed in Lisp, was inappropriate or insufficient to solve a problem. It's natural, then, that languages that have taken the functional paradigm and insisted on it, run further with it, are now the ones that are particularly functional. It's how language naturally evolves in cases like this; similar things can be seen with the term "artificial intelligence".


You might be confusing Lisp with ML.


Nope. ML, albeit ancient, was not even on the map on HN until after Haskell picked mindshare. I mean you could find a post here and a post there, but nothing like Lisp's posts and HN-mindshare.

And, sure, HN was from the start a Lisp-built forum, made by a Lisp-advocate, and thus attracting Lispers, but it wasn't that different on the wider internet.


But who said more or less of what on HackerNews, and in what years, bears no relevance to what is pure functional language. Lisp had mutable everything from the earliest beginning, and that has been maintained in the descendant dialects.

We literally cannot go back to a time in which Lisp was the model of what is purely functional; though it may have been used as an instrument (in restricted ways) for conveying or practically making use of the ideas.

There is something called "pure Lisp". Even such backwaters as the Webster Dictionary of Computing seem to know what this is:

http://www.webster-dictionary.org/definition/Pure+Lisp

"A purely functional language derived from Lisp by excluding any feature which causes side-effects."

I think you will rarely see an entire program written in nothing but pure Lisp. Functions or modules can be designated as being in pure Lisp. If the documentation says that a certain module is pure Lisp, that instantly sets certain expectations about what it will be like to interact with.


If you truly think "functional programming is natively supported in almost all modern languages" then you really don't know what modern FP is about.

For instance implementations of monadic control flow, effect systems, higher kinded types, algebraic data types, exhaustive pattern matching etc are all laughably unergonomic or entireley missing in most "popular"/"enterprise" languages. Just having a labmda doesn't FP make.

To me, FP in modern functional languages provides abstractions that enable much more straightforward problem solving and understandable code with fearless refactoring. Haskell and Ocaml are the pragmatic leaders here I think, with honorable mentions for things like Scala3, F#, and (kindof?) clojure.

(edit spelling)


I've spent more time studying programming languages than anyone else I know, and I understand FP very well. The argument that I'm making is not that FP has nothing to offer (I dearly hope that many more features from FP languages make it into the mainstream!), it's that FP comes with a set of restrictions on what can and cannot be done, and those restrictions do not seem justified to the majority of programmers. It is as uncomfortable for them to work with Haskell as it is to work with Rust, but unlike with Rust they can't see why they should bother.

The author of the article hopes to build a new functional programming language that will finally get traction by making a few different choices. I think that this is a mistake because a single-paradigm language will never reach mainstream adoption. Pragmatically multi-paradigm languages have historically been the only ones to get traction, and I believe that the reason is because they explicitly endorse multiple possible approaches, even if their support for any given approach isn't as strong as a specialized tool. They may not support FP as well as Haskell or OO as well as Smalltalk, but by sacrificing purity they buy pragmatism, and allow programmers more flexibility in problem solving.


> They may not support FP as well as Haskell or OO as well as Smalltalk…

The Smalltalk Collection enumeration protocol was probably the first time I'd seen anything like

    (1 to: 100) select: [ :n | n isPrime ]

https://cuis-smalltalk.github.io/TheCuisBook/Fun-with-collec...

(Now I recall, I had been required to write some Tower of Hanoi snippet in either Lisp or Prolog, and I'd used Prolog.)


> To me, FP in modern functional languages provides abstractions that enable much more straightforward problem solving and understandable code with fearless refactoring.

Maybe, but they remove other abstractions. The lack of Haskell's popularity is a clear indication that the abstractions added are not as valuable to programmers as the abstractions removed.


> The lack of Haskell's popularity is a clear indication that the abstractions added are not as valuable to programmers as the abstractions removed.

No it's not. At most it's an indication that of the group programmers familiar with Hakell's abstractions that they aren't enough to try the language more.

For it to be a clear indication those programmers would actually have to learn to effectively use Haskell's abstractions and compare it to a version in their former language that doesn't utilize the abstractions.

Almost nearly no one does this however... I can recall one instance that comes close:

http://roscidus.com/blog/blog/2013/06/09/choosing-a-python-r...

However that's not a comparision of abstractions.


> For it to be a clear indication those programmers would actually have to learn to effectively use Haskell's abstractions and compare it to a version in their former language that doesn't utilize the abstractions.

No one tries everything, there's just no time to try all the possible alternatives.

Lack of popularity may not mean that the unpopular thing is bad, but it ain't a indication that further study would reveal it to be good.

In this particular case, Haskell isn't selling painkillers, it's selling vitamins. Few who have tried it go on to adopt it - it clearly isn't solving any immediate problem.


> For instance implementations of monadic control flow, effect systems, higher kinded types, algebraic data types, exhaustive pattern matching etc are all laughably unergonomic or entireley missing in most "popular"/"enterprise" languages.

You don't need all that garbage in most languages because they allow mutation. In Haskell you can't have a class instance with a few variables so you end up with all these endofunctors, monomorphisms, and applicatives. And I would argue that, say, implementing Chaitin's algorithm, which involves filling and emptying a stack and destructuring a graph is not at all straightforward in a functional language.


This seems like an easy fix, re-release LISP with cleaner syntax. Lexers have come so far this seems like an easy fix.

I can't say i've ever read the opinion that LISP's syntax allows more expressive programming; rather LISP itself allows more expressive programming at the tradeoff of the syntax.


IMO the thing that makes lisp so powerful is the ability to extend the syntax, and this is only possible because the syntax is represented as a list.

When your syntax can be manipulated like data (in this case lists), macros become regular functions operating on regular data, the data just happens to be syntax in this case. I can take a struct definition for example and use regular, built-in list manipulation functions to walk the definition and transform it into whatever I want.

In something like Rust we can also run macros on struct definitions, but in that case you operate on a token stream, and need specialized libraries and functions to do meaningful transformations. It ends up being much more difficult but still pretty powerful.

Super powered macros is what makes lisp "lispy" to me and the "abnormal" syntax is required for that, so I do think the syntax is what makes is uniquely expressive.


Adding different syntax to Lisp has been tried many times. No attempt has ever succeeded. Lisp users basically rejected all those systems, and they didn't gain any traction in bringing new users into Lisp.

The first attempt was Lisp 2. This doesn't refer to Lisp-2 (two namespaces for variables and functions) but to a follow-up to Lisp 1.5, which was tried in the 1960's. Lisp 2, IIRC, featured an Algol-like syntax, with types, for speeding up numeric processing. It went nowhere. In the early 1970's, Vaughan Pratt (of "Pratt parser" fame, I think) created CGOL. That also went nowhere. After that we can identify numerous other things in that vein.

Here we are today and most people using Lisp still have it (looking (like) this).


>I can't say i've ever read the opinion that LISP's syntax allows more expressive programming; rather LISP itself allows more expressive programming at the tradeoff of the syntax.

The conventional idea from LISPers is the other way around: LISP's syntax allows for more expressive programming, because code and data are isomorphic (so, macros and rewriting, etc.), and any function you make looks (and is, and behaves) as "first class" as any native one.

LISP's syntax is also said to be a better and more concise way than JSON (and or course XML) to describe arbitrary hierarchical structures.


> re-release LISP with cleaner syntax.

A lot of the flexibility of lisp relies on there being no difference between a list of symbols and code (for, example, this is how the entire macro system works).

Function calls being inside the parentheses (and, really, all language constructs just being lists of symbols) is generally what people mean when they say the syntax throws people.

These two things are at odds.

You could go with Racket's method of letting people use alternate delimiters, but that's a fairly trivial reader macro in common lisp and probably not worth releasing a language over.


The macro system can just work with an AST instead of a list.


Plenty of successors to lisp have tried that. They gunk up the whole experience.


Let me know if you'd like to give it a shot to see how gunky it is!


I mean, I'd be down to try.

Best I've seen so far (outside of lisp) is elixir, but it really bogs down the entire thing by requiring a bunch of extra information and metadata that common lisp can infer by position and an extra layer of indirection.


Contact form here: https://quil.la/FVZ3PMZC Thanks!


There is no cleaner syntax than Lisp's. It is essential to Lisp's expressivity.


O-expressions seem pretty good.


Imperative languages allow you to write functional code, too, but it's clumsier. It's not the strength of the language. In the same way, sure, you can write imperative code in a functional language. But ergonomically, it's not as nice as writing imperative code in an imperative language.

If you're code is going to be 90% imperative, don't write it in Haskell. If it's going to be 90% functional, don't write it in C++.


Any programming language represents limitations placed on the underlying expressiveness of the hardware. By this argument raw x86 asm is "more expressive" than any higher level language.

The whole point of a programming language is to provide you with common patterns/designs to simplify reasoning about your program. Even base concepts like a "function" or "struct" is really a limitation on the underlying expressive power.

Immutability/purity does solve real problems for programmers: it means that you only have to reason about mutability at the margins of your application. I think people who don't have experience with functional style don't realise how helpful this is. I choose to use functional languages because it makes me a better developer - I am able to write code that I am not diligent/smart/disciplined enough to write in an imperative style.


That's not really how I - and I think most engineers - understand the term "expressiveness". In Haskell I can express "a side-effect free function that consumes a String and produces an Int", for example; it's impossible to express this in x86 assembly language. You can certainly write a block of code that has the same operational qualities, but the code itself doesn't in any way express any of that denotational information - it's all just implied or inferred, not something encoded in the language itself; you can't even express the idea of a "function", in fact!


Right, what parent called "expressiveness" is probably more accurately described as "flexibility". And flexibility is not always a good thing.


I agree completely about immutability and purity—for my own code I strongly lean that way and try to outsource as much of my thinking to the compiler as possible.

However, there are times where impurity and mutability are beneficial—whether within a single function or at the margins. Functional programming languages do not tend to have good support for these cases, because their whole thing is representing everything in a functional style.

I think this is why functional languages don't tend to get traction while the multi-paradigm languages do. Given a choice between purity and pragmatism, most engineers will pick pragmatism. The alternative is to intentionally remove tools from your toolbox, and unless those tools are demonstrably more dangerous than they are helpful that's a tough sell.

My ideal language is a multi paradigm language which has robust, statically analyzable support for objects, imperative code, and functional code.


> I think this is why functional languages don't tend to get traction while the multi-paradigm languages do. Given a choice between purity and pragmatism, most engineers will pick pragmatism.

Well, it's obvious to the working corporate-employed programmer, not so obvious to the ivory-tower theorist.

The short argument is that the entire point of a program is to take inputs and produce outputs. Making the consumption of inputs and the generation of outputs a second-class citizen in the language with the goal of "purity" just makes it that much harder for a program to perform its primary task.


> Making the consumption of inputs and the generation of outputs a second-class citizen in the language with the goal of "purity" just makes it that much harder for a program to perform its primary task.

You have this backwards. I/O is second class in every language except Haskell and languages with type and effect systems. Those are the only languages where it's a first-class citizen, which is to say that I/O computations are values that can be passed around, interpreted and otherwise recognized or operated on in various ways.

The problem is that people are used to the second class status and don't understand the additional power first class status gives you, or the patterns needed to make it extensible and maintainable.


I mean, you can write mutable code in Haskell, too. Parts of it can even look quite imperative. I think there are other cultural problems with the language, though.


>it means that you only have to reason about mutability at the margins of your application

Functional programming threads state throughout the execution of the program instead of just putting the state into variables you now have to track what data gets passed where.


Functional programming boils down to 2 important ideas.

    1) Referential Transparency - Basically, you can take an expression in your program, and replace it with the result of that expression. and your program will run 
    the same, every time.  

    2) Immutability - the idea that data cannot be deleted or changed.
Having a language/runtime that truly embodies these two principles completely eliminates entire classes of problems in regards to parallelism and concurrency.

Some languages and their implementations are very ergonomic to programming in a FP style, some are not. ( Im looking at every imperative lang that has lambda semantics and now says they are FP. )

I think people get hemmed up talking about what FP is. There's nothing in there, though about Category Theory, Algebraic Data Types, Static or Dynamic Typing Disciplines, etc.


> I think people get hemmed up talking about what FP is

You've defined purely functional programming, which seems to be the increasingly common meaning of FP. The original definitions only explicitly refer to constructing programs by combining functions:

A History of Lisp http://jmc.stanford.edu/articles/lisp/lisp.pdf

> LISP is characterized by [...] composition of functions as a tool for forming more complex functions.

Can programming be liberated from the von Neumann style? https://dl.acm.org/doi/pdf/10.1145/359576.359579

> informal description of a class of simple applicative programming systems called functional programming (FP) systems, in which "programs" are simply functions without variables.

However, Backus defines his functional programming system mathematically, which could reasonably be construed as referentially transparent.


https://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf First mention of "Functional Programming" is in the paper, "The next 700 Languages" from July 1965.

It's heavily implied that FP is programming with things that resemble mathematical objects and have the above properties, not merely just programming with "functions" so to speak.

https://youtu.be/1_Eg8KYq2iQ?t=1230 <- Great talk on what FP programming style is.


> Now that (nearly) every mainstream language has lambda expressions, there are very few patterns in functional programming that cannot be replicated in mainstream languages, which means that moving to a pure functional language now provides strictly fewer choices when problem solving

Those patterns in functional programming compose cleanly. They do not compose cleanly once you introduce imperative constructs, which is what you get if all you've done is important lambdas to your OO language, so you've presented a bit of a false choice. All you have is some surface semantics of FP but none of the deep reuse and top to bottom clean, understandable semantics.


> strictly less expressive power

We have a difference in what we consider expressive. How in this more expressive language would you represent pure functions or deeply immutable datastructures if it only offered flexible/mutable ones? More flexible does not mean more expressive.

If by expressive you mean conciseness, "Comparing the Same Project in Rust, Haskell, C++, Python, Scala and OCaml"[0] is of interest.

[0] https://news.ycombinator.com/item?id=20192645


> How in this more expressive language would you represent pure functions or deeply immutable datastructures if it only offered flexible/mutable ones

With a function that doesn't cause any mutations and a set of classes/interfaces that offer no mutator methods.

My point is that you don't need to be able to have the compiler enforce constraints in order to express an algorithm—you just write the code. Having the compiler enforce constraints is often helpful but comes with decreased ability to express other types of algorithms that rely on breaking those constraints, so the language as a whole is less expressive. In order for a programmer to accept those limitations there need to be clear advantages, and the advantages that functional programming proponents have put forward so far haven't been persuasive to most developers.

By the definition I'm working with, the most expressive languages are the ones that offer the fewest constraints to what the code can do—LISP would be a top contender. Expressivity isn't the highest virtue, but it is valuable and programmers don't give it up unless they can be persuaded it will be worth it.


> you don't need to be able to have the compiler enforce constraints

Expressing such constraints and having the language/compiler enforce them is part of what I want from an expressive language. That way once I've expressed them, I don't have to ensure those things repeatedly for the entire lifecycle of the program. We each have our definitions and values.


I should be clear that I personally agree with you 100%—I prefer to outsource as much as I can to the compiler. It's just that I also recognize that that means being unable to express some things that are otherwise possible, and that most developers don't like that trade-off.


I don't know what you mean. Just because C++ has "const", you can still make a method that doesn't mutate without the keyword. Seems like a "No True Scotsman" definition of expressiveness.


I feel like this doesn't make sense because mainstream functional languages all provide a way to express imperative code.

In Haskell you can write an entire application using IO/do blocks and it will be completely imperative. In f# you can declare every variable mut.

Functional languages don't prevent you from writing imperative code. They give you the opportunity to choose not to, and enforce it at compile time. The opposite is not true of imperative languages, there the type system cannot help you to enforce that some code is purely functional.


https://en.wikipedia.org/wiki/Rule_of_least_power is why we don't do everything with "if/goto". Being able to write "switch" or "while" or "throw" names the behavior intended (and especially what's not intended) more clearly than one construct that could do anything and appears everywhere.


Here is a completely pure function in any imperative language:

  int add(int a, int b) {
    return a + b;
  }


But the purity is implicit. Nothing will stop you from refactoring add into an impure function. As a programmer you have to maintain this mentally by remembering which types are pass by value and which are pass by reference.


Yes, this is a great example of a pillar of functional programming, pure functions. What is concerning is that your code example, like other pure function examples out there, implements a math operation a first grader can perform.

It's exceedingly difficult to find a pure functional example that operates on tangible domain-specific data of objects full of state.


It's not at all difficult (https://pandoc.org/). However, I'd hazard a guess that pandoc written in Rust or C (with idioms appropriate to those languages) would be much more performant.


I’m not as sure, several optimizations are only possible if we know that they can’t/won’t have side effects, and while the compiler can see it after inlining, having it part of the signature does help. Haskell is quite impressive on the performance front.


The fundamental difference between FP and imperative/OO is that in FP, equality is easy, state is hard. In imperative and OO, the opposite is true.

The restrictions that are in FP are there because it comes from the standpoint that equality should be easy.

In a parallel paradigm, state is hard, period. You look at how people use Spark, for example, and it's kind of funny that they are effectively doing FP using an imperative language.


Go solves no correctness problem regarding parallelism, it is as chock full of foot guns as any other multithreaded largely imperative language.


I would credit this model for multiprocessing to LISP family (Scheme/Rackt/Clojure) before golang. (Probably someone else would mention an earlier references :)


Yep, there's a reason why I credited Golang with mainstreaming the concept, not inventing it. The model goes back quite a ways, but Golang brought it into the mainstream.


> provides strictly fewer choices when problem solving.

If you view programming as trying to avoid writing the "wrong" program from the state space of all programs, this is a good thing.

I guess it's a matter of priorities/viewpoints: whether your goal is to make writing the correct program easier, or make writing incorrect programs harder.


I'm a bit confused, because I don't see the commonality between Go and Project Loom other than lightweight threading. Go's whole concurrency model is still fundamentally different from Java as it went all in on CSP, and Project Loom isn't.

Beyond that, nothing about Project Loom fundamentally changes the design of all but the most performance sensitive concurrent programs in Java.

For most of us plain programmers that Go was targeted at anyways, a bucket of OS threads is just fine. Now they will be a bucket of virtual threads, which will be nicer, but not exactly game changing.


Go didn’t really go all in on CSP. It provides primitives for shared memory concurrency as well, in addition to message passing. And if you have messages that are not fully immutable, the producer and consumer have to use mutexes to avoid inconsistency. No different from Loom.

What it does better than Loom is that all system APIs are savvy about goroutine yielding


> there are very few patterns in functional programming that cannot be replicated in mainstream languages

Well, off the top of my head, the only language that can do something like Phoenix Framework's "LiveView" is... Elixir (and by association, Erlang/OTP underneath), and... that's it, to my knowledge, because of these reasons: https://news.ycombinator.com/item?id=35026508


> He spoke of the potential for functional languages to provide a significant, intrinsic advantage when it comes to parallel computing.

> (...)

> If that were true, you'd expect that the many existing functional programming languages would have already satisfied this need. But in my opinion, they haven't

Well there is https://futhark-lang.org/ - it runs on the GPU, and is awesome.

On the CPU side, I think that Rust plus https://github.com/rayon-rs/rayon was a huge breakthrough on writing parallel programs using both functional and imperative programming, and future languages should learn from its successes. The ownership system & the borrow checker, plus other type-level features like the Send and Sync traits, were essential to enable sharing read-only data between threads without synchronization, or sharing read-write data with synchronization, all checked at compile time for data races (which is a huge problem to solve, and is something that neither Go nor Java protects against at compile time)

Indeed Futhark shares a key feature with Rust: it uses uniqueness types to enable in-place updates, which is kind like a limited form of Rust ownership: if you are the sole user of some memory, you can update it and other code will be none the wiser. This kind of thing is very important to build functional programs that are performant in practice.


See also the Higher-Order Virtual Machine for a functional, non-garbage-collected, and parallel runtime: https://github.com/HigherOrderCO/HVM


Hey, that's pretty cool! The lang that targets this runtime is https://github.com/HigherOrderCO/Kind


Fair point

It may be better to see rust as sitting between functional and imperative paradigms


I've been programming for 35 years or more, and I've yet to come across situations where the hard problem was the language.

Sure, there are situations where I've been grateful for the ability to use Python (because everything is so easy), or Perl (because sometimes its the right tool for the job) or Lisp (because its slippery fluidity just feels right) rather than the C/C++ that I've generally used during that time.

But the hard problems I've faced writing code would (almost?) never be made easier by using functional programming, or Rust, or Go or Swift or Brainfuck. The problems are hard because the problems are hard, and typically for me over the last 23+ years, it has been the combination of generically tough programming problems with performance requirements that ultimately make the choice of language mostly irrelevant.

Sure, someone could offer me a thread-safe sparse integer-to-integer mapping container in another language, but then I have to wonder about what design assumptions were made and how performant it is, and if the language was clearly created to prevent me from ever doing such things myself, then I'm going to be deeply suspicious from the outset that it could possibly have my (coding) interests at heart.

I understand that I'm not the typical programmer these days, and I don't work on entirely typical problems, but I can't help but feel that sometimes the focus on "languages to help programmers" comes from programmers who just don't enjoy their work enough and/or don't have enough to do.


I've been programming for 35 years or more, and I've yet to come across situations where the hard problem was the language.

The hardest problem is usually something else, but IME some languages make it much easier to solve a given hard problem than others. Some languages are qualitatively safer. Some languages are much more expressive, in the sense that I can implement my solution with far less code/time/cost without compromising anywhere else.

There is a difference between whether I can solve a problem and how efficiently and pleasantly I can solve a problem. I find choice of language rarely affects the former but often profoundly affects the latter.


100%. But that's why I mentioned performance. I work with realtime audio, and this puts most of the "pleasantly solve the problem" options off the table.


It’s true that for genuinely performance-sensitive applications the options are more limited, but there are worlds of difference between assembly language, C and Rust, for example. There’s still a place for each of them, but other things being equal, Rust wins by a wide margin on both safety and expressiveness.


Might be true, but since the hard problems rarely involve safety or expressiveness, it's not clear what the benefit is.


I’m very curious about what kind of hard problems you have in mind where an implementation wouldn’t benefit from better safety and expressiveness. I’ve done my share of both mathematical and systems programming. In both cases I’d say eliminating types of bug that typically come with low-level/high-performance code was highly relevant. Also, quite often my biggest practical frustration was trying to iterate on a solution while language barriers made implementing each new attempt tedious in some way.


I gave 3 examples in one of my replies in this subthread: https://news.ycombinator.com/item?id=35036176


databases, haha. That's where modern programming throws most of the hard stuff nowadays.

in fact all of the hard parts of modern programming can be characterized by a single problem that all of these languages like go and rust are attempting to solve:

   Shared mutating state.
Really the parent and everyone here is just complaining about this one singular problem and how no language has completely solved it yet with a method that's Easy and safe.

In fact all these new languages have a sort of similar primary purpose of attempting to abstract this problem away.


That is not the "single problem" I am complaining about.

We already good methods for dealing with this. In fact, we have several, and what is often the hardest coding question is when to use which one (e.g. RCU versus mutexes).

There are plenty of other "hard" problems in modern programming that have nothing to do with shared mutating state (unless you broaden the definition of that so much that it becomes essentially meaningless).

Here's one example: resolving the tension between cache locality and time-varying data access patterns. Here's another: zero-copy data pathways between user space and hardware. Here's another: resolving the relative costs of fundamental GUI implementation choices (e.g. retained or immediate mode drawing models; pre-rendering components or using vector graphics; absolute positioning vs constraint packing)


Shared mutating state.

Really the parent and everyone here is just complaining about this one singular problem and how no language has completely solved it yet with a method that's Easy and safe.

If my comment was the parent you’re referring to, that isn’t really my position at all.

Certainly there are challenges that come with shared mutable state, but we also have some good tools for dealing with those already. There is a more general problem of managing side effects and external resources, where I think there’s room for better language support, so that might be a better example.

The biggest single frustration I find with many popular programming languages is that once I’ve figured out how to solve the real world problem we’re dealing with, I want to implement that solution as fast as my mind can think and my fingers can type. I don’t want to get bogged down with wrangling general purpose data structures and algorithms, or wondering how to implement an idea with whatever networking or database or mathematical library we use, or writing a screenful of infrastructure plumbing to talk to a cloud service, or trying to figure out all the possible ways that a function I’m calling might tell me it failed and hand-crafting a few lines of boilerplate for every single one.

I haven’t found a language and ecosystem yet that I feel give me that, even though I’ve found many features in reasonably well-known languages and libraries and tools that individually make good progress in that direction.


I love to swing a hammer as much as the next programmer. But if you offer me a nail gun, even with slightly lower precision, I will happily use it the majority of the time and revert to my hand tools when it's most appropriate. This is about developing force multipliers and producing leverage, not avoiding the craft.


I would say that a better analogy is that you've already got an experienced crew, some swinging hammers, some lugging nail guns and everyone's really quite good at what they do. But the plans from the architect are just really hard to build. You and the crew can do it, for sure, but the issues are not going to better or worse if the guys use the table saw to crosscut some of the framing elements or someone gets one of those new german tool widgets. I mean, sure, the tools might make a small difference to the process, but the overall experience will be dominated by the fact that the thing is just hard to build.


Let's keep going with the building analogy. You're running a cabinet making company. You've got all kinds of hand tools and power tools. You build jigs to make certain repetitive tasks faster. Then a ridiculously difficult design comes in for you to build. You and your crew are flummoxed by its complexity. Suddenly, someone offers you a CNC machine that you've never used before. What seemed hard is now easy. The nature of solving problems with a CNC is different. Using a CNC presents other challenges. But you have entered a new realm of what's possible. Analogies are a lot of BS but hopefully that gets across the flavor of what I'm talking about.


It's a fine analogy. The only problem is that I don't recall ever coming across the equivalent of a CNC machine for the class of problems I face in my work.

Thread-safe lock-free sparse integer-to-integer map? No CNC for that.

Translating time between two domains, one of which is linear and monotonic, and the other is non-linear and non-monotonic. No CNC for that.

Generating and caching the right versions of different segments of audio waveforms at different zoom scales, in multiple threads? No CNC for that.

I could go on, but you get the point.

What tends to be more like a CNC machine are libraries. For example, realizing that you need some sort of reference-counting system for lifetime management, preferably combined with pointer-like behavior ... and then discovering boost::shared_ptr (later to be std::shared_ptr) ... now that's like getting a new CNC machine. But it doesn't require a new language (and realistically, it didn't even require the library - the library just made it possible to not implement it locally).

I think what I'm really trying to say is that I rarely come across problems where I think that the kind of help offered by the putative "new CNC machine aka new language" is anywhere nearly as substantive as the help offered by an actual CNC machine to a cabinet making company. Put differently, the new tool (language) still leaves the problem essentially as hard as it was before.

p.s. a good friend runs a high end wood shop, and I'm fairly aware of the impact their first CNC machine made to what they could do.


When the web became popularized there problem of writing really fast concurrent servers that can handle 10k connections without the overhead of 10k threads.

This problem is arguably harder than the example problems you gave and this problem was solved by language primitives that now exist in basically every popular language. These primitives, when used basically change the nature of the language they are used in.

These primitives (async await) are more than libraries. They intrinsically change the nature of your code. (Though technically they could be made into libraries for languages without async await it's just the syntax would be extremely busy)

This only occured because the web was popular and the specific problem of servers and IO changed from a specific problem to a general one. So when someone wants to create a new language it's to attack a general problem.

Your issues in your example look to be somewhat domain specific, so new languages won't really help you in these specific areas you need to handle.

>Generating and caching the right versions of different segments of audio waveforms at different zoom scales, in multiple threads? No CNC for that.

I would say that for this example there are enough general issues here that modern languages CAN help you with. For example do you want to program in a language that can guarantee with helpful static error messages that your code will never have a data race or a seg fault or a buffer overflow or a dangling pointer?

Well there's a language that can help you here. In the same vein I've seen languages go even further then this and guarantee that the compiler will never ever let you write code that will make your program crash.

I think we can both agree that these general features that improve safety WILL make the issues you face easier.


Based on my experience (and I was doing web stuff starting in 92), this is a mistelling of the tale you're trying tell.

The problem was called "the thundering herd": if you had N threads all sleeping/waiting on a condition, and then that condition was raised/signalled, there were no OS primitives available that would wake only a single thread. Instead they all woke up, tried to get whatever work was available, only one succeeded, the rest go back to sleep. Incredible waste of cycles. These days, you can signal a condition in a way that will only wake a single thread that is waiting on it. Problem solved, for every language, without language modifications.

This was NOT fixed at the language level. It was fixed by adding new OS-level primitives that did the right thing.

Async-wait is another wrinkle in this, but for those of us old enough to remember life before pthreads, that was already effectively taken care of using threads (whatever the API) and existing OS-level sleep/wait primitives.

Doing this without threads is popular among the cool kids these days, but that's even harder than doing it with threads. Consequently, various languages have wrapped this sort of code into builtins in the language. Yes, that makes thread-less async wait easier to code, but it doesn't actually address the design problems where you might be using thread-less async wait to accomplish something.

The problem with waveform caching is not data races etc (though of course, those issues are hard enough). It's figuring out what you should cache and when. The best answers vary depending on user behavior, so you need an adaptive approach that isn't particularly linear, and you also a need way to recognize when user behavior means you should clear out everything in the cache and start over.


Your recollection is spot on.

Re. CnC, how about something like:

    @thread-safe @lock-free @sparse
    map map[K,V] 
(Let's not get hunged-up syntax.)

> What tends to be more like a CNC machine are libraries.

If a language or library gives you composable semantics, you may have a programming CnC on your hand. CnC requires minimally one degree of disconnect between the artefact and the artisan. A language compiler/runtime (or library) that applies composable semantics to logical & computational abstractions.


> (Let's not get hunged-up syntax.)

But the syntax is precisely the only thing a language can give you that a library cannot.


But the syntax is precisely the only thing a language can give you that a library cannot.

This might be true on a similar level to all Turing-complete languages being equivalent, but I’m not sure how useful it is beyond that level. For example, the features that a language provides as directly supported building blocks and the guarantees a language makes about how certain entities will or won’t behave and relate to each other profoundly affects the developer’s experience. That remains true even if some of those features could eventually have been recreated with a library modulo syntax and even if a perfect programmer would always use them properly and never rely on the language to prevent a mistake.


I said don't get hung up on it precisely for that reason. We're not ignoring it, just simply noting that the magic is not in syntax.


Right, but that means it has to be implemented by someone. It could be a language builtin, but almost nobody is going to switch languages for such a feature (if it could even exist as a language feature anyway). Or it could be a library, in which case the question of better languages is again moot.

The idea that the compiler could somehow pick "the right" implementation of a sparse unordered map based on a list of constraints that combine to create potentially dozens of versions strikes me as far-fetched. Even specifying "lock-free" for example is very, very far from providing enough detail about what is actually required. Wait-free? Readers-not-blocked-by-writers? Writers-not-blocked-by-readers? etc. etc.

I don't dream of a future when compilers (or something) can somehow do all this, and I'm not convinced it will arrive. But I've been (very) wrong before.


Fair points. It would need to be a meta langauge of sorts, with compilation stages, stuff like that. Maybe a compromise can be reached with customizable compilers. PaulDavisThe1st's meta-compiler may know which sparse lock-free doodad you want. Even a CnC machine can't just spit out anything under the sun. Key thought here is industrialization of software production. It will (has to) happen (not that I'm pining for it :o) And our little chat here is whether programming languages will have a role to play here.


I'm referring to a similar overlapping problem called C10k. http://www.kegel.com/c10k.html

This was fixed at the OS level. But the usage of new system calls involved fundamental shifts in basically every language to account for the new paradigm.


> I've been programming for 35 years or more, and I've yet to come across situations where the hard problem was the language

The language can cause hard problems by itself. Arguably dynamic typing has been partly responsible for driving microservices beyond what's reasonable and thus introduced accidental complexity, largely because they don't have a typing discipline that enforces proper module boundaries.

That's the most obvious example to me, but far from the only one.


As a developer I would agree. As a maintainer I would disagree.

Some problems require paradigms to solve, and developing bespoke implementations makes maintenance extremely painful.


I've also been doing this for over 30 years and agree with you 100%. Never has the language been the impediment, or the thing keeping me awake at night.

We don't need more programming languages. All languages are either hated or not used. The language isn't the problem and it a poor craftsman that always blames the tool.

I've never been on a retrospective of a failed project and the reason being the language.


> The problems are hard because the problems are hard

I would respectfully submit that this is exactly where functional programming shines.


I would respectfully submit that you completely missed Paul's point.

He's doing real-time audio processing. So, first, he's got performance issues. In that environment, it can be very helpful to re-use a buffer without having to allocate one (and without a garbage collector). For most functional languages, that's not a place where they shine.

But, second, he's (probably) got problems like building echo cancellers. The hard problem is, how do you determine whether something is an echo? Writing it in an FP language doesn't help much with that, because the hard part is the definition (and tuning) of the algorithm, not the implementation.


> He's doing real-time audio processing. So, first, he's got performance issues.

FFTW, long the fastest FFT software available, is a program written in OCaml that generates C. You need to think more outside the box!

Another example: Copilot is a Haskell EDSL that generates a hard real-time C program for avionics.

You can write programs functionally and get your imperative performance too.


>He's doing real-time audio processing. So, first, he's got performance issues. In that environment, it can be very helpful to re-use a buffer without having to allocate one (and without a garbage collector). For most functional languages, that's not a place where they shine.

It can be via streaming. The functional portion of the program does all the map and reduce operations on streams. Buffer allocation (and reuse) is automatically handled by the compiler. It's very similar to the way NodeJS hides the event loop. I'm not familiar with audio stuff but I'm sure there's libraries out there that allow you to do extremely efficient audio processing with high level language primitives.

>But, second, he's (probably) got problems like building echo cancellers. The hard problem is, how do you determine whether something is an echo? Writing it in an FP language doesn't help much with that, because the hard part is the definition (and tuning) of the algorithm, not the implementation.

Neural networks can be employed here to "recognize" echos. NNs are basically functional primitives with deterministic inputs and outputs. FP is remarkably appropriate here given that audio is basically a function.


> I'm not familiar with audio stuff but I'm sure there's libraries out there that allow you to do extremely efficient audio processing with high level language primitives.

I am extremely familiar with audio stuff, and I can assure you that there are no libraries out there that solve the problems you face when writing a digital audio workstation.

Doing non-realtime processing of file-based data? Of course! Oodles of stuff for that, and you can write it in whatever language you want.

Realtime processing of 240 tracks of audio feeding 64 channels of physical output, with hundreds of plugins? Sorry, but no, you can't do this "with a library" or in "a high level language". This, however, is what DAWs do.


Well, there are certainly use cases where purely functional languages (at least the ones that currently exist) aren’t the tool of choice… But in the grand scheme of things, like you yourself pointed out, these types of problems are not what the vast majority of engineers are working on.

I had assumed that initially you were referring to the difficulty of the “domain problems” when I wrote my comment.


I'm a Haskell fanatic, even wrote it professionally recently. I have also done a fair amount of small-scale audio programming, including trying to write a basic "soft" real-time-keeping app in Haskell, and it's not a great fit. I would probably choose Rust for anything that had to interface with hardware or tell time at this point, and that's largely because it would probably be easier for me than C/C++, which seems like the obvious choice based on ecosystem and prior art.

I think these kinds of comments made by FP advocates hurt FP advocacy more than they help, by implying that FP advocates can't see the forest for the trees. We are not in a world where the answer to ever programming problem is "throw FP at it!" We will never be in that world, no matter how awesome FP is, and I'm a huge fanboy.

Go take a look at the Ardour (http://ardour.org/credits) codebase and ask yourself in all seriousness how you could approach a project like that, what the tradeoffs would be, etc. and then ask yourself how serious any of your suggestions sound to someone like who you're responding to.


I've written haskell that interfaces with hardware. I've also done real time video processing with C++ which is magnitudes of more data then audio.

This by the way is 15 color cameras of extremely high resolution video so it's not trivial. At this level of data you can't expect real time performance without a GPU, so I go straight into CUDA for massive parallel processing performance.

You can think of each pixel in a video as 3 channels of sound, multiply that with HD resolutions and you have thousands of channels of sound flowing through the system at the same time in REAL TIME. 15 * 1080p * 3. Our pipeline also involves real time ML model processing and object tracking all written in C++/cuda.

This of course is across hundreds of similar edge devices all which feed relevant data into the cloud for further processing.

I am not an FP extremist. I work in systems programming which is pretty much the opposite of FP. I am simply stating a specific point here in that I think audio programming for FP can work. I respect the authors comment and I left it at that, even though I'm positive it's possible to write an FP language/framework that handles this specific use case of writing daws. It's basically a massively parallel streaming framework not so dissimilar to what I work on.

The actual processing is easily a functional issue, and the low level memory details can definitely be abstracted away. In fact, it's LARGELY what a daw is when you think people who write plugins.

What really pissed me off about your comment is the superior attitude. Like the "learn how to respect the experts NEWBIE". Seriously I hate shit like that. Who the hell are you?


Yeah, I'm not into expertism either. Even if I'm the expert (or maybe especially if I'm the expert).

Anyway ..

> In fact, it's LARGELY what a daw is when you think people who write plugins.

This is the problem though. Yes, that's how people who write plugins think about the host. But it's nothing really like the actual internals.

Comparing video and audio requirements is rarely fair and balanced. I tend to only be interested in realtime processing, not offline. Video involves far more data processing, but also missing deadlines is rarely noticeable by a human (unless you really miss a bunch of them). A single missed audio deadline (even 1 sample) is audible to any human. That's just a reflection of the different ways our eyes and ears work. So it's a bit of complex thing to try to compare the loads.

It's relatively easy to design and implement a parallel streaming framework, especially one without GUI interaction and realtime requirements. Most of the hard stuff comes from the user interaction side and the requirements it imposes on the realtime side. An engine that just takes a set of processing instructions and applies them as fast as possible to as much data as possible is significantly easier in a number of important ways.


I definitely agree. I think somewhere along the way we went from solving problems to bickering about _how_ to solve problems. I feel there are now plenty of languages now that exist purely for aesthetic reasons (C++, Swift, etc) rather than having some deep technical reason for existing (C, Rust).

I guess I get it though, designing your own language is great fun and I’ve definitely made a couple myself in my spare time.


>I understand that I'm not the typical programmer these days, and I don't work on entirely typical problems, but I can't help but feel that sometimes the focus on "languages to help programmers" comes from programmers who just don't enjoy their work enough and/or don't have enough to do.

Indeed. There's a subset of programmers who like to solve "problems" and there's another subset of programmers who like to actualize their intent with minimal exposure to problems.

The later subset is the overwhelming majority both for what programmers themselves want and for what users want out of their programs. Solving "hard" problems is simply a niche desire.


I came across a quote the other day that is a far more accurate summary of the experience I've had:

"We don't do hard things because we like doing hard things; we do hard things because we thought they would be easy"

Certainly I naively believed that writing a digital audio workstation would be relatively easy. That was 23 years ago and I'm still writing it.


> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

I hope he is also questioning the premise that functional programming is the reason why functional languages are unpopular. Every time I wanted to dive into functional programming I was very quickly frustrated with the abysmal tooling around them and while that didn't really prevent me from trying, I simply found a better use for my time.

Most recent example: Clojure wanting me to choose one of multiple package managers before starting my first project. Before that it was package management nonsense in Haskell, and before that some Scheme interpreter not being able to even give a stack trace on errors.

IMHO a really underrated feature of Rust is how friendly its tooling is towards its users. I'd really like to see something like that for functional programming.


> I'd really like to see something like that for functional programming.

Elixir has awesome tooling and developer experience, its over 10 years old built on a vm thats 35 years old.

Has communities dedicated to domains around embedded systems, distributed/fault tolerant system, web development, machine learning, data pipe lines, ETL, etc.

It's just not in the ML or LISP families.


Yeah, if you're looking for a functional programming language and can live without static types, you need a pretty strong reason to look further than Elixir.


...and if you want the static type checking before compile time with a C like syntax on the Erlang Virtual Machine, Gleam is pretty compelling.

https://gleam.run/cheatsheets/gleam-for-python-users/

https://gleam.run/cheatsheets/gleam-for-php-users/

https://gleam.run/cheatsheets/gleam-for-rust-users/


The question "functional programming is great, why isn't it more popular?" predates package managers by decades. That question is older than Linux, older than Haskell.

Best of luck with improving tooling, but it's not the answer.


> IMHO a really underrated feature of Rust is how friendly its tooling is towards its users. I'd really like to see something like that for functional programming.

> abysmal tooling around them

F# has amazing tooling.

Debugging works out of the box in Visual Studio and Rider, and it's impure enough I was able to get comfortable quickly and .NET core has a crazy amount built in (e.g. no fuss JSON serialization). I ported a project from C# to F# and the old bash/powershell scripts to publish a standalone executable are now just F# scripts you run with `dotnet fsi`.


As someone who's dabbled in Clojure: the project manager scene (lein vs boot vs deps CLI) is not nearly as bad it is may seem. They are different project workflow runners, not completely different ways to write code and use dependencies. They are not radically different in their core job of taking your code, downloading dependencies and running it.

If you're a beginner, you can choose any one and there's very little you're missing out or getting locked into. You only need to worry about the choice when you start having complex development workflows.

In comparison, the JavaScript ecosystem is much, much more divided. There are different ways to write modules, different ways to build them, different package repositories, etc..


> If you're a beginner, you can choose any one

I found this to be false in practice: I did choose any one (lein actually, because my IDE suggested that it's the one you'll want to use when you don't know better). The consequence was that the examples from the Clojure page didn't work and/or printed totally different output, because they made a different choice.

It's the kind of thing where a Clojure expert would say: they are different tools, of course their output is different, but it doesn't matter because that's only the tool output and the net effect is the same. But in practice, a newbie will struggle solving very simple problems because the tutorial doesn't match reality.

Contrast that with the "rust book" (more or less the official tutorial) in which every single example behaves 100% exactly as described in the tutorial, from installing the toolchain to very complex, language-specific issues (yes, of course I'm referring to the borrow checker).


While I do think an opinionated template would be good; I do also hate how some languages pretend you don't need a package manager.

Back when code was shared more in ideas than implementations, this was truer. All too often today, first thing you need to do is import a package.


Great examples. Another question that leads to frustration is "can I use my existing code editor for this"? People are turned off when the answer is "no"


What examples force not using an editor? Many have a "natives use this" choice, but few lock you to it.


Some parentheses languages are much easier to use if you're familiar with a structural editor, for example.


All languages are easier that way, though? My kids learning python are best with a ton of hand holding. Affordances that show the structure are huge.


Agreed that seeing structure is important. I mean operating on it at a higher level like https://calva.io/paredit/


This makes sense, but I don't think that is at all required for working in lisp and friends. Just as many refactoring tools are not required to get started. Certainly useful if you get running with them.


Just choose Clojure deps

I kind of agree some more clarity would be helpful but we don't have a lot of enthusiastic young programmers in Clojure land who like making super accessible stuff

It's just kind of assumed you'll put the work in because Clojure is different enough that you probably have a good reason to be here

Btw if you want anything Clojure Google Clojure Slack and they'll help you there


You’re 100% correct, I think. But it’s notable that in that case, an extra functional programming language would make things worse by dividing effort, not better.


> once asked Guido van Rossum (the creator of Python [...]) [...]

Uh-oh...

You know, I was just programming in Python yesterday, and I was once again reminded of how ANNOYING a language it is... how annoying it is that it's so popular that I'm essentially forced to code in it.

Especially now that a bunch of (probably needed) changes have been made in half-assed ways that interact clunkily with its original rather, um, idiosyncratic structure, syntax, and philosophy, which for a long time were determined almost entirely by one person's idea of what was intuitive.

I mean, it's not JavaScript-level annoying or Perl-level chaotic, but it's a pretty damned annoying language.

> Although our tools have continued to improve (such as golang),

Uh-oh...

Another language that grabbed some popularity by fixing some things, but still fails to systematically Do Things Right, even though we're now like 50 or 75 years into this whole computer thing, and are at a point where we ought to have a mature understanding of what Right is and the time to act with deliberation.

One of the ways it grabbed that popularity was, of course, by looking familiar and unintimidating to people who don't like "weird languages".

> By making different tradeoffs, these issues can be mitigated and a more broadly-appealing functional language can be built.

Uh-oh...


I'd rather use JS than python, since at least JS doesn't actively try making programming in a functional style worse


Can you explain how python is annoying?


> The hypothesis I aim to test with a new programming language is: By making different tradeoffs, these issues can be mitigated and a more broadly-appealing functional language can be built. With such a language, I hope that the true promise of functional programming for parallel computing can be widely realized.

Proceeds to give zero indication how to achieve this.


It basically comes down to nondeterminism. Pure functional languages supposedly make it easier to exploit associativity, commutativity, and distributivity to allow many expressions to have nondeterministic semantics that are amenable to concurrent execution. Argument evaluation also can be specified nondeterministic so the runtime is free to execute something like the following with two or more CPU threads:

  f(g(), h())
Of course a well designed imperative language could also avoid overspecifying deterministic semantics and open itself up to similar optimizations, but for various reasons that hasn’t caught on. One example would be specifying the conditional to select a branch with a true guard nondeterministically rather than deterministically in the order of appearance in the program text. Dijkstra’s EWDs give many practical examples of why one might want such a thing.

Another example would be deducing when statement composition is symmetric and allowing nondeterminism when it is. For example if f and g don’t have access to each other’s data then the following can be executed in two threads even with mutation:

  f(); g()
Side effects will appear nondeterministically though which may or may not be a problem. As always an algorithm needs to be designed for the semantics the language has not those the programmer wishes it does.


This makes a good case for effect systems in conjunction with FP, so they can be reasoned about as well. Something like https://koka-lang.github.io/koka/doc/index.html

Having said that, having something like Daedelus in there as well would give a handle on temporal-based determinism, and then some allowance for delimited continuations would top things off nicely.


Very disappointed by the non-announcement. If it were a corp, it would be FUD.

I'm curious as to what these different trade-offs are.

> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

My conclusion is because most everyone is taught to think empirically and 'play computer' that functional doesn't fit in their preconceived brains. I once had someone say SQL didn't make sense because it's SELECTing fields (i.e. vars) before they're defined/assigned later in the (binding) expressions.

People having trouble with FP is like how people who learn databases using ORM's don't deal well with SQL as expressions of set operations.

There are varying limits to how much people like and handle abstract constructs before applying the construction to the data at hand but you can do a whole lot of effective and unsurprising development before getting to such esoteric levels.

Edit: I was super-excited to open the link, and now on re-reading expect this new FP language to be as much an advancement as Go was to 'systems' programming. I hope I'm mistaken. Also odd that only Haskell and Lisp were mentioned when there are so many others. On the plus-side if Google were to make F# and back it as well as they have Go, that would be a win-win.


After reading their other post "The case for dynamic, functional programming", I would classify this as FUD, or rather NIH. Go got support by being touted as "Not Java". I can't say that this would gain much by being declared as "Not Elixir".


> it would be FUD

I thought FUD stood for Fear Uncertainty Doubt—it's usually used to indicate that something is designed to sow fear for clicks or for some ulterior motive, not for insubstantial non-announcements.

Did something in this post evoke fear?


The point of FUD is usually to carve out a space where existing products/solutions can't compete with idealized vapourware. I updated my opinion to NIH (though could be both)--and still hope to be mistaken.


In his defense the title is "why" not "how". I guess you'll have to spend some sleepless nights waiting for the follow-up.


I'm sorry to make you wait.


> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

I'm sure there are a multitude of different reason and likely require different conflicting tradeoffs to address


It's true and that is one of the largest challenges. Producing something that feels familiar (but not error prone) has been the way I've solved such conflicts so far.


It's a four paragraph blog post answering the question "why?". I'm not sure why you expect him to delve into the how at this stage: he probably doesn't have anything fully formed yet.


Thanks! Right now the language's codebase is about 50KLOC. There's an interpreter for fast iteration and a compiler built on LLVM for producing native binaries. I'd like to make more progress on the parallel computing functionality before sharing it more widely.


He's going for apple/iphone style of drip feed product announcements to build excitement

Look at us salivating for his big unveil -- we've probably already written more than he did


https://twitter.com/haxor/status/1582878447134572544?s=20

Looks like the syntax is taking shape at least.


No language syntax needs return in 2023.


How should an early return if statement work instead?


When if is an expression early returns aren't really a problem.


I understand what you mean. But I expect most potential users want explicit returns because they're familiar and more approachable for beginners.


I disagree. You make two claims: familiarity, and beginner friendliness.

For the first, I think it's a mistake to perpetuate the mistakes of the past. I don't think JS developers have had problems adjusting to the lack of return in arrow functions.

Beginners have no preconceived notions of how a programming language should operate, and return makes the language model more complex.

1 + 1 evaluates to 2, but (in a language that requires return)

def foo = 1 + 1

foo

doesn't for no good reason. This breaks the simple substitution model of evaluation.

These claims could be addressed emperically, but I guess neither of us are going to do the research. :-)


For a very simple, single-expression lambda function I agree you don't need an explicit return. Even Python skips the "return" for lambdas. But for anything more complex, I find explicit returns, especially early returns, makes the code much more readable for people who are used to imperative languages.

For example, which of these is more clear to people who don't know Lisp? I'd argue the second one because of the early return if guard.

    ---- THIS ----

    (defun sum-helper (items total)
        (cond
            (items
                (sum-helper
                    (cdr items)
                    (+ total (car items))))
            (t total)))

    (defun sum (&rest items)
        (sum-helper items 0))

    (print (sum 1 2 3 4))

    ----- OR -----

    (defun sum-helper2 (items total)
        (if (not items)
            (return-from sum-helper2 total))

        (sum-helper2
            (cdr items)
            (+ total (car items))))

    (defun sum2 (&rest items)
        (sum-helper2 items 0))

    (print (sum2 5 6 7 8))


I think I'll write a blog post in response, as I believe this illustrates a much deeper and more important issue.


Great! Please send it my way so I don't miss it. Thank you


I have a draft up here: https://noelwelsh.com/posts/fp-is-based/

(It should be readable but hasn't had a final editing pass.)


The biggest thing I find myself missing from functional languages isn’t the purity or functions - it’s the clean expression of immutable data types combined with exhaustive pattern matching. All I want is algebraic data types and the ability to use the compiler to check my work.


Most new systems languages have this now.


Yes and I’m a huge fan of the movement towards those kind of type systems. I was more commenting on this because I feel like those kind of types frequently get lumped in FP vs OO when there’s more nuance to it


This other "The case for dynamic, functional programming"[0] post may offer hints. So I suppose it will be more like Elixir/BEAM.

> Theoretically, functional programs should: have fewer bugs, be easier to optimize for performance, allow you to add features more quickly, achieve the same outcome with less effort, require less time to get familiarized with a new codebase, etc. With a dynamic, functional language you could enjoy all of this simplicity.

In my experience use of static typing has fewer bugs and better performance than my usages of dynamic languages on all but small or short-lived projects.

I suppose this will round-out Google languages: Go, Dart, and _, until they make their own scripting or an actual systems language.

[0] https://www.onebigfluke.com/2022/11/the-case-for-dynamic-fun...


> In my experience use of static typing has fewer bugs and better performance than my usages of dynamic languages on all but small or short-lived projects.

In so far as I am aware, research doesn't seem to provide compelling evidence one way or the other here.


Without articulating what particular tradeoffs the author is balancing and sounding like they only recently realized/discovered the parallelism advantage possible to functional programming languages and paradigms, it is hard to know for sure, but I might venture a guess that Julia will be superior to what they are building: https://julialang.org


Parallelism is concurrency (+maybe a little SIMD) done for speed's sake.

Haskell is by far the best programming experience I've ever had (all aspects included), and it's damn fast. It's just not C/C++/Rust fast in nearly all cases.

Having said that, going functional for 'parallelism advantage' (speed) is a fool's errand (for now).

The same thing is said about how incredibly fast JIT has become. (But AOT is faster!)

Want speed? C/C++/Rust.


The conversation happened in ~2008.

Julia has a lot of great ideas in it for sure. Why hasn't it gotten more popular?


When it gets out of its data-science pedigree and can be used for standard apps. When it gets good tooling. When its stack-traces stop being arcane gobbledygook. When it can be compiled to a single binary at the command line without going through arcane hoops. When it gets interfaces/protocols/traits.

At the moment, Julia is a nice language at v1alpha1 for scripts and data exploration.


I agree it lacks all of these things, but in scientific computing - at the current state - it is already ahead of fortran, python, and C++ in terms of convenience. Precompilation doesn't matter as much here, and the packaging and JIT compilation as well as relatively simple FFI are making one's life a lot easier.

And NB, python also doesn't offer many of these features, such as dependency management and simple single-binary builds. Yet it's popular.


"And NB, python also doesn't offer many of these features, such as dependency management and simple single-binary builds"

Python has tools for both of these - including virtualenv in standard Python. Nuitka if you want native compilation to a single binary. Julia has none.


> Julia has none.

Weird, since we ship natively compiled single binaries with Julia to customers all of the time using the standard open source tooling...


I did mention simple single-binary builds using the CLI. (ie - it just works with one command). Not fiddling around for hours with PackageCompiler.jl. Fiddling around with snoopfiles, then forced to ask questions on the forums to do what other languages do out of the box is not the way to go for developer ergonomics.


Uhh, you don't have to do anything with snoop files, and it's just a one line CLI call.

    julia -e 'using PackageCompiler; create_sysimage(["MyPackage"], sysimage_path="MyPackage.so"; precompile_execution_file = "MyScriptOfWhatToCompile.jl")'
and now you have a binary. How are people "forced to ask questions on the fourms" if the only thing to do is to change file location names? Are you talking about PackageCompiler from 2019 or PackageCompiler from 2023?


The problem is that those are f**king difficult problems to solve. Regardless of how you approach them. No tool or paradigm will make them easy, it may make a part of them easier (at some other cost somewhere else), but by no means will make it easy.

Building large scale, low latency systems is really hard. Handling concurrency and shared state is hard.

Heck, even functional programming is hard to grok for most people. The evidence is in its popularity (or lack of it).

There's no large mystery why we all are not using functional programming exclusively: because it's hard. Imperative, with all it's drawbacks it's easier to grasp and closer to how hardware works. Most people find it more natural, there's no big conspiracy.

But (there's always a "but"), the functional paradigm is essential to solve all those hard problems. It provides many tools where many common pitfalls are essentially avoided, but gaining a comfortable grasp of those tools takes work and it's not easy as many claim.

They seem easy after you've had the "aha!" moment, but it's like advanced math, getting there takes time and a certain proclivity to it. Once you develop some sort of intuition, you tend to forget the effort it took to gain it.

In any case, I love languages and new ideas, so I hope he succeeds (or fails in an interesting way where we learn something new).


This must be a troll post because F# already exists and used extensively in production in Azure.


From the article: "My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages. The hypothesis I aim to test with a new programming language is: By making different tradeoffs, these issues can be mitigated and a more broadly-appealing functional language can be built. With such a language, I hope that the true promise of functional programming for parallel computing can be widely realized."

It's a laudable aim, and I wish you all the best with it. A new language with it's own 'zen', similar to the 'zen of python' but with a functional flavour, is something I'd love to see, and I think it'd do well.

Is there something I can read that's a little more specific about the "different tradeoffs" you have in mind?


Thank you. I think you've basically summarized the approach! If you have any specific guidance to that end please send me an email or DM with details :)


> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

Personally, I hate type systems. I mean, I tolerate them, but >90% of the benefit I get is via better auto-complete (an possibly better performance).

Functional languages (that aren't lisp derived*) all seem to love complicated type systems. And in these systems, at some point I experience the same abstraction collapse as I do in a java code-base when I have to sift through 87 layers of interfaces and sub-classes.

I'm also, self-admittedly, a devotee of the New Jersey style of design, and consider usefulness to be much more important than correctness.

*The lisp ones are kinda fun, but I always end up back in dirty-old common lisp where I can do crazy do* shenanigans instead of functional lisp land.


I mean if you want to vomit up buggy software than not having a type system is a pretty good accelerator for that.


Correctness in a program is a bit like Partition-tolerance in distributed systems, isn't it?

It's not an option to write (too) incorrect software, even if doing so is a lot faster. At the end of the day, the software needs to do its work.


Sure, but that's an impact in usefulness. I value correctness as far as it makes the program more useful, but relatively simple type systems are more than enough to get there.


A program can't be too incorrect (in certain directions) without it affecting usefulness. The type correctness of the language it's written in isn't one of those directions, though.


In my idle brain cycles I've been thinking about a hypothetical implicitly-parallel computing architecture that could be built upon the concept of processing events, instead of the current mostly-linear execution that happens to get interrupted every now and then.

Everything would be an event: The computer powers on, user presses a key, an app icon is clicked, OS launches a background task, or an app wants to call its internal routine.. and a central scheduler or something sends that event to all available CPUs/cores, that could work on it simultaneously preferably without needing to worry about getting in each other's way.

I'm sure this idea has been around for a while and already has a name. Does anybody know?


So why would Clojure or Erlang or Elixir fail to satisfy the need for a functional language capable of taking advantage of the available cores?


I don't think he's arguing that the existing functional languages can't take advantage of all cores, it sounds like he believes that the existing functional languages cannot reach mainstream adoption because of fundamental flaws that he hopes to address:

> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages. The hypothesis I aim to test with a new programming language is: By making different tradeoffs, these issues can be mitigated and a more broadly-appealing functional language can be built. With such a language, I hope that the true promise of functional programming for parallel computing can be widely realized.


Exactly.


So what advantages to comprehension or usability would you like to offer over Elixir? (as it doesn't really focus on type/category theory like most other FP languages)


I'm not an Elixir expert at all, but one important feature I believe is missing: explicit returns.


um, what?

Pretty common in FP languages to have implicit returns. A function HAS to return something. Ideally, a function is just a mapping between one value and another value. Wouldn't make much sense to have explicit returns.

For everything else there are guards, or just a simple if/else block. If everything is an expression (And if you are making an FP lang, almost everything should be.), the last expression will always be returned.


> My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages.

What are those reasons, in your view?


Seems like he's talking about parallelism rather than concurrency? Erlang does the latter pretty well, but I'm not sure it's that great for "I have a lot of processing to do and I want the language to help parallelize it", especially since it's also not the fastest thing out there.


Have you used Broadway, its pretty fast.

I think people forget how expensive it is to serialize/deserialize data between machines/nodes.


There are a lot of people that have no interest in FP and hate reading functional code which involves reading and understanding recursion, fix-point style, partial application, currying, lazy evaluation, immutable data structures, lambdas, and combinators.

Then you have to think of performance characteristics, and debugging, onboarding new engineers, and code reviews.

I think it's easy to see why the industry doesn't use FP languages, and because of this people see no reason to even bother with it.

I myself would rather use ML family languages or Clojure or BEAM but it will never happen. The majority want to use Python or Go since they are slim on abstraction and easy to pickup. There's no bullshit involved.


The article is light on details, but as a polyglot programmer, I encourage him.

I am an enthusiastic, but far from expert, Haskell programmer. I wonder if Brett is thinking of a more Python-like interpreted functional language?


Thank you! Imagine everything we like about Python, remove some warts, make it functional. It should feel familiar and appealing to a Python programmer.


Based on above it seems you want to test the hypothesis that 'there is a python-like functional language in the space of FP'.

Is there anything ~technical you can share beyond desires and disappointments that leads you to believe this is a reasonable hypothesis?


I've got a working interpreter and compiler for the initial language. I've had a few people take it for a spin and they were able to rapidly learn it and start contributing, despite having expressed frustration in trying FP languages in the past. All anecdotes for now!


Ah, you already have it. Now we're all intrigued! Looking forward to the public release.


I'm pretty deep in on Elixir, but I'd be quite interested to take a look.


Functional is a good way to 'specify' how a certain calculation/change has to be performed. When working with distributed systems with multiple actors making changes, you have to treat the whole system (distribute database) as a single value on which your functions operate. Then you can use the commutative properties of function to implement transactions. (If two operations on the system commute, they can be applied in either order on the system, to result in the same state.)

But from such a specifying approach to a working system, is a long way.


Virding's First Rule of Programming:

Any sufficiently complicated concurrent program in another language contains an ad hoc informally-specified bug-ridden slow implementation of half of Erlang.


There are 172 comments until now but Scala was mentioned only 3 times while it has, I think, all the features that are said to be needed for a functional programming language. Then what's wrong with Scala?

I'd like Scala to be more successful although I'm using Java and a bit of Kotlin now.


Scala has a lot of interesting concepts in it for sure! The problem is that the JVM is a deal-breaker, either because of its complexity or because of the associated licensing risk. Clojure has the same problem, for what it's worth.


I think intrinsic to that deal-breaker is the underlying problem. Memory management in a parallel environment is a tough nut to crack. Leaning on the JVM punts on the issue.

There's no reason JavaScript couldn't transparently implement parallel map/filter/etc. Other than the fact it is really hard to do. On the other hand a lot of the things JavaScript engines do today are also really hard, so maybe one day it'll happen.


how did this person possibly write this entire paragraph without once mentioning Erlang or Elixir? It sounds like exactly what he's looking for


Right?


Wait till he discovers it and then...

"...Fuuuuck." lol


It’s not just a “functional programming language”, it’s a lisp!

https://mobile.twitter.com/haxor/status/1580959518984241158

It will never be popular.


emacs-lisp is not popular? autolisp is not popular?

think again


I'll drop this here: https://github.com/manuel/wat-js

If you have delimited continuations then you can construct coroutines/threads/await/async, promises etc.

I guess that this might be suitable for many scenarios thanks to nodejs, but the runtimes it relies on are not exactly small.


What language is he working on though? Can we see the initial ideas in the syntax?


As always for those niche languages, if the answer is not personal entertainment, then the reality is that it's just poor judgement.


Short answer, because I am bored.


"I want to be paid a fortune for making a programming language, so I need a reason."

Of course, I would do the same in their position.


It's an unpaid personal open source project.


I’m surprised he doesn’t mention Scala and Spark.


I think Spark is a good example of where the functional paradigm has done well in a niche.


He has set, not functional. See his Twitter


It's SSA. If you use (define) in Racket is it no longer functional?


If the setter is at compile-time it might still be functional. If it's at run-time, not. A run-time setter must be a binding (as in lambda args), not an assignment.


Because 14 wasn’t enough. https://xkcd.com/927/


While certainly true that functional programming provides an intrinsic advantage to programming parallelism, parallelism is limited by hardware constraints. To best take advantage of CPU architectures,”function” execution needs to optimize usage of caching layers and execution threads. This naturally lends itself to the development of event loops/schedulers in functional runtimes like node and erlang.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: