Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I felt similar with lenses. The problem lens solve is horrible. You don’t even want that problem.

Lenses abstract properties in a composable manner. How is this problem horrible?

> FP can be the pragmatic as well. You’re going to glue up monad transformers, use lenses like there’s no runtime cost, and compute whatever you need in days but at least you know it works. Maybe there’s accidentally quadratic behavior in lifting or lenses but that’s by design. The goal is to just throw software at things as fast as possible as correctly as possible.

Any abstraction can be used inappropriately. Slavish adherence to an approach in spite of empirical evidence is a statement about those making decisions, not the approach itself.

In other words:

  A poor craftsman blames his tools.


Lenses: solve a problem elegantly (if you can hide the boilerplate) but inefficiently. A self caused problem by having extremely nested records. How did you get to a point where you have a structure that's hard to play with?

Lenses are exactly glue for throwing software at things as fast as possible as correctly as possible. A poor tool.

The very need for lenses often indicates that the data model has been designed in a way that's hostile to direct, ergonomic manipulation. A glued up steampunk contraption, a side-effect of throwing software at everything as fast as possible. Invented in a language environment where they can't be efficient.

Monad transformers: lift has quadratic complexity and runtime cost. Not really composable. Similar to effect systems in other languages, control flow becomes very unclear, depending on the order of application.

Lenses and monad transformers are just a nice trick that you shouldn't ever learn.

But I agree with your last statement, many of these libraries are just poor craftsmans giving us new tools that they made for problems we never want to have.

It's similar to dependency injection, why would anyone need a topological sort over dependencies and an automatic construction of these dependencies? Is it so hard to invoke functions in the right sequence? Sounds like you've made a program with too many functions and too many arguments. (or in oop, too many classes with too much nesting and too many constructor args)

These tools are "pragmatic". Given the mess that will naturally arise due to poor craftsmanship, you'll have these nice tools to swim well in an ocean overflowing with your own poop.


> Lenses: solve a problem elegantly (if you can hide the boilerplate) but inefficiently. A self caused problem by having extremely nested records. How did you get to a point where you have a structure that's hard to play with?

I see the value of lenses from a different perspective, in that they can generalize algorithms by abstracting property position within an AST such that manipulation does not require ad hoc polymorphism. For example, if there exists an algorithm which calculates the subtotal of a collection of product line items, lenses can be used to enable its use with both a "wish list" and a "purchase order."

Another thing they cleanly solve is properly representing a property value change with copy-on-write types. This can get really ugly without lenses in some languages.

I respect your take on them though and agree their definitions can be cumbersome if having to be done manually.


I understand your examples, but I'd say ASTs are rare and they're already a convenience, not a performance or efficiency choice. You're using ASTs because you'll be able to write other code quickly.

For the wishlist and purchase order, just think of the boilerplate you have to write to get 1 computation for variable data shapes, compared to doing what you want 2 times.

Copy-on-write types are easy if they are shallow, I'd question why they're so deep that you need inefficient lens composition to modify a deep value. We've already invented relational structures to deal with this. I'm assuming you care about history so copy-on-write is important and is not purely an exercise in wasteful immutability.


> For the wishlist and purchase order, just think of the boilerplate you have to write to get 1 computation for variable data shapes, compared to doing what you want 2 times.

This example is simple enough to not have to use lenses for sure. Another example which may better exemplify appropriate lens usage is having properties within REST endpoint payloads used to enforce system-specific security concerns. Things like verifying an `AccountId` is allowed to perform the operation or that domain entities under consideration belong to the requestor.

> Copy-on-write types are easy if they are shallow, I'd question why they're so deep that you need inefficient lens composition to modify a deep value. We've already invented relational structures to deal with this. I'm assuming you care about history so copy-on-write is important and is not purely an exercise in wasteful immutability.

While being able to track historical changes can be quite valuable, using immutable types in a multi-threaded system eliminates having to synchronize mutations (thus eliminating the possibility of deadlocks) and the potential of race conditions. This greatly simplifies implementation logic (plus verification of same) while also increasing system performance.

The implication of using immutable types which must be able to reflect change over time is most easily solved with copy-on-write semantics. Lenses provide a generalization of this functionality in a composable manner. They also enable propagation of nested property mutations in these situations such that the result of a desired property change is a new root immutable instance containing same. Add to this the ability to generalize common functionality as described above and robust logic can be achieved with minimal duplication.

It is for these and other reasons I often find making solutions with immutable types and lenses very useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: