Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know much about how it got started. I'm curious how much of Rust's capabilities depend upon recent CS breakthroughs. Could we have made Rust in 1990?

The compiler is also relatively slow. Would Rust have been worth working with on 30+ year old hardware?





> Could we have made Rust in 1990?

No. Only massively oversimplifying, Rust could be described as a bunch of ideas pioneered among functional languages coming back to C++, the same way Java was a bunch of ideas from Lisp coming back to C. There is very little that's truly new in Rust, it's just mixing a bunch of features that were not often together before.

> The compiler is also relatively slow. Would Rust have been worth working with on 30+ year old hardware?

What makes Rust slow to compile is largely independent of what makes it unique. A lot of text has been written about this, but the again massively oversimplified version is that had the designers cared about compile times when the language was being designed and the compiler written, you could have something that's very similar to Rust but also very fast to compile.


> The compiler is also relatively slow. Would Rust have been worth working with on 30+ year old hardware?

As I understand it, a lot of the slowness of the rust compiler comes about from llvm. And how rust and llvm interoperate. Rustc creates and sends gigabytes of stuff to llvm - which passes all of that to its optimizer. If you skip all that work - for example by running cargo check - the compiler is an order of magnitude faster.

If rust were invented in the 90s, it wouldn’t have used llvm. Rust could still have been implemented, and we’d probably have a much faster compiler as a result. But it would have missed out on all the benefits of llvm too. It would have needed its own backend to be written - which would have been more work. And the compiler probably wouldn’t have been as good at low level optimisations. And it probably wouldn’t have out of the box support for so many target platforms. At least, not from day 1.


The obvious(?) question is why it sends gigabytes of stuff to llvm and if that can't be reduced somehow.

> The obvious(?) question is why it sends gigabytes of stuff to llvm

IIRC it's a combination of technical debt from earlier in Rust's life (it's easier to generate naive LLVM IR and let LLVM's optimizer do the heavy lifting of chewing through that) and how Rust implements generics via monomorphization

> and if that can't be reduced somehow.

I believe the technical debt bit can be (and is being!) reduced by implementing optimizations and better IR generation in rustc itself. As for the monomorphization strategy, I thought I remembered reading something about how Rust technically allows for generics to be implemented via non-monomorphization strategies like type erasure/dynamic dispatch, but I can't seem to find that post/article/whatever it was now so I'm not sure I'm not making it up. That being said, there are patterns to reduce the amount of code generated (e.g., generic facade that forwards to a non-generic implementation), but those need to be manually implemented at the moment and I don't think there's significant work towards automating that at the moment.


For the latter, the term you want to search for is polymorphization. It is about making the compiler do the work that those manual façades do, automatically.

Yeah, that looks familiar. Thanks for the pointer!

I think the answer is probably that Rust was possible in the 1980s and 1990s, but such a thing just wasn't practical.

Rust is notoriously compiler-intensive. That wouldn't have been tolerated in the early PC era. When you needed fast compilers that "worked on my machine" and should work on yours. Ship it.


It wasn't really possible. We had neither the PL techniques nor the computational power to make something like Rust work at the time. All the answers people are throwing around showing it would have been possible rely on a garbage collector and require a runtime, or have many other unacceptable compromises (e.g. no use after free because you aren't allowed to free).

> Could we have made Rust in 1990?

We did, it was called OCaml. If we'd had any sense we'd've rewritten all our systems code in it. But since C had bigger numbers on microbenchmarks, no-one cared.


One of Rust’s biggest and best features is its trait system inspired by Haskell’s type classes. It is the right abstraction for most use cases that in 1990 were implemented by OOP and inheritance. Now the basics of type classes were invented by Wadler in 1988, but certain more advanced features (type families) were only invented in 2005. You mention OCaml but that’s only a small part of Rust’s type system.

So the answer is no, because humans’ collective expertise of programming language theory simply isn’t enough in 1990, unless Rust developers independently invented such features instead of copying them from GHC Haskell.


> One of Rust’s biggest and best features is its trait system inspired by Haskell’s type classes. It is the right abstraction for most use cases that in 1990 were implemented by OOP and inheritance. Now the basics of type classes were invented by Wadler in 1988, but certain more advanced features (type families) were only invented in 2005. You mention OCaml but that’s only a small part of Rust’s type system.

I submit that those advanced features are at most a tiny fraction of why projects like OP are seeing benefits from moving to Rust. E.g. I wouldn't be at all surprised if this Rust on Android project isn't using type families at all, or is using them only in an incidental way that could be replaced without significantly compromising the benefits.


Any Rust code longer than ~20 lines uses Rust iterators which use type families. The Iterator trait in Rust has an associated type called Item. This is the innovation here. Classic type classes can only contain functions not types. It was in 2005 that a paper was written to show how having a type inside a type class makes sense, including how it can be type checked (via entailment of type class predicates with type equality), type inferred (by changing the standard HM system to return partial type equality constraints in addition to substitutions).

Now if Rust did not have such language features maybe it would have implemented iterators very differently. Current Rust iterators are similar to Java iterators, and in Java, iterators themselves have a type parameter, rather than having an associated type inside the iterator trait.


Are associated types type families? I'm not sure. They seem more similar to functional dependencies to me. But I'm only familiar with type families from Haskell so maybe I'm missing some broader context.

Yes they are type synonym families. And yes they are quite similar to functional dependencies because it is conceived as an alternative to functional dependencies but it expresses the programmer’s intent more clearly.

> Classic type classes can only contain functions not types...

> Now if Rust did not have such language features maybe it would have implemented iterators very differently. Current Rust iterators are similar to Java iterators, and in Java, iterators themselves have a type parameter, rather than having an associated type inside the iterator trait.

True, although I'm not sure how much difference it makes in a language with first-class modules. But more importantly, how much difference does it make at the point of use? As far as I can see the overwhelming majority of Rust iterator code looks pretty much the same as one would write in OCaml, or Java.


All of the memory safety stuff is independent of the trait system, to the best of my knowledge, but the data race protection is implemented through the Send and Sync traits. I'm unsure if there is an obvious alternative approach to this same feature, but I think it may be one innovation that is still novel to Rust and would not have existed in earlier decades.

Rust also has the bigger numbers on microbenchmarks, that's why people care about it.

Yes, exactly. It's tragic that the only way programming culture ever improves is when a language comes out that's better for writing software in and happens to also have bigger numbers on microbenchmarks.

I think you're generalizing a bit too much. Rust targetted the audience that wanted big numbers on microbenchmarks, but not all languages do. Typescript for example has no performance advantage against Javascript, but it became very popular due to the better dev experience. Kotlin is another example where that mattered a lot.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: