Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Bergmann agreed with declaring the experiment over, worrying only that Rust still "doesn't work on architectures that nobody uses".

I love you Arnd. More seriously, this will become an issue when someone starts the process of integrating Rust code into a core subsystem. I wonder whether this will lead to the kernel dropping support for some architectures, or to Rust doing the necessary work. Probably a bit of both.



There are two separate ongoing projects to make a rust compiler that uses GCC as a backend (one on the gcc side adding a c++ frontend that directly reads rust, one on the rustc side to make rustc emit an intermediate format that gcc can ingest).

The long-term solution is for either of those to mature to the point where there is rust support everywhere that gcc supports.


I wonder how good a LLVM backend for these rare architectures would have to be for that to be “good enough” for the kernel team. Obviously correctness should be non-negotiable, but how important is it that the generated e.g. Alpha code is performant for somebody’s hobby?


I suspect more the latter than anything. It could be that by the time Rust gets used in the kernel core, one or both of the GCC implementations would be functional enough to compile the kernel.

I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?

I presume that outside of devices and drivers, there's little to no new developments in those architectures. In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?


No one is doing any kind of serious computing on 30 year old CPUs. But the point of the hobby isn’t turning on the computer and doing nothing with it. The hobby is putting together all the pieces you need to turn it on, turning it on and then doing nothing with it.

There’s an asymmetry in what the retro computing enthusiasts are asking for and the amount of effort they’re willing to put in. This niche hobby benefits from the free labour of open source maintaining support for their old architectures. If the maintainers propose dropping support because of the cost of maintenance the hobbyists rarely step up. Instead they make it seem like the maintainers are the bad guys doing a reprehensible thing.

You propose they get their hands dirty and cherry pick changes from newer kernels. But they don’t want to put in effort like that. And they might just feel happier that they’re using the “real” latest kernel.


> I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?

Wanting big fixes (including security fixes, because old machines can still be networked) and feature improvements, just like anyone else?

> I presume that outside of devices and drivers, there's little to no new developments in those architectures.

There's also core/shared features. I could very easily imagine somebody wanting eg. ebpf features to get more performance out of ancient hardware.

> In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?

Because backporting bits and pieces is both hard and especially hard to do reliably without creating more problems.


The kernel must adapt to Rust, not the other way around. Rust is the way!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: