Hacker Newsnew | past | comments | ask | show | jobs | submit | mistercow's commentslogin

> and don’t think that the programmer more than the languages contribute to those problems

This sounds a lot like how I used to think about unit testing and type checking when I was younger and more naive. It also echoes the sentiments of countless craftspeople talking about safety protocols and features before they lost a body part.

Safety features can’t protect you from a bad programmer. But they can go a long way to protect you from the inevitable fallibility of a good programmer.


I never said anything about unit testing nor type checking, last time I checked C/C++ are strongly typed but I guess I'm just too naïve to understand.


Multiple severe attacks on browsers over the years have targeted image decoders. Requiring an implementation in a memory safe language seems very reasonable to me, and makes me feel better about using FF.


Seems like the normal usage to me. The post above lists other criteria that have to be satisfied, beyond just being a Rust implementation. That would be the consideration.


Mozilla indicates that they are willing to consider it given various prerequisite. GP translates that to being “more than willing to adopt it”. That is very much not a normal interpretation.


From the link

> To address this concern, the team at Google has agreed to apply their subject matter expertise to build a safe, performant, compact, and compatible JPEG-XL decoder in Rust, and integrate this decoder into Firefox. If they successfully contribute an implementation that satisfies these properties and meets our normal production requirements, we would ship it.

That is a perfectly clear position.


How far away is JPEG-XL rust version from Google if Chrome is not interested in it?


You can review it here: https://github.com/libjxl/jxl-rs

Seems to be under very active development.


1. “High resolution” in this kind of context is generally relative to previous work.

2. “Postage stamp sized” is not a resolution. Zoom in on them and you’ll see that they’re quite crisp.


I tried to do that, and now I’m just confused. The included glyph for the lower case n doesn’t actually fit the grid, so you can’t seem to replicate it. But also that grid doesn’t have enough resolution to do the tilde. Maybe I’m missing something?


Yeah, there are some sort of shenanigans going on in the editor. The premade letters use a finer grid than what the editor lets you work with.

It's most obvious with O, {, & and # which are impossible to draw with the grid that's presented to you.


Noticed that on desktop, the real grid, including the "half points" is shown and you can actually work with it. So it might just be a problem with the mobile version.


Enter a N or n into the Editing box, you'll see the two grids that make the glyph up along with a blank third grid on the bottom, add a small tilde in the top two rows. Or copy and paste the actual Ñ or ñ characters into the Editing box to create it new, and you can use it immediately with the alphabet textbox on the left.


The editor doesn’t understand ñ and you can’t make a tilde because it requires a minimum of four points of width.


Yes, and those glyphs don’t fit the grid. Try to redraw the n from the original font yourself. You can’t, because you can’t add points between the grid dots.


WebGPU is the only one of those I’ve really followed, but hasn’t that had a huge amount of input and changes due to other voices in the working group? That seems to contradict the simplistic picture painted above of Google just dictating standards to the industry.


Would webgpu exist at all if Chrome hadn’t just pushed through with an implementation?

Who knows.

Not us, we’ll never know.


To add insult to injury, we probably would have gotten WebGL 2.0 Compute, which was initially done by Intel, if Chrome had not refused to ship it on Chrome, arguing that WebGPU was right around the corner, and it would take too much space, this was about 5 years ago.

And to those rushing out to point out the excuse part about OpenGL on Mac not having support for compute, WebGL already back then wasn't backed up by OpenGL on all platforms, see Windows (DirectX), PlayStation (LibGNM).

Also eventually Safari also moved their WebGL implementation from OpenGL to Metal, and Chrome did as well, replace their WebGL to run on top of Metal on Mac.

So not really that much of a problem regarding the state of OpenGL on Mac as "required" implemenatation layer for WebGL.


If you use base64 with the intention of hiding the encoded information, surely it’s as much a cipher as rot13 is, right?


> How to calculate π for n-metrics numerically. The general idea of "divide the circle into segments and calculate the length by the metric" is explained, but the exact algorithm or formulas are not shown.

I feel like that would have been a bit in the weeds for the general pacing of this post, but you just convert each angle to a slope, then solve for y/x = that slope, and the metric from (0,0) to (x,y) equal to 1, right? Now you have a bunch of points and you just add up the distances.


Thanks


There’s an almost mystical belief among certain tech and science journalists that computers are bad at randomness, and it’s really bizarre, and in my opinion, pretty harmful.


> ... all believe that the universe is not destined to grow more disorganized forever, but more complex and rich with information.

Maybe it's just a problem of being loose with terminology, but this seems to be contrasting entropy and information content, which is backwards?


The usual handwaving for entropy v. information/complexity is to observe that our universe goes from simple low entropy state (Big Bang) to a simple maximum entropy state (Heat Death).

Both have low information. The complexity rises and falls, peaking somewhere in the middle, as energy from the gravitational field is turned into structure.

https://en.wikipedia.org/wiki/Weyl_curvature_hypothesis

Penrose likes zero initial Weyl curvature because it provides low entropy, but also conformal flatness, thus enabling his CCC theories.

Another consequence is that the Big Bang is not a reversed black hole (white hole). Black holes have high Weyl curvature. The Big Bang is the lowest entropy configuration, but a Black Hole is the maximum entropy configuration (just mass, spin and charges).


Entropy also has units of measure: Joules per Kelvin.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: