> and don’t think that the programmer more than the languages contribute to those problems
This sounds a lot like how I used to think about unit testing and type checking when I was younger and more naive. It also echoes the sentiments of countless craftspeople talking about safety protocols and features before they lost a body part.
Safety features can’t protect you from a bad programmer. But they can go a long way to protect you from the inevitable fallibility of a good programmer.
Multiple severe attacks on browsers over the years have targeted image decoders. Requiring an implementation in a memory safe language seems very reasonable to me, and makes me feel better about using FF.
Seems like the normal usage to me. The post above lists other criteria that have to be satisfied, beyond just being a Rust implementation. That would be the consideration.
Mozilla indicates that they are willing to consider it given various prerequisite. GP translates that to being “more than willing to adopt it”. That is very much not a normal interpretation.
> To address this concern, the team at Google has agreed to apply their subject matter expertise to build a safe, performant, compact, and compatible JPEG-XL decoder in Rust, and integrate this decoder into Firefox. If they successfully contribute an implementation that satisfies these properties and meets our normal production requirements, we would ship it.
I tried to do that, and now I’m just confused. The included glyph for the lower case n doesn’t actually fit the grid, so you can’t seem to replicate it. But also that grid doesn’t have enough resolution to do the tilde. Maybe I’m missing something?
Noticed that on desktop, the real grid, including the "half points" is shown and you can actually work with it. So it might just be a problem with the mobile version.
Enter a N or n into the Editing box, you'll see the two grids that make the glyph up along with a blank third grid on the bottom, add a small tilde in the top two rows. Or copy and paste the actual Ñ or ñ characters into the Editing box to create it new, and you can use it immediately with the alphabet textbox on the left.
Yes, and those glyphs don’t fit the grid. Try to redraw the n from the original font yourself. You can’t, because you can’t add points between the grid dots.
WebGPU is the only one of those I’ve really followed, but hasn’t that had a huge amount of input and changes due to other voices in the working group? That seems to contradict the simplistic picture painted above of Google just dictating standards to the industry.
To add insult to injury, we probably would have gotten WebGL 2.0 Compute, which was initially done by Intel, if Chrome had not refused to ship it on Chrome, arguing that WebGPU was right around the corner, and it would take too much space, this was about 5 years ago.
And to those rushing out to point out the excuse part about OpenGL on Mac not having support for compute, WebGL already back then wasn't backed up by OpenGL on all platforms, see Windows (DirectX), PlayStation (LibGNM).
Also eventually Safari also moved their WebGL implementation from OpenGL to Metal, and Chrome did as well, replace their WebGL to run on top of Metal on Mac.
So not really that much of a problem regarding the state of OpenGL on Mac as "required" implemenatation layer for WebGL.
> How to calculate π for n-metrics numerically. The general idea of "divide the circle into segments and calculate the length by the metric" is explained, but the exact algorithm or formulas are not shown.
I feel like that would have been a bit in the weeds for the general pacing of this post, but you just convert each angle to a slope, then solve for y/x = that slope, and the metric from (0,0) to (x,y) equal to 1, right? Now you have a bunch of points and you just add up the distances.
There’s an almost mystical belief among certain tech and science journalists that computers are bad at randomness, and it’s really bizarre, and in my opinion, pretty harmful.
The usual handwaving for entropy v. information/complexity is to observe that our universe goes from simple low entropy state (Big Bang) to a simple maximum entropy state (Heat Death).
Both have low information. The complexity rises and falls, peaking somewhere in the middle, as energy from the gravitational field is turned into structure.
Penrose likes zero initial Weyl curvature because it provides low entropy, but also conformal flatness, thus enabling his CCC theories.
Another consequence is that the Big Bang is not a reversed black hole (white hole). Black holes have high Weyl curvature. The Big Bang is the lowest entropy configuration, but a Black Hole is the maximum entropy configuration (just mass, spin and charges).
This sounds a lot like how I used to think about unit testing and type checking when I was younger and more naive. It also echoes the sentiments of countless craftspeople talking about safety protocols and features before they lost a body part.
Safety features can’t protect you from a bad programmer. But they can go a long way to protect you from the inevitable fallibility of a good programmer.