Hacker Newsnew | past | comments | ask | show | jobs | submit | Maxious's commentslogin

Ability to virtualize on Apple devices and linux with GPUs https://github.com/scipy/scipy/issues/24990

Some devs did get the email and follow the process and still got kicked out

> Don’t let anyone tell you it’s because we didn’t read our emails or submit the right verification paperwork. Cuz we did all that back in October. > And this month, we were suddenly and without any warning locked out.

https://x.com/OSRDrivers/status/2042286973461709183


It's got a little zig mystery blob that does the hashing. Messing with that would run afoul of DMCA anticircumvention right?

Things can be worse - this decision means 2 out of the 3 principals of the Vienna School of Agentic Coding have not sold out to OpenAI


> Effective immediately, all S3 GET, PUT, and LIST operations, as well as operations that change object tags, ACLs, or metadata, are now strongly consistent.

https://aws.amazon.com/blogs/aws/amazon-s3-update-strong-rea...


We're winning so hard we're sick of winning. So satellite imagery is now banned because of how hard we're winning https://www.reuters.com/business/media-telecom/satellite-fir...

One of the issues being explored is that although US radar is aged, surface vehicles can be equipped with ASDE-X transponders to be more visible to ATC systems. https://www.faa.gov/air_traffic/technology/asde-x

The vehicle that crashed into the plane did not have one and thus no automated alert was triggered.



I like the visualization, but I don’t understand the grid quantization. If every point is on the unit circle aren’t all the center grid cords unused?


Yeah that's odd. It seems like you'd want an n-1 dimensional grid on the surface of the unit sphere rather than an n dimensional grid within which the sphere resides.

Looking at the paper (https://arxiv.org/abs/2504.19874) they cite earlier work that does exactly that. They object that grid projection and binary search perform exceptionally poorly on the GPU.

I don't think they're using a regular grid as depicted on the linked page. Equation 4 from the paper is how they compute centroids for the MSE optimal quantizer.

Why specify MSE optimal you ask? Yeah so it turns out there's actually two quantization steps, a detail also omitted from the linked page. They apply QJL quantization to the residual of the grid quantized data.

My description is almost certainly missing key details; I'm not great at math and this is sufficiently dense to be a slog.


Yes. Great catch. I simplified the grid just for visualization purpose.

I've updated the visualization. The grid is actually not uniformly spaced. Each coordinate is quantized independently using optimal centroids for the known coordinate distribution. In 2D, unit-circle coordinates follow the arcsine distribution (concentrating near ±1), so the centroids cluster at the edges, not the center.


Cool! Thank you


i think grid can be a surface of the unit sphere


Is there an error in the visualization? It shows that every vector is rotated the same amount. My understanding was that they are randomized with different values, which results in a predictable distribution, which is easier to quantize.


That's actually correct and intentional. TurboQuant applies the same rotation matrix to every vector. The key insight is that any unit vector, when multiplied by a random orthogonal matrix, produces coordinates with a known distribution (Beta/arcsine in 2D, near-Gaussian in high-d). The randomness is in the matrix itself (generated once from a seed), not per-vector. Since the distribution is the same regardless of the input vector, a single precomputed quantization grid works for everything. I've updated the description to make this clearer.


Thanks. However, from this visualization it's not clear how the random rotation is beneficial. I guess it makes more sense on higher dimensional vectors.


Yes, this is important in high dimension. But sadly, very hard to visualize. In 2d it looks like unnecessary.


I believe they are all rotated by the same random matrix, the purpose being (IIUC) to distribute the signal evenly across all dimensions. So effectively it drowns any structure that might be present in noise. That's essential for data efficiency in addition to avoiding bias related issues during the initial quantization step. However there are still some other issues due to bias that are addressed by a second quantization step involving the residual.

That said, I don't believe the visualization is correct. The grid for one doesn't seem to match what's described in the paper.

Also it's entirely possible I've misunderstood or neglected to notice key details.


Good post but link at the end is broken.

“”” For the full technical explanation with equations, proofs, and PyTorch pseudocode, see the companion post: TurboQuant: Near-Optimal Vector Quantization Without Looking at Your Data.“


Author here. Sorry still working on refining the post. Will share once the post is ready.


Awesome! So it nudges the vectors into stepped polar rays.. It's effectively angle snapping? Plus a sort of magnitude clustering.


> we did our best to convince anthropic to support developer choice but they sent lawyers

https://x.com/i/status/2034730036759339100


Anthropic has zero problems with API billing, there's no chance they told him to rip that out.

Reading through his X comments and GitHub comments he is behaving immaturely. I don't trust what he's saying here. Ripping out Claude API support was just throwing a tantrum. Weird given his age - he's old enough to be more mature.


cio.gov and trumpcard.gov are also Cloudflare.

Cloudflare has proudly protected Trump campaign websites for the last decade https://www.businessinsider.com/cloudflare-ceo-anonymous-ddo...


Okay or maybe they won the contract for `.gov` in 2023: https://www.cloudflare.com/press/press-releases/2023/cloudfl...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: