Its the same as "bird dogging" or wholesaling in real estate or really any series of other middlemen or wholesale business who do the hard work of finding the deal but don't necessarily have the cash or want to run the business end to end.
>but not compared to similar businesses of their capital value
So? it's not like if hyperscalers weren't building datacenters, the billions that would otherwise be spent on GPUs would be spent on 10 car factories or whatever. The only reason the billions was being invested in the first place was because there's a craze for AI datacenters
>Traders following the investments disclosed by Scion’s over the last 3 years (between May of 2020 and May 2023) would have made annualized returns of 56% according to an analysis by Sure Dividend
Seems like Scion Capital could have just disclosed winning trades, that they may or may not have made?
They've used their previous motors in production Ferraris and koensiggs and also in aircraft. They have the capability to make 100,000 motors a year so this is definitely not just lab stuff!
Yup, if you're using OpenCV for instance compiling instead of using pre-built binaries can result in 10x or more speed-ups once you take into account avx/threading/math/blas-libraries etc...
for 768 dimensions, you'd still expect to hit (1-1/N) with a few billion samples though. Like that's a 1/N of 0.13%, which quite frankly isn't that rare at all?
Of course are vectors are not only points in one coordinate axes, but it still isn't that small compared to billions of samples.
Bear in mind that these are not base vectors at this stage (which would indeed give you 1/768). They are arbitrary linear combinations. There are exponentially many near orthogonal of these vectors for small epsilon. And epsilon is chosen pretty small in the paper.
reply