Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Instead of sums of multiplications you could for example use sum of squares of differences.

Means squared error instead of dot product, it's not cheaper but it's close

If you want to go cheaper you could use sum of abs of differences.



This is effectively "the same" as dot product.

For a lot of embeddings we have today, norm of any embedding vector is roughly of same size, so the angle between two vectors is roughly same size as length of difference that you are saying, and can be expressed in terms of 1 - dot product after scaling




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: