Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think he is talking about the 2024 arxiv paper by some netflix (?) researchers that say that it's best not to normalize the dot products (so instead of cosine similarity you just have a dot product).

For most commercial embeddings (openai etc) this is not a problem as the embeddings are already normalized



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: