Tag: vector embeddings

  • Score Normalization

    Currently, I have two different scoring functions: BM25 and the semantic scoring function that comes from our sentence embedding. These scores take very different ranges, but need to be combined to make a final score. It’s not simply a matter of assigning different weights to these scores. We need to stretch them out to make…

  • Vector Retrieval

    Now that every document has been assigned a vector encoding its semantics, this opens the door to a new kind of retrieval. Rather than find documents that might be relevant to the query by searching through the search index for keywords, we can instead take the query’s vector and find nearby documents in the embedding.…

  • Vector Embeddings

    Search engines make use of AI to improve their search results. There are AI models that can understand the meaning of a sentence or document. They often present their results as embeddings of the document space into a vector space: D→ℝnD \to \mathbb{R}^n. These are called vector embeddings. Once you’ve found the embeddings for your…