Euclidean Distance

Embeddings
Euclidean distance measures the straight-line distance between two points in vector space — smaller values indicate greater semantic similarity.

Euclidean distance measures the straight-line distance between two points in vector space, calculated as the length of the line connecting them. In text analysis, smaller distances indicate greater semantic similarity between documents.

However, most SEO applications use cosine similarity instead of Euclidean distance because Euclidean distance is sensitive to vector magnitude. Two semantically identical vectors can have different Euclidean distances if they differ in magnitude. For example, vectors [3,4] and [6,8] have Euclidean distance of √5 between them despite representing the same semantic direction. Cosine similarity measures the angle between vectors while ignoring length, making it more suitable for most text similarity tasks.

Euclidean distance works better in low-dimensional spaces, such as when visualizing document clusters on 2D or 3D plots. When vectors are normalized to unit length, Euclidean distance and cosine similarity become mathematically equivalent. However, cosine similarity remains the convention in most SEO applications.

Source: AI Semantic SEO Expert, Robert Niechciał (sensai.io)