# Distance Measures

In this short blog post I am going to summarize content from Mining of Massive Datasets more precisely the part of Chapter 3 which is describing the concept of distance.

Beyond all the formal definitions of both concepts measure/metric and distance we might pick Wikipedia's very first "informal definition" as a good starting point:

A distance is a numerical measurement of how far apart [two] objects are.

Follows a "more abstract" mathematical definition:

In mathematics, a metric or distance function is a function that defines a distance between each pair of elements of a set.

## A "Formal" Definition of Distance Measure

Given this short conceptual introduction we should provide an overview of the properties of a generic function in order for it to be called distance function. Let's suppose we have a set of points called a space. A distance function on this space is a function $d(x,y)$ that given two points ($x$ and $y$) produces a real number and satisfies the following axioms:

• $d(x,y) \geq 0$ a distance measure can not be negative.
• $d(x,y) = 0$ if and only if $x=y$; a distance measure is always positive except from the distance from a point and itself has value 0.
• $d(x,y) = d(y,x)$ a distance function is symmetric.
• $d(x,y) \leq d(x,z) + d(z,y)$; a distance function always satisfies the triangular inequality constraint.

The last axiom can be intuitively explained using basic geometry. If we consider two points: $x$ and $y$ and we define our distance function as the shortest possible line (path) drawn from $x$ to $y$ we clearly do not get any benefit to pass through any other point $z$ in order to go from $x$ to $y$, the best case scenario is when $z$ lies on the shortest line between the two points $x$ and $y$ apporting no benefits nor penalities on our distance path.

## Euclidean Distances

The euclidean distance represents the most common concept of distance, it is the most familiar on our daily life.
An n-Dimensional Euclidean Space is the one where points are represented as vectors of $n$ real numbers.

### $L_2$-Norm Distance AKA The Euclidean Distance

The most conventional distance, the euclidean distance, also referred as $L_2$-Norm Distance is defined as follows:
$$d(x,y)= d([x_1, x_2, \cdots,x_n], [y_1, y_2,\cdots,y_n]) = \sqrt{\sum_{i=1}^{n} (x_i - y_i)^2}$$
This distance satisfies all the four axioms mentioned above. In Euclidean Spaces the concept of distance can be generalized such that for any constant $r \in \mathbb{N}$ we can define the $L_r$-Norm Distance $d$ as:
$$d(x,y)= d([x_1, x_2, \cdots,x_n], [y_1, y_2,\cdots,y_n]) = (\sum_{i=1}^{n} |(x_i - y_i)|^r)^{1/r}$$

### $L_1$-Norm Distance AKA Manhattan Distance

Another common measure is the $L_1$-Norm Distance more often referred as Manhattan Distance. It is the sum of the magnitudes of the differences in each dimension:
$$d(x,y)= d([x_1, x_2, \cdots,x_n], [y_1, y_2,\cdots,y_n]) = \sum_{i=1}^{n} |(x_i - y_i)|$$

### $L_\infty$-Norm Distance

Is the limit as $r$ tends to $\infty$ of the $L_r$-norm. Here, as $r$ gets larger and larger only the dimension with the largest difference really matters (becomes dominant), thus, formally the $L_\infty$-Norm Distance as defined as the maximum of $|x_i-y_i|$ over all the dimensions $i$.
$$d(x,y)= d([x_1, x_2, \cdots,x_n], [y_1, y_2,\cdots,y_n]) = \max{(|x_i-y_i|)}$$

Follows a code snippet of a pure python3.x implementation of the Euclidean Distances:

import numbers

def euclidean_distance(x, y, r=2):

# Case wrong r value
if r != 'inf' and r < 1:
raise ValueError('r must be positive (r > 0)')

# Case two vectors have different lengths
if len(x) != len(y):
raise ValueError('Input Vector Length Mismatch')

# Case both vectors have length 0
if len(x) == 0:
return 0.0

# Case values are not numeric
if not all(isinstance(v, numbers.Number) for v in x) or \
not all(isinstance(v, numbers.Number) for v in y):
raise TypeError('Euclidean Distance is Computed'
' over only numerical data')

# Case L_infty-Norm Distance
if r == 'inf':
return max(abs(x - y) for x, y in zip(x, y))

# Compute the L_r-Norm Distance
return sum(abs((x - y))**r for x, y in zip(x, y))**(1/r)

>>> euclidean_distance([1,2,3],[1,2.5,3.5], 1)
>>> 1.0
>>> euclidean_distance([1,2,3],[1,2.5,3.5], 2)
>>> 0.7071067811865476
>>> euclidean_distance([1,2,3],[1,2.5,3.5], 3)
>>> 0.6299605249474366
>>> euclidean_distance([1,2,3],[1,2.5,3.5], 'inf')
>>> 0.5


Interesting to note that as $r$ increases the value of the distance slowly approximates the definition of $L_\infty$-Norm Distance (in this case 0.5):

>>> [euclidean_distance([1,2,3],[1,2.5,3.5],x) for x in range(1,1000,1)]
>>> [1.0,
>>>  0.7071067811865476,
>>>  0.6299605249474366,
>>>  0.5946035575013605,
>>>  0.5743491774985174,
>>>  ...
>>>  0.5003480865601369,
>>>  0.5003477373047961,
>>>  0.5003473887496088,
>>>  0.5003470408924718]


Those implementations have to be considered just for illustration proposes, refer to numpy.linalg or other libraries for production ready code.

## Jaccard Distance

The Jaccard distance represents a metric to compute the distance between two sets. Given two sets $X$ and $Y$ it is defined as the ratio of the cardinality (/size) of the intersection among the two sets over the cardinality of the union of the two sets:
$$d(X,Y) = 1 - \frac{|X\cap Y|}{|X\cup Y|}$$
Its value range is between $0$ and $1$ and it make use of the Jaccard similarity between two sets defined as follows:
$$SIM(X,Y) = \frac{|X\cap Y|}{|X\cup Y|}$$

Follows a code snippet of a pure python3.x implementation of the Jaccard Distance:

def jaccard_distance(x, y):
if type(x) == list:
x = set(x)
if type(y) == list:
y = set(y)

# Avoiding division by 0
if len(x.union(y)) == 0:
return 0
else:
return 1 - len(x.intersection(y))/len(x.union(y))

>>> jaccard_distance([1,2,4], [2,3,4])
>>> 0.5
>>> jaccard_distance([1,2,4], [1,2,4])
>>> 0.0


## Cosine Distance

In the context of Cosine Distance we iterpret points as directions from the origin of the plane to the points themselves. In this scenario we are going to measure the distace between two vectors as $1 - cos(\theta)$ where $cos(\theta)$ is the cosine of the angle between the two directions $\theta$.

Cosine distance is generally used as a metric for measuring distance when the magnitude of the vectors does not really matter; often this happens when working with NLP (Natural Language Processing) text data represented by word counts. The lenght of the directions do not alter the distance measure, thus, multiplying a vector with a constant would not impact the distance measure at all. The cosine distance (based on cosine similarity and normalized) can be defined as:
$$d(x,y) = 1 - \frac {x \cdot y}{||x||_2 ||y||_2}$$

Follows a code snippet of a pure python3.x implementation of the Cosine Distance:

def cosine_distance(x, y):

# Case two vectors have different lengths
if len(x) != len(y):
raise ValueError('Input Vector Length Mismatch')

# Case both vectors have length 0
if len(x) == 0:
return 0.0

# If one of the vector has 0 magnitude then d is not defined
if sum(x) == 0 or sum(y) == 0:
return None

# Case values are not numeric
if not all(isinstance(v, numbers.Number) for v in x) or \
not all(isinstance(v, numbers.Number) for v in y):
raise TypeError('Euclidean Distance is Computed'
' over only numerical data')

# Numerator: Dot Product
num = sum(x*y for x, y in zip(x, y))
# Denominator: Product of L_2 Norms
den = sum(el_x*el_x for el_x in x)**.5 * sum(el_y*el_y for el_y in y)**.5

return (1 - (num/den))/2


The following example shows that a vector scaling does not impact the cosine distance measure:

>>> cosine_distance([1,2,3],[1,2,0])
>>> 0.2011928476664016
>>> cosine_distance([1,2,3],[4,8,0])
>>> 0.2011928476664016


## Edit Distance

This kind of distance comes into play when the data points are strings. Indeed, it is not as trivial as with real numbers to compute a distance among strings. What is the magnitude of a string? Maybe its length? The edit distance between two strings $x=x_1x_2\cdots x_n$ and $y=y_1y_2\cdots y_m$ can be defined as the smallest number insertions and deletions of single characters that will convert $x$ to $y$.

Another way to define the edit distance between two strings $d(x,y)$ is to compute their longest common subsequence (LCS). Given the LCS of the two strings $x$ and $y$ the edit distance can be computed as the length of $x$ plus the length of $y$ minus twice the length of their LCS. Considering $|x|$ as the length of a string we might express the edit distance as:
$$d(x,y) = |x| + |y| - 2\cdot LCS(x,y)$$

A similar distance measure which takes into account not only insertions and deletions but also substitution is the Levenshtein Distance.

## Hamming Distance

Given a space of vectors, is it possible to define the Hamming Distance between two vectors to be the number of components in which they differ:
e.g.: $x = [1,2,3,4]$ and $y=[0,2,2,4]$ then $d(x,y) = 2$.
Most often it is used when handling boolean vectors, nevertheless, it is possible to apply this metric to vectors having components from any set.

Follows a code snippet of a pure python3.x implementation of the Hamming Distance:

def hamming_distance(x, y):
return [x != y for x, y in zip(x, y)].count(True)

>>> hamming_distance([1,2,3],[1,2,3])
>>> 0
>>> hamming_distance([1,3,4],[1,2,3])
>>> 2


## Non-Euclidean Distances

It is important to nothe that some of the distance measures mentioned above are not Euclidean Spaces. An important property of Euclidean spaces is that the average of points always exists; this does not apply for sets: it is not possible to compute the average of two sets, the same limitation applies also for strings. Vector spaces where we define the cosine distance may or may not be Euclidean, it depends on the components of such vectors, for example if we restrict the value of vectors to be Integers, then the space is not going to be Euclidean, in the case of Real numbers it forms an euclidean space.

Show Comments