Clustering consists of two steps. First, a distance between all pairs ", + alLearnMore = paste0("

Agglomerative hierarchical clustering ", + "initially assumes that all time series are forming their own clusters. It then grows a clustering dendrogram thanks to 2 inputs:

",
+ "First, a **dissimilarity matrix** between all pairs ",
"of time series is calculated with one of the metrics, such as ",
"Euclidean (L2 norm) ",
"or Manhattan (L1 norm) distance. ",
"Dynamic Time Warping (DTW) ",
- "also quantifies similarity between two time series but ",
- "contrary to other distance measures it accounts for the order of time points.

In the second step, distances are arranged hierarchicaly and visualised as a dendrogram ", + "is another distance metric that does not only compare series point by point but also tries to align them such that shapes between the 2 series are matched. ", + "This makes DTW a good quantification of similarity when signals are similar but shifted in time.

", + "In the second step, clusters are successively built and merged together. The distance between the newly formed clusters is determined by the **linkage criterion** ",
"using one of linkage methods.