⼀ kinds of small samples for accurate clustering clustering algorithms: hierarchical clustering. The so-called more and more precise, ⼀ aspect refers to the clustering method full transparency in the implementation process, the other side is ⼀ in a real application scenarios for the amount of data (about thousands of rows) of data, hierarchical clustering to have a clustering effect of looking great!
First of all it will calculate the nearest data point, be classified as a cluster, and then take the center of mass of the cluster as a representative to participate in the next time the process of selecting the nearest point, in short, it is the constant pairwise merge until it merged into a cluster so far
In [2]:
import numpy as np import pandas as pd import matplotlib.pyplot as plt %matplotlib inline
In [4]:
from sklearn.datasets import load_iris iris = load_iris()
In [6]:
iris.data
In [7]:
iris.target
Out[7]:
In [8]:
from sklearn.cluster import AgglomerativeClustering agClustering = AgglomerativeClustering(n_clusters=3)
In [9]:
agClustering.fit(iris.data)
Out[9]:
In [12]:
agClustering.labels_
Out[12]:
In [15]:
from sklearn.metrics import accuracy_score
In [16]:
accuracy_score(iris.target,agClustering.labels_)
Out[16]: