SVM用sklearn库实现(一)

** SVM support_vector_machines 支持向量机
1、解决的是二分类问题
2、什么样的决策边界才是最好的。
3、找到一条线(面),使离该线最近的点,能够最远。最近的点称为支持向量,该线是分类线。
图中带圈的点为支持向量,中间的线为分界线

code

  1. 训练一个基本的SVM,线性的
from sklearn.svm import SVC # "Support vector classifier"
model = SVC(kernel='linear')
model.fit(X, y)```

得到的结果为:
SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0,
  decision_function_shape=None, degree=3, gamma='auto', kernel='linear',
  max_iter=-1, probability=False, random_state=None, shrinking=True,
  tol=0.001, verbose=False)

可以从中看出那些重要的参数如 C , gamma, kernel等

2. 定义绘图函数
def plot_svc_decision_function(model, ax=None, plot_support=True):

if ax is None:
    ax = plt.gca()#Get Current Axes获取当前轴线
xlim = ax.get_xlim()
ylim = ax.get_ylim()

# create grid to evaluate model
x = np.linspace(xlim[0], xlim[1], 30)
y = np.linspace(ylim[0], ylim[1], 30)
Y, X = np.meshgrid(y, x)
xy = np.vstack([X.ravel(), Y.ravel()]).T
P = model.decision_function(xy).reshape(X.shape)

# plot decision boundary and margins
ax.contour(X, Y, P, colors='k',
           levels=[-1, 0, 1], alpha=0.5,
           linestyles=['--', '-', '--'])

# plot support vectors
if plot_support:
    ax.scatter(model.support_vectors_[:, 0],
               model.support_vectors_[:, 1],
               s=300, linewidth=1, facecolors='none');
ax.set_xlim(xlim)
ax.set_ylim(ylim)```
plt.scatter(X[:, 0], X[:, 1], c=y, s=50, cmap='autumn')
plot_svc_decision_function(model);

即可画出图一。
3.查看支持向量:

model.support_vectors_

array([[ 0.44359863, 3.11530945],
[ 2.33812285, 3.43116792],
[ 2.06156753, 1.96918596]])

4.引入核函数:

#加入径向基函数, 核设为 rbf
clf = SVC(kernel='rbf', C=1E6)
clf.fit(X, y)

5.调节SVM参数: Soft Margin问题:
调节C参数
当C趋近于无穷大时:意味着分类严格不能有错误
当C趋近于很小的时:意味着可以有更大的错误容忍

X, y = make_blobs(n_samples=100, centers=2,
                  random_state=0, cluster_std=0.8)

fig, ax = plt.subplots(1, 2, figsize=(16, 6))
fig.subplots_adjust(left=0.0625, right=0.95, wspace=0.1)

for axi, C in zip(ax, [10.0, 0.1]):
    model = SVC(kernel='linear', C=C).fit(X, y)
    axi.scatter(X[:, 0], X[:, 1], c=y, s=50, cmap='autumn')
    plot_svc_decision_function(model, axi)
    axi.scatter(model.support_vectors_[:, 0],
                model.support_vectors_[:, 1],
                s=300, lw=1, facecolors='none');
    axi.set_title('C = {0:.1f}'.format(C), size=14)

C值越大越严格

6.调节gamma值,gamma值越大,映射的维度越高,模型越复杂:

X, y = make_blobs(n_samples=100, centers=2,
                  random_state=0, cluster_std=1.1)

fig, ax = plt.subplots(1, 2, figsize=(16, 6))
fig.subplots_adjust(left=0.0625, right=0.95, wspace=0.1)
# wspace调整图像边框,使得各个图之间的间距为0

for axi, gamma in zip(ax, [10.0, 0.1]):
    model = SVC(kernel='rbf', gamma=gamma).fit(X, y)
    axi.scatter(X[:, 0], X[:, 1], c=y, s=50, cmap='autumn')
    plot_svc_decision_function(model, axi)
    axi.scatter(model.support_vectors_[:, 0],
                model.support_vectors_[:, 1],
                s=300, lw=1, facecolors='none');
    axi.set_title('gamma = {0:.1f}'.format(gamma), size=14)

这里写图片描述

ps:排版不好,请勿见怪。里面的内容较为粗浅。希望能一起交流。

猜你喜欢

转载自blog.csdn.net/weixin_41376658/article/details/79415192