Installation and configuration of python machine learning package mlxtend

I saw the mlxtend package today, and the integration of the example is very simple. Another thing that attracts me is that it comes with some data that can be used directly, which saves the process of creating data or finding data by myself, so I decided to install and experience it.

dependent environment

First, sudo pip install mlxtend to get the base environment.

Then start to look at the resolution of system dependency problems. I roughly looked at the classic packages that are basically used in python scientific computing, mainly numpy, scipy, matplotlib, sklearn.

In the LINUX environment, these are generally better installed pip can generally be done.
The point to be mentioned here is that in the case of matplotlib, several problems that prompted me when pip installed are that png and a package called Freetype are required, but there are problems when installing. So matplotlib finally chose to use

sudo apt-get install python-matplotlib

Solve the dependency problem directly.

The same is true for scipy, with

sudo apt-get install python-scipy

solve.

sample code

import numpy as np
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import itertools
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
from sklearn.ensemble import RandomForestClassifier
from mlxtend.classifier import EnsembleVoteClassifier
from mlxtend.data import iris_data
from mlxtend.evaluate import plot_decision_regions

# Initializing Classifiers
clf1 = LogisticRegression(random_state=0)
clf2 = RandomForestClassifier(random_state=0)
clf3 = SVC(random_state=0, probability=True)
eclf = EnsembleVoteClassifier(clfs=[clf1, clf2, clf3], weights=[2, 1, 1], voting='soft')

# Loading some example data
X, y = iris_data()
X = X[:,[0, 2]]

# Plotting Decision Regions
gs = gridspec.GridSpec(2, 2)
fig = plt.figure(figsize=(10, 8))

for clf, lab, grd in zip([clf1, clf2, clf3, eclf],
                         ['Logistic Regression', 'Random Forest', 'Naive Bayes', 'Ensemble'],
                         itertools.product([0, 1], repeat=2)):
    clf.fit(X, y)
    ax = plt.subplot(gs[grd[0], grd[1]])
    fig = plot_decision_regions(X=X, y=y, clf=clf, legend=2)
    plt.title(lab)
plt.show()

Then you can run this sample code.

The matplot result is shown in the figure:
write picture description here

Then you can start playing~!

Attachment: a sum command of the classic package of python scientific computing under linux:

sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325878590&siteId=291194637