Support Vector Machine (Pine Creek) Example 4

This example create a classification map with the help of Support Vector Machine. It use the classical Pine Creek hyperspectral cube. The regions of interest used for the example are those described in the article: Landgrebe, David, 1997, Multispectral Data Analysis: A Signal Theory Perspective, School of Electrical Engineering, Purdue University.

In [1]:
%matplotlib inline

import os.path as osp
import pysptools.classification as cls
import pysptools.util as util

def remove_bands(M):
    """
    Remove the bands with atmospheric
    scattering.
    Remove:
        [0..4]
        [102..110]
        [148..169]
        [211..end]
    """
    p1 = range(5,102)
    p2 = range(111,148)
    p3 = range(170,211)
    Mp = M[:,:,p1+p2+p3]
    return Mp

data_path = '../data1'
sample = '92AV3C.hdr'

data_file = osp.join(data_path, sample)
# load Pine Creek
data, info = util.load_ENVI_file(data_file)
# and clean up bad bands
data_clean = remove_bands(data)

First, we define the rois.

An instance of the ROIs class keep all the rectangular and polygonal related rois. A cluster can be describe by one or more rois. The name of a cluster can be use as a label by the SVC class.

In [2]:
r = cls.ROIs()
# A roi can be a polygon
r.add('Alfalfa', {'poly': ((67,98),(73,98),(75,101),(70,101))})
# or a rectangle
r.add('Corn-notill', {'rec': (33,31,41,56)})
# 'Corn-min' is the cluster name
r.add('Corn-min', {'rec': (63,6,71,21)}, {'rec': (128,20,134,46)})
# The poly can be close or not (next)
r.add('Corn', {'poly': ((35,7),(35,5),(48,10),(48,23),(45,22),(44,16),(35,10),(35,5))})
r.add('Grass/Pasture', {'rec': (75,4,85,21)})
r.add('Grass/Trees', {'rec': (48,28,70,35)})
r.add('Grass/pasture-mowed', {'rec': (73,109,78,112)})
r.add('Hay-windrowed', {'rec': (39,124,59,138)})
r.add('Soybeans-notill', {'rec': (42,78,63,92)})
# You can define more than one roi for the same cluster
r.add('Soybean-min-till', {'rec': (78,34,111,45)}, {'rec': (3,112,17,117)}, {'rec': (80,51,95,71)})
r.add('Soybean-clean', {'rec': (52,5,58,24)})
r.add('Wheat', {'rec': (119,26,124,46)})
r.add('Woods', {'rec': (121,91,137,121)})
# A closed poly
r.add('Stone-steel towers', {'poly': ((14,47),(23,44),(24,49),(16,52),(14,47))})

Next, we run the Support Vector Classification.

By default the kernel used is rbf. You have to give the data cube, an ROIs instance containing the defined rois and the class_weight. For the class_weight, the definition is as following: '0:1' for the background, '1:10' for Alfalfa, '2:10' for Corn-notill, '3:10' for Corn-min and so on.

In [3]:
svm = cls.SVC()
# We fit and
svm.fit(data_clean, r, class_weight={0:1,1:10,2:10,3:10,4:10,5:10,6:10,7:10,8:10,9:10,10:10,11:10,12:10,13:10,14:10})
# classify
cmap = svm.classify(data_clean)

You can display the rois with the method display_ROIs. The ROIs instance 'r' keep the labels. You can pass it to the method.

In [4]:
svm.display_ROIs(labels=r.get_labels(), colorMap='Paired', suffix='Pine Creek')
<matplotlib.figure.Figure at 0x27414c18>

And, finally, the method display show the classification map.

In [5]:
svm.display(labels=r.get_labels(), colorMap='Paired', suffix='Pine Creek')
<matplotlib.figure.Figure at 0x27416b70>