III. Support Vector Machines

 

Idea:  Use a nonlinear mapping kernal to project data onto a higher dimension in order to find a separating hyperplane with the largest margin between the data.

Steps:

 

 

.Results.

 

  Upright Leaning Forward Leaning Left Leaning Right Leaning Back Slouching % Recognition
Upright 84 8 0 0 8 0 84%
Leaning Forward 1 99 0 0 0 0 99%
Leaning Left 0 0 100 0 0 0 100%
Leaning Right 0 0 0 100 0 0 100%
Leaning Back 2 0 0 0 95 3 95%
Slouching 0 0 0 0 1 99 99%
  Total Recognition 96.17%

Table 1: Confusion Matrix for SVM using leave-one-out cross validation

Upright Leaning Forward Leaning Left Leaning Right Leaning Back Slouching % Recognition
Upright 82 12 0 2 4 0 82%
Leaning Forward 0 100 0 0 0 0 100%
Leaning Left 6 6 88 0 0 0 88%
Leaning Right 24 12 0 64 0 0 64%
Leaning Back 10 0 0 0 90 0 90%
Slouching 0 0 0 0 38 62 62%
          Total Recognition 81.00%

Table 2: Confusion Matrix for SVM with 10 new subjects

.Other Explored Areas.

Results using linear classifier:

cross validation = 94%

10 new subjects = 80.33%

 

Results using mixture model parameters as input:

cross validation = 47.50%

10 new subjects = 41.33%