I. Principal Component Analysis

 

Represent a set of n d-dimensional samples by a single vector xo, where xo minimizes the sum of squared distances between xo and various xk.  PCA tries to find the best representation of the data by reducing the dimensionality of the feature space so this might be a good method since I am trying to model the different pressure patterns for each posture.

Steps:

...but, but...how many eigenvectors should we use???

 

Number of Eigenvectors

% Recognition

10

55.33%

20

65.67%

30

65.00%

40

73.33%

50

69.67%

60

67.00%

Table 1: Comparison of Different Numbers of Eigenvectors Used for Classification

 

.Results.

 

Total % Recognition

86.45%

Table 2: Using k-fold Cross Validation, k = 10

 

Number Correct

% Recognition

Upright

47

94%

Leaning Forward

32

64%

Leaning Left

44

88%

Leaning Right

35

70%

Leaning Back

41

82%

Slouching

21

42%

Total Recognition

73.33%

Table 3: Using 40 Eigenvectors on 10 New Test Subjects