2nd Order Markov Model
Believing more to be better, I implemented a 2nd order Markov Model.
In this improvement on 1st order, the previous two states are considered.
I believed this would more accurately represent the data for the simple
fact that a drum sound may be more likely to follow one pair of sounds
than another. For example, in many rhythms, kick drums and
snares alternate. It might be possible that a snare is more likely to have
a kick drum follow it if it followed silence, than if it followed all
three drums being hit at once.
This model is still subject to some of the deficiencies
of the 1st order MM, but they are reduced. For example,
the mutation which turned
1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4
into:
1 2 2 3 4 1 2 3 4 1 2 3 4 1 2 3
now has more commensurate effect on the likelihood. Instead
of two terms, changing, four terms must change.
Classification Results
The 2nd Order MM classified the testing dataset
correctly 59.7% of the time, an improvement on the
1st Order MM.
Just out of curiousity, I also tested against
the training set:
Dataset | Classification Rate |
Training | 79.9% |
Testing | 56.1% |
Feature State Transitions
Just out of curiousity, I experimented with
the same magnitude remapping tricks I used in the first
order model.
Unfortunately, the technique performed worse than
the raw data on both the testing data and the the training
data. Perhaps the transition of another, more intelligent
feature would work better:
Dataset | Classification Rate |
Training | 73.2% |
Testing | 55.81% |
--->Hidden Markov Model
Index