Hidden Markov Model Example with hmmlearn

Published

August 7, 2023

  1. Read the example here first: https://hmmlearn.readthedocs.io/en/latest/auto_examples/plot_casino.html#sphx-glr-auto-examples-plot-casino-py
  2. A good tutorial on HMM: https://www.cs.sjsu.edu/~stamp/RUA/HMM.pdf
import numpy as np
import matplotlib.pyplot as plt
from hmmlearn import hmm

Generated Model

Three params to define a HMM:

  • Start probability: \(\pi\)
  • Transition probability: \(A\)
  • Emission probability: \(B\)
# simulate some data assuming the true model is a 2-state HMM

gen_model = hmm.CategoricalHMM(n_components=2, random_state=99)
gen_model.startprob_ = np.array([1.0, 0.0])
gen_model.transmat_ = np.array([[0.95, 0.05],
                                [0.1, 0.9]])
gen_model.emissionprob_ = \
    np.array([[1 / 6, 1 / 6, 1 / 6, 1 / 6, 1 / 6, 1 / 6],
              [1 / 10, 1 / 10, 1 / 10, 1 / 10, 1 / 10, 1 / 2]])
rolls, gen_states = gen_model.sample(30000)

# rolls are the observed values
# gen_states are the hidden states
# plot states over time, let's just look at the first rolls for clarity
fig, ax = plt.subplots()
ax.plot(gen_states[:500])
ax.set_title('Hidden States over time')
ax.set_xlabel('Time (# of rolls)')
ax.set_ylabel('State')
fig.show()

# plot rolls for the fair and loaded states
fig, ax = plt.subplots()
ax.hist(rolls[gen_states == 0], label='fair', alpha=0.5,
        bins=np.arange(7) - 0.5, density=True)
ax.hist(rolls[gen_states == 1], label='loaded', alpha=0.5,
        bins=np.arange(7) - 0.5, density=True)
ax.set_title('Roll probabilities by state')
ax.set_xlabel('Count')
ax.set_ylabel('Roll')
ax.legend()
fig.show()

Fitting the HMM

Several notes: 1. The score function returns the log-likelihood of the data given the model parameters using the forward algorithm. 2. The model could take params and init_params as input. params specifies which parameters could be updated during training. init_params specifies which parameters should be initialized before training. A string with ste meaning starting probs, transition, emission, respectively. Here, we would like to input our own transition matrix, so we set init_params to se, leaving the start probability and emission probability to be initialized by the model. 3. n_fits here allowing us to find the best model among multiple random initializations. 4. predict function returns the most likely state sequence given the data.

# split our data into training and validation sets (50/50 split)
X_train = rolls[:rolls.shape[0] // 2]
X_validate = rolls[rolls.shape[0] // 2:]

# check optimal score
gen_score = gen_model.score(X_validate)

best_score = best_model = None
n_fits = 50
np.random.seed(13)
for idx in range(n_fits):
    model = hmm.CategoricalHMM(
        n_components=2, random_state=idx,
        init_params='se')
    model.transmat_ = np.array([np.random.dirichlet([0.9, 0.1]),
                                np.random.dirichlet([0.1, 0.9])])
    model.fit(X_train)
    score = model.score(X_validate)
    # print(f'Model #{idx}\tScore: {score}')
    if best_score is None or score > best_score:
        best_model = model
        best_score = score

print(f'Generated score: {gen_score}\nBest score:      {best_score}')

# use the Viterbi algorithm to predict the most likely sequence of states
# given the model
states = best_model.predict(rolls)

# plot our recovered states compared to generated (aim 1)
fig, ax = plt.subplots()
ax.plot(gen_states[:500], label='generated')
ax.plot(states[:500] + 1.5, label='recovered')
ax.set_yticks([])
ax.set_title('States compared to generated')
ax.set_xlabel('Time (# rolls)')
ax.set_xlabel('State')
ax.legend()
fig.show()
Generated score: -26230.575868403906
Best score:      -26229.220050326167

For example, the below sequence looks like a fair coin at first, but then becomes biased when more 5s are observed.

best_model.predict([[1, 2, 3, 3, 2, 1, 1, 3, 5, 5, 5, 5]])
array([1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0])

Comparison

print(f'Transmission Matrix Generated:\n{gen_model.transmat_.round(3)}\n\n'
      f'Transmission Matrix Recovered:\n{best_model.transmat_.round(3)}\n\n')
Transmission Matrix Generated:
[[0.95 0.05]
 [0.1  0.9 ]]

Transmission Matrix Recovered:
[[0.922 0.078]
 [0.059 0.941]]

print(f'Emission Matrix Generated:\n{gen_model.emissionprob_.round(3)}\n\n'
      f'Emission Matrix Recovered:\n{best_model.emissionprob_.round(3)}\n\n')
Emission Matrix Generated:
[[0.167 0.167 0.167 0.167 0.167 0.167]
 [0.1   0.1   0.1   0.1   0.1   0.5  ]]

Emission Matrix Recovered:
[[0.106 0.108 0.114 0.105 0.113 0.454]
 [0.168 0.168 0.167 0.171 0.172 0.153]]