# Hidden Markov models : Différence entre versions

## [modifier]Introduction

Markov chains are a useful tool for analyzing categorical longitudinal data. However, sometimes the Markov process cannot be directly observed, though some output, dependent on the (hidden) state, is visible. More precisely, we assume that the distribution of this observable output depends on the underlying hidden state. Such models are called hidden Markov models (HMMs). HMMs can be applied in many contexts and have turned out to be particularly pertinent in several biological contexts. For example, they are useful when characterizing diseases for which the existence of several discrete stages of illness is a realistic assumption, e.g., epilepsy and migraines.

Here, we will consider a parametric framework with Markov chains in a discrete and finite state space $\mathbf{K} = \{1,\ldots,K\}$.

## [modifier]Mixed hidden Markov models

HMMs have been developed to describe how a given system moves from one state to another over time, in situations where the successive visited states are unknown and a set of observations is the only available information to describe the dynamics of the system. HMMs can be seen as a variant of mixture models that allow for possible memory in the sequence of hidden states. An HMM is thus defined as a pair of processes $(z_j,y_j, j=1,2,\ldots)$, where the latent sequence $(z_j)$ is a Markov chain and where the distribution of the observation $y_j$ at time $t_j$ depends on the state $z_j$.

 Dynamics of a hidden Markov model

In a population approach, HMMs from several individuals can be described simultaneously by considering mixed HMMs. Let $y_i=\left(y_{i,1},\ldots,y_{i,n_i}\right)$ and $z_i= \left(z_{i,1}, \ldots,z_{i,n_i}\right)$ denote respectively the sequences of observations and hidden states for individual $i$.

We suppose that the joint distribution of $(z_i,y_i)$ is a parametric distribution that depends on a vector of parameters $\psi_i$ and can be decomposed as

 $$\pcyzipsii(z_i,y_i | \psi_i) = \pczipsii(z_i |\psi_i) \, \pcyizpsii(y_i | z_i,\psi_i) .$$ (1)

For each individual $i$, $z_i$ is a Markov chain whose probability distribution is defined by

• the distribution $\pi_{i,1} = (\pi_{i,1}^{k},\ k=1,2,\ldots,K)$ of the first state $z_{i,1}$:

$$\pi_{i,1}^{k} = \prob{z_{i,1} = k | \psi_i} .$$

• the sequence of transition matrices $(Q_{i,j} \ ; \, j=2,3,\ldots)$, where for each $j$, $Q_{i,j} = (q_{i,j}^{\ell,k} \ ; \, 1\leq \ell,k \leq K)$ is a matrix of size $K \times K$ such that $q_{i,j}^{\ell,k} = \prob{z_{i,j} = k | z_{i,j-1}=\ell , \psi_i}$.

 Transitions of a Markov chain with 3 states

The conditional distribution $\qcyizpsii$ depends on the model for the observations: for each state, observation $y_{ij}$ has a certain distribution. Let us see some examples:

### [modifier] Examples

1. In a continuous data model, one possibility is that the residual error model is a hidden Markov model that can randomly switch between $K$ possible residual error models.

Example 1

In this example, we consider a 2-state Markov chain. A constant error model is assumed in each state:

$$\begin{eqnarray} y_{ij} &=& \sin(\alpha \, t_{ij}) + a_{i,1} \teps_{ij} \quad \text{if } z_{ij}=1 \\ y_{ij} &=& \sin(\alpha \, t_{ij}) + a_{i,2} \teps_{ij} \quad \text{if } z_{ij}=2. \end{eqnarray}$$

The figure below displays simulated data from this model for 4 individuals. Observations drawn from state 1 (resp. state 2) are displayed in magenta (resp. black). Of course, the states are unknown in the case of hidden Markov models, i.e., only the values are observed in practice, not the colors.

2. In a Poisson model for count data, the Poisson parameter might randomly switch between $K$ intensities. Such models have been used for describing the evolution of seizures in epileptic patients:

Example 2

Instead of assuming a single Poisson distribution for the observed numbers of seizures, this model assumes that patients go through alternating periods of low and high epileptic susceptibility. Therefore we consider what is called a 2-state Poisson mixed-HMM:

$$\begin{eqnarray} y_{ij} &\sim& {\rm Poisson}(\lambda_{i,1}) \quad \text{if } z_{ij}=1 \\ y_{ij} &\sim& {\rm Poisson}(\lambda_{i,2}) \quad \text{if } z_{ij}=2. \end{eqnarray}$$

## [modifier]Distributions of observations

Assuming that the $N$ individuals are independent, the joint pdf is given by:

 $$\pcypsi(y_1,\ldots,y_N | \psi_1,\ldots,\psi_N ) = \prod_{i=1}^{N}\pcyipsii(y_i | \psi_i).$$ (2)

Then, computing the conditional distribution of the observations $\qcyipsii$ for any individual $i$ requires integration of the joint conditional distribution $\qcyzipsii$ over the states:

$$\begin{eqnarray} \pcyipsii(y_i | \psi_i) &=& \sum_{z_i \in \mathbf{S} } \pcyzipsii(z_i, y_i | \psi_i) \\ &=& \sum_{z_i \in \mathbf{S} } \pczipsii(z_i | \psi_i) \, \pcyizpsii(y_i | z_i,\psi_i) \\ &=& \sum_{z_i \in \mathbf{S} } \left\{ \pi_{i,1}^{z_{i,1} } \pcyiONEzpsii(y_{i,1} | z_{i,1},\psi_i)\prod_{j=2}^{n} \left( q_{i,j}^{z_{i,j-1},z_{i,j} } \, \pcyijzpsii(y_{i,j} | z_{i,j},\psi_i) \right) \right\} . \end{eqnarray}$$

Though this looks complicated, it turns out that forward recursion of the Baum-Welch algorithm provides a quick way to numerically compute it.

## [modifier] Bibliography

Albert, P. S. - A two state Markov mixture model for a time series of epileptic seizure counts

Biometrics 47:1371-1381,1991
Altman, R. M. - Mixed hidden Markov models : an extension of the hidden Markov model to the longitudinal data setting
Journal of the American Statistical Association 102:201-210,2007
Anisimov, W., Maas, H. J., Danhof, M., Della Pasqua, O. - Analysis of responses in migraine modelling using hidden Markov models
Statistics in Medicine 26:4163-4178,2007
Cappée, O., Moulines, E., Rydéen, T. - Inference in hidden Markov models
Springer Series in Statistics,2005
Chaubert-Pereira, F., Guédon, Y., Lavergne, C., Trottier, C. - Markov and Semi-Markov Switching Linear Mixed Models Used to Identify Forest Tree Growth Components
Biometrics 66:753-762,2011
Delattre, M., Lavielle, M. - Maximum likelihood estimation in discrete mixed hidden Markov models using the SAEM algorithm
Computational Statistics & Data Analysis ,2012
Delattre, M., Savic, R. M., Miller, R., Karlsson, M. O., Lavielle, M. - Analysis of exposure-response of CI-945 in patients with epilepsy: application of novel mixed hidden Markov modeling methodology
Journal of pharmacokinetics and pharmacodynamics pp. 1-9,2012
Maruotti, A., Rydéen, T. - A semiparametric approach to hidden Markov models under longitudinal observations
Statistics and Computing 19:381-393,2009
Rabiner, L. R. - A tutorial on Hidden Markov Models and selected applications in speech recognition
Proceedings of the IEEE 77:257-286,1989
Rijmen, F., Ip, E. H., Rapp, S., Shaw, E. G. - Qualitative longitudinal analysis of symptoms in patients with primary and metastatic brain tumours
Journal of the Royal Statistical Society - Series A. 171, Part 3:739-753,2008