Modeling the individual parameters
In The individual approach section we introduced the modeling approach for a single individual whose response variable depended on the parameter $\psi$. In the population approach, we now suppose that each individual $i$ has its own "individual" parameter $\psi_i$ and more importantly, that this $\psi_i$ comes from some probability distribution $\qpsii$.
In this chapter, we are interested in the description, representation and implementation of these individual parameter distributions $\qpsii$. Generally speaking, we assume that individuals are independent. This means that in the following analysis, it suffices to take a closer look at the distribution $\qpsii$ of a unique individual $i$.
If $\qpsii$ is a parametric distribution that depends on a vector $\theta$ of population parameters and a set of individual covariates $c_i=(c_{i,1} , c_{i,2},\ldots, c_{i,L})$, this dependence can be stated explicitly:
\(
\psi_i \sim \qpsii(\, \cdot \, ;c_i,\theta) .
\)

(1) 
The distribution $\qpsii$ plays a fundamental role since it describes the interindividual variability of the individual parameter $\psi_i$. It achieves two things:
 Definition of a predicted value $\hpsi_i$ of $\psi_i$ for a given vector of covariates $c_i$ and a given population parameter $\theta$, i.e., a "typical" value of the individual parameter $\psi_i$ for individuals who share the same covariates in a given population.
 A description of how the individual parameter $\psi_i$ fluctuates around its predicted value $\hpsi_i$. In other words, describes the distribution of the individual parameters for individuals who share the same covariates $c_i$.
This means that modeling the individual parameters reduces to describing these two properties of the distribution $\qpsii$. We can imagine all sorts of discrete or continuous distributions and linear or nonlinear covariate models to define $\hpsi_i$. Nevertheless, we must remember that in the modeling context, the parameters $\psi_i$ are not actually going to be themselves observed. This means that we are going to prefer certain types of models with a structure that lets them be both identifiable and interpretable.
Example distributions via the normal distribution are proposed in the Gaussian models section, and continuous and categorical covariate models are presented in the The covariate model section.
Rather than defining $\psi_i$ using a probability distribution as in (1), we can instead use equations:
\(
\psi_i = \model(\bbeta,c_i,\eta_i) ,
\)

(2) 
where $\bbeta$ is a vector of fixed effects and $\eta_i$ a vector of random effects, i.e., a vector of zeromean random variables: $\esp{\eta_i}=0$. The predicted value $\hpsi_i$ is then seen as the value of $\psi_i$ with the random effects set to zero:
\(
\hpsi_i = \model(\bbeta,c_i,\eta_i \equiv 0) .
\)

(3) 
The pros and cons of the two approaches are discussed in the Description, representation and implementation of a model section. We will show that both representations can be used with the various models presented in the Gaussian models and The covariate model sections.
A multivariate representation of the distribution of $\psi_i$ is given in the Extension to multivariate distributions section for when the random effects vector $\eta_i$ is Gaussian. In this case, under fairly general hypotheses, we can explicitly calculate the likelihood function
Here, the distribution of the vector of random effects is completely defined by its variancecovariance matrix $\Omega$. Then, the vector of population parameters $\theta$ contains the vector $\bbeta$ of fixed effects and the variancecovariance matrix $\Omega$.
Several extensions are possible:
 We can suppose that the individual parameters of a given individual can fluctuate over time. Here, the model needs to describe the intraindividual variability of the individual parameters.
 We can also suppose that the individuals are not in fact independent. The model then requires us to provide the interindividual dependencies of the individual parameters.
Some of these models that incorporate differing types of variability are presented in the Additional levels of variability section.