# Rothenberg (1971)

Identification in Parametric Models

These notes are based on the following article:

Given an observed sample, we want to perform inference about an underlying structure. Identification refers to the question of whether such inference is even possible. Or, under what conditions is it possible?

Rothenberg is interested in conditions for conditions for identification in general parametric models, with rank conditions for linear models presented as a special case. The broader structural identification problem is more general than that of parametric identification. See Koopmans and Reiersøl (1950) for a general formulation.

The conditions laid out are related to the information matrix, which measures the information available from the sample about the parameters. Lack of identification is simply lack of sufficient information to distinguish between different structures.

The results are for unconditional models $f\left(y\right)$, but extensions to conditional models $f\left(y\mid x\right)$ are straightforward.

## Definitions

Let $Y\in {ℝ}^{n}$ and suppose that the distribution of $Y$ is a member of some family of distribution functions $ℱ$. A structure $S$ is a set of hypotheses which imply a unique distribution $F\left(S\right)\in ℱ$. Let $𝒮$ be the collection of all structures.

Two structures $S$ and $S\prime$ in $𝒮$ are observationally equivalent if they imply the same distribution of $Y$: $F\left(S\right)=F\left(S\prime \right)$. A structure $S\in 𝒮$ is identifiable if there is no other structure $S\prime$ in $𝒮$ which is observationally equivalent.

These definitions apply to general sets $𝒮$ and $ℱ$, but can be specialized to the case of parametric models. Suppose that every structure in $𝒮$ can be described by a real vector $\alpha \in {ℝ}^{m}$ and let $A\subset {ℝ}^{m}$ be the set of all possible values of $\alpha$. Now, suppose that the distribution of $Y$ is known to be a continuous density function of the form $f\left(y,\alpha \right)$ for some unknown value of $\alpha$. Let $ℱ=\left\{f\left(y,\alpha \right):\alpha \in A\right\}$ be the parametric family of all such functions. We have thus reduced the problem of identifying a general structure to that of identifying a single point in the parameter space $A$.

We can now redefine the definitions above in terms of a generic parametric model. Two parameters (structures) $\alpha$ and $\alpha \prime$ in $A$ are observationally equivalent if they imply the same distribution of $Y$: $f\left(y,\alpha \right)=f\left(y,\alpha \prime \right)$ for all $y\in {ℝ}^{n}$. Similarly, a parameter ${\alpha }^{0}\in A$ is identifiable if there is no other parameter $\alpha$ in $A$ which is observationally equivalent.

Rothenberg considers two notions of identification—local and global. Global identification is simply a more precise way of labeling the definition above, that no other $\alpha$ in the entire parameter space is observationally equivalent to ${\alpha }^{0}$. Local identification relaxes this requirement, and only requires there to be some open set containing ${\alpha }^{0}$ over which no other $\alpha$ is observationally equivalent. Clearly, global identification implies local identification.

So, to summarize, the (parametric) identification problem is to find conditions on $f\left(y,\alpha \right)$ and $A$ which guarantee that ${\alpha }^{0}$ is identifiable, at least locally, but perhaps globally.

A parametric model is regular if the following conditions hold:

• $A$ is an open set,
• $f\left(y,\alpha \right)$ is a proper density for all $\alpha$,
• the support of $y$ under $f\left(y,\alpha \right)$ is the same for all $\alpha$,
• $f$ is smooth in $\alpha$,
• the elements ${r}_{\mathrm{ij}}\left(\alpha \right)$ of the information matrix $R\left(\alpha \right)$ are continuous functions of $\alpha$.

The results that follow are for regular parametric models.

The information matrix is the usual matrix $R\left(\alpha \right)=\left[{r}_{\mathrm{ij}}\left(\alpha \right)\right]=\mathrm{E}\left[\frac{\partial \mathrm{ln}f}{\partial {\alpha }_{i}}\frac{\partial \mathrm{ln}f}{\partial {\alpha }_{j}}\right].$

A point ${\alpha }^{0}$ is a regular point of a matrix $M\left(\alpha \right)$ if there exists an open set containing ${\alpha }^{0}$ over which $M\left(\alpha \right)$ has constant rank.

## Local Identification

Rothenberg’s main result on local identification in regular models is the following.

Theorem: If ${\alpha }^{0}$ is a regular point of the information matrix $R\left(\alpha \right)$, then ${\alpha }^{0}$ is locally identifiable if and only if $R\left({\alpha }^{0}\right)$ is nonsingular.

Following this is a theorem considering the case where $\alpha$ is known to satisfy a set of constraints of the form $\varphi \left(\alpha \right)=0$.

## Global Identification

Proving global identification is harder. Two cases are considered.

A function $g\left(y,\alpha \right)$ is a member of the exponential family of densities if it can be written $g\left(y,\alpha \right)=A\left(y\right)+B\left(\alpha \right)+\sum _{i=1}^{m}{\alpha }_{i}{D}_{i}\left(y\right)$ where $B\left(\alpha \right)$ is differentiable in $\alpha$.

Theorem: If $f\left(y,\alpha \right)$ is a member of the exponential family and $R\left(\alpha \right)$ is nonsingular in a convex set containing $A$, then every $\alpha \in A$ is globally identifiable.

Theorem: Suppose there exist $m$ known functions ${\varphi }_{1}\left(Y\right),\dots ,{\varphi }_{m}\left(Y\right)$ such that for all $\alpha \in A$, ${\alpha }_{i}=\mathrm{E}\left[{\varphi }_{i}\left(Y\right)\right]$ for all $i=1,\dots ,m$. Then every $\alpha \in A$ is globally identifiable.