# Rothenberg (1971)

These notes are based on the following article:

- Rothenberg, Thomas J. (1971)
Identification in Parametric Models.
*Econometrica*39, 577–591.

Given an observed sample, we want to perform inference about an
underlying structure.
*Identification* refers to the question of whether such inference is
even possible.
Or, under what conditions is it possible?

Rothenberg is interested in conditions for conditions for identification in general parametric models, with rank conditions for linear models presented as a special case. The broader structural identification problem is more general than that of parametric identification. See Koopmans and Reiersøl (1950) for a general formulation.

The conditions laid out are related to the information matrix, which measures the information available from the sample about the parameters. Lack of identification is simply lack of sufficient information to distinguish between different structures.

The results are for unconditional models $f(y)$, but extensions to conditional models $f(y\mid x)$ are straightforward.

## Definitions

Let $Y\in {\mathbb{R}}^{n}$ and suppose that the distribution of $Y$ is a member of some family of distribution functions $\mathcal{F}$. A structure $S$ is a set of hypotheses which imply a unique distribution $F(S)\in \mathcal{F}$. Let $\mathcal{S}$ be the collection of all structures.

Two structures $S$ and $S\prime $ in $\mathcal{S}$ are *observationally
equivalent* if they imply the same distribution of $Y$: $F(S)=F(S\prime )$.
A structure $S\in \mathcal{S}$ is *identifiable* if there is no other
structure $S\prime $ in $\mathcal{S}$ which is observationally equivalent.

These definitions apply to general sets $\mathcal{S}$ and $\mathcal{F}$, but can be specialized to the case of parametric models. Suppose that every structure in $\mathcal{S}$ can be described by a real vector $\alpha \in {\mathbb{R}}^{m}$ and let $A\subset {\mathbb{R}}^{m}$ be the set of all possible values of $\alpha $. Now, suppose that the distribution of $Y$ is known to be a continuous density function of the form $f(y,\alpha )$ for some unknown value of $\alpha $. Let $\mathcal{F}=\{f(y,\alpha ):\alpha \in A\}$ be the parametric family of all such functions. We have thus reduced the problem of identifying a general structure to that of identifying a single point in the parameter space $A$.

We can now redefine the definitions above in terms of a generic
parametric model.
Two parameters (structures) $\alpha $ and $\alpha \prime $ in $A$ are
*observationally equivalent* if they imply the same distribution of
$Y$: $f(y,\alpha )=f(y,\alpha \prime )$ for all $y\in {\mathbb{R}}^{n}$.
Similarly, a parameter ${\alpha}^{0}\in A$ is *identifiable* if there is
no other parameter $\alpha $ in $A$ which is observationally
equivalent.

Rothenberg considers two notions of identification—local and global.
*Global identification* is simply a more precise way of labeling the
definition above, that no other $\alpha $ *in the entire parameter
space* is observationally equivalent to ${\alpha}^{0}$.
*Local identification* relaxes this requirement, and only requires
there to be some open set containing ${\alpha}^{0}$ over which no other
$\alpha $ is observationally equivalent.
Clearly, global identification implies local identification.

So, to summarize, the (parametric) identification problem is to find conditions on $f(y,\alpha )$ and $A$ which guarantee that ${\alpha}^{0}$ is identifiable, at least locally, but perhaps globally.

A parametric model is *regular* if the following conditions hold:

- $A$ is an open set,
- $f(y,\alpha )$ is a proper density for all $\alpha $,
- the support of $y$ under $f(y,\alpha )$ is the same for all $\alpha $,
- $f$ is smooth in $\alpha $,
- the elements ${r}_{\mathrm{ij}}(\alpha )$ of the information matrix $R(\alpha )$ are continuous functions of $\alpha $.

The results that follow are for regular parametric models.

The *information matrix* is the usual matrix
$$R(\alpha )=[{r}_{\mathrm{ij}}(\alpha )]=\mathrm{E}\left[\frac{\partial \mathrm{ln}f}{\partial {\alpha}_{i}}\frac{\partial \mathrm{ln}f}{\partial {\alpha}_{j}}\right].$$

A point ${\alpha}^{0}$ is a *regular point* of a matrix $M(\alpha )$ if
there exists an open set containing ${\alpha}^{0}$ over which $M(\alpha )$
has constant rank.

## Local Identification

Rothenberg’s main result on local identification in regular models is the following.

**Theorem:** If ${\alpha}^{0}$ is a regular point of the information
matrix $R(\alpha )$, then ${\alpha}^{0}$ is locally identifiable if and
only if $R({\alpha}^{0})$ is nonsingular.

Following this is a theorem considering the case where $\alpha $ is known to satisfy a set of constraints of the form $\varphi (\alpha )=0$.

## Global Identification

Proving global identification is harder. Two cases are considered.

A function $g(y,\alpha )$ is a member of the *exponential family of
densities* if it can be written
$$g(y,\alpha )=A(y)+B(\alpha )+\sum _{i=1}^{m}{\alpha}_{i}{D}_{i}(y)$$
where $B(\alpha )$ is differentiable in $\alpha $.

**Theorem:** If $f(y,\alpha )$ is a member of the exponential family
and $R(\alpha )$ is nonsingular in a convex set containing $A$, then
every $\alpha \in A$ is globally identifiable.

**Theorem:** Suppose there exist $m$ known functions
${\varphi}_{1}(Y),\dots ,{\varphi}_{m}(Y)$ such that for all $\alpha \in A$,
${\alpha}_{i}=\mathrm{E}[{\varphi}_{i}(Y)]$ for all $i=1,\dots ,m$.
Then every $\alpha \in A$ is globally identifiable.

## References

- Koopmans, T.C. and O. Reiersøl (1950).
The Identification of Structural Characteristics.
*Annals of Mathematical Statistics*21, 165–181.