# Heckman and Honoré (1989)

The Identifiability of the Competing Risks Model

These notes are based on the following article:

Heckman, James J. and Bo E. Honoré (1989). The identifiability of the competing risks model. Biometrika, 76: 325–330.

## The Classical Competing Risks Model

• Suppose there are $J$ competing causes of death $\left\{1,2,\dots ,J\right\}$.

• Associated with each cause of death is a stochastic failure time ${T}_{j}$.

• We observe only the distribution of the identified minimum:

• The time of death $T={\mathrm{min}}_{j}{T}_{j}$.

• The cause of death $I=\mathrm{arg}{\mathrm{min}}_{j}{T}_{j}$.

• Goal: Identify the joint distribution of the latent failure times given that we only observe the distribution of the identified minimum.

• Note that we aren’t considering regressors yet.

## Cox and Tsiatis Nonidentification Theorem

• For any joint distribution of latent failure times, there exists another such distribution with independent failure times that yields the same distribution of the minimum (Cox, 1959, 1962; Tsiatis, 1975).

• That is, given r.v.’s $\left({T}_{1},{T}_{2},\dots ,{T}_{J}\right)$ there exist $\left({S}_{1},{S}_{2},\dots ,{S}_{J}\right)$ with ${S}_{i}⫫{S}_{j}$ for all $i\ne j$ such that $\left(T,{I}_{T}\right)$ and $\left(S,{I}_{S}\right)$ are observationally equivalent.

• In light of this result, any empirical work needed to proceed by placing some structure on the form of dependence across risks, for example, by assuming independence.

## Importance of Dependence

• We are concerned with conditional independence—independence of the risks ${T}_{1},\dots ,{T}_{J}$ conditional on $X$.

• Even conditional independence may not hold if, for example, we are studying an individual whose behavior may affect all of the risks.

• Yashin, Manton, and Stallard (1986): How do smoking, blood pressure, and body weight (regressors) affect time of death from cancer, heart disease, etc. (risks).

## Overview

• Establish an identification theorem for a general class of competing risks models with regressors.

• This class includes models with marginal distributions that follow:

• Proportional hazards.

• Mixed proportional hazards.

• Accelerated hazards.

• Results are presented for only two competing risks but generalize to any arbitrary finite number of risks.

## Proportional Hazards Model

• We want to model the time of death $T$ from a single risk conditional on some covariates $X$.

• Conditional on $X$, $T$ has cdf $F\left(t|x\right)$ and pdf $f\left(t|x\right)$.

• Hazard function: $\lambda \left(t|x\right)=\frac{f\left(t|x\right)}{1-F\left(t|x\right)}$.

• Integrated hazard: $\Lambda \left(t|x\right)={\int }_{0}^{t}\lambda \left(s|x\right)\phantom{\rule{thickmathspace}{0ex}}\mathrm{ds}$.

• If $\lambda \left(t|x\right)=z\left(t\right)\varphi \left(x\right)$ then $\Lambda \left(t|x\right)=Z\left(t\right)\varphi \left(x\right)$ with $Z\left(t\right)={\int }_{0}^{t}z\left(s\right)\phantom{\rule{thickmathspace}{0ex}}\mathrm{ds}$.

• Equivalently, we can work with the survivor function: $S\left(t|x\right)=\mathrm{Pr}\left(T>t|x\right)=\mathrm{exp}\left[-Z\left(t\right)\varphi \left(x\right)\right]$

• It is common in practice to use $\varphi \left(x\right)={e}^{x\beta }$.

• Suppose $F\left(t|x\right)=1-{e}^{-Z\left(t\right)\varphi \left(x\right)}$ where $Z\left(t\right)$ is the baseline integrated hazard and $\varphi \left(x\right)$ is a scaling term.

• If $Z$ is differentiable, then $Z\prime \left(t\right)$ is the baseline hazard.

## Proportional Hazards and Competing Risks

• Assuming for the moment that failure times are independent, we can easily generalize this to model competing risks.

• The distribution of each failure time has a proportional hazard specification.

• $Z\left(t\right)$ and $\varphi$ may differ across risks.

• The joint survivor function is $S\left({t}_{1},{t}_{2}|x\right)=1-\left(1-\mathrm{exp}\left[-{Z}_{1}\left({t}_{1}\right){\varphi }_{1}\left(x\right)\right]\right)\left(1-\mathrm{exp}\left[-{Z}_{2}\left({t}_{2}\right){\varphi }_{2}\left(x\right)\right]\right).$

## Introducing Dependence

• We could draw two independent failure times ${T}_{1}$ and ${T}_{2}$ by drawing (independently) ${U}_{j}\sim U\left(0,1\right)$ and solving for ${T}_{j}$:

${S}_{j}\left(t|x\right)=\mathrm{exp}\left\{-{Z}_{j}\left(t\right){\varphi }_{j}\left(x\right)\right\}.$

• If $K\left({u}_{1},{u}_{2}\right)={u}_{1}{u}_{2}$ is the CDF of ${U}_{1}$ and ${U}_{2}$, the joint survivor function is

$S\left({t}_{1},{t}_{2}|x\right)=K\left[\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right\},\mathrm{exp}\left\{-{Z}_{2}\left(t\right){\varphi }_{2}\left(x\right)\right\}\right].$

• We can introduce dependence in ${T}_{1}$ and ${T}_{2}$ by introducing dependence in ${U}_{1}$ and ${U}_{2}$ via $K$.

• Suppose $\left({U}_{1},{U}_{2}\right)\sim K\left(\cdot ,\cdot \right)$ on $\left[0,1{\right]}^{2}$ and assume that ${Z}_{1}\left(0\right)={Z}_{2}\left(0\right)=0$.

• Then the survivor function for $\left({T}_{1},{T}_{2}\right)$ is

(1)$S\left({t}_{1},{t}_{2}|x\right)=K\left(\mathrm{exp}\left[-{Z}_{1}\left({t}_{1}\right){\varphi }_{1}\left(x\right)\right],\mathrm{exp}\left[-{Z}_{2}\left({t}_{2}\right){\varphi }_{2}\left(x\right)\right]\right).$

## Generalization: Mixed Proportional Hazards

• Suppose that the competing risks are independent, ${\varphi }_{j}\left(x\right)={e}^{x{\beta }_{j}}$, and that one of the covariates, $\omega$, is not observed:

$S\left({t}_{1},{t}_{2}|x\right)={\int }_{\Omega }\mathrm{exp}\left[-{Z}_{1}\left({t}_{1}\right){e}^{x{\beta }_{1}+{c}_{1}\omega }\right]\mathrm{exp}\left[-{Z}_{2}\left({t}_{2}\right){e}^{x{\beta }_{2}+{c}_{2}\omega }\right]\phantom{\rule{thickmathspace}{0ex}}\mathrm{dG}\left(\omega \right).$

• We can arrive at this model by choosing $K$ such that:

$K\left({\eta }_{1},{\eta }_{2}\right)={\int }_{\Omega }{\eta }_{1}^{\mathrm{exp}\left({c}_{1}\omega \right)}{\eta }_{2}^{\mathrm{exp}\left({c}_{2}\omega \right)}\phantom{\rule{thickmathspace}{0ex}}\mathrm{dG}\left(\omega \right).$

## Generalization: Accelerated hazards

$S\left(t|x\right)=\mathrm{exp}\left[-Z\left\{t\varphi \left(x\right)\right\}\right]$

• Joint survivor with dependent competing risks:

$S\left({t}_{1},{t}_{2}|x\right)=K\left(\mathrm{exp}\left[-{Z}_{1}\left\{{t}_{1}{\varphi }_{1}\left(x\right)\right\}\right],\mathrm{exp}\left[-{Z}_{2}\left\{{t}_{2}{\varphi }_{2}\left(x\right)\right\}\right]\right).$

• For any $K$, the marginal distributions give rise to univariate accelerated hazard models.

## Identification Theorem

Assume that $\left({T}_{1},{T}_{2}\right)$ has joint distribution (1). Then ${Z}_{1}$, ${Z}_{2}$, ${\varphi }_{1}$, ${\varphi }_{2}$, and $K$ are identified from the minimum of $\left({T}_{1},{T}_{2}\right)$ under the following assumptions:

1. $K$ is continuously differentiable with partial derivatives ${K}_{1}$ and ${K}_{2}$ and for $i=1,2$, ${\mathrm{lim}}_{n\to \infty }{K}_{i}\left({\eta }_{1n},{\eta }_{2n}\right)$ is finite for all sequences ${\eta }_{1n}$, ${\eta }_{2n}$ for which ${\eta }_{1n}\to 1$ and ${\eta }_{2n}\to 1$ for $n\to \infty$. We also assume that $K$ is strictly increasing in each of its arguments.

2. ${Z}_{1}\left(1\right)={Z}_{2}\left(1\right)=1$ and ${\varphi }_{1}\left({x}_{0}\right)={\varphi }_{2}\left({x}_{0}\right)=1$ for some ${x}_{0}$.

3. The support of $\left\{{\varphi }_{1}\left(x\right),{\varphi }_{2}\left(x\right)\right\}$ is $\left(0,\infty \right)×\left(0,\infty \right)$.

4. ${Z}_{1}$ and ${Z}_{2}$ are nonnegative, differentiable, strictly increasing functions, except that we allow them to be infinite for finite $t$.

1. $K$ is already weakly increasing.

2. This is an innocuous normalization since ${\varphi }_{j}\left(x\right)$ and ${Z}_{j}\left(t\right)$ are not jointly identified to scale.

3. This is satisfied, for example, when ${\varphi }_{j}\left(x\right)=\mathrm{exp}\left(x{\beta }_{j}\right)$ and there is a common covariate with support $ℝ$ and different coefficients.

## Mapping Observables to Unobservables

Observed distributions:

${Q}_{1}\left(t|x\right)=\mathrm{Pr}\left({T}_{1}\ge t,{T}_{2}\ge {T}_{1}|x\right)\phantom{\rule{1em}{0ex}}{Q}_{2}\left(t|x\right)=\mathrm{Pr}\left({T}_{2}\ge t,{T}_{1}\ge {T}_{2}|x\right).$

Tsiatis (1975) establishes the following mappings:

$\frac{\partial {Q}_{1}}{\partial t}\left(t|x\right)={\left[\frac{\partial S}{\partial {t}_{1}}\right]}_{{t}_{1}={t}_{2}=t}\phantom{\rule{1em}{0ex}}\frac{\partial {Q}_{2}}{\partial t}\left(t|x\right)={\left[\frac{\partial S}{\partial {t}_{2}}\right]}_{{t}_{1}={t}_{2}=t}.$

We have $\frac{\partial {Q}_{1}}{\partial t}\left(t|x\right)=-{K}_{1}\left[\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right\},\mathrm{exp}\left\{-{Z}_{2}\left(t\right){\varphi }_{2}\left(x\right)\right\}\right]\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right\}Z{\prime }_{1}\left(t\right){\varphi }_{1}\left(x\right).$

## Identification of ${\varphi }_{j}$

Taking the ratio of $\frac{\partial {Q}_{1}\left(t|x\right)}{\partial t}$ at $x$ and ${x}_{0}$ yields

$\frac{{K}_{1}\left[\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right\},\mathrm{exp}\left\{-{Z}_{2}\left(t\right){\varphi }_{2}\left(x\right)\right\}\right]\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right\}Z{\prime }_{1}\left(t\right){\varphi }_{1}\left(x\right)}{{K}_{1}\left[\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left({x}_{0}\right)\right\},\mathrm{exp}\left\{-{Z}_{2}\left(t\right){\varphi }_{2}\left({x}_{0}\right)\right\}\right]\mathrm{exp}\left\{-{Z}_{1}\left(t\right){\varphi }_{1}\left({x}_{0}\right)\right\}Z{\prime }_{1}\left(t\right){\varphi }_{1}\left({x}_{0}\right).}$

Taking $t\to 0$ and using the normalization yields ${\varphi }_{1}\left(x\right)$. Our choice of $x$ was arbitrary so ${\varphi }_{1}\left(x\right)$ is identified on the entire support of $X$. Similarly for ${\varphi }_{2}\left(x\right)$.

## Identification of $K$

We know $S\left(t,t|x\right)$ since $S\left(t,t|x\right)={Q}_{1}\left(t|x\right)+{Q}_{2}\left(t|x\right)$. Furthermore, $S\left(t,t|x\right)=K\left(\mathrm{exp}\left[-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right],\mathrm{exp}\left[-{Z}_{2}\left(t\right){\varphi }_{2}\left(x\right)\right]\right).$

Setting $t=1$ gives $S\left(1,1|x\right)=K\left(\mathrm{exp}\left[-{\varphi }_{1}\left(x\right)\right],\mathrm{exp}\left[-{\varphi }_{2}\left(x\right)\right]\right).$ and letting ${\varphi }_{1}\left(x\right)$ and ${\varphi }_{2}\left(x\right)$ vary over $\left(0,\infty {\right)}^{2}$ (by Assumption 3) yields $K$.

## Identification of ${Z}_{j}$

$S\left(t,t|{x}_{n}\right)=K\left(\mathrm{exp}\left[-{Z}_{1}\left(t\right){\varphi }_{1}\left({x}_{n}\right)\right],\mathrm{exp}\left[-{Z}_{2}\left(t\right){\varphi }_{2}\left({x}_{n}\right)\right]\right)$

• Let ${\varphi }_{2}\left(x\right)\to 0$ while holding ${\varphi }_{1}\left(x\right)$ fixed.

• Then $S\left(t,t|x\right)\to K\left(\mathrm{exp}\left[-{Z}_{1}\left(t\right){\varphi }_{1}\left(x\right)\right],1\right).$

• Since $K$ and ${\varphi }_{1}$ are known and $K$ is strictly increasing in both arguments, we have ${Z}_{1}\left(1\right)=1$ for any $t$.

• Similarly for ${Z}_{2}\left(t\right)$.

## Conclusion

Identification argument:

• Given the distribution of $\left(T,I\right)$ and exploiting multiplicative separability gives us ${\varphi }_{j}\left(x\right)$ for $j=1,2$.

• Using the full range of ${\varphi }_{j}\left(x\right)$ on $\left(0,\infty \right)$ yields $K$.

• Using $K$, ${\varphi }_{j}$, and related properties gives us ${Z}_{j}\left(t\right)$.

Implications of Nonparametric Identification:

• Identification does not depend on parametric functional forms or assumed forms of risk dependence (modulo separability of the hazard).

• Highlights the role of regressors in identification in contrast to the Cox-Tsiatis nonidentification result.

• Suggests the possibility of a nonparametric estimator.