Background: An extensive literature has demonstrated that self-ratings of health predict mortality, even after controlling for more objective measures of health, health habits and sociodemographic characteristics. We examine the role of a related concept, self-rated life expectancy, in predicting mortality. Objective: To assess whether self-rated life expectancy predicts mortality after controlling for measures of health, self-rated health, and sociodemographic characteristics. Methods: Using data from the 1992 Health and Retirement Survey (HRS), the 1993 Asset and Health Dynamics Among the Oldest Old (AHEAD) survey, and the second Tracker file (2.0), Cox proportional hazard models were estimated to assess whether self-rated life expectancy predicts mortality, after adjusting for self-rated health and several potential confounders that might otherwise explain this relationship. The AHEAD sample included 2,102 men and 3,160 women. During the 2 years of follow-up, 9% (n = 185) of the men died and 5% (n = 166) of the women died. The HRS sample was comprised of 4,090 men and 4,885 women. Four percent (n = 164) of the men died and 2% (n = 99) of the women died in the 3 years of follow-up. Results: In the older, AHEAD sample, both self-rated life expectancy (p < 0.01) and self-rated health (p < 0.05) predicted mortality for both men and women, even when the two measures were included in the model together. In the younger, HRS sample, self-rated life expectancy was not significantly associated with mortality when self-rated health was included in the model. Conclusion: Our findings suggest that, although self-rated life expectancy and self-rated health may be conceptually related, they have independent empirical effects on mortality.