Trends over time
(go to Outline)
The second perspective is the trend in the prevalence of acute protein-energy malnutrition over the long-term or by season. A relatively small but rapid rise in an otherwise low prevalence may indicate a substantial nutritional deterioration which may continue into the future. However, at the time of an emergency, high levels of malnutrition might not necessarily indicate a major worsening of nutritional status; they may have been present before the emergency began. For this reason, unacceptable levels of malnutrition are sometimes dismissed as "baseline" even though they may be producing substantially elevated mortality.
This was a question during the alleged famine in Niger in mid-2005. Background levels of acute protein-energy malnutrition exceeded 15% before the world's attention focused on a supposed famine in southern Niger. However, prevalences of acute protein-energy malnutrition failed to rise, even after the famine began. Was this a famine? If so, how can a famine not result in an increase in the prevalence of malnutrition?
Seasonality is very important in many societies practicing subsistence agriculture on marginal land. The prevalence of acute protein-energy malnutrition may rise substantially just before the harvest when food is scarcest. The graph below shows the prevalence of acute protein-energy malnutrition derived from several surveys in Darfur. Note that the lowest levels of malnutrition are seen around the harvest in November through January; however, in 1991 the prevalence did not drop as expected in January. The next hungry season in July shows a sharp rise in prevalence.