1. Introduction
Nutrition and health surveys play an important role in assessing and monitoring dietary patterns and trends, assist in the analysis of dietary factors related to disease risk, and provide background data for informing nutrition policy. The National Health and Nutrition Examination Survey (NHANES) is designed to monitor the health and nutritional status of the U.S. population. NHANES data are used to determine the prevalence of diseases and their risk factors, assess nutritional status and its association with health promotion and disease prevention, and assist with formulation of national standards and public health policy [
1]. The Centers for Disease Control and Prevention National Center for Health Statistics (NCHS) conducts the NHANES every year using a stratified, multistage probability design to obtain a nationally representative sample. Demographic and basic health information data are collected via an in-home interview, and participants complete a comprehensive diet and health examination at a Mobile Examination Center. Dietary intake information is collected via the Automated Multiple-Pass Method [
2] using two 24 h dietary recalls, the first conducted in person in the Mobile Examination Center interview (day 1 recall) and the second via telephone interview 3–10 days later (day 2 recall).
In addition to 24 h dietary recall (single or multiple days), food frequency questionnaires (FFQs) and dietary records are other commonly used methods to assess dietary intake in nutritional surveys [
3,
4,
5,
6]. Long-term or habitual intakes, typically referred to as usual intake (UI), are estimated from short-term measurements using various modeling techniques that take into account within-person variation. Various statistical methodologies using a similar general approach have been developed for estimating UI distributions [
7,
8,
9,
10,
11,
12,
13]. Nevertheless, there are differences between the methods used and their underlying assumptions. To date, the different methods of UI estimation, especially the newly developed methods, have only been compared in a few reports and have used only simulation studies [
14,
15,
16]. Therefore, the objective of the present study was to compare dietary intake estimation by four approaches: 1 day and 2 day average 24 h dietary recall and UI determined with the National Cancer Institute (NCI) and the Markov Chain Monte Carlo (MCMC) methods. The MCMC method allows for simulation of changes in more than one variable (e.g., nutrient intake and calories). Intake of added sugars and its effect on nutrient adequacy/inadequacy were used as an example, as authoritative public health organizations from around the world recommend limiting intake of added sugars [
17,
18] due to the potential association of added sugars intake with decreased diet quality and increased chronic disease risk [
19,
20,
21,
22,
23].
4. Discussion
Micronutrient dilution of the diet at higher added sugars intakes has been cited as a concern by public health organizations recommending limits on added sugars intake [
17,
18]. However, the evidence is not consistent and may be limited by methodological constraints [
25,
26,
27,
28,
29,
30,
31]. In this report, we analyzed the association of added sugars intake with the percentage of the population below the EAR of calcium and vitamin D intake/above the AI of potassium and dietary fiber intake. We chose these nutrients because their current intake is very low and they have been characterized as “nutrients of public health concern” [
17]. The sufficiency or insufficiency of micronutrient intakes along the continuum of added sugars intake was examined using deciles of added sugars instead of arbitrarily selecting set points. We used four different methods of estimating deciles of added sugars intakes (1 day 24 h dietary recall data, 2 day average 24 h dietary recall data, the UI/NCI method, and the UI/MCMC method) and compared them for the association of added sugars deciles with micronutrient intake as the percentage of the population below the EAR/above the AI. It is interesting to note that all methods generally yielded a significant direct association between deciles of added sugars and inadequacy of calcium and vitamin D intakes, and there was an inverse association between deciles of added sugars and the percentage above the AI of potassium and dietary fiber intakes for children or adults. However, except for calcium in adults, there were no significant differences between the four different methods. With all four methods, those with the lowest added sugars intake generally still had a large percentage of the population below the EAR for calcium and vitamin D, as reported previously [
30,
31]. These findings indicate that obtaining the recommended intake of these nutrients is not solely related to added sugars intake but rather to consumption of specific foods high in these nutrients.
The results show that both the range and the median of added sugars deciles in children and adults were dependent on the method used to estimate added sugars deciles. As expected, the largest range in deciles of the added sugars (as % kcal) in both children and adults was highest for the 1 day intakes (~20 percentage units), reduced somewhat for the average of 2 day intakes (~15 percentage units), and decreased further with UI methods (NCI: ~8 percentage units; MCMC: ~3 percentage units). Additionally, the difference in the first decile of added sugars intake among the methods in children was ≤5.2, ≤6.4, ≤10.8, and ≤ 12.1% of calories for the 1 day, 2 day, UI/NCI, and UI/MCMC methods, respectively; comparable data for the highest decile were >25.8, >22.6, >18.2, and >15.0% calories. A similar pattern was also seen in adults (first decile: ≤2.6, ≤3.8, ≤6.8, and ≤ 11.1% of calories for the 1 day, 2 day, UI/NCI, and UI/MCMC methods, respectively; comparable data for the highest decile were >25.3, >22.9, >19.4, and >13.7% calories). There were significant associations between the percentage of the population below the EAR/above the AI of nutrient intakes and added sugars deciles for some nutrients and for some methods. However, there were very few significant between-method differences for the associations of added sugars intake and the percentage of the population below the EAR/above the AI. The only difference in regression coefficients across methods was seen in the relationship of calcium adequacy in adults for the UI/MCMC method (e.g., the percentage of the population below the EAR increased 3.4 percentage units per decile of added sugars intake, with other methods indicating that the regression coefficient was 1% or less). While this difference may be due to chance, it is also likely that because calcium intake was based, at least somewhat, on total calories, the UI/MCMC method may have greater sensitivity to assess associations when nutrient intake is strongly correlated with calorie intake.
Self-reported 24 h dietary recall assesses an individual’s consumption in the past 24 h and can be used to characterize the average consumption of a group of individuals for that day. In any self-reported dietary assessment instrument, there is an inherent measurement error that leads to biased estimates; therefore, for reliable interpretation, validity of such self-reported instruments is needed [
32]. Additionally, since there are day-to-day variations in an individual’s diet due to various biological and environmental influences, a single 24 h recall is not representative of an individual’s long-term intake [
33,
34]. Intraindividual variability can be eliminated to some extent by using repeated 24 h dietary recalls, especially if the data are collected randomly on non-consecutive days. However, there may be methodological inconsistences in collecting data on multiple days (e.g., data collection via an in-person interview on day 1 and via telephone interview on day 2). For dietary assessments in national dietary surveys, 24 h dietary recall methods are most commonly used and are preferred over dietary record/history and FFQ methods [
4,
35]. However, long-term habitual dietary intake or “UI” of the population is of interest to scientists and policy makers because dietary recommendations are intended to be achieved over time. Habitual intakes are also useful in identifying population groups who are at risk of having an inadequate/insufficient dietary intake or excessive consumption. The distribution of 24 h (single or mean of multiple) dietary intakes of a representative sample is used to estimate UI and the proportion of the population with intakes above or below particular dietary standards [
36]. Statistical modeling of the short-term measurement data is needed to address intra- and interindividual variations, and various statistical methodologies have been proposed for estimating UI distributions [
7,
8,
9,
10,
11,
12,
13]. The UI/NCI method can be used to estimate the distribution of UI of both occasionally and regularly consumed nutrients and foods. The UI/NCI method provides good estimates of the UI distributions; however, the estimates are less accurate for small sample sizes [
14,
15]. The UI/MCMC method is an alternative to the UI/NCI method incorporating the estimation of energy intake into the model [
13]. This method is more complex and uses the MCMC simulation approach instead of maximum likelihood. Although the UI/MCMC method is fairly reliable, it can be very difficult to assess accuracy and obtain convergence. It is interesting to note that, in general, all four methods of short- and long-term intake estimation provided a similar association of deciles of added sugars intake with the percentage of the population below the EAR/above the AI. While there were large differences in the range of intakes of added sugars as a percentage of calories across the methods, the relationship of added sugars to adequacy of the nutrients evaluated was relatively similar across methods. The median intakes measured with each method were also relatively similar (11.1, 10.8, 11.9, and 12.5% kcal for 1 day intake, 2 day average intake, UI/NCI method, and UI/MCMC method, respectively) but given intra- and interindividual variation mentioned above, to estimate the percentage of the population above added sugars recommendations, one of the UI methods is recommended.
There are some limitations to our research that should be considered when drawing conclusions: (1) all of our intake measures are subject to self-reporting issues associated with 24 hr recalls; (2) we used a conservative p value (i.e., <0.001) given our large sample size and propensity to thus detect very small numerical differences, and this may have also lead to concluding non-significance where another p value may have shown a significant association; (3) we only examined four nutrients and two of the four nutrients we examined have either a very high (<90%) percentage of the population below the EAR or a very low (<5%) percentage of the population above the AI; and (4) as our focus was on comparing methods of measuring added sugars intake, no attempt was made to adjust for total caloric intake or diet quality which are known to impact nutrient adequacy.
While we would expect analyses using deciles of intake to help mediate issues such as skewness/non-normality and a non-linear relationship of a nutrient along the range of intakes of added sugars, it is possible that when a large number of subjects are sorted into numerous groups (e.g., deciles of intake as used in this study), some of the weaknesses of shorter-term measures of intake are minimized such that similar relationships across intake levels are obtained. That said, as each of the methods offer different features and complexities, preference for one method over the others may depend more on practical reasons such as availability of data and computing capacity. However, more examples need to be investigated to confirm these findings.