Advertise here
Advertise here

Total and specific potato intake and risk of type 2 diabetes: results from three US cohort studies and a substitution meta-analysis of prospective cohorts

Seyed Mohammad Mousavi, Xiao Gu, Fumiaki Imamura, Hala B AlEssa, Orrin Devinsky, Qi Sun, Frank B Hu, JoAnn E Manson, Eric B Rimm, Nita G Forouhi, Walter C Willett , 2025-08-06 22:31:00

Cohort analyses

Study population and design

This analysis involved participants from three ongoing longitudinal cohort studies: NHS, NHSII, and HPFS. NHS was initiated in 1976 and comprised a sample of 121 700 female registered nurses aged 30-55 years from 11 US states. NHSII, which started in 1989, included 116 429 female nurses aged 25-42 years from 14 states. HPFS was established in 1986 and included 51 529 male US health professionals aged 40-75 years. Participants in these cohorts were surveyed biennially using questionnaires that collected information on disease diagnoses, risk factors, medication use, and lifestyle factors. Dietary data were collected through a validated food frequency questionnaire every 2-4 years. The study’s baseline was set in 1984 for NHS, 1986 for HPFS, and 1991 for NHSII. Detailed documentation on the design and methods of these cohorts is provided elsewhere.2021 Participants who completed the baseline food frequency questionnaire (NHS 1984, n=81 702; NHSII 1991, n=95 221; HPFS 1986, n=51 530) and had no previous diagnosis of cancer, myocardial infarction, angina, stroke, coronary artery bypass grafting, or T2D were included. We excluded participants with incomplete baseline information on age or potato intake, with implausible reported energy intake (<500 or >3500 kcal/day (1 kcal=4.18 kJ=0.00418 MJ) for women, and <800 or >4200 kcal/day for men), who died at or before baseline, and who only completed the baseline survey. After exclusions, a total of 205 107 participants including 72 712 women from NHS, 90 232 women from NHSII, and 42 163 men from HPFS were included in the final analysis (see supplementary figure 1 for participant flow chart).

Assessment of dietary intake

Diet was assessed using validated semiquantitative food frequency questionnaires on usual dietary intake over the previous year. Dietary intakes in NHS were first collected in 1980 using a short questionnaire that was expanded and used in 1984, 1986, and then every four years to 2010. In both NHSII and HPFS, dietary intakes were evaluated every four years, starting in 1991 and 1986, respectively. In all food frequency questionnaires, participants were asked how often they consumed each food item, with a standard portion size provided. Nine response options were available, ranging from “never, or <1 time/month” to “≥6 times/day.” Three questions on potato consumption were asked: one about baked, boiled, or mashed potatoes (one medium or one cup), another about French fries (4-6 oz or one serving), and one about potato or corn chips (small bag or 1 oz). We adjusted the grams per serving overtime by monitoring weights in NHANES (National Health and Nutrition Examination Survey), allowing us to account for temporal variations in portion sizes for specific foods (see supplementary table 1). Total potato intake was calculated by summing the servings of baked, boiled, or mashed potatoes and French fries. We did not include consumption of chips (referred to as crisps in the UK) in the total potato intake as the food frequency questionnaire combined potato and corn chips in a single question, and therefore we treated chips as a separate item. Supplementary table 2 provides information on other food groups, such as total red meat, fish, dairy, nuts and legumes, poultry, eggs, fruits, vegetables, and sugar sweetened beverages. We calculated intakes of total energy and alcohol using data from the Harvard University Food Composition Database. The validity and reproducibility of the food frequency questionnaire have been evaluated through dietary records in a subset of 649 male participants in HPFS and 736 female participants in NHS and NHSII.22 The deattenuated Pearson correlations between total potato intake reported on the food frequency questionnaires and seven day dietary records (as reference) were 0.61 for women and 0.63 for men.

Assessment of diabetes

Participants from each cohort self-reported newly diagnosed T2D through biennial questionnaires. These participants were mailed a detailed follow-up questionnaire to gather more information on symptoms, diagnostic tests, and use of hypoglycemic drugs. A T2D diagnosis was considered confirmed if participants fulfilled one or more of the American Diabetes Association’s criteria listed on the supplementary questionnaire: the presence of one or more classic symptoms (eg, excessive thirst, polyuria, weight loss, hunger, pruritus, or coma) along with a fasting plasma glucose level of ≥7.0 mmol/L or a random plasma glucose level of ≥11.1 mmol/L; in the absence of symptoms, at least two separate instances of raised plasma glucose levels, including fasting levels of ≥7.8 mmol/L, random plasma glucose levels of ≥11.1 mmol/L, or a plasma glucose level of ≥11.1 mmol/L during an oral glucose tolerance test; or using hypoglycemic medications such as insulin or oral diabetes drugs. Before 1998, people with T2D were identified based on the National Diabetes Data Group criteria,23 which defined diabetes as a fasting plasma glucose level of ≥7.8 mmol/L.

Assessment of covariates

We collected and updated data on a variety of risk factors and potential confounders for the association between potato consumption and risk of T2D, including age, race/ethnicity, family history of T2D, body weight, smoking status, physical activity, use of multivitamins, use of antihypertensives, use of cholesterol lowering drugs, and history of hypertension. This information was obtained through the primary biennial questionnaires. Additionally, we collected data on menopausal status and postmenopausal hormone use for women. Alcohol intake was assessed through food frequency questionnaires. The reliability and reproducibility of self-reported information on body weight, physical activity, and alcohol consumption have been documented elsewhere.2425 Body mass index (BMI) was calculated by dividing weight in kilograms by height in meters squared. Physical activity was quantified by assigning a metabolic equivalent of task (MET) value to each activity and multiplying it by the time spent on that activity weekly (MET-h/week). From the food frequency questionnaires, we calculated the modified Alternate Healthy Eating Index (AHEI) score after excluding trans fatty acid and polyunsaturated fat components owing to their presence in French fries and potato chips. Additionally, we used a geo-coded composite score, encompassing educational background, income, property value, and marital status, to represent the neighborhood socioeconomic status of each participant area level.26

Statistical analysis

Person years of follow-up were determined by calculating the period from the date of returning the baseline food frequency questionnaire (1984 for NHS, 1991 for NHSII, and 1986 for HPFS) until the date of T2D diagnosis, death, loss to follow-up, or the end of follow-up (30 June 2020 for NHS, 30 June 2021 for NHSII, and 31 January 2018 for HPFS), whichever came first.

To assess the associations between various forms of potato consumption (total; baked, boiled, or mashed; French fries; and chips) and incidence of T2D, we used cohort specific Cox proportional hazards models to compute hazard ratios and corresponding 95% confidence intervals (CIs). In our main analysis, we calculated cumulative averages of dietary intakes from the baseline food frequency questionnaire to the beginning of each subsequent four year follow-up interval.27 This approach was taken to minimize the impact of random measurement errors resulting from within person variations and to reflect any changes in diet over time. For example, in NHS, we used the average potato consumption from 1984, 1986, 1990, and 1994 to estimate T2D risk from 1994 to 1998. Potato consumption was categorized based on the frequency of servings: total potato intake: <1/week (reference), 1/week, 2-4/week, 5-6/week, and ≥7/week, and specific potato types: almost never (reference), 1-3/month, 1/week, 2-4/week, and ≥5/week. To maintain an adequate sample size, we combined the two lowest consumption categories for baked, boiled, or mashed potatoes into a single reference category. Additionally, potato intake was assessed on a continuous scale, with increments of three servings weekly. We determined the model covariates using the modified disjunctive cause criterion,28 which was informed by a thorough review of the relevant literature. All analyses were stratified by age (months) as the time scale and calendar time (two year intervals) to control for confounding by age and account for secular trends. In the first multivariable model, we adjusted for total energy intake. The second model included further adjustments for race/ethnicity, smoking status, alcohol consumption, physical activity, multivitamin use, menopausal status and hormone use (NHS and NHSII only), family history of diabetes, use of antihypertensives, use of cholesterol lowering drugs, baseline hypertension history, BMI, and socioeconomic status. The third model included further adjustments for various food groups, including total red meat, poultry, fish, eggs, dairy products, nuts and legumes, fruits, vegetables, whole and refined grains, and sugar sweetened beverages, and was mutually adjusted for the various types of potatoes. Most covariates except for race, family history of diabetes, and baseline hypertension were updated biennially, whereas dietary variables and physical activity were updated every four years. The overall proportion of missing data on covariates was low across the three cohorts (eg, 10% on average for dietary variables). We handled missing values after baseline using a last observation carried forward approach, which is appropriate given the stability of covariates over short intervals in these cohorts.29 For residual missingness (see supplementary table 3), continuous variables with <0.5% missing data were imputed using cohort specific medians, whereas categorical variables (eg, race, smoking status, menopausal status, and hormone use) were handled using the missing indicator method. In previous analyses conducted within our cohorts, the missing indicator method generated results that were largely consistent with those obtained using the multiple imputation method.30

In the categorical analysis, we evaluated linear trends across categories by treating the median intake of each category as a continuous variable, with significance tested using the Wald test. To assess the potential dose-response relation between potato intake and risk of T2D, we pooled individual level data from the three cohorts and harmonized covariates to ensure consistent definitions and measurement across studies. We then applied a restricted cubic spline regression with three knots placed at the 10th, 50th, and 90th centiles of potato intake, using the SAS Macro %LGTPHCURV9.31 The model was stratified by cohort to allow for cohort specific baseline hazard functions, whereas covariate effects were assumed to be common across cohorts given the harmonized data collection methods and standardized definitions for covariates. To evaluate the robustness of this approach, we conducted sensitivity analyses including cohort-covariate interaction terms and cohort specific spline models. The presence of non-linearity was evaluated using a likelihood ratio test, comparing a model with a linear term to one including both linear and cubic spline terms. To assess the proportional hazards assumption in our Cox regression model, we introduced interaction terms between age (years) and potato intake (three servings weekly) or each of all confounding variables (see supplementary tables 4-6).

We also conducted several sensitivity analyses to determine the robustness of our findings. First, we explored whether the association between potato and T2D risk differed across various subgroups such as age, diet quality (assessed by modified AHEI score), physical activity levels, BMI, sex, history of hypertension at baseline, smoking status, and race/ethnicity. The interaction between potato consumption and dichotomous stratification factors was examined using the Wald test with one degree of freedom. We used likelihood ratio tests to evaluate the interaction of strata with multiple levels, such as smoking status and racial/ethnic groups, by comparing models with and without their product terms for potato consumption. Additionally, for variables treated as continuous (eg, BMI, physical activity, modified AHEI scores), we included interaction terms between potato consumption and the continuous form of these variables in the models and reported the corresponding P values for interaction. Second, we modeled dietary data in four additional ways: using only the baseline food frequency questionnaire, cumulative average intake while excluding the last three food frequency questionnaires before T2D diagnosis, the average of the most recent three food frequency questionnaires at the start of a two year follow-up period, and simple updated potato consumption over the follow-up period. Third, we repeated the main analyses adjusting for the modified AHEI,32 instead of individual food groups. Fourth, the main analysis was repeated, focusing on people with T2D symptoms, ascertained through reports of at least one diabetes related symptom in the supplementary questionnaire. This approach was taken because individuals at higher risk for diabetes tend to undergo more frequent screenings and may receive a diagnosis earlier, which can lead to surveillance bias. Fifth, to assess the potential for confounding by the diagnosis of disease endpoints during the follow-up period, we stopped updating the cumulative average dietary intakes of participants once they self-reported angina, myocardial infarction, or a coronary artery bypass graft procedure. Sixth, the analysis was adjusted for baseline BMI rather than time varying BMI. For intake of French fries, we further adjusted for BMI and BMI2 instead of using categorical BMI to account for both linear and non-linear effects. We also adjusted for confectionery intake, trans fatty acid intake, and the polygenic risk score for T2D.33 Seventh, to assess the extent to which time varying BMI mediates the associations, we excluded BMI from the main analysis and estimated the percentage of the associations that was mediated by BMI. Eighth, we examined the relation between every increment of three servings weekly for potato intake and T2D risk over intervals of 0-4, 5-8, 9-12, 13-16, 17-20, and 21-24 years to identify any potential bias from reverse causation through latency analysis. For instance, in NHSII, for a presumed latency period of 8-12 years, potato intake in 1991 was used to predict risk of T2D from 1999 to 2003, 1995 was used to predict risk of T2D from 2003 to 2007, and so on. We additionally excluded individuals with incident T2D diagnosed within the first 10 years of follow-up to assess potential bias from latent undetected diabetes. Finally, for predictors that showed non-proportional hazards, we modified our primary cohort risk models by including interaction terms between these covariates and a log-transformed age variable. In making this adjustment we aimed to account for potential time dependent effects and to examine whether these altered the primary estimates of the association between potato intake and T2D risk.

In modeled substitution analyses, for estimating risk of T2D associated with replacing three servings weekly of potatoes with common alternatives (whole grains, refined grains, starchy and non-starchy vegetables, legumes, white and brown rice), we applied the serving focused strategy instead of an energy focused one. We included both consumption of potatoes and replacement foods as continuous variables in the Cox proportional hazard models3435 and calculated hazard ratios and 95% CIs for the incidence of T2D, based on differences in the estimated coefficients along with their pooled variances. This approach compares the associations of specific foods with T2D risk, rather than assuming a direct causal effect of substitution. In a sensitivity analysis, we adjusted for energy from macronutrients instead of total energy to evaluate the robustness of the observed associations. We employed a two stage IPD meta-analysis approach, conducting all analyses, except those stratified by race/ethnicity, individually within each cohort. The resulting coefficients were subsequently combined using variance weighted fixed effect meta-analysis. Alternatively, as a secondary approach, we combined coefficients using random effects models. Because of limited numbers of people with T2D for race/ethnicity subgroups, we analyzed the association using combined data from all cohorts. All statistical analyses were performed using SAS for UNIX version 9.4 (SAS Institute, Cary, NC), with all P values calculated as two sided and a significance level set at α=0.05.

Dose-response and substitution meta-analysis of published cohorts

The methods section in the supplementary file provides a comprehensive overview of the linear and non-linear dose-response meta-analyses, as well as the subgroup and sensitivity analyses and the assessment of evidence quality. Briefly, our findings were reported following the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines,36 and the study protocol was pre-registered on PROSPERO (CRD42023448736). We carried out comprehensive searches in PubMed/Medline, ISI Web of Science, and Embase up to July 2024, using predetermined search terms, without imposing any restrictions (see supplementary table 7). We included prospective cohort studies that investigated the association between potato consumption, total and specific types according to cooking method, and risk of T2D in populations without pre-existing cardiovascular diseases, cancer, and T2D at baseline (see supplementary table 8). First, we conducted a linear dose-response meta-analysis,3738 which examined the risk of T2D in relation to an increase of three servings weekly of potato using an inverse variance weighted model. Second, we performed a one stage mixed effects meta-analysis to model a potential non-linear association between potato intake and T2D risk using restricted cubic splines with three knots at the 10th, 50th, and 90th centiles of the distribution analysis.39 Finally, using meta-analyzed results, we estimated the effect of substituting three servings weekly of potatoes, both total and specific types, with whole grain. The estimate for potatoes was derived from the current meta-analysis, whereas the estimate for whole grains was obtained by updating our previously published dose-response meta-analysis.40 We first calculated β coefficients (log hazard ratios) for every three servings weekly of whole grains and potato (ie, total, fried, and non-fried). Then, using the variances of these coefficients and their covariance, we estimated the 95% CIs for this difference. We then exponentiated the difference to determine the hazard ratio for each substitution.34 The covariance between two coefficients was estimated from the multivariable meta-analysis of NHSI, NHSII, and HPFS. This approach was assumed to be valid, given the substantial weight of our cohorts in both meta-analyses. Additionally, we conducted sensitivity analyses using two alternative assumptions about covariance: assuming independence between parameters (r=0) and then adjusting our analysis by accounting for a modest correlation between parameter estimates (r=±0.20). We also performed several sensitivity analyses to evaluate the robustness of our findings against potential sources of heterogeneity.

Source link

Advertise here
error: Content is protected !!