We propose that highly processed foods share pharmacokinetic properties (e.g. concentrated dose, rapid rate of absorption) with drugs of abuse, due to the addition of fat and/or refined carbohydrates and the rapid rate the refined carbohydrates are absorbed into the system, indicated by glycemic load (GL). The current study provides preliminary evidence for the foods and food attributes implicated in addictive-like eating.
Background The age at which allergenic foods should be introduced into the diet of breast-fed infants is uncertain. We evaluated whether the early introduction of allergenic foods in the diet of breast-fed infants would protect against the development of food allergy. Methods We recruited, from the general population, 1303 exclusively breast-fed infants who were 3 months of age and randomly assigned them to the early introduction of six allergenic foods (peanut, cooked egg, cow’s milk, sesame, whitefish, and wheat; early-introduction group) or to the current practice recommended in the United Kingdom of exclusive breast-feeding to approximately 6 months of age (standard-introduction group). The primary outcome was food allergy to one or more of the six foods between 1 year and 3 years of age. Results In the intention-to-treat analysis, food allergy to one or more of the six intervention foods developed in 7.1% of the participants in the standard-introduction group (42 of 595 participants) and in 5.6% of those in the early-introduction group (32 of 567) (P=0.32). In the per-protocol analysis, the prevalence of any food allergy was significantly lower in the early-introduction group than in the standard-introduction group (2.4% vs. 7.3%, P=0.01), as was the prevalence of peanut allergy (0% vs. 2.5%, P=0.003) and egg allergy (1.4% vs. 5.5%, P=0.009); there were no significant effects with respect to milk, sesame, fish, or wheat. The consumption of 2 g per week of peanut or egg-white protein was associated with a significantly lower prevalence of these respective allergies than was less consumption. The early introduction of all six foods was not easily achieved but was safe. Conclusions The trial did not show the efficacy of early introduction of allergenic foods in an intention-to-treat analysis. Further analysis raised the question of whether the prevention of food allergy by means of early introduction of multiple allergenic foods was dose-dependent. (Funded by the Food Standards Agency and others; EAT Current Controlled Trials number, ISRCTN14254740 .).
To investigate the contribution of ultra-processed foods to the intake of added sugars in the USA. Ultra-processed foods were defined as industrial formulations which, besides salt, sugar, oils and fats, include substances not used in culinary preparations, in particular additives used to imitate sensorial qualities of minimally processed foods and their culinary preparations.
Accurate monitoring of changes in dietary patterns in response to food policy implementation is challenging. Metabolic profiling allows simultaneous measurement of hundreds of metabolites in urine, the concentrations of which can be affected by food intake. We hypothesised that metabolic profiles of urine samples developed under controlled feeding conditions reflect dietary intake and can be used to model and classify dietary patterns of free-living populations.
To assess the prospective associations between consumption of ultra-processed food and risk of cancer.
In order to facilitate foodborne outbreak investigations there is a need to improve the methods for identifying the food products that should be sampled for laboratory analysis. The aim of this study was to examine the applicability of a likelihood ratio approach previously developed on simulated data, to real outbreak data. We used human case and food product distribution data from the Norwegian enterohaemorrhagic Escherichia coli outbreak in 2006. The approach was adjusted to include time, space smoothing and to handle missing or misclassified information. The performance of the adjusted likelihood ratio approach on the data originating from the HUS outbreak and control data indicates that the adjusted approach is promising and indicates that the adjusted approach could be a useful tool to assist and facilitate the investigation of food borne outbreaks in the future if good traceability are available and implemented in the distribution chain. However, the approach needs to be further validated on other outbreak data and also including other food products than meat products in order to make a more general conclusion of the applicability of the developed approach.
- Proceedings of the National Academy of Sciences of the United States of America
- Published about 1 year ago
Food loss is widely recognized as undermining food security and environmental sustainability. However, consumption of resource-intensive food items instead of more efficient, equally nutritious alternatives can also be considered as an effective food loss. Here we define and quantify these opportunity food losses as the food loss associated with consuming resource-intensive animal-based items instead of plant-based alternatives which are nutritionally comparable, e.g., in terms of protein content. We consider replacements that minimize cropland use for each of the main US animal-based food categories. We find that although the characteristic conventional retail-to-consumer food losses are ≈30% for plant and animal products, the opportunity food losses of beef, pork, dairy, poultry, and eggs are 96%, 90%, 75%, 50%, and 40%, respectively. This arises because plant-based replacement diets can produce 20-fold and twofold more nutritionally similar food per cropland than beef and eggs, the most and least resource-intensive animal categories, respectively. Although conventional and opportunity food losses are both targets for improvement, the high opportunity food losses highlight the large potential savings beyond conventionally defined food losses. Concurrently replacing all animal-based items in the US diet with plant-based alternatives will add enough food to feed, in full, 350 million additional people, well above the expected benefits of eliminating all supply chain food waste. These results highlight the importance of dietary shifts to improving food availability and security.
Here we present evidence of phytoliths preserved in carbonised food deposits on prehistoric pottery from the western Baltic dating from 6,100 cal BP to 5750 cal BP. Based on comparisons to over 120 European and Asian species, our observations are consistent with phytolith morphologies observed in modern garlic mustard seed (Alliaria petiolata (M. Bieb) Cavara & Grande). As this seed has a strong flavour, little nutritional value, and the phytoliths are found in pots along with terrestrial and marine animal residues, these findings are the first direct evidence for the spicing of food in European prehistoric cuisine. Our evidence suggests a much greater antiquity to the spicing of foods than is evident from the macrofossil record, and challenges the view that plants were exploited by hunter-gatherers and early agriculturalists solely for energy requirements, rather than taste.
Food consumption is thought to induce sleepiness. However, little is known about how postprandial sleep is regulated. Here, we simultaneously measured sleep and food intake of individual flies and found a transient rise in sleep following meals. Depending on the amount consumed, the effect ranged from slightly arousing to strongly sleep inducing. Postprandial sleep was positively correlated with ingested volume, protein, and salt-but not sucrose-revealing meal property-specific regulation. Silencing of leucokinin receptor (Lkr) neurons specifically reduced sleep induced by protein consumption. Thermogenetic stimulation of leucokinin (Lk) neurons decreased whereas Lk downregulation by RNAi increased postprandial sleep, suggestive of an inhibitory connection in the Lk-Lkr circuit. We further identified a subset of non-leucokininergic cells proximal to Lkr neurons that rhythmically increased postprandial sleep when silenced, suggesting that these cells are cyclically gated inhibitory inputs to Lkr neurons. Together, these findings reveal the dynamic nature of postprandial sleep.
Psychological and neurobiological evidence implicates hippocampal-dependent memory processes in the control of hunger and food intake. In humans, these have been revealed in the hyperphagia that is associated with amnesia. However, it remains unclear whether ‘memory for recent eating’ plays a significant role in neurologically intact humans. In this study we isolated the extent to which memory for a recently consumed meal influences hunger and fullness over a three-hour period. Before lunch, half of our volunteers were shown 300 ml of soup and half were shown 500 ml. Orthogonal to this, half consumed 300 ml and half consumed 500 ml. This process yielded four separate groups (25 volunteers in each). Independent manipulation of the ‘actual’ and ‘perceived’ soup portion was achieved using a computer-controlled peristaltic pump. This was designed to either refill or draw soup from a soup bowl in a covert manner. Immediately after lunch, self-reported hunger was influenced by the actual and not the perceived amount of soup consumed. However, two and three hours after meal termination this pattern was reversed - hunger was predicted by the perceived amount and not the actual amount. Participants who thought they had consumed the larger 500-ml portion reported significantly less hunger. This was also associated with an increase in the ‘expected satiation’ of the soup 24-hours later. For the first time, this manipulation exposes the independent and important contribution of memory processes to satiety. Opportunities exist to capitalise on this finding to reduce energy intake in humans.