03 Sep 2024 — Nutrition-tracking smartphone applications developed through artificial intelligence (AI) require improvements in accuracy, finds a new study conducted at the University of Sydney, Australia. Some persistent concerns include AI’s poor ability to recognize dishes or estimate the caloric values of meals. However, these could be addressed through training.
“Inaccuracies in AI-enabled food tracking apps may cause undesirable outcomes and may potentially be unsafe,” Dr Juliana Chen, lead author of the study and accredited practicing dietitian, lecturer and researcher in Nutrition and Dietetics at the University of Sydney, tells Nutrition Insight.
“Errors in macronutrient and micronutrient values in apps can impact patients in multiple ways including unnecessary avoidance of foods or inaccurate administration of medicine.”
On the questions of if and how AI’s reading abilities can be improved, Chen says: “Data diversification of training datasets, such as including cultural foods and complex meal compositions, to train central neural network (CNN) models.”
The study published in Nutrients, screened 800 nutrition tracking apps before evaluating 18 further. These 18 included both AI-integrated and manual food-logging nutrition apps, which were compared based on their ability to recognize ingredients and estimate energy content.
Human dietitians still relevant
Chen recognizes that AI-generated apps may be more convenient for users in comparison to those requiring manual food-logging, but she warns of common mistakes AI has been shown to make.
Dr Juliana Chen, lead author of the study and accredited practicing dietitian, lecturer and researcher in Nutrition and Dietetics at the University of Sydney (Image credit: University of Sydney).“When patients or the public use apps to track food intake or manage weight, the process can often feel burdensome. Adding AI features like food image recognition could make the process much easier for everyone,” she argues.
“However, it is important to always double-check that the portion size detected matches what you ate. Some apps only identify the food, while others also estimate portion size and energy intake. So, for those undergoing weight loss, it is crucial to verify that the app’s estimates align with what you have eaten.”
To tackle the inaccuracy issues found to be associated with AI-tracking apps, Chen recommends the involvement of dietitians in the development of AI food-tracking apps.
“Our paper found that 30 apps stated dietitian or healthcare professional involvement. These included involvement in the development of the app, authors of blog posts and 11 apps, including a portal to connect with healthcare professionals, including dietitians. However, the exact role of dietitians in the development of the apps is unclear,” she explains.
“Dietitians have the ability to ensure the information and recommendations provided on the apps are current and evidence-based, follow systematic approaches such as the nutrition care process and incorporate behavioral change techniques to facilitate sustained behavioral changes to improve patient outcomes.”
Chen states that the interpretation of app nutritional data and advice should ideally be used as a tool to streamline dietitian consultations by reducing time spent on nutritional assessments or as a convenient tool for self-monitoring, giving more time for counseling to address individual enablers, barriers and interventions.
“Especially with AI-enabled food tracking, our paper found there are large discrepancies in energy estimation in AI-enabled food image recognition compared to the reference method. Dietitians can support clients in using the apps efficiently and safely.”
Training AI
Nonetheless, Chen argues that it is possible for AI models to be trained to recognize portion sizes and food combinations more accurately.
AI apps’ errors can lead to an unnecessary avoidance of foods or even inaccurate administration of medicine.“Recognizing portion sizes is a challenging task for AI models, however, multi-task frameworks combining food classification and portion size estimation of real-life food images showed a reduction in error percentage when compared to portion sizes provided by dietitians,” she points out.
“Another study assessed the relative validity of a mobile AI app for dietary assessment. For better accuracy of portion size, they designed an AI algorithm using a standardized visual prop for scale next to the food. However, the feasibility of this in a real-world setting is limited by cost, accessibility and imposed burden on consumers.”
Chen explains that the data diversification of training datasets such as including cultural foods and complex meal compositions, is necessary to train CNN models.
“Data augmentation which can include modifications to existing data such as transformations (e.g., angle adjustments), modifying the color space (e.g., brightness, contrast, saturation), using incomplete or partially erased images or adding noise.”
These methods are particularly useful in identifying food combinations, especially from new or unseen data, such as photos taken by consumers in varying compositions, angles, distances and lighting conditions.
The future of AI nutrition
The researcher asserts that as AI training advances, there is an “exciting potential” for AI integration in nutrition apps, particularly with their potential to streamline the logging of food within apps.
“However, while logging via nutrition apps is more acceptable than paper records, manual logging via looking up foods in the app food databases can still contribute to the burden of tracking food and self-monitoring longer term,” she continues.
“As dietitians, we do want to encourage ongoing self-monitoring among our patients as the scientific literature indicates that this can improve outcomes for health conditions (e.g., diabetes, obesity) and also improve user’s dietary intake.”
Chen recommends the involvement of registered dietitians in the development of AI food-tracking apps.Chen argues that AI in nutrition apps also have the potential to support nutrition care with dietitians by offering remote and automated monitoring and more efficient and timely assessments of diet.
“However, improvements in AI-enabled image recognition of foods, drinks and whole meals, especially for cultured foods and mixed dishes, are still needed. Further training of the AI models within nutrition apps is required, alongside dietitian support in enhancing these features,” the dietitian asserts.
“Adding in a function where app users can modify the wrongly identified food or update the quantity if it is under or overestimated may also support greater accuracy in the logging of foods. Also, add a fiducial marker and an in-depth tutorial to apps that have AI-enabled image recognition for food.”
She explains that recent trends indicate an increase in interest in AI. “Considering the increase in AI integration in mobile apps for nutrition care (a six-fold increase in the last two years), we expect a further increase in AI integration as well as an improvement in their capabilities and accuracy.”
“However, a collaborative approach from dietitians, app developers and regulatory bodies is needed for the development of practical health apps by app developers and their endorsement by dietitians and regulatory bodies as a favorable tool in nutrition care,” Chen concludes.
By Milana Nikolova