A Data Driven End-to-end Approach for In-the-wild Monitoring of Eating
Behavior Using Smartwatches
- URL: http://arxiv.org/abs/2010.07051v1
- Date: Mon, 12 Oct 2020 12:35:56 GMT
- Title: A Data Driven End-to-end Approach for In-the-wild Monitoring of Eating
Behavior Using Smartwatches
- Authors: Konstantinos Kyritsis, Christos Diou and Anastasios Delopoulos
- Abstract summary: This paper presents a complete framework towards the automated i) modeling of in-meal eating behavior and ii) temporal localization of meals.
We present an end-to-end Neural Network which detects food intake events (i.e. bites)
We show how the distribution of the detected bites throughout the day can be used to estimate the start and end points of meals, using signal processing algorithms.
- Score: 8.257740966456172
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The increased worldwide prevalence of obesity has sparked the interest of the
scientific community towards tools that objectively and automatically monitor
eating behavior. Despite the study of obesity being in the spotlight, such
tools can also be used to study eating disorders (e.g. anorexia nervosa) or
provide a personalized monitoring platform for patients or athletes. This paper
presents a complete framework towards the automated i) modeling of in-meal
eating behavior and ii) temporal localization of meals, from raw inertial data
collected in-the-wild using commercially available smartwatches. Initially, we
present an end-to-end Neural Network which detects food intake events (i.e.
bites). The proposed network uses both convolutional and recurrent layers that
are trained simultaneously. Subsequently, we show how the distribution of the
detected bites throughout the day can be used to estimate the start and end
points of meals, using signal processing algorithms. We perform extensive
evaluation on each framework part individually. Leave-one-subject-out (LOSO)
evaluation shows that our bite detection approach outperforms four
state-of-the-art algorithms towards the detection of bites during the course of
a meal (0.923 F1 score). Furthermore, LOSO and held-out set experiments
regarding the estimation of meal start/end points reveal that the proposed
approach outperforms a relevant approach found in the literature (Jaccard Index
of 0.820 and 0.821 for the LOSO and heldout experiments, respectively).
Experiments are performed using our publicly available FIC and the newly
introduced FreeFIC datasets.
Related papers
- Automatic Recognition of Food Ingestion Environment from the AIM-2 Wearable Sensor [3.9956522522260447]
We propose a neural network-based method with a two-stage training framework that tactfully combines fine-tuning and transfer learning techniques.
Our method is evaluated on a newly collected dataset called UA Free Living Study", which uses an egocentric wearable camera, AIM-2 sensor, to simulate food consumption in free-living conditions.
Experimental results on the collected dataset show that our proposed method for automatic ingestion environment recognition successfully addresses the challenging data imbalance problem in the dataset and achieves a promising overall classification accuracy of 96.63%.
arXiv Detail & Related papers (2024-05-13T15:12:21Z) - NutritionVerse-Direct: Exploring Deep Neural Networks for Multitask Nutrition Prediction from Food Images [63.314702537010355]
Self-reporting methods are often inaccurate and suffer from substantial bias.
Recent work has explored using computer vision prediction systems to predict nutritional information from food images.
This paper aims to enhance the efficacy of dietary intake estimation by leveraging various neural network architectures.
arXiv Detail & Related papers (2024-05-13T14:56:55Z) - NutritionVerse: Empirical Study of Various Dietary Intake Estimation Approaches [59.38343165508926]
Accurate dietary intake estimation is critical for informing policies and programs to support healthy eating.
Recent work has focused on using computer vision and machine learning to automatically estimate dietary intake from food images.
We introduce NutritionVerse- Synth, the first large-scale dataset of 84,984 synthetic 2D food images with associated dietary information.
We also collect a real image dataset, NutritionVerse-Real, containing 889 images of 251 dishes to evaluate realism.
arXiv Detail & Related papers (2023-09-14T13:29:41Z) - Food Image Classification and Segmentation with Attention-based Multiple
Instance Learning [51.279800092581844]
The paper presents a weakly supervised methodology for training food image classification and semantic segmentation models.
The proposed methodology is based on a multiple instance learning approach in combination with an attention-based mechanism.
We conduct experiments on two meta-classes within the FoodSeg103 data set to verify the feasibility of the proposed approach.
arXiv Detail & Related papers (2023-08-22T13:59:47Z) - Intake Monitoring in Free-Living Conditions: Overview and Lessons we
Have Learned [5.118928796825531]
We present a high-level overview of our recent work on intake monitoring using a smartwatch.
We also present evaluation results of these methods in challenging, real-world datasets.
arXiv Detail & Related papers (2022-06-04T08:38:23Z) - Enhancing Food Intake Tracking in Long-Term Care with Automated Food
Imaging and Nutrient Intake Tracking (AFINI-T) Technology [71.37011431958805]
Half of long-term care (LTC) residents are malnourished increasing hospitalization, mortality, morbidity, with lower quality of life.
This paper presents the automated food imaging and nutrient intake tracking (AFINI-T) technology designed for LTC.
arXiv Detail & Related papers (2021-12-08T22:25:52Z) - Vision-Based Food Analysis for Automatic Dietary Assessment [49.32348549508578]
This review presents one unified Vision-Based Dietary Assessment (VBDA) framework, which generally consists of three stages: food image analysis, volume estimation and nutrient derivation.
Deep learning makes VBDA gradually move to an end-to-end implementation, which applies food images to a single network to directly estimate the nutrition.
arXiv Detail & Related papers (2021-08-06T05:46:01Z) - MyFood: A Food Segmentation and Classification System to Aid Nutritional
Monitoring [1.5469452301122173]
The absence of food monitoring has contributed significantly to the increase in the population's weight.
Some solutions have been proposed in computer vision to recognize food images, but few are specialized in nutritional monitoring.
This work presents the development of an intelligent system that classifies and segments food presented in images to help the automatic monitoring of user diet and nutritional intake.
arXiv Detail & Related papers (2020-12-05T17:40:05Z) - MFED: A System for Monitoring Family Eating Dynamics [7.390103907991721]
Family eating dynamics (FED) have high potential to impact child and parent dietary intake, and ultimately the risk of obesity.
To date, there exists no system for real-time monitoring of FED.
This paper presents MFED, the first of its kind of system for monitoring FED in the wild in real-time.
arXiv Detail & Related papers (2020-07-11T19:00:53Z) - Detecting Parkinsonian Tremor from IMU Data Collected In-The-Wild using
Deep Multiple-Instance Learning [59.74684475991192]
Parkinson's Disease (PD) is a slowly evolving neuro-logical disease that affects about 1% of the population above 60 years old.
PD symptoms include tremor, rigidity and braykinesia.
We present a method for automatically identifying tremorous episodes related to PD, based on IMU signals captured via a smartphone device.
arXiv Detail & Related papers (2020-05-06T09:02:30Z) - Proximity-Based Active Learning on Streaming Data: A Personalized Eating
Moment Recognition [17.961752949636306]
We propose Proximity-based Active Learning on Streaming data, a novel proximity-based model for recognizing eating gestures.
Our analysis on data collected in both controlled and uncontrolled settings indicates that the F-score of PLAS ranges from 22% to 39% for a budget that varies from 10 to 60 query.
arXiv Detail & Related papers (2020-03-29T18:17:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.