However, the methods used in these surveys are often based on memory (e.g. 24 hour recall methods (pdf)), and being human we may simply forget what exactly we had for lunch. Another issue is that if you asked people how much they consumed, they might tend to under or overestimate.
The Holy Grail would be to make accurate measurements of an individual's consumption in a way that does not interrupt their daily routine. So how could we monitor what an individual consumes without interfering? The answer might be in smart technology.
Smart fridges exist with sensors that could scan its contents to alert you of what you need to buy in your next shopping trip or simply order the food for you. Moreover, if you remove an item, say a carton of milk, and put it back in again, then the fridge might sense the change in weight and figure out if you are running low. We could easily adapt this technology to log which foods, and their quantity, went from the fridge and into our stomachs. The only challenge is that the food you remove might not be consumed by you. In this case, wearable technology that could monitor us and our surroundings might be the answer.
The future of smart glasses is not looking bright at present especially since Google have seemingly discontinued their Google Glass project. Nevertheless, smart watches with inbuilt cameras are becoming gadgets we are more likely to wear, especially with the upcoming release of the Apple Watch. If we were to wear a smartwatch as a fob watch where it is pinned to your shirt, similar to how nurses wear watches, then it could be treated as a smart monitor where it could watch us and what we eat. Then, with the aid of computer vision, which is a computational method of analyzing and “understanding” images, it may be possible to estimate our food portions by comparing the size of our plate with the surface area of the food that it covers. Our smart monitor may also be able to communicate with our smart fridge to infer the likely amount that is on our plate. The smart monitor could even identify what we are eating if we are out and about, as it is programmed to know what a hamburger looks like. It could even recognise brand logos to make an informed decision about the burger’s recipe. Our smart monitor may even tell us to put down that bar of chocolate!
The technology may not be ready just yet, but when it is we will be able to automatically build food ‘diaries’ with a range of likely amounts of food or beverages we consume so that our average intake of a product can be calculated. Such valuable data could greatly enhance our knowledge of consumer dietary habits, which could influence decision makers in Food Safety, Nutrition, Health and Wellbeing at both government and industry levels.
Here at Creme Global we analyse vasts amounts of data on food and cosmetic product consumption, coupled with data on the amount of each product consumed at each event. We then combine these with other important variables to provide a picture of total aggregate consumption, enabling the analysis of ingredients and constituents from all sources of intake or exposure.
For more information on how Creme Global can assist you with your aggregate exposure and data science needs, visit our website where you can contact us.