Nutritional Info, Presented in Augmented Reality


Foodalyzer is an app concept developed by our wonderful partners at Chromaplex. It provides a quick and easy way to get nutritional information about your meal – just point your phone’s camera at your food and nutritional labels will appear in an augmented reality environment.

Foodalyzer was created as a proof-of-concept that food items on store shelves could be recognized using AI and Computer Vision. The larger project effort is still ongoing, and Iteration Group is working together with the Chromaplex team to improve its current usability & interface, while also exploring some cutting edge design methodologies within augmented reality.

Visit Chromaplex here.

The Build

For the object recognition functionality, Chromaplex first experimented with Apple’s built-in ARKit Computer Vision features, but believed that supplementing with TensorFlow would improve the accuracy. Using the TensorFlow machine learning framework, the team trained the algorithm by feeding in tens-of-thousands of photographs from Flickr with various food hashtags and labels. A web-based microservice (API) was built around the resulting computer vision algorithm. The team then deployed to AWS, and wired up ARKit to use the new custom-built food-focused computer vision microservice. The finishing touch on the build was to implement a third party API for nutritional information.

The Design

Iteration Group is currently working closely with the Chromaplex team to improve the user-experience, but also explore new areas in which we can push design and development within Augmented Reality. We found inspiration through popular apps like SnapChat and Pokemon Go, and are leveraging ideas into business oriented solutions.