NutriLens

“Lens that looks after your nutrition.”

About Project

Preposition for ​AI-powered AR glasses that analyze physiological health data, recognize food, scan surroundings, and suggest personalized healthy meals in real time.

Theme:Data Mirrors

Brief

Research, conceptualise and design a product, digital ecosystem or interactive material intervention that responds to the theme Data Mirrors. Project can span the diametrically opposed extremes of a commercially focused product to that of a speculative social intervention.

Role and Duration

UX UI Designer


3 months

NutriLens

Digital Companion

A pair of AR smart glasses that provide real-time health insights and food guidance by analyzing physiological data, identifying food, and interpreting the user’s physical environment.
It continuously analyze the user’s physiological signals (like heart rate, hydration, and glucose trends), meal history and provide personalized healthy food suggestions via minimal AR display, helping users make better dietary choices.

Challenge

  • Most people are unaware of how their internal health metrics relate to the food they consume.
    People are unable to find healthier food alternatives and keep a track of their activities and meals.
    There’s a gap between biometric feedback and real-time actionable dietary decisions especially for those with conditions like hypertension, diabetes, or dehydration.

Solution

  • NutriLens bridge gap between biometric data and mindful eating.
    Passively collects real-time physiological data
    Recognizes food visually and contextualizes it against the user’s current health.
    Delivers timely food recommendations through a subtle, non-intrusive AR HUD.
    Enables better awareness, prevention, and smart food choices—seamlessly integrated into daily life.

Chapter 1

Research and Analysis

What is "Data Mirror"?

  • With significant amount of our life experience spent online, data has also become the primary means by which to characterise our sense of self and personal identity.

  • Products and cultural movements that have resulted from this deluge of data, like the Quantified Self movement, have often taken a positivist approach by capitalising on the potentially transformative power of personal information.

  • Conversely, its commodification is precisely why many designers and technologists have responded with products and digital interventions to ensure that personal agency is maintained in society.

  • Data has become one of the most important monetary and social currencies of the 21st century.

- Prof. Dr. Peter Crnokrak
Programme Director for the Visual & Experience Design, and Generative Design & AI masters at UE Innovation Hub

Objective

To understand how users make food decisions and how physiological signals (like hydration or blood pressure) can inform real-time, health-conscious eating behavior.

Competitive Analysis

How users currently track health and food?
Where the gaps are in real-time guidance?
How effective their UX/UI solutions are?

Pain Points

Too Many Touchpoints
Users are required to log every meal, drink manually.
Food tracking apps often depend on barcode scanning or search menus.
Switching between devices (watch, phone, app) to check vitals or track food breaks flow.
Requires visual focus, typing, and confirmation—especially inconvenient while eating or cooking.

High friction leads to low user engagement over time.
No Environmental Awareness
Apps can’t detect where the user is or what food is physically present.
They don't know if you're in a kitchen, store, or near a food source.

Context-free advice feels generic and often irrelevant.
Lack of Real-Time Suggestions
Most apps only provide retrospective analysis (after meals or at day’s end).
Health devices show vitals but rarely link them to actionable food advice.

Users don’t get help when they need it during food decisions.

Insights

Users want automation, not input.
Systems must work in the background, detecting and suggesting without manual effort.
Switching apps or typing disrupts flow—wearables should remove these points of friction.

AR interface ➝ frictionless, ambient guidance
Context is everything.
Food guidance must be aware of location, timing, and surroundings to be meaningful.
Object/environment recognition ➝ contextual nudges
Data should lead to decisions.
Raw stats (like 120/80 BP) mean little to most users unless paired with smart suggestions.
Real-time biometric data ➝ smarter food suggestions

Chapter 2

Define and Ideate

Visual Form

AR Heads-Up Display (HUD) integrated into AI-powered AR glasses

This HUD overlays real-time, personalized visual feedback onto the user’s field of view without requiring manual input or disrupting their environment.

Physiological signals: Vital signs like Heart Rate(color-coded), Blood Pressure, Hydration Status, Glucose Level.
Food recognition: Displays nutritional value and records passively
Surrounding context: Scan surroundings (e.g., kitchen, bakery) to offer timely suggestions
Activity tracking: Tracks user activity and prompts them to go for healthier food options

Why Smart Glasses?

AR Smart Glasses

  • Users receive health data and food suggestions directly in their field of vision—no touching, typing, or swiping.
  • Automatically detect surroundings (e.g., kitchen, bakery), and respond with contextual prompts.
  • Identify food in real time using computer vision, and display nutrition info.
  • Show live vitals (e.g., heart rate, BP, hydration, glucose) in a subtle AR overlay, connected to body sensors.
  • No need to pause and switch context between eating and using a phone.

Mobile App

  • Require constant interaction (opening the app, searching food, entering values).
  • Lack real-world spatial awareness unless manually prompted.
  • Require barcode scanning, manual search, or image upload for proper food scanning.
  • Often display vitals after syncing from another device or lack real-time insight.
  • Break immersion, requiring effort and focus to operate.

"Data Mirrors"Aspects

Physiological Health Data
(via built-in biosensors)

  • These metrics help determine what the body needs at any given time.
  • Heart Rate (BPM): Indicates exertion, rest state, or stress levels.
  • Blood Pressure: Helps avoid high-sodium food if elevated.
  • Blood Glucose: Critical for managing sugar intake; especially for diabetic users.
  • Hydration Level: Suggests wate​r-rich foods when levels are low.
    (Optional future extension: Body temperature, oxygen saturation)

Food Recognition Data
(via AR lens and computer vision)

  • This ensures the system knows what the user is about to eat.
  • Food Identification: Real-time visual detection of items (e.g., pizza, apple).
  • Portion Size Estimation: Helps calculate nutritional value more accurately.
  • Nutritional Breakdown: Calories, protein, carbs, fats, sugar, sodium, fiber.
  • Meal Logging: Automatically logs recognized meals for dietary tracking.

Environmental Context Data
(via object and location detection)

  • The glasses adapt based on where the user is and what’s around them.
    Physical Location Recognition:
    Kitchen → triggers meal suggestions
    Bakery/store → alerts about high-sugar environments
    Object Detection: Fridge, stove, counter → determines food preparation context
    Time of Day Awareness: Helps align food suggestions (e.g., breakfast vs. dinner)

Historical Meal Data
(via record food data)

  • This ensures food recommendations consider what the user already consumed.
    Last Meal Composition & Time: Avoids repeating high-sodium/sugar meals and adjusts meal portion or type based on recent intake
    Daily Caloric and Nutritional Summary: Guides toward balance (e.g., more fiber if low)

Physiological Health Data 

via built-in biosensors

These metrics help determine what the body needs at any given time.
Heart Rate (BPM): Indicates exertion, rest state, or stress levels.
Blood Pressure: Helps avoid high-sodium food if elevated.
Blood Glucose: Critical for managing sugar intake; especially for diabetic users.
Hydration Level: Suggests water-rich foods when levels are low.
(Optional future extension: Body temperature, oxygen saturation)

Environmental Context Data 

via object and location detection

The glasses adapt based on where the user is and what’s around them.
Physical Location Recognition:
Kitchen → triggers meal suggestions
Bakery/store → alerts about high-sugar environments
Object Detection: Fridge, stove, counter → determines food preparation context
Time of Day Awareness: Helps align food suggestions (e.g., breakfast vs. dinner)

Food Recognition Data 

via AR lens and computer vision

This ensures the system knows what the user is about to eat.
Food Identification: Real-time visual detection of items (e.g., pizza, apple).
Portion Size Estimation: Helps calculate nutritional value more accurately.
Nutritional Breakdown: Calories, protein, carbs, fats, sugar, sodium, fiber.
Meal Logging: Automatically logs recognized meals for dietary tracking.

Historical Meal Data

via recorded food data

This ensures food recommendations consider what the user already consumed.
Last Meal Composition & Time: Avoids repeating high-sodium/sugar meals and adjusts meal portion or type based on recent intake
Daily Caloric and Nutritional Summary: Guides toward balance (e.g., more fiber if low)

Monitor Health

  • Continuously track key physiological vitals via built-in biosensors: Heart rate, Blood pressure, Hydration level, Blood glucose, Body temperature

Recognize Food

  • Recognize food visually when the user looks at it.
  • They display: Food item name, Calories and nutritional breakdown, Warnings if any nutrient exceeds safe limits (e.g., high sodium)

Detect Surrounding

  • With environmental awareness and spatial mapping, the glasses detect:
    Location (e.g., kitchen, restaurant, bakery)
    Objects (e.g., fridge, stove, food containers)
    User actions (e.g., ordering food online, prepping a meal)

Meal Suggestion

  • By combining health vitals, last meal data, and surroundings, the glasses deliver real-time food recommendations, such as:
    “Try a fiber-rich meal to balance your last intake.”
    “Hydration low — consider a smoothie or fruit-based snack.”

Chapter 3

Design

Interactive Mockup

PROTOTYPE

See Smart, Eat Smarter!