Helping people with anorexia understand hunger again.

We designed a recovery companion app and wearable that translates invisible biosignals into a living 3D garden. Giving people with anorexia nervosa a way to see what their body is doing, when their internal sense of hunger and fullness can no longer be trusted.

Role

Product Designer







Role

Project Manager,

UX Research Lead





Timeline

4 days





Timeline

7 months - Ongoing




Tools

Figma/ Figma make

Claude

Chat GPT





Team

2 Product Designers



Contribution

Research - Led the interoception research, helped source and synthesize the three papers that grounded the entire design direction. Researched the landscape of biosignal monitoring technology to determine which signals were most relevant and feasible.

Personas - Co-developed both Celine and Dr. Reyes with my partner, contributing to the background narratives, pain points, and goals.

Concept & ideation - Worked with my partner to finalise the core concept, the biosignal-to-flower system and the design principles that held the whole system together.

Design & build - Designed and built my assigned screens in Figma Make, contributing to the overall visual system and working with my partner to maintain design coherence across all screens.

Presentation - Wrote the presentation script, use case narratives, elevator pitch, and Devpost write-up. Co-delivered the final presentation with my partner.

00 OVERVIEW

About

FigBuild is Figma’s annual student design-a-thon, where university students from around the world come together over a short, intensive period to design and build original product concepts using Figma.

Unlike traditional hackathons that prioritize speed or technical output, FigBuild emphasizes thoughtful problem framing, strong user experience design, and the ability to clearly communicate a product’s concept, interaction, and impact through storytelling and prototyping.

The Challenge

The FigBuild prompt challenged teams to explore and design for human senses beyond the traditional five (sight, sound, touch, taste, and smell), focusing on how technology might better support or interpret internal bodily experiences.
Participants were asked to identify an underexplored sense, understand its role in everyday life, and imagine a technological intervention that could make that sense more perceivable, interpretable, or actionable.

The Challenge

The FigBuild prompt challenged teams to explore and design for human senses beyond the traditional five (sight, sound, touch, taste, and smell), focusing on how technology might better support or interpret internal bodily experiences.
Participants were asked to identify an underexplored sense, understand its role in everyday life, and imagine a technological intervention that could make that sense more perceivable, interpretable, or actionable.

The Solution

Designed for people recovering from anorexia nervosa, Attune makes hidden body signals visible so patients can recognize hunger when their cues feel absent or confusing. It translates clinical biosignals into an intuitive visual system, helping users eat with more confidence and rebuild trust in their body during recovery.

Visualizing hunger as a feeling, not a number

Gradual access to clinical data over recovery

Sharing progress with your support system

01 RESEARCH

Exploring the type of Senses

Early research explored multiple forms of bodily awareness and internal sensing. The team looked across different underrepresented senses and asked where technology had been most limited, where misunderstanding caused real harm, and where a speculative intervention could still feel grounded.


Interoception stood out because it deals with how people perceive internal bodily states. Within that space, hunger became the clearest and most compelling signal to design around, especially in the context of eating disorder recovery where bodily communication itself can become unreliable.

Understanding the Premise

Given the time constraints, we relied on secondary research and academic literature to build a foundational understanding of our users and identify key problem spaces. Among these, one area of research stood out as particularly influential in shaping our direction:

Gap in the Treatment

For someone in treatment for an eating disorder, a clinician appointment happens roughly once a week. That leaves 167 hours every week where the patient is alone with a body they do not trust, reading signals they cannot accurately interpret, and making decisions based on perceptions that are neurologically compromised.

The neuroscience is unambiguous: eating disorders cause structural changes to the brain regions responsible for body image, interoception, and reward processing. Gray matter loss is proportional to malnutrition. The parietal cortex, which constructs the body map is distorted. The reward circuit is blunted the punishment circuit is hyperactive. And crucially, the very capacity to perceive internal signals like hunger, fullness, temperature, and heart rhythm is severely impaired.

How do you measure and display an interoceptive sense in a way that feels safe, not clinical for someone whose interoceptive system is unreliable?

Exploring the interoception landscape

We mapped the full landscape of what could theoretically be measured from CGMs tracking blood glucose to SmartPill capsules measuring gastric motility, high-density electrogastrograms, floatation REST, and emerging wearable patches. We asked a pointed question: what signals most directly tell the story of hunger, refeeding, and recovery?

The answer shaped our final eight signals. Notably, we identified a major technological gap: no wearable currently measures hunger hormones like ghrelin and leptin passively.
Inner Garden's IBM patch is speculative but grounded in the direction the technology is heading. CGMs like Dexcom G7 and Abbott FreeStyle Libre were referenced as real precedents for passive biosignal monitoring; the SenseSupport system, which uses CGM meal-detection to deliver just-in-time adaptive interventions, directly validated our approach.

The eight signals we chose to display

Who are we building for

02 USER JOURNEY

A day in Celine's Life

The most powerful way to understand what Inner Garden does is to follow one person through three real moments. Not dramatic moments ordinary ones. The kind where the difficulty is invisible, and the system makes it visible.

Understanding the Premise

Given the time constraints, we relied on secondary research and academic literature to build a foundational understanding of our users and identify key problem spaces. Among these, one area of research stood out as particularly influential in shaping our direction:

Gap in the Treatment

For someone in treatment for an eating disorder, a clinician appointment happens roughly once a week. That leaves 167 hours every week where the patient is alone with a body they do not trust, reading signals they cannot accurately interpret, and making decisions based on perceptions that are neurologically compromised.

The neuroscience is unambiguous: eating disorders cause structural changes to the brain regions responsible for body image, interoception, and reward processing. Gray matter loss is proportional to malnutrition. The parietal cortex, which constructs the body map is distorted. The reward circuit is blunted the punishment circuit is hyperactive. And crucially, the very capacity to perceive internal signals like hunger, fullness, temperature, and heart rhythm is severely impaired.

How do you measure and display an interoceptive sense in a way that feels safe, not clinical for someone whose interoceptive system is unreliable?

Exploring the interoception landscape

We mapped the full landscape of what could theoretically be measured from CGMs tracking blood glucose to SmartPill capsules measuring gastric motility, high-density electrogastrograms, floatation REST, and emerging wearable patches. We asked a pointed question: what signals most directly tell the story of hunger, refeeding, and recovery?

The answer shaped our final eight signals. Notably, we identified a major technological gap: no wearable currently measures hunger hormones like ghrelin and leptin passively.
Inner Garden's IBM patch is speculative but grounded in the direction the technology is heading. CGMs like Dexcom G7 and Abbott FreeStyle Libre were referenced as real precedents for passive biosignal monitoring; the SenseSupport system, which uses CGM meal-detection to deliver just-in-time adaptive interventions, directly validated our approach.

Check out more of my work

case studies Image

2025 • Concept • Built with AI

The Slow Fold

Transformed Hack4Impact’s recruitment system into a single, transparent platform that supports applicants, reviewers, and directors at scale.

Sohaya

© 2026 Sohayainder Kaur · Product Designer

Made with patience