My role
Product Designer
Timeline
1 week
At the intersection of science and self-care, Future of Me helps you decode your external health story.
Through cutting-edge in-store scans that analyse your skin, hair, and body composition, the Future of Me Token gives you a personalised snapshot of where your health stands- and where it could go next.
Itโs a system built to turn complex metrics into clarity, insight, and actionable next steps, helping you:
Understand what your skin, hair, and body are really trying to tell you
See how your biological health compares to others in your age group
Get matched with products, routines, and habits tailored to your unique blueprint
In a world of guesswork and generic advice, Future of Me puts science, precision, and you at the centre.
Pain points
Opportunities
The Future of Me logo consists of two elements:
The wordmark: โFuture of Meโ
The endorsement lock-up: โpowered by Deep Holisticsโ with the DH monogram
Our typography consists of two font families, Satoshi and Playfair Display .
Regular
Medium
Italic
Medium Italic
Our primary font - modern, geometric, and effortlessly clear. It keeps things sharp, accessible, and always in focus.
Medium Italic
Our secondary font - classic with flair, used sparingly to spotlight titles or key phrases with elegance and punch.
Fallback font: Whenever itโs not possible to use our font family, Satoshi could be replaced with Inter. Playfair Display is a Google font.
This is where it all begins โ letโs go!

Before any design could begin, we were handed a reportโdense, technical, and packed with scientific terms. To make sense of it, we took a deep-dive approach:
Collaborated with the team behind the analysis machine
Researched documentation and tutorials
Consulted our in-house doctors to understand the clinical significance of each parameter
Our goal was to decode the data into something usableโfor both us and the end user. We broke down the terms, restructured the flow, and created a simplified map of what each parameter meant, so it could later translate into design, content, and personalised recommendations.

Once we understood what the data was actually saying, we shifted our focus to how it should feel to the user.
We knew we were dealing with complex scientific insightsโso the challenge became: how do we make this information personal, visual, and actionable?
We began by sketching possible layouts, experimenting ways on how do we highlight whatโs good, bad, or โneeds attentionโ without creating fear or confusion or overwhelmingness?
Alongside visual design, we also laid the foundation for how the report would be generated- thinking of it not as a static file, but a scalable, semi-automated system.

The original planโa seamless dashboardโhad to be dropped early on. Instead, reports were manually exported to PDFs and passed to the sales team. This added steps, slowed delivery, and widened the gap between our promise and what was technically feasible.
Overwhelmed response. The promise was to deliver personalised reports within ten minutesโbut the system had to process dense biometric scans, fetch individual insights, match them to the right report format, and then send it out. This created a high-stakes bottleneck.
We aimed to deliver detailed reports within minutes of each scan, but the volume made manual work unsustainable.
Data bugs crept in. Backend glitches, design coordination, and dev dependencies all had to align in real time.

The goal was to minimise human errors and reduce back-and-forth.
Manual execution wasnโt scalable- we needed automation from day one.
Worked closely with developers to build logic for bulk processing skin, hair, and body reports.
Marked exact coordinates (X, Y, width, height) for every image block in the scan reports.
Created a blueprint for developers to extract visuals via an image extractor tool.
Standardised the image pull process so design never needed to intervene manually.
Built a plugin-based flow where extracted images would auto-populate into the Figma report.
Enabled real-time batch updates for all visuals, overlays, graphs, and photos.
Drastically reduced design time, turning hours of work into minutes.
Data was populated using a similar logic- but under NDA constraints.
Faced bugs during initial testing where data didnโt populate correctly.
Collaborated with backend to debug and refine data flow into the report templates.

One fix at a time.

~30% of users lacked a skincare routine, showing clear need- a strong opportunity for first-time engagement.
Reports were seen as accurate and helpful, reflecting real conditions.
~15% of users adopted recommended products- an encouraging start that validated user trust.
Manual execution wasnโt scalable- we needed automation/dashboard from day one.
Users asked for more product options than what we suggested- a need for more flexible suggestion logic.
Retail staff offered conflicting advice, leading to confusion and rechecks at our kiosk.








