Back to home

Research Projects

My research portfolio spans cognitive science, digital therapeutics, and human-centered design—unified by a commitment to translating scientific insight into real-world impact. From investigating contemplative photography as a mindfulness intervention to leading clinical VR trials with aging populations, my work bridges experimental rigor with practical application.

Each project represents a different lens on the same core question: How can we design technologies and interventions that enhance human cognition, well-being, and behavior in absolutely valid, evidence-based ways?

Cognitive Science

1 Project
Current Thesis and Extension Project
In Progress
2024 - Present
Cognitive Science

Research Overview

Investigating contemplative photography as a mindfulness-based intervention for anxiety, with extension into immersive VR environments. This work bridges cognitive neuroscience, clinical psychology, and human-centered AI design.

Core Investigation

My undergraduate thesis investigates how contemplative photography can function as a mindfulness-based art therapy by enhancing attentional control in individuals with elevated trait anxiety. This work bridges cognitive science, experimental psychology, and human-centered design principles, using contemplative photography as a sensory, embodied medium to reduce rumination and improve performance on the Emotional Stroop Task.

Extension: VR Translation

Building on this, my extension project explores how the same cognitive principles–attentional modulation, embodied engagement, and sensory cueing–can be translated into immersive AI settings to enhance its therapeutic outcomes, precisely VR.

Research Directions

Adaptive Sensory Cueing for Attentional Control

Investigating how color, visual salience, and mindful perception can shift cognitive load and reduce rumination.

Design of Real-Time AI Feedback Systems for Therapeutic VR

Translating contemplative photography principles into VR-based tasks that adapt difficulty and sensory stimulation to user needs.

User-Centered Design for Embodied Digital Therapeutics

Integrating intermodal expressive arts principles into digital interfaces to build mind-body connection.

Long-Term Efficacy, Well-Being Outcomes, and Scalability

Extending the project into clinical and real-world contexts, including assisted-living facilities and AI-enhanced VR therapy.

Methodology

• Experimental Design: Pre-post intervention with control group
• Primary Measure: Emotional Stroop Task performance
• Population: Adults with elevated trait anxiety
• Intervention: Contemplative photography protocol
Cognitive ScienceAttentional ControlVR TherapyMind-Body ConnectionHuman-Centered AIMindfulnessAnxiety Research

Applied Cognition in Therapeutics

4 Projects
VR Therapy Clinical Research
Completed
2024
Neo Auvra × Vivante Senior Living
Clinical Trial

Project Context

Led clinical trial evaluation of FDA-registered VR cognitive therapy for older adults, assessing real-world effectiveness of immersive interventions targeting attention, memory, and executive function.

Working as part of the research team, I led a clinical trial, assessing the effectiveness of Neo Auvra's FDA-registered immersive cognitive assessment and intervention modules for enhancing cognitive abilities in older adults residing at Vivante, an assisted living facility in Newport, CA. These modules, grounded in Neo Auvra's Brain Health Optimization framework, target core domains such as visual attention, working memory, information processing speed, psychomotor coordination, and executive function through real-time VR-based tasks.

I evaluated user comfort, engagement, and task performance of the participants within VR environments that fuse immersive interaction with multi-domain cognitive demands. Responsibilities included securing informed consent, guiding participants through each VR session, documenting behavioral and emotional responses, mitigating motion sickness, collecting structured performance data, and conducting semi-structured interviews to extract qualitative insights into usability and perceived cognitive effort.

This work aligned with Neo Auvra's convergence-science approach, bridging neuroscience, psychology, and human-centered design, while providing real-world evidence on how older adults interact with multimodal VR systems. Findings directly informed iterative improvements to Neo Auvra's platform, particularly around task difficulty calibration, sensory cue clarity, and motion-interaction usability, advancing its application for aging populations and early cognitive decline.

Research Contributions & Outcomes

Ethical Research Practice

Ensured ethical, safe, and comfortable participation through informed consent, continuous monitoring, and adherence to VR safety protocols

User Experience Documentation

Documented detailed behavioral, emotional, and usability responses, providing actionable user-experience insights for platform refinement

Cognitive Assessment Validation

Supported the evaluation of VR-based tasks targeting attention, working memory, processing speed, and motor/psychomotor coordination

Data Quality Assurance

Strengthened data validity by resolving technical issues, calibrating VR systems, and guiding participants through immersive multi-domain cognitive tasks

Clinical Translation

Generated clinically relevant observations contributing to Neo Auvra's development of personalized, multi-domain VR cognitive assessments

Study Design

• Duration: 16-week longitudinal trial
• Population: Older adults in assisted living
• Technology: FDA-registered VR cognitive modules
• Methods: Mixed-methods (quantitative + qualitative)
VR TherapyClinical ResearchCognitive AgingImmersive TechnologiesHuman-Centered ResearchGerontology
The Utility of Virtual Reality and Artificial Intelligence in ADHD: A Systematic Review from Diagnosis to Intervention
Completed
2024
Systematic Review

Research Overview

Systematic investigation of VR and AI technologies in ADHD assessment and intervention, evaluating diagnostic accuracy, ecological validity, and therapeutic potential through analysis of 17 peer-reviewed studies.

Background

Attention-Deficit/Hyperactivity Disorder (ADHD) is traditionally diagnosed by subjective clinical interviews and rating scales, which are often lacking in ecological validity and objective biomarkers. Emerging technologies like Virtual Reality (VR) and Artificial Intelligence (AI) have a transformative potential to overcome these limitations.

Objective

The objective of this systematic review is to assess the usefulness of combining VR and AI technologies in the diagnosis, evaluation, and therapeutic intervention of children with Attention Deficit Disorder (ADHD).

Methods

A systematic search was performed from Web of Science, PubMed and Google Scholar databases. Studies using VR environments and AI algorithms (for example Machine Learning, Deep Learning) for populations with Attention Deficit Hyperactivity Disorder (ADHD) were selected according to the guidelines of the PRISMA 2020. A total of 17 studies were included in the study.

Results

From synthesizing the findings, it is discovered that the diagnostic accuracy of AI models trained with multi-modal VR data (eye-tracking, EEG and behavioral logs) is high (ranging from 81% to 98.3%). VR environments offer ecologically valid measures of assessment by simulating real world scenarios such as virtual classrooms and daily life tasks. While the field of diagnostic applications is mature, the field of therapeutic intervention is emerging, with pilot studies showing improvements in attention efficiency using closed-loop biofeedback systems.

Conclusion

The intersection of both VR and AI enables a paradigm shift from subjective symptom checklists to objective "digital phenotyping." These technologies have the precision for diagnosis, opening the door for personalized, AI-based digital therapeutics. Future studies should focus on large scale longitudinal studies in order to standardize these tools for routine clinical practice.

ADHDVirtual RealityArtificial IntelligenceMachine LearningDiagnosisDigital PhenotypingEcological ValiditySystematic Review
Emteq Labs Preclinical Research
Completed
Led the preclinical research study, collecting biosensing data through the world's first eyewear for behaviour analytics, wearable expression and emotion sensing
2024
Emteq Labs
Preclinical Research

Project Context

Preclinical investigation of biosensing wearables for real-time emotional and behavioral state tracking, exploring closed-loop feedback systems for therapeutic applications in eating behavior and emotional wellbeing.

In this preclinical research with Emteq Labs, I investigated how real-time biosensing can capture and interpret users' emotional and behavioral states in everyday contexts. Using OCOsense™ smart-glasses, which measure facial muscle micro-movements through optomyography, I collected, organized, and analysed biosensing datasets tracking eating behavior.

Working directly with Emteq Labs' founder, surgeon Charles Nduka, I initiated the ideation and early prototyping of next-generation smart-glasses that not only measure biosignals, but have therapeutic outcomes. Being influenced by precision medicine and feedback loops, I explored integrating biosignals into AI-based feedback systems that dynamically adjust cognitive challenges or sensory cues through haptic or visual feedback that scaffold attention, modulate behavior (e.g., mindful eating), and harness neuroplasticity by keeping users in an optimal engagement zone.

Key Contributions & Outcomes

Biosensing Protocol Mastery

Operated facial-activity tracking smart glasses, mastering complex biosensing protocols for preclinical dietary behavior studies

Data Collection Excellence

Collected real-time high-quality eating behavior data from human participants, maintaining scientific rigor and strong interpersonal engagement

Closed-Loop Interface Design

Explored closed-loop interface design, integrating real-time physiological data into adaptive feedback models

Science Communication

Wrote a blog post for their website titled "Revolutionizing Eating Behaviour Tracking with Smart Glasses and Deep Learning" to inform the public about Emteq's work

BiosensingAffective ComputingPhysiological ComputingPreclinical ResearchDigital Health
VRforHealth Research
Completed
Research contributions to VRforHealth
2024
VRforHealth
Science Communication

Project Context

Contributed to translating VR therapy research into accessible resources for patients, caregivers, and health providers, bridging the gap between scientific knowledge and real-world therapeutic adoption.

VRforHealth aimed to bridge the gap between scientific research and real-world therapeutic adoption by making high-quality information on VR therapy understandable, accessible, and clinically meaningful. The goal was to help patients, caregivers, and health providers navigate the rapidly expanding landscape of VR-based mental health and pain-management interventions. I worked under the mentorship of Denise Silber and received significant guidance in navigating the digital health field.

Key Contributions

I contributed to shaping the organization's knowledge framework, reviewing and synthesizing emerging research on VR therapy, neuropsychological rehabilitation, cognitive enhancement, pain desensitization, and anxiety reduction. I curated materials that translated neuroscience and psychology findings into clear, actionable resources for non-experts. I also produced blog posts that were posted on the VRforHealth website and LinkedIn.

Translation as Innovation

VRforHealth taught me that translation is its own form of innovation. What made this work compelling was the challenge of turning dense research into something that accelerates adoption, improves lay public, patient, and clinician understanding, and empowers them to use VR ethically and effectively.

This required:

  • understanding cognitive psychology (how people learn)
  • understanding emotion regulation (how people relate to therapeutic tools)
  • understanding behavior change (why clinicians adopt new technologies)
VRHealthScience Communication

Human-Centered Design

1 Project
Edulis Labs - Human-Centered Strategy & Design Thinking Project
Completed
2023 - 2024
Edulis Labs
Design Thinking Initiative

Classes Taken: ENGR 180: Human-Centered Design, ENGR 186: Advanced Human-Centered Design

Role: Human-Centered Design Researcher & Strategist
Collaboration: 4-person team

Project Summary

Partnered with Edulis Labs, a startup developing a non-toxic, damage-free hair-lightening/coloring technology, to conduct a full design-thinking research initiative (empathize → define → ideate → prototype → test) aimed at identifying real barriers to product adoption and creating a go-to-market strategy for hairstylists and salon owners.

Key Contributions

User Research & Empathy Work
  • Conducted interviews, social-media outreach, and on-site conversations with professional stylists.
  • Identified behavioral patterns around trust, risk perception, learning styles, and technology adoption.
  • Synthesized insights showing stylists require education, evidence, and relational trust before adopting new products.
Problem Definition
  • Collaboratively articulated a central design problem:
  • Stylists will not adopt a new hair system until they can believe in it, experience it firsthand, and see long-term value.
Ideation & Concept Development
  • Co-created five educational engagement models: event-based, salon-based, in-salon visits, online learning, and micro-education via social media.
  • Integrated team members’ research, strategic, and design strengths to build concept prototypes.
Prototyping & Iteration
  • Tested each prototype with stylists, gathering feedback on feasibility, credibility, cost, time constraints, and perceived value.
  • Iterated solutions based on real-world constraints and user insights.
Strategic Proposal
  • Designed the final "Come to You" educational model: short, free, high-trust in-salon demonstrations by trained Edulis reps.
  • Developed a stylist-rep system, incentive structures, and messaging guidelines reinforcing reps as a relational bridge between Edulis and salons.
Team Collaboration
  • Worked within a multidisciplinary team, combining research, behavioral insight, communication design, and strategy.
  • Led synthesis discussions and supported alignment across different expertise areas.

Key Outcomes

  • Human-Centered Go-To-Market Strategy: Delivered an evidence-based education strategy that directly reflects stylist behavior, motivations, and barriers.
  • Behaviorally Informed Product Messaging: Produced clear scripts and messaging guidelines grounded in cognitive and psychological principles of trust-building.
  • Validated Engagement Model: Identified the preferred adoption pathway among stylists: relational, hands-on in-salon demonstrations.
  • Organizational Impact: Introduced a scalable model for early-stage product adoption that can expand with Edulis’s growth.

Connection to Background

Translational Thinking (Science → Product)

Although the context wasn’t digital, the underlying question was the same as in digital therapeutics: How do humans learn, build trust, change behavior, and adopt new tools? This project deepened my passion for bridging scientific insight with real-world implementation, reinforcing my interest in designing systems that translate cognitive and behavioral science into effective, adoptable products.

Rooted in Cognitive Science & Human-Centered Design

Applied principles of perception, decision-making, behavioral change, and user trust–core areas within my cognitive science training. Strengthened my commitment to designing tools–digital or otherwise–that are intuitive, accessible, and grounded in human behavior.

Interdisciplinary Teamwork

Mirrored the collaborative environments I aspire to work in at the MIT Media Lab: engineers × designers × behavioral scientists × strategists combining distinct strengths to solve human problems.

Alignment With My Long-Term Goals

Reinforced my belief that meaningful innovation–whether digital therapeutics, VR systems, biosensing wearables, or intelligent interfaces–requires: deep empathy for the end user and a willingness to iterate based on feedback.

Design ThinkingBehavioral ScienceStrategyUser Research