top of page

PROJECTS

My projects explore how intelligent systems can move beyond passive tools to become embodied, perceptual, and affective "collaborators". Through research-driven design, system building, and empirical evaluation, I investigate how AI can sense human states, adapt over time, and intervene through multimodal and physical interaction.​

My design projects from 2019 to 2023 are available HERE.

Scent Intervention for Affective Regulation in High-Stress Driving

MIT Media Lab · Sep 2025-Present

A human-in-the-loop AI system that adapts scent sequences to regulate sustained physiological arousal in high-stress driving scenarios. The project combines RLHF, physiological sensing, and interpretable scent–emotion representations to study non-intrusive, long-horizon affective intervention.

scent-intervention.png

DreamTales: AI-Facilitated Pre- and Post-Reflection for Parent–Child Storytelling

MIT Media Lab · Oct 2025–Present

An AI-supported storytelling system that shifts intervention away from in-the-moment mediation toward pre- and post-session reflection. By visualizing emotional synchrony between parents and children, the project explores how generative AI can support parental agency, reflection, and emotional bonding.

Conceptual Diagram.png

NOEMA: Reconstructing Perception Through Spatialized Audio

Design Intelligence Lab@MIT · Mar - May 2025

An eyewear-based embodied AI system that translates visual scenes into spatialized audio narratives. NOEMA reframes perception from vision to sound, exploring how large language models and multisensory feedback can reshape attention, situational awareness, and everyday embodied experience.

cover_edited.jpg

PainMouse: Multimodal Violence Detection with Embodied Haptic Feedback

How to AI (Almost) Anything@MIT Media Lab · Mar - May 2025

A multimodal AI system that detects violent gameplay behavior through facial expression, audio, video, and interaction signals, triggering calibrated pain-based haptic feedback via a custom hardware device. The project investigates real-time behavioral regulation and ethical intervention through embodied feedback.

painmouse_edited.jpg

The Anemoia Device: A Tangible AI System for the Co-creation of Synthetic Memories through Scent

Tangible Interfaces@MIT Media Lab · Sep - Dec 2024

The Anemoia Device is a synthetic memory generator that uses generative AI to provoke nostalgia for a time you have never experienced. The work is an inquiry into memory malleability in an age of AI, proposing an alternative to conventional screen-based interaction through an intentional, embodied ritual that positions the user as an active co-author, rather than a passive consumer.

hero_image_v5.png

Creative 3D Modeling Through Human-AI Interaction: A Non-linear, Incremental, and Iterative Workflow

Enactive Design@Harvard GSD · Sep - Dec 2024

A generative AI creative support tool designed to align with how designers actually work: non-linearly, incrementally, and reflectively. The system integrates with existing 3D modeling workflows, capturing design progress over time and allowing designers to selectively request AI suggestions based on prior iterations, supporting exploration while preserving authorship and control.

non-linear.png

Rethinking Human-AI Collaboration in Complex Medical Decision Making

Human-Centered AI Lab@NEU · May-Sep 2023

A human-centered AI decision support system for sepsis care that examines how interface design shapes clinicians’ trust, interpretation, and use of model predictions. Through interface prototyping and user studies, the project investigates how AI can better support clinical reasoning in high-stakes, time-sensitive medical contexts.

sepsis.png

 

© 2025 by Nomy Jianing Yu.

 

bottom of page