Ph.D. Candidate · Cornell University · SciFi Lab
Advised by Prof. Cheng Zhang
Committee: Prof. Tanzeem Choudhury, Prof. Deborah Estrin
I build wearable systems that combine acoustic sensing, inertial sensors (IMU), and egocentric vision with machine learning to understand fine-grained human behavior, from full-body pose tracking to everyday activity recognition, on devices like smartglasses, smartwatches, rings, and clothing.
News & Updates
Research
Active sonar on glasses for fine-grained, camera-free sensing of the body, face, and everyday behaviors. From recognizing daily activities and dietary actions to tracking 3D pose and gaze.

SonicGlasses: Smart Glasses Platform for Multi-Task Human Activity Tracking with Acoustic Sensing
Preprint · Under Review


MunchSonic: Tracking Fine-grained Dietary Actions through Active Acoustic Sensing on Eyeglasses
UbiComp / ISWC 2024

EchoGuide: Active Acoustic Guidance for LLM-Based Eating Event Analysis from Egocentric Videos
UbiComp / ISWC 2024
Extending acoustic, inertial, and capacitive sensing beyond glasses, into commercial smartwatches, rings, and clothing, to enable full-body pose tracking and continuous hand pose recognition on devices people already wear.
EchoMotion: Towards Full-Body Pose Tracking with a Single Smartwatch Using Active Acoustic Sensing
Preprint · Under Review
SonicFit: Cross-User Full-Body Pose Tracking for Fitness Exercises Using Active Acoustic and Inertial Sensing with a Single Commercial Smartwatch
Preprint · Under Review
Self-attention, autoencoder, and graph-based sequence models for human activity recognition and spatio-temporal prediction tasks on wearable and IoT sensor data.



Patents
Wearable Devices with Wireless Transmitter-Receiver Pairs for Acoustic Sensing of User Characteristics →
Press & Media
Glasses use sonar, AI to interpret upper body poses in 3D →
These sonar-equipped glasses could pave the way for better VR body tracking →
Spec-tacular Body Pose Estimation →
Smart glasses could boost privacy by swapping cameras for this 100-year-old technology →
Glasses use sonar, AI to interpret upper body poses in 3D →
PoseSonic: Sonar, AI Powers Glasses to Track Upper Body Movements in 3D →
Glasses use sonar, AI to interpret upper body poses in 3D →
New sonar-equipped glasses use AI to interpret upper body poses in 3D →
Naočale sa sonarom za bolje praćenje tijela →
AI-powered 'sonar' on smartglasses tracks gaze and facial expressions →
No-Camera Eye Tracking: Cornell Invents Tech To Track Gaze Minus Surveillance →
Smart glasses use sonar to work out where you're looking →
Odd-looking glasses track your eyes and facial expressions without cameras →
AI-powered 'sonar' on smartglasses tracks gaze, facial expressions →
AI-Powered Sonar on Smartglasses Promises to Monitor Gaze and Facial Expressions →
Service
Contact
Email: sm2446@cornell.edu
Office: 239 Gates Hall [map],
Cornell University, 107 Hoy Rd, Ithaca, NY 14853, USA