Haoxiang (Steven) Yu, 于浩翔

Research Scientist @ Meta Reality Labs

prof_pic.jpg

Curriculum vitae (CV)

Redwood City, CA, USA

Empowering the next generation of Multimodal & Contextual AI

I am a Research Scientist at Meta Reality Labs, building scalable AI systems that power the future of AR/VR and smart wearables. My expertise lies in adapting large-scale foundation models for real-world deployment, with a core focus on on-device machine learning, highly-efficient edge inference, and multimodal understanding (including mLLMs and vision-language architectures). I am driven by the challenge of bringing cutting-edge reasoning capabilities to resource-constrained devices with zero compromise on latency or user experience.

My research centers on pervasive/ubiquitous computing & Machine Learning – specifically the collaboration between machine learning models (e.g., federated/collaborative learning), the interaction between models and the environment (e.g., edge AI, context-aware systems), and the synergy between models and humans (e.g., human-in-the-loop learning).

Those works have generated 25+ publications and 5 U.S. patent applications, earning top honors including the IMWUT Distinguished Paper Award, IEEE MASS Best Paper Award, and IEEE PerCom Best Paper Runner-Up. More importantly, my research has demonstrated strong industry relevance, accumulating 300+ citations and influencing patent filings, research publications, and product roadmaps at companies like Amazon, Samsung, and Toyota.

Prior to Meta, I earned my Ph.D. in Electrical and Computer Engineering from the University of Texas at Austin under Dr. Christine Julien, and conducted applied AI research at the Toyota Research Lab.

Outside of my research pursuits, I enjoy cooking :curry:, baking :cake:, and participating in outdoor activities :deciduous_tree:. I love bringing the world to my kitchen by experimenting with diverse dishes from around the globe.

news

Oct 15, 2026 Our paper “Sasha: Creative Goal-Oriented Reasoning in Smart Homes with Large Language Models” received the Distinguished Paper Award from ACM IMWUT (top 3.8%, 8 out of 208 papers)!
Aug 15, 2025 Ph.D. secured! I graduated from UT Austin.
May 27, 2025 Join Meta Reality Labs as a Research Scientist working on wearable contextual AI.
Mar 19, 2025 Our paper “Teaching Things To Think: Bootstrapping Local Reasoning for Smart(er) Devices” received the Best Paper Runner-Up at IEEE PerCom 2025!

selected publications

  1. IMWUT
    Sasha: Creative Goal-Oriented Reasoning in Smart Homes with Large Language Models
    Evan King, Haoxiang Yu, Sangsu Lee, and 1 more author
    Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Mar 2024
  2. SenSys
    Analysis of IFTTT Recipes to Study How Humans Use Internet-of-Things (IoT) Devices
    Haoxiang Yu, Jie Hua, and Christine Julien
    In Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems, Coimbra, Portugal, Mar 2021
  3. IEEE PerCom
    Teaching Things To Think: Bootstrapping Local Reasoning for Smart(er) Devices
    Evan King, Haoxiang Yu, Sahil Vartak, and 3 more authors
    In 2025 IEEE International Conference on Pervasive Computing and Communications (PerCom), Mar 2025