Siyou Pei, Ph.D.         

  



Extended Reality (XR) is an umbrella term for Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR).

I'm an XR enthusiast, an HCI researcher @HiLab, Department of Electrical & Computer Engineering, UCLA, advised by Prof. Yang Zhang. I also had the fortune to work with mentors Alex Olwal, David Kim, Ruofei Du at Google and Blair MacIntyre, Feiyu Lu at JPMC XR Research.

To broaden the interaction bandwidth between humans and XR systems, I aim to make XR technologies natural and efficient across users with different backgrounds and expertise through human body embodiment. Human body itself is a complex and delicate system after millions of years of evolution that embodies vast implicit knowledge carved in human genes. By turning the body into an interaction medium, embodiment is the key to transferring user knowledge of human body to unseen interaction.

To achieve this goal, I propose the concept of Embodied Interaction for XR -- turning body into XR interfaces through embodiment -- and explore how to design interaction techniques and invent enabling sensing technologies for body-embodied interfaces. I believe that interdisciplinary research across science, engineering, and design is key to effectively expanding the design space of human-centered interaction with emerging technologies.

I am on job market, open to opportunities in both academia and industry. Here is my Research Statement.


Research



Haptic Artificial Muscle Skin for Extended Reality

Yuxuan Guo, Yang Luo, Roshan Plamthottam, Siyou Pei, Chen Wei, Ziqing Han, Jiacheng Fan, Mason Possinger, Kede Liu, Yingke Zhu, Zhangqing Fei, Isabelle Winardi, Hyeonji Hong, Yang Zhang, Lihua Jin, and Qibing Pei (Science Advances, 10(43), eadr1765, 2024)

We present a wearable haptic artificial muscle skin based on multilayer dielectric elastomer actuators (DEAs) in Extended Reality (XR) systems to enhance immersion.



UI Mobility Control in XR: Switching UI Positionings between Static, Dynamic, and Self Entities

Siyou Pei, David Kim, Alex Olwal, Yang Zhang, and Ruofei Du (CHI '2024)

We facilitated UI mobility between static, dynamic, and self entities with Finger Switches based on users' in-situ needs.



WheelPose: Data Synthesis Techniques to Improve Pose Estimation Performance on Wheelchair Users

William Huang, Sam Ghahremani, Siyou Pei, and Yang Zhang (CHI '2024)

A data synthesis pipeline to address the underrepresentation of wheelchair users in data collection for pose estimation models.



Embodied Exploration: Facilitating Remote Accessibility Assessment for Wheelchair Users with Virtual Reality

Siyou Pei, Alexander Chen, Chen Chen, Franklin Mingzhe Li, Megan Fozzard, Hao-yun Chi, Nadir Weibel, Patrick Carrington, Yang Zhang (ASSETS '2023)

VR technique for wheelchair users to evalute accessibility remotely.



ForceSight: Non-Contact Force Sensing with Laser Speckle Imaging

Siyou Pei, Pradyumna Chari, Xue Wang, Xiaoying Yang, Achuta Kadambi, Yang Zhang (UIST '2022)

Best Demo Honorable Mention

Object surfaces deform in the presence of force. This deformation, though very minute, manifests as observable and discernible laser speckle shifts, which we leverage to sense the applied force.



Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions

Siyou Pei, Alexander Chen, Jaewook Lee, Yang Zhang (CHI '2022)

Best Paper Honorable Mention

A new interaction technique that lets users' hands become virtual objects by imitating the objects themselves. For example, a thumbs-up hand pose is used to mimic a joystick.



AURITUS: An Open-Source Optimization Toolkit for Training and Development of Human Movement Models and Filters Using Earables

Swapnil Sayan Saha, Mr. Sandeep Singh Sandha, Siyou Pei, Vivek Jain, Mr. Ziqi Wang, Yuchen Li, Ankur Sarker, Prof. Mani Srivastava (IMWUT '2021)

AURITUS is an extendable and open-source optimization toolkit designed to enhance and replicate earable applications, e.g., activity detection and head-pose tracking.

 


News


Nov 2024: Obtained PhD degree at UCLA!

Jun-Sep 2024: Interned at JPMC Immersive Technologies in New York.

May 2024: Presented UI Mobility in XR at CHI 2024.

Feb 2024: Reviewed CHI 2024 Late-Breaking Work as Chair Committee.

Oct 2023: Presented Embodied Exploration at ASSETS 2023, New York.

Oct 2023: Reviewed CHI 2024 Papers.

July 2023: Successfully organized a non-profit summer program LACC 2023 at UCLA.

May 2023: Reviewed UIST 2023 Papers.

Apr 2023: Reviewed ISMAR 2023 Papers.

Jan-Apr 2023: Interned at Google in San Francisco.

Mar 2023: Reviewed DIS 2023 Papers.

Jan 2023: Reviewed CHI 2023 Late-Breaking Work.

Sep-Dec 2023: Interned at Google in Los Angeles.

Nov 2022: Received UIST Best Demo Honorable Mention Award for ForceSight.

Oct 2022: Presented ForceSight at UIST '22, Bend, OR.

Oct 2022: Reviewed CHI 2023 Papers.

August 2022: Passed the Oral Qualifying Examination at Department of Electrical & Computer Engineering and became a Ph.D candidate.

May 2022: Reviewed UIST 2022 Papers.

May 2022: Presented Hand Interfaces at CHI '22, New Orleans, LA.

April 2022: Received CHI Honorable Mention Award for Hand Interfaces.

April 2022: Reviewed CHI 2022 Late-Breaking Work.

Mar 2022: Passed the Preliminary Exam at Department of Electrical & Computer Engineering.

Dec 2021: Turned in MS thesis and obtained the MS degree.

Nov 2021: Reviewed CHI 2022 Papers.

Feb 2021: Reviewed CHI 2021 Late-Breaking Work.

Sep 2019: Started MS-Ph.D. program at University of California, Los Angeles.




© Siyou Pei, Source Code.