Hongyue is an HCI researcher, designer, and engineer based in Melbourne, Australia. He is currently a Ph.D. candidate at the Exertion Games Lab, Monash University, supervised by Prof. Florian ‘Floyd’ Mueller and Dr. Don Samitha Elvitigala. His work bridges tangible artifacts, multisensory interaction, and extended reality. Currently, he collaborates with culinary practitioners to explore auditory elements as programmable “ingredients” that support creative experimentation and enrich the dining experience.
He is interested in creating novel interfaces and engaging user experiences that merge emerging technologies and art to uncover knowledge through hands-on design practice, in-the-field studies, and reflection. In particular, his research focuses on:
(1) exploring interactive digital elements as new dimensions of creative expression;
(2) designing seamless interfaces that integrate into daily life through unobtrusive interactions;
(3) developing edible interactions that combine technology, edibility, and perception to reimagine the future of eating experiences.
News
- 2025-10 I will be at ICHEC 2025 in Singapore, presenting my latest poster work. See you all there! :D
- 2025-01 We got two CHI 2025 papers accepted in the main track, along with two LBWs! See you all in Yokohama! 🎉
- 2024-03 Our two CHI 2024 late-breaking works have been accepted! Thanks to all the co-authors! 😊
- 2023-09 I officially joined the Exertion Games Lab to start my PhD journey, researching auditory interfaces for food interaction.
Full Paper Publications
-
Field study unpacking how chefs and diners co-create sonic dining experiences to inform future multisensory design tools.
-
Introduces speculative food artifacts that translate sonic cues into gustatory sensations, expanding co-creative culinary practices.
-
A multi-institution agenda outlining opportunities and hurdles for future human-food interaction research.
-
Presents a multimodal VR laboratory platform that blends smart sensing and instructional design for immersive chemistry education.
-
Proposes a multimodal algorithm that models user comfort to improve collaborative intention understanding between humans and computers.
-
Details a smart glove platform enabling multimodal perception for virtual-reality chemistry experiments that blend physical and digital actions.
-
Introduces a smart glove that captures multimodal biosignals to infer user intent for future collaborative robotic systems.
-
Describes smart glove hardware/software architecture enabling contextual sensing for collaborative learning scenarios.
Workshop Papers, Extended Abstracts, and Doctoral Consortium
-
Mixed reality installation that couples breath sensing with ambient nature cues to foster restorative human–nature connections.
-
Presents an auditory dining system concept co-designed with chefs to explore sonic augmentations of culinary practice.
-
Investigates how multisensory VR scenes can enhance the experience of bubble tea rituals and storytelling.
-
Demonstrates a playful pipeline where groups co-design edible 3D-printed appetizers to jump-start social interaction.
-
Reports on a tangible pen-based system enabling collaborative exploration of geometric concepts in classrooms.
-
Explores AR concepts for urban cycling that blend data overlays with experiential storytelling for safer, playful rides.
Academic Service
- 2025 CHI 2025 Late-Breaking Work Associate Chair; Reviewer for CHI 2025 and the International Journal of Food Design.
- 2024 Reviewer for the International Journal of Human-Computer Interaction, DIS 2024, and CHI 2024 Late-Breaking Work.