Background Image 1 Background Image 4 Background Image 2 Background Image 3 Background Image 4

Hi, I'm Haopeng Wang

Ph.D. Student
HCI Reasercher
Natural UI Design

哈喽!!

I'm Haopeng Wang

Ph.D. Student at Lancaster University

I’m working in the field of Human-Computer Interaction (HCI) as a part of the GEMINI ERC project, supervised by Prof. Hans Gellersen. I explore and design multimodal interaction techniques for virtual reality, leveraging coordination between eye, head, and hand to improve efficiency and user experience.

My Resume

Publications

HeadDepth: Gaze Raycasting with Head Pitch for Depth Control

Haopeng Wang, Florian Weidner, Yasmeen Abdrabou, Ken Pfeuffer, and Hans Gellersen
ISMAR'25: IEEE International Symposium on Mixed and Augmented Reality Video Figure Repo

HeadShift: Head Pointing with Dynamic Control-Display Gain

Haopeng Wang, Ludwig Sidenmark, Florian Weidner, Joshua Newn, and Hans Gellersen
TOCHI: ACM Transactions on Computer-Human Interaction Presentation Video Video Figure Repo

Gaze, Wall, and Racket: Combining Gaze and Hand-controlled Plane for 3D Selection in Virtual Reality

Uta Wagner, Matthias Albrecht, Andreas Asferg Jacobsen, Haopeng Wang, Hans Gellersen, Ken Pfeuffer
ISS'24: ACM International Conference on Interactive Surfaces and Spaces

Towards the Fusion of Gaze and Micro-Gestures

Kagan Taylor, Haopeng Wang, Florian Weidner, Hans Gellersen
IEEE VR'25: IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)

Understanding and Leveraging Head Movement as Input for Interaction

Haopeng Wang
IEEE VR'25: IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)

Projects & Prototypes

All
HeadShift
HeadDepth
Gaze, Wall, and Racket
Motion Correlation
OnGazeLeave
Cone and Bubble

Motion Correlation

Magic Finger: selecting a menu item that follows the fingertip by tracing with eyes.

Motion Correlation

Stretch Ruler: selecting a rotating menu item as hands move apart by tracing surrounding asteroids with eyes.

OnGazeLeave

Gaze Trigger: confirming hand selection with a saccade to a pop-up gaze target.

OnGazeLeave

Hands-free 2D Positioning: attach an object to the head pointer while the gaze is within its area, and release it when the gaze leaves.

OnGazeLeave

Hands-free Multi-target Selection: add the object to the selected object list when the gaze is first on the target and then the head pointer lands on the target.

Cone and Bubble

Gaze + Hand: gaze cone (green) intuitively filter out a subset of targets, allowing hand to perform more precise and easier selections.

Cone and Bubble

Hand + Hand: the hand cone (green) intuitively filter out a subset of targets in the cone, allowing the other hand to perform more precise and easier selections.

OnGazeLeave

Camera Dragging: when the gaze is fixated at the center of the viewport, the virtual forward direction shifts horizontally based on (draged by) head rotation.

Cone and Bubble

Voronoi Diagram: generate a voronoi diagram with targets located within a gaze cone, where target sizes are dynamically expanded to facilitate hand-based selection.

Small Bench

Moving object along a hand ray, with depth controlled via the trackpad on the opposite hand's controller, and rotation also manipulated by the other hand.

FishRod

Dynamic parabolic curve for ground-pointing, enabling natural selection at both close and far distances.
  • $$ { ID = Log_{\ 2}\ (\frac{D}{W_e} + 1) } $$

HeadDepth on the stage of ISMAR 2025!

Thrilled to share our work on interaction design with eye-head coordination with the XR community!

HeadShift Hits CHI 2025!

Proud to present our TOCHI paper on the big CHI stage!

IEEE VR 2025 Doctoral Consortium!

Presented my PhD research with the ocean as my backdrop!

GEMINI Team Meetup 2025!

An amazing hike through the beautiful Lake District with labmates, Hans, and Albrecht!

GEMINI Team Meetup 2024!

Such a lovely team! I love it so much.

Contact

h.wang73@lancaster.ac.uk

A44, Infolab21,
Lancaster University, United Kingdom,
LA1 4WA

LinkedIn