Publication Types:

Sort by year:

Clench Interaction: Novel Biting Input Techniques

Conference Paper
Xuhai Xu, Chun Yu, Anind K. Dey, Jennifer Mankoff
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19)
Publication year: 2019

ABSTRACT

People eat every day and biting is one of the most fundamental and natural actions that they perform on a daily basis. Existing work has explored tooth click location and jaw movement as input techniques, however clenching has the potential to add control to this input channel. We propose clench interaction that leverages clenching as an actively controlled physiological signal that can facilitate interactions. We conducted a user study to investigate users’ ability to control their clench force. We found that users can easily discriminate three force levels, and that they can quickly confrm actions by unclenching (quick release). We developed a design space for clench interaction based on the results and investigated the usability of the clench interface. Participants preferred the clench over baselines and indicated a willingness to use clench-based interactions. This novel technique can provide an additional input method in cases where users’ eyes or hands are busy, augment immersive experiences such as virtual/augmented reality, and assist individuals with disabilities.

vMotion: Designing a Seamless Walking Experience in VR

Conference Paper
Misha Sra, Xuhai Xu, Aske Mottelson, Pattie Maes
Proceedings of the 2018 Designing Interactive Systems Conference (DIS ’18)
Publication year: 2018

ABSTRACT

Physically walking in virtual reality can provide a satisfying sense of presence. However, natural locomotion in virtual worlds larger than the tracked space remains a practical challenge. Numerous redirected walking techniques have been proposed to overcome space limitations but they often require rapid head rotation, sometimes induced by distractors, to keep the scene rotation imperceptible. We propose a design methodology of seamlessly integrating redirection into the virtual experience that takes advantage of the perceptual phenomenon of inattentional blindness. Additionally, we present four novel visibility control techniques that work with our design methodology to minimize disruption to the user experience commonly found in existing redirection techniques. A user study (N = 16) shows that our techniques are imperceptible and users report significantly less dizziness when using our methods. The illusion of unconstrained walking in a large area (16×8m) is maintained even though users are limited to a smaller (3:5×3:5m) physical space.

Hand Range Interface: Information Always at Hand with A Body-centric Mid-air Input Surface

Conference Paper
Xuhai Xu, Alexandru Dancu, Pattie Maes, Suranga Nanayakkara
Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’18)
Publication year: 2018

ABSTRACT

Most interfaces of our interactive devices such as phones and laptops are flat and are built as external devices in our environment, disconnected from our bodies. Therefore, we need to carry them with us in our pocket or in a bag and accommodate our bodies to their design by sitting at a desk or holding the device in our hand. We propose Hand Range Interface, an input surface that is always at our fingertips. This bodycentric interface is a semi-sphere attached to a user’s wrist, with a radius the same as the distance from the wrist to the index finger. We prototyped the concept in virtual reality and conducted a user study with a pointing task. The input surface can be designed as rotating with the wrist or fixed relative to the wrist. We evaluated and compared participants’ subjective physical comfort level, pointing speed and pointing accuracy on the interface that was divided into 64 regions. We found that the interface whose orientation was fixed had a much better performance, with 41.2% higher average comfort score, 40.6% shorter average pointing time and 34.5% lower average error. Our results revealed interesting insights on user performance and preference of different regions on the interface. We concluded with a set of guidelines for future designers and developers on how to develop this type of new body-centric input surface.

ForceBoard: Subtle Text Entry Leveraging Pressure

Conference Paper
MingYuan Zhong, Chun Yu, Qian Wang, Xuhai Xu, Yuanchun Shi
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18)
Publication year: 2018

ABSTRACT

We present ForceBoard, a pressure-based input technique that enables text entry by subtle finger motion. To enter text, users apply pressure to control a multi-letter-wide sliding cursor on a one-dimensional keyboard with alphabetical ordering, and confirm the selection with a quick release. We examined the error model of pressure control for successive and error-tolerant input, which was incorporated into a Bayesian algorithm to infer user input. A user study showed that, after a 10-minute training, the average text entry rate reached 4.2 WPM (Words Per Minute) for character-level input, and 11.0 WPM for word-level input. Users reported that ForceBoard was easy to learn and interesting to use. These results demonstrated the feasibility of applying pressure as the main channel for text entry. We conclude by discussing the limitation, as well as the potential of ForceBoard to support interaction with constraints from form factor, social concern and physical environments.

BreathVR: Leveraging Breathing as a Directly Controlled Interface for Virtual Reality Games

Conference Paper
Misha Sra*, Xuhai Xu*, Pattie Maes (*Equally contributed)
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18)
Publication year: 2018

ABSTRACT

With virtual reality head-mounted displays rapidly becoming accessible to mass audiences, there is growing interest in new forms of natural input techniques to enhance immersion and engagement for players. Research has explored physiological input for enhancing immersion in single player games through indirectly controlled signals like heart rate or galvanic skin response. In this paper, we propose breathing as a directly controlled physiological signal that can facilitate unique and engaging play experiences through natural interaction in single and multiplayer virtual reality games. Our study (N = 16) shows that participants report a higher sense of presence and find the gameplay more fun and challenging when using our breathing actions. From study observations and analysis we present five design strategies that can aid virtual reality game designers interested in using directly controlled forms of physiological input.