Publication Types:

Sort by year:

Effects of Past Interactions on User Experience with Recommended Documents

Conference Paper
Farnaz Jahanbakhsh, Ahmed Hassan Awadallah, Susan T. Dumais, Xuhai Xu
Proceedings of the 2020 Conference on Human Information Interaction and Retrieval(CHIIR ’20)
Publication year: 2020

ABSTRACT

 Recommender systems are commonly used in entertainment, news, e-commerce, and social media. Document recommendation is a new and under-explored application area, in which both re-finding and discovery of documents need to be supported. In this paper we provide an initial exploration of users’ experience with recommended documents, with a focus on how prior interactions influence recognition and interest. Through a field study of more than 100 users, we investigate the effects of past interactions with recommended documents on users’ recognition of, prior intent to open, and interest in the documents. We examined different presentations of interaction history, and the recency and richness of prior interaction. We found that presentation only influenced recognition time. Our findings also indicate that people are more likely to recognize documents they had accessed recently and to do so more quickly. Similarly, documents that people had interacted with more deeply were also more frequently and quickly recognized. However, people were more interested in older documents or those with which they had less involved interactions. This finding suggests that in addition to helping users quickly access documents they intend to re-find, document recommendation can add value in helping users discover other documents. Our results offer implications for designing document recommendation systems that help users fulfil different needs.  [Click for More Details] 

EarBuddy: Enabling On-Face Interaction via Wireless Earbuds

Conference Paper
Xuhai Xu, Haitian Shi, Xin Yi, Wenjia Liu, Yukang Yan, Yuanchun Shi, Alex Mariakakis, Jennifer Mankoff, Anind K. Dey
Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20)
Publication year: 2020

ABSTRACT

Past research regarding on-body interaction typically requires custom sensors, limiting their scalability and generalizability. We propose EarBuddy, a real-time system that leverages the microphone in commercial wireless earbuds to detect tapping and sliding gestures near the face and ears. We develop a design space to generate 27 valid gestures and conducted a user study (N=16) to select the eight gestures that were optimal for both human preference and microphone detectability. We collected a dataset on those eight gestures (N=20) and trained deep learning models for gesture detection and classification. Our optimized classifier achieved an accuracy of 95.3%. Finally, we conducted a user study (N=12) to evaluate EarBuddy’s usability. Our results show that EarBuddy can facilitate novel interaction and that users feel very positively about the system. EarBuddy provides a new eyes-free, socially acceptable input method that is compatible with commercial wireless earbuds and has the potential for scalability and generalizability.  [Click for More Details] 

Understanding User Behavior For Document Recommendation

Conference Paper
Xuhai Xu, Ahmed Hassan Awadallah, Susan T. Dumais, Farheen Omar, Bogdan Popp, Robert Rounthwaite, Farnaz Jahanbakhsh
Proceedings of the 2020 World Wide Web Conference (WWW '20)
Publication year: 2020

ABSTRACT

Personalized document recommendation systems aim to provide users with a quick shortcut to the documents they may want to access next, usually with an explanation about why the document is recommended. Previous work explored various methods for better recommendations and better explanations in different domains. However, there are few efforts that closely study how users react to the recommended items in a document recommendation scenario. We conducted a large-scale log study of users’ interaction behavior with the explainable recommendation on one of the largest cloud document platforms office.com. Our analysis reveals a number of factors, including display position, file type, authorship, recency of last access, and most importantly, the recommendation explanations, that are associated with whether users will recognize or open the recommended documents. Moreover, we specifically focus on explanations and conduct an online experiment to investigate the influence of different explanations on user behavior. Our analysis indicates that the recommendations help users access their documents significantly faster, but sometimes users miss a recommendation and resort to other more complicated methods to open the documents. Our results suggest opportunities to improve explanations and more generally the design of systems that provide and explain recommendations for documents.  [Click for More Details] 

Clench Interaction: Novel Biting Input Techniques

Conference Paper
Xuhai Xu, Chun Yu, Anind K. Dey, Jennifer Mankoff
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19)
Publication year: 2019

ABSTRACT

People eat every day and biting is one of the most fundamental and natural actions that they perform on a daily basis. Existing work has explored tooth click location and jaw movement as input techniques, however clenching has the potential to add control to this input channel. We propose clench interaction that leverages clenching as an actively controlled physiological signal that can facilitate interactions. We conducted a user study to investigate users’ ability to control their clench force. We found that users can easily discriminate three force levels, and that they can quickly confrm actions by unclenching (quick release). We developed a design space for clench interaction based on the results and investigated the usability of the clench interface. Participants preferred the clench over baselines and indicated a willingness to use clench-based interactions. This novel technique can provide an additional input method in cases where users’ eyes or hands are busy, augment immersive experiences such as virtual/augmented reality, and assist individuals with disabilities.  [Click for More Details]

vMotion: Designing a Seamless Walking Experience in VR

Conference Paper
Misha Sra, Xuhai Xu, Aske Mottelson, Pattie Maes
Proceedings of the 2018 Designing Interactive Systems Conference (DIS ’18)
Publication year: 2018

ABSTRACT

Physically walking in virtual reality can provide a satisfying sense of presence. However, natural locomotion in virtual worlds larger than the tracked space remains a practical challenge. Numerous redirected walking techniques have been proposed to overcome space limitations but they often require rapid head rotation, sometimes induced by distractors, to keep the scene rotation imperceptible. We propose a design methodology of seamlessly integrating redirection into the virtual experience that takes advantage of the perceptual phenomenon of inattentional blindness. Additionally, we present four novel visibility control techniques that work with our design methodology to minimize disruption to the user experience commonly found in existing redirection techniques. A user study (N = 16) shows that our techniques are imperceptible and users report significantly less dizziness when using our methods. The illusion of unconstrained walking in a large area (16×8m) is maintained even though users are limited to a smaller (3:5×3:5m) physical space.  [Click for More Details]

Hand Range Interface: Information Always at Hand with A Body-centric Mid-air Input Surface

Conference Paper
Xuhai Xu, Alexandru Dancu, Pattie Maes, Suranga Nanayakkara
Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’18)
Publication year: 2018

ABSTRACT

Most interfaces of our interactive devices such as phones and laptops are flat and are built as external devices in our environment, disconnected from our bodies. Therefore, we need to carry them with us in our pocket or in a bag and accommodate our bodies to their design by sitting at a desk or holding the device in our hand. We propose Hand Range Interface, an input surface that is always at our fingertips. This bodycentric interface is a semi-sphere attached to a user’s wrist, with a radius the same as the distance from the wrist to the index finger. We prototyped the concept in virtual reality and conducted a user study with a pointing task. The input surface can be designed as rotating with the wrist or fixed relative to the wrist. We evaluated and compared participants’ subjective physical comfort level, pointing speed and pointing accuracy on the interface that was divided into 64 regions. We found that the interface whose orientation was fixed had a much better performance, with 41.2% higher average comfort score, 40.6% shorter average pointing time and 34.5% lower average error. Our results revealed interesting insights on user performance and preference of different regions on the interface. We concluded with a set of guidelines for future designers and developers on how to develop this type of new body-centric input surface.[Click for More Details]  

ForceBoard: Subtle Text Entry Leveraging Pressure

Conference Paper
MingYuan Zhong, Chun Yu, Qian Wang, Xuhai Xu, Yuanchun Shi
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18)
Publication year: 2018

ABSTRACT

We present ForceBoard, a pressure-based input technique that enables text entry by subtle finger motion. To enter text, users apply pressure to control a multi-letter-wide sliding cursor on a one-dimensional keyboard with alphabetical ordering, and confirm the selection with a quick release. We examined the error model of pressure control for successive and error-tolerant input, which was incorporated into a Bayesian algorithm to infer user input. A user study showed that, after a 10-minute training, the average text entry rate reached 4.2 WPM (Words Per Minute) for character-level input, and 11.0 WPM for word-level input. Users reported that ForceBoard was easy to learn and interesting to use. These results demonstrated the feasibility of applying pressure as the main channel for text entry. We conclude by discussing the limitation, as well as the potential of ForceBoard to support interaction with constraints from form factor, social concern and physical environments.  [Click for More Details]

BreathVR: Leveraging Breathing as a Directly Controlled Interface for Virtual Reality Games

Conference Paper
Misha Sra*, Xuhai Xu*, Pattie Maes (*Equally contributed)
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18)
Publication year: 2018

ABSTRACT

With virtual reality head-mounted displays rapidly becoming accessible to mass audiences, there is growing interest in new forms of natural input techniques to enhance immersion and engagement for players. Research has explored physiological input for enhancing immersion in single player games through indirectly controlled signals like heart rate or galvanic skin response. In this paper, we propose breathing as a directly controlled physiological signal that can facilitate unique and engaging play experiences through natural interaction in single and multiplayer virtual reality games. Our study (N = 16) shows that participants report a higher sense of presence and find the gameplay more fun and challenging when using our breathing actions. From study observations and analysis we present five design strategies that can aid virtual reality game designers interested in using directly controlled forms of physiological input.  [Click for More Details]