Two New Papers: CHI 2019 and IUI SC 2019

02.7.2019 |

research

I have been continuing my research on adaptive techniques for Voice User Interfaces (VUIs). Recently, I have had two paper accepted based on studies related to my dissertation work. Our latest study from last summer has been accepted and will be presented at this year’s CHI 2019! Our paper focuses on better understanding the individual differences of users that impacts their behavior patterns with a VUI. This study continues to evaluate our VUI, DiscoverCal.

The Impact of User Characteristics and Preferences on Performance with an Unfamiliar Voice User Interface

Abstract: Voice User Interfaces (VUIs) are increasing in popularity. However, their invisible nature with no or limited visuals makes it difficult for users to interact with unfamiliar VUIs. We analyze the impact of user characteristics and preferences on how users interact with a VUI-based calendar, DiscoverCal. While recent VUI studies analyze user behavior through self-reported data, we extend this research by analyzing both VUI usage data and self-reported data to observe correlations between both data types. Results from our user study (n=50) led to four key findings: 1) programming experience did not have a wide-spread impact on performance metrics while 2) assimilation bias did, 3) participants with more technical confidence exhibited a trial-and-error approach, and 4) desiring more guidance from our VUI correlated with performance metrics that indicate cautious users.

I have also been accepted as one of 12 students into IUI 2019’s Student Consortium! My accepted paper focuses on my latest design for an adaptive technique to encourage the learnability of a VUI.

Adaptive Suggestions to Increase Learnability for Voice User Interfaces

Abstract: Voice User Interfaces (VUIs) are growing in popularity. However, their lack of visual guidance challenges users in learning how to proficiently operate these systems. My research focuses on adapting a VUI’s spoken feedback to suggest verbal commands to users encountering errors. Based on observations from my previous research, these adaptive suggestions adapt to the detected user proficiency and predicted user goal to customize feedback to support user needs. The objectives of this technique are to guide users to 1) learn what verbal commands execute VUI actions and 2) learn the actions supported to accomplish desired tasks with the system.

I am extremely honored for these acceptances and look forward to continuing my research. If you are attending CHI 2019 or IUI 2019 this year, stay in touch! I would love to meet up.