eBay Open-Sources Technology that Uses Head Motion to Navigate User Interface on iPhone X
By: Muratcan Cicek, eBay Intern and PhD Candidate at University of California, Santa Cruz
We built a hands-free app to make online shopping easier for people with motor impairments.
Do you like shopping alone? I wish I could shop by myself, without my mother’s incessant color suggestions and best friend’s unwarranted comments on my brand preferences.
As someone with extensive motor impairments, I do not have full control of my limbs. Consequently, I am unable to walk or grab anything with my hands. These limitations hinder my ability to perform everyday tasks, like going to the grocery store and shopping independently—even though I have my own income.
This year as part of my internship project at eBay, my team and I developed HeadGaze, a reusable technology library that tracks head movement on your iPhone X and starting today, the technology is available via open source on GitHub.com. The first of its kind, this technology uses Apple ARKit and the iPhone X camera to track your head motion so you can navigate your phone easily without using your hands.
Our team built a model that creates a virtual stylus that follows the motion of your head (up, down, side to side), taking the head’s 3D information from ARKit and applying 3D geometry mapping to get the location of the "cursor" on the screen. In addition to that, we designed and implemented new user interface widgets that sense and respond to the "cursor" interaction. Similar to how a mouse navigates the cursor on a desktop, this design lets you point to any location on the screen with your head and activate designated “buttons”.
We further designed the new user interface components that register head gestures and navigate the technology accordingly. For example, they sense the cursor movement and how long the cursor has been in one spot to trigger a clicking action. Once activated, the buttons unlock different functions like scrolling the screen, moving between pages and selecting the product you wish to purchase—without touching the screen at all. The fuse of the head-based control and the new UI widgets empowers the existing iOS app with hands-free interactions. Our modular code design allows developers to easily integrate our features into their existing or future apps with minimum code change.
We implemented this by developing the HeadSwipe app (also open-sourced via HeadGaze), as a test experience that allows users to swipe deals on eBay. Specifically, this app empowers people to browse and buy their perfect items with simple head motions. For example, by pointing your head toward the up and down buttons, the app scrolls to different daily deals in different categories (e.g. a vertical swipe). By pointing your head toward left and right buttons, the app swipes items one after another horizontally (e.g. a horizontal swipe) like a carousel.
As part of the Partnership on AI, an association of businesses focused on establishing best practices for artificial intelligence systems and educating the public about AI, eBay is dedicated to investigating the profound impact that many expect AI will have on the world. It is because of HeadGaze’s potential to make a tremendous impact on the lives of many people that we are open-sourcing this tool. We want to encourage developers to build more apps that don’t require screen touch.
With the help of Assistive Technology (AT), I have been fortunate enough to pursue a PhD and complete my internship. Used to improve functional capabilities of anyone with a disability, AT helps me read my course books, type on a computer to complete my assignments and code. AT has also enabled me to develop artificial intelligence (AI) to help the greater community. While AT helps the disabled to perform some everyday tasks, there is no existing tool that considers our needs when shopping online. And with 39.5 million Americans currently considered physically disabled, according to The Centers for Disease Control and Prevention, we saw an opportunity to create a tool that would promote independence.
HeadGaze enables you to scroll and interact on your phone with only subtle head movements. Think of all the ways that this could be brought to life. Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?
In addition to this head gazing experience, we’re exploring an experience that tracks eye movements. The fusion of these gazing experiences open up a broader possibility on defining various hands-free gestures, enabling much more interesting applications.
HeadGaze along with its example app HeadSwipe are now available on GitHub.com.