xingtwittersharerefreshplay-buttonpicture as pdflogo--invertedlinkedinkununuinstagram icon blackShapeGroup 3 Copy 2Group 2 Copydepartment_productdepartment_datascienceuserclosebasic clockblogShapearrows slim right copy 3arrows slim right copy 3arrows slim right copy 3

Use Cases of Eye Tracking on Mobile Phones

|

27. Jun 2019 |

- min Lesezeit

Use Cases of Eye Tracking on Mobile Phones
Apple's iOS has it, Android has it. But what can we actually use it for?! Modern smartphones are capable of accurately tracking the users gaze point using the front facing cameras. A feature that hasn't yet been picked up frequently by app developers.

When Apple introduced the ARKit 2 in June 2018, they not only improved the general AR experience but also silently integrated a feature enabling developers to keep track of the users’ gaze. To do so they optically recognize a user’s eye balls in a video taken with the phone’s front facing camera, estimate their poses and infer a vector which resembles the estimated viewing direction. An interpretation of this data is the screen coordinate, which the user is currently focusing on. All this is easily accessible through the ARFaceAnchor.



Direct interaction is the obvious approach when thinking about what to do with the information of what the user looks at in an app. This idea has now been around for quite a while and there really are reasons why this input modality might be advantageous for some people. More interesting for the mass market is, however, indirect gaze interaction. Here are some thoughts of what this technology could be used for.

During my academical career we did a lot of research on eye tracking and I am excited to use this new technology for our customers at DieProduktMacher as well.

QA & Usability Testing

A first and already quite concrete use case which we came up with tackles the problem of quality assurance for mobile websites. Mobile apps are capable of displaying arbitrary websites in a containing element, a webView. Eye tracking in combination with the input modality of speech could then enable the user to easily annotate certain UI elements by just looking at them while speaking out what’s on their mind. The beauty of this approach lies in the fact that this is now possible with just one device. Before ARKit2 and similar frameworks introduced this functionality, such an implementation would have required the use of additional eye tracking devices, either head-mounted or external.

A very common use case of such additional eye tracking devices was usability testing. Instead of doing this in dedicated studies, gaze data can now be obtained during the day to day usage of the app itself. Every user has the required camera roughly pointed to their eyes most of the time anyway. Of course this assumes the user accepting the fact of being tracked to help improving the app’s experience. Hawkeye - User Testing is an app example for a dedicated user testing app which applies eye tracking for mobile websites.

Intelligent Human-Computer Interfaces

Some observations about the user’s behavior can be detected automatically. Scientific literature provides us with some examples of so called Intelligent Human-Computer Interfaces which do exactly that and even adapt themselves accordingly to improve the user’s experience.

  • What’s the user interested in? Our eyes can tell what we’re interested in. It is possible to predict as how interesting a user would rate certain areas of the UI. This hypothesis was shown to be true for listed apps in the Google Play Store. Basis for this prediction are the time a user focused a certain area, the same time in relation to the time totally spent in the app, and the time until the user focused the area for the first time. The interest ratings could here be predicted with an accuracy of just over 90%. Insights like this can be used to arrange items in comparable listing interfaces, e.g. e-commerce sites, to order the offered items based on historic attention data. Alternatively the same data could also be used to spot the most promising item positions.

  • Detect uncertainty: There are certain recognizable patterns in our gaze behavior that indicate uncertainty. Being able to detect these in real time allows us to immediately react by providing the user with meaningful assistance. Here a very nice example: A user gets confronted with a text in a foreign language and the app provides translations of certain words or sentences when it sees them struggling.

Limitations

Eye tracking on mobile phones still has to improve in many ways and it’s going to be interesting to see how this process will take place. Calibration methods, for example, will have a huge role to play. Every single enhancement along the way will enable new features and applications. Thus it is crucial to stay on top of the current development. Up until now it has to be decided on a case by case bases wether this technology is already good enough to satisfy the specific requirements.

Conclusion

Live eye tracking data just got a lot easier to obtain and we will start making use of that. The mentioned ideas are just some of many potential future use cases that could both help companies to improve their products and make user interfaces a whole lot more intelligent. DieProduktMacher were part of the ZeroUI movement from the early days on and we see eye tracking as the next building blick towards meaningful and intuitive interaction. Don’t hesitate to get in touch if you’re interested in implementing this within your product.


Ähnliche Artikel

Ähnliche Artikel