The Implications Of The Interface That Watches You

This coming week, we’re likely to get a peek at Samsung’s next generation Galaxy flagship smartphone, and by most accounts, it’ll have an auto-scrolling feature that can use head movement cues to detect when you’re paying attention to what’s on the small screen, and when you look away. There’s no word on just how specific it will be, but others are prepping tech that detects eye movement with a high level of accuracy to determine not just when someone’s facing a screen, but also where exactly their attention is focused.

Phones that pay at least as much attention to a user as a user pays to them are coming, whether in the form of the Galaxy S IV or not. And when they do, they’ll bring a tremendous amount of innovation potential to the application market, lucrative opportunities for mobile advertisers, and privacy concerns that make those around the rise of mobile location services over the past five years look like a minor concern.

Changes to Apps

Auto-scrolling is just scratching the surface of what an app could do by detecting where a user’s attention is focused. It could definitely be useful, and make the notion of “pages” and even scroll bars completely irrelevant on mobile devices. But it’s also minor compared to how software interfaces might change based on the availability of new data about when and where people look and focus when they’re using an app.

Why stick with one design when you can generate one tailored to each user?

Imagine a dynamic interface that can change on the fly depending on a user’s habits: given enough data, and enough clever engineering, the layout of an app might actually become the next frontier for personalization. Just as developers today are concerned with building recommendation engines and algorithms to help make sure that every time a user opens an app, they’re greeted with the most relevant and engaging content possible, tomorrow we could see apps that eschew the “one design fits most” ethos in favor of a strategy that really can be everything to everyone.

And speaking of content, when a device knows where your eye is naturally shifting, it also has a much better idea of your content preferences. And not just what kind of content you like – what specifically about each thing you seem most interested or drawn to. It’s not far-fetched to imagine the creation of heat map models even for motion video, which could analyze what caught a viewer’s eye in individual scenes; which characters are testing well; what types of objects in a scene with many strike the viewer as most interesting. All of which can help sharpen the edge on existing personalization engines for apps like Flipboard, putting a finer point on the personalized web.

The Marketer’s Feast

Knowing where someone’s looking is like a marketer’s dream: it can tell you exactly where the best place to put an ad in an app is, give you remarkable insight on what’s working and what isn’t in terms of attracting consumer attention, and, when aggregated with other demographic data, make targeting types of buyers that much more effective.

It’s no secret that companies and advertisers have been looking for a way to boost the ROI of mobile ads, Google included. Gathering facial feedback data could act like a cheat code to help marketing get to the next level – provided it isn’t wielded like a heavy, blunt club. The possibility for abuse is tremendous here: imagine ads that periodically migrate to occupy the places where you find the choicest content in an app, or autoplaying video ads that wait until you’re paying close attention before launching into a sales barrage.

We’ll see both good and bad uses with face and eye-tracking tech in ads

Like with any marketing tool, we’ll see both good and bad uses with face and eye-tracking tech. But both will have to contend with the privacy tangle that this new mobile data source entails.

Is Facial Data Where We Draw The Privacy Line?

Location information has raised a ruckus on more than a few occasions with consumers, and it still isn’t a technology everyone is completely comfortable with. Data gathered from sensors in cameras designed to detect eye and face movement crosses a whole new boundary when it comes to personal privacy, and its use will be watched closely by watchdog agencies, concerned users, and likely legislators, too.

The question will be whether gathering this type of data raises enough red flags to actually merit widespread resistance to its use. Location data managed to mostly squeak by, and is used by nearly every new mobile app that comes on market it seems, even when there’s no clear benefit to users. Will facial tracking pass the same test? Can even claims that it’ll be used in an anonymized format even be enough to assuage concerns? I have no immediate answers, but the mobile-first generation seems more willing to share personal data than older users, so it may very well manage a pass after an initial knee-jerk reaction.

Ultimately, a phone that knows you is better than one that doesn’t, and a phone that can ‘watch’ you will know you better than those that don’t. Expect this tech to take its first few wobbly steps over the next few years before properly finding its footing, but we’ll see plenty of interesting use cases between now and then.