Qualcomm's New Vision-based Augmented Reality Platform Will Knock Your Block Off

Vision-based augmented reality — that is, pulling in data from a device’s camera and using it to position and rotate 3D models drawn on top of an on-screen view of the real world — isn’t really anything new. We started seeing tech demos of the concept 5+ years ago, and games like Sony’s Eye of Judgment have been doing it for nearly as long. More recently, the concept has been moving to mobile phones — a perfect fit, given that the camera and display are built into one unit.

Up until this point, however, the idea has been more or less exclusive to those with gobs of cash or manpower to spare. Anyone who built up their own Vision-based AR tech generally kept it pretty close to their chest, so building a Vision-based AR app meant rebuilding things from the ground up.

It looks like the endless reinvention of the same wheel is coming to a close. At Qualcomm’s recent Uplinq conference, they announced their plans to release a free Vision-based AR platform to mobile developers. Why? To sell more phones, of course.

How does a free platform like this sell phones? It all makes sense when you consider the first (and only confirmed) OS they’ll be supporting: Android. You see, all this image parsing and model rendering requires some pretty beefy hardware. Most of the “beefier” Android handsets out there are running Snapdragon processors — which just so happen to be made by Qualcomm. Get developers to make apps with this platform, convince people they need faster phones, sell more Snapdragon phones. More phones sold = more chipsets ordered, and more royalties paid per sale.

Before we dive any deeper, check out the demo video:

Here’s how it works: the user obtains a gameboard, be it by printing it at home, pulling it out of a cereal box, whatever. This gameboard — or, more accurately, the unique pattern on the gameboard — serves as the item that the AR app looks for. The AR app identifies the board, calculates its size/orientation on the fly, and then renders objects on top. Rotate around the board, and the objects on screen rotate accordingly; move in closer, and the objects get larger relative to your movements. In time, this platform will grow to support multiple objects, and handle the identification of non-flat objects.

Even if you only brain storm for a few seconds, the idea bin seems pretty limitless. A chess board could have rendered, fully animated chess pieces dancing on top. Medical students could point it at a skeleton, and look at and around each part of the body, layer by layer, before ever touching a cadaver. A laptop repair parts company could show you exactly where your new hard drive should go, with a 3D exploded-view.

Qualcomm pulled in toy-maker Mattel as one of their first pre-launch partners, allowing them to build the Rock’em Sock’em Robots demo up above. The framerate wasn’t fantastic and there was a bit of obvious control lag — but given that they built this thing in a matter of a few weeks for the sake of the presentation, they get some slack.

More excitingly, however, is Qualcomm’s other partner: Unity. Unity is a rather fantastic multi-platform game development tool — in other words, you build a game in their SDK, and it’ll auto-port to iPhone, Xbox 360, PC, and a slew of other platforms with minimal tweakage. They’re launching the Android arm of their product in the coming weeks –and when they do, it’ll support Qualcomm’s AR platform.

Vision-based AR just went from mostly untouchable to something that just about any developer worth their weight in semi-colons should be able to dive into.

Disclosure: I was a moderator on a panel at Qualcomm’s Uplinq conference, where this was announced. I don’t believe this in any way determined my decision to cover this, but I’m mentioning this because transparency = a good thing.