As an Augmented Reality company, we know ARKit will be huge – here's why

The real impact of Apple's ARKit remains to be seen, but as you can tell by the number of new Augmented Reality demos and samples out there, it has already grabbed the attention of many. As ever, the quantity does not always equal quality but in this case, it’s a huge step forward that developers all around the world are giving it a try, and sharing results.

Here's one of our first demos made with ARKit – SEAN, our beloved alien from our fully interactive, live-controlled LiveAvatar AR experience steps off the big screen, straight into our MobileAR system:

Using live connection we’re streaming real-time motion capture data (with one of our colleagues wearing a mo-cap suit) to the iPad, and taking advantage of the exceptional environment mapping capabilities of ARKit that allow us to place any 3D character right in front of us, in any real-world space, in real time. Fascinating, isn't it? 

Shameless plug: SEAN on MobileAR using ARKit will be debuting at the AT&T SHAPE Tech & Entertainment Expo in Los Angeles, July 14-15, where we'll also be exhibiting our upcoming Computer Vision analytics system, Wearable AR object recognition and HoloLens demos. On the first episode of our newly launched Augmented Reality podcast, our CTO, CPO and R&D Director are sharing some really exciting background info and views on ARKit, HoloLens and Computer Vision - Enjoy!

We are excited to be part of this year’s edition of SHAPE, a Tech and Entertainment Expo sponsored by AT&T (Los Angeles, July 14-15), exploring the convergence of technology and entertainment. The public face of INDE usually shows our Augmented Reality installations for big screens, powered by our BroadcastAR system, and we have been lucky enough to travel the world sharing these experiences with hundreds of thousands of people. This travel has helped us gain a deeper understanding on how and why Augmented and Mixed Reality will disrupt the way people interact with digital content. That quest for understanding has been shaping the efforts we make in our multidisciplinary R&D department, and we wanted to take the opportunity and bring to SHAPE a variety of early stage products and prototypes – such as our Computer Vision analytics system, HoloLens and ARKit demos – with real life examples of their applications. In this episode our CTO, CPO and R&D Director share some really exciting background info and views on the above. We hope you enjoy. Music: Winter Boulevard by Minuit De Lacroix Photo: Mike Wilson

Getting back to ARKit – why is it such a big deal? There are a lot of potential answers to that question, but let’s just look at two.

The framework

For me, who used AR back when it was cubes placed on black framed “Hiro” markers, the accuracy of the tracking in ARKit is stunning. Apple uses a solution that they named Visual Inertial Odometry. This combines the inertial sensors of the device with what the camera sees. You can learn more about the technical stuff at Apple’s dedicated developer page, but let me sum it up in a few words: it gives the device a pretty darn accurate digital awareness of the real world. By understanding the environment, and detecting flat surfaces it allows you to place virtual objects accurately around you, blending digital content with the real world. And it gets even better with ARKit’s lighting estimation since it detects the overall lighting state of the real room and tries to match the virtual one with it. The result you can see in the above demo video (visit our booth at AT&T SHAPE in Los Angeles, July 14-15, to meet SEAN the alien and some amazing 3D dinosaurs!), and in the many many more demo videos that are out there. My first reaction was close to when I first tried the HoloLens, or when Oculus DK2 came out.

To be fair, the technology has been there for a while now, just waiting to be perfected by someone, and now it’s getting there. Apple – I’m guessing the Metaio guys had a great part in it – is close, and given the amount of resources, marketing and branding going into ARKit, it’s no wonder it’s having a seemingly bigger impact than other AR frameworks at the moment.

The market

The other aspect of ARKit’s impact – and this is probably the biggest effect it will have on the AR landscape – is the sheer volume of the devices it can potentially run on. When iOS 11 comes out of beta, and people start downloading and installing it on their devices, it will open up a big market for AR applications and solutions. We’re talking about millions of devices with almost immediate access to AR.

"Wait a minute! Vuforia or Wikitude are already available for all those devices” – one might say. True, but there is no unified marketing and PR push behind them. When a powerful company like Apple gets behind a technology, it will always have more impact, and make growth faster. Suddenly AR is not only advertised to the developers, but it’s pushed to the end users. At least we’re hoping that’ll happen in September, or whenever iOS 11 is out of beta. How it will affect application developers remains to be seen but our hope is that it will lead to many very high quality applications. I’ve said it a few times, repeating it again: AR – as with VR – needs lots of high-quality and useful applications. Yes, we’re working on a few, and hope to be part of many in the future.

Our MobileAR offering will not be affected to its core, but adding ARKit will allow us to provide higher quality tracking which in turn will make the AR experience better. Not having to worry about the accuracy of the tracking will allow us to focus on content and the AR experience itself, and experiment with new ways to interact with digital content. Working towards having a non-disruptive digital augmentation of the real world with natural and seamless interaction. With ARKit on iOS, Tango, Vuforia and others on Android. And yes, on wearable devices. Exciting times.


Previous
Previous

Pokemon GO, AR & The Future Of Mobile Gaming

Next
Next

AT&T SHAPE, here we come with ARKit, HoloLens, Computer Vision & more