Friday, September 1, 2017

ARCore, ARKit and HoloLens - How does it compare?

Google released a preview of its Augmented Reality platform ARCore. In this post I provide some thoughts and resources about how it compares to the competition. My earlier post is more about Getting Started.

I'm into 3D and AR development before it was cool and even did an ARToolkit port for Silverlight back in the days. Of course I also had to give ARCore a try.

The release of ARCore was nothing less than an answer to Apple's ARKit, although most of the ARCore algorithms originate from Google's Tango which is being worked on since years. Both provide a similar feature set with motion tracking, horizontal plane detection and ambient light estimation.
I don't own an Apple development environment so I can't compare it directly to ARKit but I tried out some of the experiments myself on a colleague's iPhone a few weeks ago. ARKit's tracking was really stable and I think ARCore is a bit behind but it's hard to say how it compares now without having my own side-by-side comparison. This video though seems to confirm this observation and it's not surprising considering Apple owns the whole stack and can calibrate their sensors and align the software much better. The next generation of their mobile devices will surely even more improve it and Apple can move a bit faster here  but will users care about those slight tracking differences? I doubt it.
There's a great in-depth article comparing Apple's ARKit and Google's ARCore which also covers the tech under the hood: How is ARCore better than ARKit.

Wikitude and Vuforia are two larger third party libraries that provide SLAM-based marker-less tracking for many platforms since quite a while and much before ARKit / ARCore were out. I'm sure they will continue to innovate ahead but I fear both will be become less and less relevant since the licensing cost and third party SDK integration is always a big disadvantage. Even more important can Google and Apple really influence the hardware manufactures and improve the calibration to reduce the tracking errors to its minimum.

I'm a HoloLens developer since day one and actually already before it was publicly available, so I could compare it to Microsoft HoloLens but ARCore/ARKit play in a different league and actually complement each other.
The Microsoft HoloLens is a revolutionary device that might be a bit ahead of its time. The HoloLens shows us how mobile devices will work in the future. The inside-out tracking running on a custom chip is still leading and unseen in any other untethered, mobile device and the NUI input is top notch with voice and gesture input. The stereoscopic 3D see-through displays of an HMD provide a much more immersive UX seeing the world around you with your own eyes than holding up your phone/tablet in an unnatural way to see the world though a monoscopic 2D camera lens on a mobile phone screen.
ARCore/ARKit are taking a different approach and bring AR to the masses on existing hardware and there's a place for all 3. I can easily imagine building cross-platform solutions which target high-end experiences with the HoloLens for a smaller enterprise audience and a counterpart app with Google ARCore / Apple ARKit for the masses. Another benefit of classic mobile AR like ARCore/ARKit is the outdoor support and having easy access to GPS location data which I leveraged in this demo.

The news about HoloLens v2 with the new HPU / AI chip are exciting as it will open amazing use cases. Microsoft is definitely leading the industry with innovation in the AR/MR space and no competitor comes close to its bleeding edge technology. HMDs like the HoloLens are the future of mobile AR/MR but until HMDs will be become cheaper and in forms factors everyone can enjoy, ARCore and ARKit are great additions to the AR/MR landscape providing AR on current gen mobile hardware driving mass adoption and helping to mature the product design challenges.

It's a great time to be a 3D developer!

Below a few more GIFs from my experiments. See all in full length at my YouTube channel.

At the ARCore of Tango - Getting Started

Google recently released a preview of its Augmented Reality platform called ARCore which is based on the amazing R&D that went into Google Tango.

This post provides a collection of useful information I came across and links to the sources of my own ARCore demos.

Google provides a few options for ARCore development depending on what your favorite development environment is: JAVA, Unreal, Web or Unity3D.
There's even a Xamarin port of the JAVA sample available made by my fellow MVP Morten.
The Getting Started with Unity is a great How To I successfully followed.

You can find some videos of my experiments with Google ARCode on my YouTube channel, like the one in the GIF above where I combined Unity Rigid Body physics and added a MeshCollider to ARCore's detected horizontal planes and also used ARCore's ambient light estimation feature.

The source code for my Unity3D ARCore experiments is published on GitHub: ARCoreExperiments.

I own a Samsung Galaxy S8 which is fortunately one of the supported Android devices. If you don't own one of those devices you might still be lucky using this hack.

Google also hosts a nice collection of AR Experiments some of them even include source code.
One of them is Portal Painter by Jane Friedhoff. She also wrote a great blog post how it was built.

Have fun and build some cool stuff!

Below a few more GIFs from my experiments. See all in full length at my YouTube channel.

Wednesday, May 24, 2017

Behind the scenes of HoloLens Tire Explorer

Two weeks ago, my team and I were at Microsoft’s largest developer conference //build 2017, where we talked about Mixed Reality and unveiled our new HoloLens app called Tire Explorer.
I wrote a post for the Valorem blog which provides more technical details about the aquarium booth we had and the 3 main key features of the app. You can read it here.
Below is a video of the experience.

Friday, May 5, 2017

Massive Mixed Reality - Content for the Vision VR/AR Summit 2017 Presentation

Beginning of this week I gave my new presentation at the Vision VR/AR Summit in Hollywood, California. I really enjoyed the conference with nice people, good vibes, great content and interesting new devices.

The title of my talk is "Massive Mixed Reality - Leveraging large 3D models with mobile XR" and it covers different strategies for leveraging existing, large 3D models for rendering on a mobile XR device (VR/AR/MR) like the HoloLens. I also showed a live demo of one model reduction tool and also Holographic Remoting in action. The feedback I got was very positive.

The slide deck can be viewed and downloaded here but the main content is in the session itself.
The session was recorded and the video is up on Unity's YouTube channel and embedded below.

Long time no hear - WriteableBitmapEx 1.5.1 is out

Even after all the years WriteableBitmapEx is still quite popular, especially with WPF developers and I always incorporate bug fixes and also accept Pull Requests if I get some time.
Many contributions were integrated and lots of bugs fixed. Among those are some nice additions like a clipped line drawing or a dotted line renderer.

WriteableBitmapEx supports a variety of Windows platforms and versions: WPF, Silverlight, Windows 10 Universal Windows Platform (UWP), Windows 8/8.1, Windows Phone WinRT and Silverlight 7/8/8.1.

You can download the latest via the updated NuGet package. The packages contain the WriteableBitmapEx binaries. All samples and the source code can be found in the GitHub repository.

A big thank you to all the contributors, bug reporters and users of the library who help with feedback.