Augmented Exercise: A Meetup Review

By Travis Fischer | Aug 09, 2017

Houston has a vibrant iOS community, complete with a large iOS Developers Meetup. Between work commitments and three kids, it’s been very long time since I’ve been able to attend one, but I made an extra effort to make the July meetup. I was impressed with how large the crowd was. We filled the Apple Highland Village briefing room:

A panorama image of a large number of people sitting around a long wooden table. Multiple TVs on the wall display a Keynote presentation.

The crowd at the Apple Highland Village briefing room.

I really wanted to hear what Mohammad Azam had to say about ARKit. ChaiOne has been doing our own tests with ARKit and I was interested to see what other local developers had found. Azam gave a good introduction to ARKit and how to get started building augmented reality experiences using SceneKit and SpriteKit, Apple’s 3D and 2D gaming engines.  Azam’s talk generated a good discussion with the group about ways to use the technology. ChaiOnenauts Alex DuBois and Fabian Buentello added to the discussion based on some of ChaiOne’s own learnings.

A man stands at the end of a table holding an iPhone pointed in the air. Behind him on the wall is a TV displaying source code in Swift and a video of the screen of the iPhone showing a grey cube floating above the table.

Azam demoing ARKit. The cube visible on the right side of the TV behind Azam is a virtual object floating above the table.

At one point, the group started discussing how to push ARKit to its limits.  The core feature of ARKit is automatic plane detection, where the iPhone’s camera and light sensors are used to detect a horizontal plane like a floor or table top. Once the plane is found, you can anchor virtual objects to it, whether it’s a 3D shape, map, or other graphic. Someone wondered how big a plane could the iPhone detect and remember. A second person wondered how quickly a plane could be detected, and if they iPhone would see the road if he held his phone out of a car while driving. Suddenly, I knew we needed to test things.

In an effort to be more environmentally conscious and get some extra exercise, I frequently bike home from work.  The 11 mile trek gives me a good (albeit exhausting) chance to unwind from a hard work day. The day after the meetup was one of my planned biking days.  After lunch, Alex, Intern Zale, and I headed down to the parking lot to test out our theories.  We mounted an iPhone 7 running iOS 11 beta 3 to the bike’s handlebars and did some experiments in the parking lot.

A man in jeans and a gray shirt stands in front of a black and blue bike. An iPhone is mounted on the handle bars of the bike.

Prepping the experimental testbed

We were using Apple’s demo ARKit application with the debugging information on, so the plane detected would be drawn on the screen.

A screenshot from an iPhone application. The screen shows a parking lot surface made of gray bricks with white lines for parking. A grid of lines is overlaid on top of the parking lot surface. A 3D representation of a candle is sitting at the intersection of two white lines.

A virtual candle placed in an easy to compare location.

 

We used the demo app to place a virtual 3D candle on the ground in the parking lot.  Standing stationary, the app did a easily detected the plane of the parking lot as represented by the grid visible in the above picture. Then, I was off!

An animated gif of a man wearing jeans and a gray shirt riding a bike away from the camera.

Ride like the wind!

Plane detection on ARKit works by detecting blemishes or patterns on the horizontal surfaces. This is why it works great on wooden tables with natural grain, but fails for surfaces that are all one color or highly polished.  In the parking lot, it could easily detect the surface using the pattern of the bricks.  ARKit will detect “features” and specific blemishes or patterns. The more features it detects, the more of the surface it will detect.  When stationary or at very low speeds, we were detecting around 250 features, so had a well-defined plane.  As I accelerated on the bike, the pattern of the parking lot began to blur and the feature count started to drop significantly. At the point in my loop farthest from my start, where I was going the fastest, the feature count finally dropped below 25 and the iPhone lost the plane detection.  However, as I started to decelerate, the feature count increased again and plane detection kicked back in.  Most importantly, the iPhone successfully connected the new plane to the plane from the beginning of my ride.  Returning to where I started, the virtual candle was in approximately the same place.

A screenshot from an iPhone application. The screen shows a parking lot surface made of gray bricks with white lines for parking. A grid of lines is overlaid on top of the parking lot surface. A 3D representation of a candle is sitting at the intersection of two white lines.

Two similar images of a screenshot from an iPhone application. The screen shows a parking lot surface made of gray bricks with white lines for parking. A grid of lines is overlaid on top of the parking lot surface. A 3D representation of a candle is sitting at the intersection of two white lines on the left image. On the right image the candle is two inches to the left of it its location in the left image.

The candle shifted a couple of inches, but was still there after my ride.

It was a fun, silly experiment, but it really demonstrates     the potential of ARKit.

Get in touch

Marketo Form