Houston has a vibrant iOS community, complete with a large iOS Developers Meetup. Between work commitments and three kids, it’s been very long time since I’ve been able to attend one, but I made an extra effort to make the July meetup. I was impressed with how large the crowd was. We filled the Apple Highland Village briefing room:
I really wanted to hear what Mohammad Azam had to say about ARKit. ChaiOne has been doing our own tests with ARKit and I was interested to see what other local developers had found. Azam gave a good introduction to ARKit and how to get started building augmented reality experiences using SceneKit and SpriteKit, Apple’s 3D and 2D gaming engines. Azam’s talk generated a good discussion with the group about ways to use the technology. ChaiOnenauts Alex DuBois and Fabian Buentello added to the discussion based on some of ChaiOne’s own learnings.
At one point, the group started discussing how to push ARKit to its limits. The core feature of ARKit is automatic plane detection, where the iPhone’s camera and light sensors are used to detect a horizontal plane like a floor or table top. Once the plane is found, you can anchor virtual objects to it, whether it’s a 3D shape, map, or other graphic. Someone wondered how big a plane could the iPhone detect and remember. A second person wondered how quickly a plane could be detected, and if they iPhone would see the road if he held his phone out of a car while driving. Suddenly, I knew we needed to test things.
In an effort to be more environmentally conscious and get some extra exercise, I frequently bike home from work. The 11 mile trek gives me a good (albeit exhausting) chance to unwind from a hard work day. The day after the meetup was one of my planned biking days. After lunch, Alex, Intern Zale, and I headed down to the parking lot to test out our theories. We mounted an iPhone 7 running iOS 11 beta 3 to the bike’s handlebars and did some experiments in the parking lot.
We were using Apple’s demo ARKit application with the debugging information on, so the plane detected would be drawn on the screen.
We used the demo app to place a virtual 3D candle on the ground in the parking lot. Standing stationary, the app did a easily detected the plane of the parking lot as represented by the grid visible in the above picture. Then, I was off!
Plane detection on ARKit works by detecting blemishes or patterns on the horizontal surfaces. This is why it works great on wooden tables with natural grain, but fails for surfaces that are all one color or highly polished. In the parking lot, it could easily detect the surface using the pattern of the bricks. ARKit will detect “features” and specific blemishes or patterns. The more features it detects, the more of the surface it will detect. When stationary or at very low speeds, we were detecting around 250 features, so had a well-defined plane. As I accelerated on the bike, the pattern of the parking lot began to blur and the feature count started to drop significantly. At the point in my loop farthest from my start, where I was going the fastest, the feature count finally dropped below 25 and the iPhone lost the plane detection. However, as I started to decelerate, the feature count increased again and plane detection kicked back in. Most importantly, the iPhone successfully connected the new plane to the plane from the beginning of my ride. Returning to where I started, the virtual candle was in approximately the same place.
It was a fun, silly experiment, but it really demonstrates the potential of ARKit.