Apple, with iOS 11.3, is updating it’s AR platform — ARKit. As we can read on their website, it adds a significant new feature — detection of vertical surfaces.
“In addition to horizontal surfaces like tables and chairs, ARKit can now recognize and place virtual objects on vertical surfaces like walls and doors (…)”
If this new feature works as good as detecting the horizontal surfaces, it could be really SOMETHING BIG!
Imagine all the apps allowing you to decor your walls without drawing the wall outlines yourself. Wondering which wallpaper to pick? No problem, there would be an app that could show you how your walls would look like. It could also improve user experience in the existing AR apps.
Let’s take for example the app that we’ve described in our last article — Room Scanner. Currently, you have to move around the room to get an accurate measurement. If the ARKit could detect walls, taking the measurement would be as simple as pointing the device in the wall’s direction.
Let’s put it to the test!
To test the new feature, I wrote the app that would allow me to see detected planes, both: horizontal and vertical. For each detected plane, it draws a grid that roughly shows how ARKit sees the surface.
Enough theory. Here’s the video I made so you can see for yourself how well it’s working:
Disappointed? Me too…
Expectations vs (augmented) reality
Reading the note on Apple’s website most of us expected detection of vertical surfaces to work at least as good as it does for horizontal ones. Unfortunately, it’s not the case. After the testing, I came to the conclusion that it works only for the surfaces that have distinguishable features — e.g. hanging photos, text or art. If the surface is plain, like a one-colored wall, or has only few small features, it’s not detected. Which means that most of the walls are invisible to ARKit. Sad.
* * *
What do you think about the ARKit update? Does it work for you? Share your experience below in the comments!
featured image by jaycalabresephotography