Scanning rooms with an iPhone

Roomscan featured image

About a month ago I worked on a prototype project which would allow users to measure their rooms using an iPhone. It was a very interesting experience and I would like to share what I have learned. First question which comes to mind is – is it even possible? Answer – yes, it’s possible and was done before (look at Roomscan app on the AppStore).

As usual with this kind of task, before writing the first line of code, I started with some research on the subject. Turns out this topic is quite often picked up by a variety of people on different sites. Almost all of them are told that it’s not possible or very hard to accomplish. Don’t be discouraged by this, just as I wasn’t. I just told myself, that there is no learning without trying and picked up the first physics book that I was able to find. Yes, physics! You will see why…


Gathering data

As most of you know, the iPhone can detect its movement. What you probably didn’t know, is that it has three sensors to do so. First and most important is the accelerometer – it measures the acceleration affecting the device in a 3D space.

iPhone accelerometer axis graph


The second one is the gyroscope which detects the rotation of the device, relative to the gravity.

iPhone Gyroscope


Last but not least is the magnetometer, detecting magnetic direction the phone is facing in relation to the real world north direction.

iPhone magnetometer



Calculating distance

The data from those sensors is easily accessible with the Core Motion framework. So where is the problem? Well, first of all we can get the acceleration, the rotation and the direction, but not the distance. So how to transform the data we have to the data we need?

At this point, we cannot proceed without some basic knowledge of physics. What is acceleration and how to derive distance from it? Acceleration is a vector (a vector is a mathematical entity which has a value and a direction in a 3D space – x, y, z coordinates) which describes velocity changes in a given time and can be described with the formula below:

acceleration = velocity change / time change

If the acceleration is constant or time change is very small we can say:

velocity = acceleration * time

Now we have the velocity which gets us one step closer to getting the distance. Velocity is also a vector – it describes the rate at which an object changes its position.

velocity = distance traveled / time of travel

Again if the velocity is constant or time of travel is very small we can say:

distance traveled = velocity * time


If the initial velocity is equal to zero then:

distance traveled = (acceleration * time) * time

And if not then:

distance traveled = initial velocity * time + (acceleration * time) * time

Simple? Maybe… But we have to remember that those formulas are 100% accurate only if time between measurements is infinitely small, which in real life isn’t possible. Core Motion maximum sampling frequency may vary between the devices but should be in a range of 50 – 100 measurements per second. It’s not much, but it’s enough to get some usable data.

Ok. We have displacement (vector of distance) but its x, y, z coordinates are in the coordinate system of the device, which changes when we rotate the device (z axis is always pointing from the screen).

This type of data is useless, unless we transform it to the world’s coordinates system. To do so we need the data from the other two sensors – gyroscope and magnetometer. To make this kind of transformation we can use a rotation matrix (you can find more information on the subject here). Fortunately, we don’t have to calculate the matrix manually, as it’s already done by the Core Motion framework. Our only job is to use it to transform the initial displacement vector.

At this point, if we were living in a perfect world, we would get a perfect displacement vector, which means we would be able to accurately reproduce the device’s movement. Unfortunately, inaccuracy of motion sensors comes into play.


Boosting data accuracy

Don’t give up yet, as there is hope! Instead of using raw data from the sensors, if we pass it through a well selected virtual filter, we can get some nice results. A low pass filter will smooth out the data we receive and a high pass filter will cut out the noise from our measurements.

Accelerometer data diagram with and without low pass filter

Accelerometer data diagram with and without low pass filter; Source.

With filtered data we get much better results – not perfect but usable. To get the best results I tried a lot of different approaches. In the end I chose LERP (a kind of a linear interpolation which takes a previous data point into account when calculating the next one) for its simplicity and rather fine results.

xValue = oldXValue * lerpFactor + newXValue * (1 – lerpFactor)

After about a week of research I was able to receive results with about 20 – 40 cm of measurement error which in my opinion is bearable. With a better filter the measurement error can be further decreased by a meaningful amount. Here are some screens from my app – the curved lines are paths I walked with the app turned on.

Basic room scanning app result

I walked through my room from corner to corner touching each one (green points). It’s clearly visible that I didn’t walk in straight lines (because I had to bypass furniture). The last two corners I touched twice in the same spot but the green points are misplaced. It happened because of the device’s inaccuracy.



So is indoor distance measurement usable for real life applications or is it just a meaningless feature? It depends on how would you like to use it and what level of accuracy you require. I’ll leave it for everyone to decide on their own.

PS. I was working on iPhone 5. iPhone 6 and 6+ have a new generation of motion sensors. Can’t say without testing, but it’s possible that distance can be measured with a much better accuracy on the new devices.

  • Patrick


    Thanks for your article.

    Could you please share the source code ?


    • Wojtek

      Hello Patrick,

      We could do it under the General Public License, in which case, since we’re curious, would you mind sharing what would you use it for.

      Cheers and stay in touch!

      • Patrick

        yes, sure, there is no problem for me. I want to make a pedometer with only sensors on the iPhone, to avoid using CMPedometer. I am not sure to understant the formula you use for estimating the distance, that is why I would like to have a source code (or only just the part which perfoms the estimation).

      • louismullie

        Hello Wojtek,

        We are interested in this source code for an application to measure gait speed in elderly patients. The GPL license would work fine for our purposes. Are there any plans to put it up on Github?

      • Eli Levi


        I’m really curious of thic technogly and Im planning to use this feature in my app. I wonder what is the situation of making it public?


  • John


    Thanks for the walkthrough.
    Any chance you can share any code with us?
    Particularly the filter implementation.


  • Nicolas

    Hello Przemek

    I’m very impressed by your results !
    I’m currently working on a university project and trying to track a skier’s turns using an iPhone 6 (and its accelerometer and gyroscope). However, with increasing time, my calculated position drifts quite far away.

    I’m also using a lowpass filter combined with a median filter and then rotate the acceleration vectors using a quaternion (as opposed to your rotation matrix).

    Would you mind sharing your source code (under the General Public License)? I’m sure I could learn a lot from it to improve my own project :)

    Thank you,

  • Phillip Schmidt

    Thanks for linking to my blog with the accelerometer filtering graph.

    I’m curious, if you leave your app running for 5minutes in a single location, how far does it indicate the unit has wandered?

  • Iv

    distance traveled = (acceleration * time) * time / 2

  • Andrew Ho

    I am working for similar concept but different application. It seems to me the mentioned LERP function is taking the average of 2 points if the LERP factor is 0.5. I am applying double integration, high pass filter, and taking average of more points to smooth out the data. The time interval between each data can’t be 0 as device’s limitation, so I do some system calls and do the subtraction to give me the time interval. The error so far is around 3-5 cm for a distance of 60cm and error is unstable. Not sure if I can do anything better for the error.

    • Dor

      What device are you using to measure the acceleration?

  • Nadia Yudina

    Hi! I am working on the similar app, and I would really appreciate if you could share your code (or parts of your code). I work on the AR app which will allow to explore objects by moving the phone around them. It is non-commercial app. I am making it for my Mobile Applications class.

  • Pantelis Zirinis


    Thank you for the article.

    I know that it has been a long time since the post was written. Can you please share the code? I am working on an Kitesurf application, which will measure the jump height.

  • code-matt

    Could anyone share a bit more about how to transform the data with the gyro so that the movement is always real world accurate no matter what orientation the device is in ?

  • jvogel

    Hello Przemek, I starting to develop an indoor app and I do some search on dead reckoning and stuff related to indoor positioning and I have not found lot of things except your article and your results seems pretty promising :o but to avoid wasting few days on a useless POC just to test, can you please share your sample code ?
    Even if no, thank you for this article and formula, good job!

  • Narender Kumar

    Hello Przemek,
    Thanks for provide that logic and concept. very nice!. But, I want more help. So can you please update this article with more logic, formula and codes. It is very grateful.
    Thank you