Android supports little magnetic gizmos inside the handset which can tell your app which way up the phone is, how far its tilted, if it's moving and so on. This opens up a whole new world to developers, be it in utility apps, navigation, games or whatever they can dream up. Accessing these readings in code isn't always straightforward, and if you don't have a physical handset available it might seem impossible. Not so - in this tutorial we show how to make use of the values your app can listen to, and go through the process of building the necessary hooks in your code to test it without a real phone.
This app draws a realtime image showing the direction the handset is tilited in as well as the "steepness" of that tilt. Heres whats on the screen when you tilt the phone down towards the top right corner:
Sensors working overtime
The collective name for the various signals Android can read is "Sensors". This ability is one of the defining characteristics of these next-gen handsets, or "smartphones on steroids", to use a phase recently seen on the net. The classes of readings available are the orientation (which way up the handset is, which way it is tilting etc), how fast it is moving and where it is. These are termed readings for ORIENTATION, the ACCELEROMETER and the MAGNETIC_FIELD. In general terms, you write your app such that at the start you indicate to Android you wish to be notified of these signals throughout its runtime. You then write handlers which interrupt your main program flow with them, so you make use of these new values and then continue running.
Three values are sent from the orientation sensors: Azimuth, Pitch and Roll. If you imagine you are holding the handset flat in your hand and want to describe accurately how you are tilting it, these are the values you could use.
Azimuth is rotation around the z axis, so in this example it stays flat but it spins round the palm of your hand. The amount of spin is given in degrees, 0 being North, turning through 90 East right round back to North again at 360, which is of course 0 as you are back whre you started.
Pitch describes the tilt along the x axis. This is measured from -180 to +180 and the x axis is the line .which runs with your thumb when you hold your hand in front of you with your thumb stretched as far as it can, fingers stright forward.
Roll again has a range from -180 to +180 and describes how far tilted along the y axis the handset is. The y axis runs along your fingers when you hold your hand straight out in front of you. So if you do this and move your hand so the tips of your fingers go up and down (keeping them straight) you are changing the roll.
Our task is to get these readings into our code in a form we can use. The form they will take depends on the purpose, but a common need is just to know if the handset is tilted, and if so how far. Another way of putting that is to say which corner of the handset is the closest to the floor, and by how much. If you have these 2 values, you can write your code entirely in those terms alone.
You may have noticed there was a lot of talk about axis and different parameters when describing the orientation sensor reading just then, and thats because it has been written in mathematical terms to provide them as co-ordinates. The actual term is rectangular co-ordinates, as they define points on a graph relative to a fixed origin and axis. There is an alternative way to describe these point just as precisely. These are termed polar co-ordinates. Instead of saying "go so far along the x axis, then so far along the y axis" as with the rectangular system you say "go a distance of n at an angle of so many degrees". Sound familar? Thats very much like the reading we want. Therefore, we are going to need some form of rectangular to polar co-ordinate conversion.
Portions are modifications based on work created and shared by Google and used according to terms described in the Creative Commons 3.0 Attribution License. Android Academy is independent from Google. All trademarks acknowledged.