Android OS brought a cutting edge approach to mobile development: handle differences in device hardware by standardizing pixel sizes at the application layer. If you’ve ever worked on Blackberry devices you will know how much of a problem it can be without such a concept in a mobile OS. Apple avoided this problem all together by only releasing one phone model at a time (as opposed to opening it up the platform to any manufacturer.
The idea is this. Developers and graphics designers can rely on defining graphics for Android using one of four standard pixel density sizes:
- low density (LDPI): 120 DPI
- medium density (MDPI): 160 DPI
- high density (HDPI): 240 DPI
- extra high density (XHDPI): 320 dpi
This means that regardless of what device you target, you can start graphic asset design on an Android by specifying asset sizes in inches. From there, you can figure out how many pixels each graphic asset needs for the screen density of the phone you’re targeting (note: density is not the same as screen size – that’s another discussion).
This is particularly important, because as every mobile graphics designer ultimately realizes, using native pixel size in your graphics specifications will result is wild results on the screen when the assets are displayed. A graphic of 160 x 160 native pixels will look one way on a particular screen, bigger on a lower density screen, and smaller on a higher density screen. Throw in the mix the fact that many phones’ physical screen sizes differ by fractions of an inch and you quickly see your nice screen designs morph into a nightmare.
But, for all the good standardized screen densities do, OEMs still don’t follow the letter of the law when it comes to API behavior. Debbie Hackborn recently explained this on a posting over at the Google Android Developer group:
In short, she explains that many Android phones don’t accurately provide the number of pixels per inch. She goes on to explain that a device is more likely to have a screen density that is higher than it’s real DPI measurement rather than lower.
Here is her first posting on the matter (Jan 12, 2011 – see link above):
The density and densityDpi is an abstract density bucket the device manufacturer has decided makes sense for their UI to run in. This is what is used to evaluate things like “dp” units and select and scale bitmaps from resources.
The xdpi and ydpi are supposed to be the real DPI of the screen… though as you’ve seen, many devices don’t set it correctly. 😦 This is our fault, it isn’t actually used anywhere in the platform, so people don’t realize they have a bad value, and we haven’t had a CTS test to try to make sure it is sane (it’s not clear how that test should work). Worse, we shipping the original Droid with a graphics driver that reports the wrong value here… in fact that reported that same damnable 96.
Unfortunately, I don’t have a good solution if you want to get the real exactly screen dots per inch. One thing you could do is compare xdpi/ydpi with densityDpi and if they are significantly far apart, assume the values are bad and just fall back on densityDpi as an approximation. Be careful on this, because a correctly working device may have densityDpi fairly different than the real dpi — for example the Samsung TAB uses high density even though its screen’s really density is a fair amount lower than 240.
Android framework engineer
And her second posting in the same thread, and with it we see a concrete example by another poster (his quotes marked with ‘> ‘):
Well a device is more likely to have a density that is higher than its real dpi rather than lower; making it higher makes things UI larger and thus more readable and usable, while making it smaller quickly makes the UI so small that it doesn’t work. I wouldn’t want to go any lower than what you’d see on say the G1, which is 180dpi and uses the mdpi (160) density. At any rate, the whole point of density is that it is not directly tied to the real screen dpi. It is quantized, so that there are a limited number of densities applications need to deal with. Manufacturers have some flexibility in picking it based on the feel they want for their device. In the case of Samsung, they wanted a larger more easily touched UI. Others may want the same thing, for example to make it easier for people who have poor eyesight to use their device or whatever other reason.> I’ve [another poster] also just looked at my AC100 (Tegra 250), and it reports xdpi =
> 160, yet the correct value should be about 120.
> As you say, this seems to be completely broken, and the Tegra example
> shows that 96 cannot be used as a sentinel for a wrong value.
Well fortunately that one isn’t a compatible device so won’t have Market. :} Out of curiosity, what are you trying to do? I know you said you are trying to display a ruler, but what exactly do you want it to be used for? Just for people to hold stuff up to it to measure?
One of the reasons the devices haven’t been good in this regard is because after introducing these APIs, we have never actually found a single place in the standard UI where they should be used, so nobody realizes they are shipping with bad values. *sigh*
Android framework engineer
What does this mean to those of us working on Android phones? Even if you adhere to the standardized pixel dimensions identified in the Google Documentation (see http://developer.android.com/guide/practices/screens_support.html), you’re likely to see small variances in graphic asset sizes when they are displayed.
This makes pixel perfect alignment of screen shots during the Quality Assurance (QA) phase of application development difficult. While most implementations of the Android OS will display the same graphic exactly the same size, there’s always a possibility that many wont.
The solution I advocate: when doing quality checks, always allow for this variance from phone to phone. Don’t waste time trying to make things 100% perfect when it comes to asset sizes.