I am not sure if you are a developer but in any case; if the developer does not know the specs of the hardware the app is going to run on, it makes the job a lot harder.
For example; If you are developing for iOS you know that many users will be on the latest OS (iOS 6.x), and the screen size will either be for mobiles 3.5 or 4 inches and for tablets 7.9 or 9.7 inches (I have actually noticed that Android, even though it tries it's best, the scaling of the UI has blemishes that are noticeable from device to device). And if you are developing for Android you have no idea what the user will be running (possibly that latest? you just wont know), and even worse you don't know the hardware; you could have a device which is has 6 inch screen with 2GBs of RAM ... which will run an app smoothly, but then you could have a device with a 4 inch screen and 500MBs of RAM .... which will just not have enough storage/processing power for what you need and will just crash. (Bad example, but you get where I am coming from)
I found this not too long ago - it's from a company called Sky about Sky Go not supporting a lot of Android devices; "We have two equally resourced teams that work on app development for Sky Go, one for Apple development and one for Android. However, due to the nature of the Android platform -- in terms of both the variety of operating systems and the sheer number of devices -- the reality is that developing for Android throws up a number of additional challenges when compared to working on iOS devices."
Android will always be fragmented, but there are ways to limit the fragmentation. If Google makes a sets a minimum on the specs required for hardware (e.g. 800MBs RAM, 800MHz CPU ...) then the OS would be "less" fragmented, but until that happens .... it will keep making developers lives a nightmare with the variety of devices.
I have nothing against Android, it's good that iOS has competition, and you can run it on many many devices, but there are still downsides to both iOS and Android.