The trick is that iOS runs the driver that checks the touch sensor, and the driver for the graphics rendering pipeline that displays the result of the touch, both at a higher priority and/or more often than does the Android OS kernel.
There's not too much the touchscreen vendor and the app developer can do if the OS thinks it should be busy doing other stuff.
But it's more than that, I believe...
In the OS: it's more than priority. Apple also has better algorithms for predicting what a user was trying to do.
In the screen: I don't think all capacitive touch screens are the same, regardless of vendor. Hasn't Apple spent some time and resources looking at the materials and the lamination of layers in their screens, and where and how to place the sensors in relation to the surface of the glass? Apple isn't just asking for an off-the-shelf capacitive screen -- they are providing the blue-print.