No, they won't. Unless Apple achieves some tech that can disprove both optical law and quantum physics, they just can't add that more meaningful pixels (meaning pixels that capture real data rather than noise or artifacts).
They can't put a larger sensor, because a larger sensor would mean a longer lens for the same angle of vision (optical laws) and it would be sticking even more at the back of the phone.
With a fixed size sensor, extra pixels mean extra noise in low light (quantum physics). I'm not talking about noise that is due to electronics you can improve, but noise that is due to the fact that light is made of photons and thus of discrete amounts of energy. You can always run algorithms to reduce noise, but they can't do miracle and create information that is not there.
Moreover, a fixed sensor means that you keep the same floor for resolution due to diffraction. On my D800 with 4,89 micron pixels on a 24x36 sensor, I'm diffraction limited around f/11. Meaning the airy disk becomes bigger than an 2-3 pixels and this really destroys resolution (and can potentially cause color artifacts with the Bayer filter).
If you take a point&shoot like the Powershoot G9 a 12mp pixels camera with 1.9 micron pixels, the airy disk becomes too big around f4 - meaning that if you want to actually use the 12mp, you have to shoot at f4 or less.
Now, the iPhone 6+ has 1.5 micron pixel and a sensor smaller than the G9. You will hit the limit very fast, probably at around f2.8. So, the resolution is already limited by optical laws, adding more pixels will cause the camera to become resolution limited at full aperture.
And coming with a better lens won't help either. This is cause by optical laws, they exist even with an ideal lens.
The only room for improvement would be a bigger sensor and a lens with a wider aperture. But both means a thicker camera unit, that will stick even more. And the later causes problem with the AF system which must be very precise otherwise it will limit resolution too.
You're kidding ? The iPhone 6 has one of the best camera on the market, this has been reviewed as such by many photography sites. Both DxO and DP Review gave it excellent reviews. It's way better than what is on the Android world.
For instance :
http://connect.dpreview.com/post/6303555427/apple-iphone-6-plus-camera-review?page=6
But it's a smartphone. There is no way you can even approach under ideal conditions the quality you will have from a DSLR...
It's science and technology, not magic. A tiny sensor will always be plague by noise. Even when the conditions are perfect. For instance, look a blue skyes in broad daylight on photos from smartphones, you will very often have noise here. Because they have a tiny sensor and it's low light as far as the pixels under the red and green of the Bayer filters are concerned. Clever algorithms can hide that by recognizing it's a sky, but it's a problem you will have everytime you have something monochromatic.
The best the algorithm can do is not remove all the noise, focusing on the ugly chroma noise which is easy to remove with little impact on quality and letting most of the luma noise in (because removing too much of it will give a plastic look and destroy details). And that's exactly the clever choice that Apple made. You can always apply some more noise reduction with software or app, but you can never recover the details lost by aggressive noise reduction (like on some Android phones).