I think this also answers one persistent question that keeps coming up: "is there any future in DX?" Yes, there is. The reason why Sony would drop FF is rooted in the economics, because the dollar amounts don't really change with sensors like they do with most semiconductor chips. In CPUs, for instance, cost gets driven out by making the CPU chip smaller and using smaller transistors (process size). That doesn't really apply to sensors, as the sensor size itself is not changing, and the light detection mechanism doesn't particularly benefit from smaller transistors (the supporting circuity does). It's just very, very costly to do FF/FX sensors. This puts them into the prosumer and pro categories only, which means there's not a lot of volume. Not having great volume makes it more difficult to reduce costs, and the circle just repeats.
As I've written before, if an FX sensor costs US$500 then a DX sensor probably costs US$50. And a cellphone sensor these days costs less than US$5 (including lens in many cases). The two ways to get lower sensor costs (other than size) is to increase the wafer size (200mm -> 300mm -> 450mm) or increase the yield somehow. But both of these tend to yield small changes at a time and would produce proportional benefits to each size (e.g., if you could reduce FX costs to US$400, then a DX sensor is going to cost US$40). Moving to smaller process (e.g. going from the current 65 nanometer sizes to 45 or 28 or even smaller) doesn't reduce FX sensor cost. But it might benefit noise handling on smaller sensors. This may be why Sony appears about to concentrate solely on DX-sized sensors at the big end. They may see that they can get to FX-type performance with DX-sized sensors, in which case the cost benefit of doing so is huge.