You should read up on the technology used by Starlink. They don’t blast out a signal in all directions like old communication satellites but they beam steer a signal to a circle on the earth surface. This means one satellite can simultaneously send/receive from multiple areas of the earth. They also fly lower so latency is much lower and can sometimes be lower if you are communicating to a location far away.
phased array antennas are nothing new, essentially the US version of apple's 5G devices all have a tiny patch of those for 28GHz. but they need
- unobstructed line of sight
- space
- power
starlink relies on Ka and Ku band, so between 12-18GHz and 26-40GHz, and as far I know they've been awarded rights to use E-band (72-90GHz) and V-band (60-70GHz) too. these are - from 26GHz onwards - in the mmWave range, that is the wavelength is less than 1mm. generally said higher frequencies are more light-like, so line of sight is crucial and need a lot of power - something that is not available on your phone. also, it's worth noticing that Starlink uses relatively large dishes (far larger than a cellular device itself) and those suck up almost 100W of power and get hot. you wouldn't want to hold them next to your head. and they have to face upwards - exactly because their phased antenna array nature.
starlink - extremely simplified - is very similar to a wifi repeater, because there's no content up there. whenever one streams a video, it has to be routed through the nearby terrestrial station, up to the satellite in its view and then down to the receiver. same applies to DTC services: even peer to peer calls will need to go through the PGW, which is not in orbit.
as for the distance: there's low earth orbit, which ranges from 160km to 2000km, but many sats usually are in the 700-800km range. the starlink constellation is around 540km height, so about 100x farther than your average cell tower. they planned to have a lower shell around 340km with v-band, but it simply doesn't make sense because of the constant boosting needed to avoid burn up due to orbital decay.
we use lower frequencies in cellular communication, because of the balance
- good enough propagation
- lower power consumption for the same distance
- reasonably high bandwidth
fun fact, spectral-efficiency-wise there's about a 10% difference between LTE and 5GNR, so from the same 20MHz spectrum you can get about 180-200Mbps with LTE, and around 10% more with 5G new radio. the big boost comes from being able to use 100-120Mhz spectrum at ~3500MHz - but here the issue is less optimal propagation (shorter reach) higher attenuation due to obstacles (walls, coated windows). you go higher with frequency, you'll face different challenges: the middle of the v-band you get oxygen absorption, throughout the mmWave range you get various amount of rain fading, absorption from water vapor. the only thing you can use here is either higher transmission power (limited by regulation on you phone) or simpler, more error-proof modulation (lower speed).
that's why I said: there's no free lunch.