The Google event was held on the planet Earth. As a result, it is a fact that the camera on the Pixel 4 is not even 1 lightyear in distance away from the camera on the iPhone, so there’s no way it can be multiple lightyears AHEAD of the camera on the iPhone. There may be qualitative differences that make the Pixel 4 phone better in some respects, but ”relative distance from iPhones” is not really one of them.
And, you can bet they’re going to change that “instant”.Impressive for a demo, but even the Google employees at the event were wanting there to be a delay before just dropping you back into your last app. There’s actually a good reason why getting back to your phone on iOS requires a distinct additional step.
So let's recap:
1. the camera on the Pixel 4 is lightyears ahead of the one on the iPhone.
2. The display is as well.
3. FaceID on Pixel is instant.
...and somehow, people here still find ways lie to themselves that the iphone's better. Amazing!
Apple pretends too, they just do it better.The event was a complete borefest with the most unenthusiastic team of presenters I've ever seen. Interesting that Google is now trying to pretend they are privacy-focused company, when their business model betrays this messaging.
1. nopeSo let's recap:
1. the camera on the Pixel 4 is lightyears ahead of the one on the iPhone.
2. The display is as well.
3. FaceID on Pixel is instant.
...and somehow, people here still find ways lie to themselves that the iphone's better. Amazing!
4. No one will ever use it.So let's recap:
1. the camera on the Pixel 4 is lightyears ahead of the one on the iPhone.
2. The display is as well.
3. FaceID on Pixel is instant.
...and somehow, people here still find ways lie to themselves that the iphone's better. Amazing!
“iPhone XR and 11 have noticeable bezels”
-people: ew, it’s ugly!!!
“Pixel 4 has noticeable (huuuge) bezels”
-people: it’s OK.
So let's recap:
1. the camera on the Pixel 4 is lightyears ahead of the one on the iPhone.
2. The display is as well.
3. FaceID on Pixel is instant.
...and somehow, people here still find ways lie to themselves that the iphone's better. Amazing!
What makes this a little confounding is that none of what Google is doing, as far as I can tell, is new. Not sure why Apple isn't making more of an effort to get ahead in this area.It appeared Google’s computational photography smarts have leapfrogged Apple’s again.
The likes of MKBHD have been hyping it up as a "pro" feature that, apparently, is a must on current day flagships.The human eye cannot see 90 Hz. I don't understand the push for increased refresh rate. No harm but who can tell the difference?
The human eye cannot see 90 Hz. I don't understand the push for increased refresh rate. No harm but who can tell the difference?
Well, this is an Apple centric site, so..... By the way, how many of these amazing Pixel 4's is Google going to sell? I don't think the 3 sold very well.So let's recap:
1. the camera on the Pixel 4 is lightyears ahead of the one on the iPhone.
2. The display is as well.
3. FaceID on Pixel is instant.
...and somehow, people here still find ways lie to themselves that the iphone's better. Amazing!
Though I'm a bit concerned by the blue and red light rings they project on your head.Honestly those Pixel Buds look amazing from a design factor. AirPods should looks like that instead of the comical long white shaft sticking out the ear.
The camera bundle looks a lot like another phone recently released, is samsung going to do it next?
I stand corrected. There does appear to be one low quality study that suggests at least high end gamers can tell the difference in refresh rate. And this also supports that at least peripherally we can definitely tell the difference.The human eye can very well see 90 or more Hz on a display. If you've ever used an iPad Pro with 120hz, you'll know the difference clearly noticeable. Everything is just much smoother with less ghosting and blur. I hope Apple finally adds 120Hz in the 2020 iPhone.
Are they serious with the top and bottom bezels?
I am long GOOGL, but they should stick to ads and data mining.
That's something that Apple appears to be sitting on for now. iOS 13 has an offline accessibility mode powered by Siri, so we know there's at least potential for offline transcription there.Loving the offline transcribing as well, that will be very useful.