Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When are these iPads gonna get OLED?

Not any time soon. Apple has chosen to go with mini-LED in the short term. It offers better wide-color gamut and no burn-in issues for Apple's productivity devices.
 
Camera is not only used for taking photos, but is very useful in the pro devices for ‘scanning’ documents (by taking pictures of them), converting them into PDFs, and using  pencil to edit/sign

Would be nice if it was ONE camera with higher resolution for that purpose.
Rather than 3D sensing 2-3 cameras crap
 
I use the front camera on my iPad constantly for video chatting but I can't remember the last time I used the rear camera. If I'm going to scan documents it's way easier to just use my phone.

It blows my mind when I see people using full sized iPads to take photos at theme parks. o_O
 
I'm sure some value the camera on their iPad, but I wish Apple would offer a variant of the iPad Pro that includes no rear camera, or at least one that doesn't protrude. It's hard to imagine an iPad Pro owner that doesn't also have a good smartphone camera, and I believe many of us would value a flat-backed iPad more than a high-quality rear camera.
Completely agree with you. I would Apple rather continue to invest in super high quality front cameras for FaceTime and Skype since the iPad Pros are continually eroding the need for a laptop/desktop. I know a lot of people with iPads and not one of them uses it as a camera.
 
I would buy an iPad without camera. Never use it on mine.

Everyone on here says this, so when I saw a middle aged lady hold up her huge iPad in a restaurant to take a photo of her family this past weekend, I had to chuckle (to which my wife looked at me confusingly).

Point of the story: At least ONE person uses the iPad camera.
 
  • Like
Reactions: Guscat
I like it better when Apple releases new iPad Pros right after the iPhone. At least the Spring is better than the Summer. Nothing like buying an expensive new iPad Pro only to have the new iPhone that comes out two months later either match or beat it in performance because it uses a newer generation processor.
Hmm but the iPads (at least the iPad Pro for sure now) always uses the "X" version of that year's A-series chip. Like with better graphics performance and such that the iPad can utilize. So technically the iPad Pro will have a better chip than the iPhone no matter what, right?

Apple is clearly trying to better segregate the iPhone and iPad categories with the new iPadOS, "X" processors, and iPad-specific apps (which bears mentioning has its own program, Project Catalyst, for using said apps on a Mac). Idk I feel like it's just not as easy to compare the iPhone and iPad these days, and I mean that in a good way because Apple is finally starting to treat it more like the capable computer it is.
 
It will be interesting to see how this whole 3D sensor thing works. I've been a fan of stereoscopic imaging (and movies, and games - been running stereoscopic gaming on my game rigs for nearly 10 years now) for a long time, and one of the problems with creating a stereoscopic image from a single POV is occlusion. With two points-of-view spaced far enough apart (our eyes), you get true 3D, but just as importantly, each POV can see part of what is behind any occluding object that the other POV can't see. In video games this leads to the whole problem with VR (and proper stereoscopic gaming even without VR) taking such massive power - they actually have to render the game twice in order to avoid areas of the scene being occluded. In other "fast" 3D modes, where a game world is built using the depth-map from a single POV, there are always areas that are occluded - which the software then has to fill-in with guesswork, causing a distortion around objects usually called "halos". So where I could see and iPhone or iPad with dual-lenses at either end of the device taking true 3D images, I'm not sure I understand how these "3D sensing" cameras get around the issue that from any single POV there are areas blocked from sight that wouldn't be with 2 properly spaced lenses. How are they filling in this missing data? Anyone understand the technology well-enough to explain it? (Personally, I hope it works very well. I would love to take 3D photos on vacation and then display those on our 3D projector or even view them in VR).
That's not what Apple is going to deliver. They are following in the Samsung footsteps. You can find details on how this technology works in Samsung Galaxy Note 10 reviews and discussions.
 
What’s the point in adding a sophisticated camera to a device that is most likely not used as a primary camera? Looks cool tho!

3D rear cameras will enable many AR applications. Apple is making a huge push into AR.
 
An added plus for all the grandparents that bring their iPads to family events. You know the ones, with the covers that are hanging down as they reach over the table to get the picture..
 
  • Like
Reactions: MisterSavage
I would buy an iPad without camera. Never use it on mine.
I use it to scan documents and for these few occasions when quality makes no difference, I want an iPad with a camera, a 5 MP one is more than enough, even flash isn’t necessary at all.
 
It will be interesting to see how this whole 3D sensor thing works. I've been a fan of stereoscopic imaging (and movies, and games - been running stereoscopic gaming on my game rigs for nearly 10 years now) for a long time, and one of the problems with creating a stereoscopic image from a single POV is occlusion. With two points-of-view spaced far enough apart (our eyes), you get true 3D, but just as importantly, each POV can see part of what is behind any occluding object that the other POV can't see. In video games this leads to the whole problem with VR (and proper stereoscopic gaming even without VR) taking such massive power - they actually have to render the game twice in order to avoid areas of the scene being occluded. In other "fast" 3D modes, where a game world is built using the depth-map from a single POV, there are always areas that are occluded - which the software then has to fill-in with guesswork, causing a distortion around objects usually called "halos". So where I could see and iPhone or iPad with dual-lenses at either end of the device taking true 3D images, I'm not sure I understand how these "3D sensing" cameras get around the issue that from any single POV there are areas blocked from sight that wouldn't be with 2 properly spaced lenses. How are they filling in this missing data? Anyone understand the technology well-enough to explain it? (Personally, I hope it works very well. I would love to take 3D photos on vacation and then display those on our 3D projector or even view them in VR).
holy christ that top gear link had me in stitches, thank you
 
AR Glasses, actually.

Then you have to have a camera in the glasses and, so far, nobody likes the idea of someone having a camera staring at them. (Google Glass) I would love to have an AR display in my glasses, though. Like a HUD in a jet.
 
As Apple pushes the iPad Pro as an alternative to a MacBook, I think they will decouple more and more features updates between the phone and the iPad. It's no longer just a bigger phone (if it ever was). Apple will want it judged on its own merits. So I am not swayed by what Apple did historically on updates (the point of the article, not your quote).
Yeah, it would be great if they could make an all-new processor line for the iPad that is much more performant.
 
I would buy an iPad without camera. Never use it on mine.

Anyone seen using an iPad to take a photo, when they have a more than capable camera phone in their pocket should be shot on sight!!!!

I only use it to take photos of my iPhone when it's time to sell it every 2 to 3 years. Other than that, it is never used.

I'm sure some value the camera on their iPad, but I wish Apple would offer a variant of the iPad Pro that includes no rear camera, or at least one that doesn't protrude. It's hard to imagine an iPad Pro owner that doesn't also have a good smartphone camera, and I believe many of us would value a flat-backed iPad more than a high-quality rear camera.
I predict there will soon be some use cases for 3d scanning and AR that will be very compelling on the bigger screen of an iPad, and the camera will become indispensable to many who never saw value in it before.

This seems like a solution in search of a problem. This would be much more useful on a phone than an iPad. Even more useful if you just buy a 3D camera and connect it to an iPad Mini. Something you can actually hold easily.
I’d have to disagree. Holding two devices, one in each hand, and looking at one device while pointing the other in another direction sounds more difficult than holding one device in two hands and pointing where you look.
Edit- if you meant connect the 3d camera to the iPad physically, it would be much easier for the user to have it built in. If you mean connecting it to a mini specifically because it’s smaller but won’t get the 3d camera upgrade, true a mini is easier to hold, but depending on the specific application, a bigger screen could also be more important than the extra portability. And adding the mass of an external camera would decrease the portability of the mini a bit as well.
 
Last edited:
Hmm but the iPads (at least the iPad Pro for sure now) always uses the "X" version of that year's A-series chip. Like with better graphics performance and such that the iPad can utilize. So technically the iPad Pro will have a better chip than the iPhone no matter what, right?

Apple is clearly trying to better segregate the iPhone and iPad categories with the new iPadOS, "X" processors, and iPad-specific apps (which bears mentioning has its own program, Project Catalyst, for using said apps on a Mac). Idk I feel like it's just not as easy to compare the iPhone and iPad these days, and I mean that in a good way because Apple is finally starting to treat it more like the capable computer it is.
Nope. The 2017 iPad Pro, released in June 2017, was slower in both single-core and multi-core benchmarks than both the iPhone 8, released in September 2017, and the iPhone X, released in November 2017. The Metal Score was only barely faster.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.