Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
??? Samsung had ToF on the Note 10 not the Note 20 as this article misleads you to believe.

Lemme ask you, are you aware that Samsung has traveled in time and back so they can copy Apple and release a phone with a foldable display and the next one later this year will have the camera UNDER the display? If that wasn't enough, Samsung also traveled in time so they could copy Tile and release an NFC tracker and release 3 Bluetooth trackers including one that uses UWB before Apple.

Samsung also lept into the future to include a spectrometer on the Galaxy S22/S23.

Amazing how they can release technology first and still be accused of copying Apple.
Once Apple is rumored to be working on something, Samsung rushes to get that technology to market. Wash rinse repeat. The unfortunate thing is that Samsung's execution is always worse than Apple's subsequent execution.
 
  • Like
Reactions: the future
Let’s not kid ourselves. The sensors are for a Car and Apple is throwing it on their current devices as a stepping stone.
Another example is the Touch Bar, Apple testing their integration for future M1 chip setups.

Under Steve they’d maybe do this under R&D and release the final product. Tim takes a more kanban/agile methodology and releases things where most of it just works. Getting to production faster makes the stockholders happy, until something goes wrong from corner cutting too much.
 
  • Disagree
  • Like
Reactions: diandi and 4087258
Apple is already late in offering any real world usage for the Lidar sensor. Currently it is expensive technology that only offers marginal improvement in photo shooting. It is currently a gimmick!
 
  • Like
Reactions: wnorris
??? Samsung had ToF on the Note 10 not the Note 20 as this article misleads you to believe.

Lemme ask you, are you aware that Samsung has traveled in time and back so they can copy Apple and release a phone with a foldable display and the next one later this year will have the camera UNDER the display? If that wasn't enough, Samsung also traveled in time so they could copy Tile and release an NFC tracker and release 3 Bluetooth trackers including one that uses UWB before Apple.

Samsung also lept into the future to include a spectrometer on the Galaxy S22/S23.

Amazing how they can release technology first and still be accused of copying Apple.
It’s about the time machine. That’s how they copy apple.
 
Once Apple is rumored to be working on something, Samsung rushes to get that technology to market. Wash rinse repeat. The unfortunate thing is that Samsung's execution is always worse than Apple's subsequent execution.
Wow so they rushed that out in 2019 then. Pretty good for Samsung to rush it out a whole year before Apple. I reckon those periscope camera lenses must be a copy cat action too. I mean by the time Apple get theirs out Samsung will have had them 2-3 years before Apple but definitely copying. It’s the same with the foldable phone. We know Samsung has been working on it since at least 2013 but they have clearly rushed theirs out to copy Apple. Can these companies actually do anything or do they just wait around to hear what apple is doing and then rush it to the market.
 
  • Haha
  • Like
Reactions: mayga1 and Ramchi
Samsung clearly doesn’t want to expand or change the smartphone business with augmented reality. No company saw/made what Apple saw before iPhone came out and it may also happen with AR.

This laser sensor is all about integration with future Apple glasses and, of course, testing. If it turns out to be obsolete, ok, no problem. By now I see it as a decent tool.
 
I have just measured my house using an iPad Pro and lidar. I could of course use an Leica laser scanner but the cost difference is just massive ($50000+).

The model created by the camera-lidar combo is of course quite rough but it is certainly useful!

Also lidar improves the AR a lot, Apple is just so much far ahead on AR at the moment compared to Android.

I don’t even think you can create AR content from Adobe Aero, Substance painter etc to Android yet.

of course it’s a lot of early concepts and a lot of rough apps but remember the lousy cameras in the iPhone 3G and in the end how they changed the camera market.

will you be able to scan objects and environments with an iPhone 15, creating 3D environments instead of just panoramas – yes I certainly believe that.
 
  • Like
Reactions: 4087258
Funny. Apple is clearly hiding some of the purpose in the dark. So Samsung first copied, then realized it has no own plan für the use.
Let me predict: If Apple shows his cards, Samsung will re-add the sensor in no time.
Copied?
Samsung introduced ToF in the S20 series which was announced in Feb 2020... seven months *before* Apple added it to their phone range.
 
Last edited by a moderator:
Copied?
Samsung introduced ToF in the S20 series which was announced in Feb 2020... seven months *before* Apple added it to their phone range.
It's not like either Samsung or Apple invented LiDAR.

(It's not entirely clear to me how PrimeSense's technology in Face ID differs from LiDAR. If it doesn't, iPhones have had front-facing LiDAR since 2017.)
 
Last edited by a moderator:
Same sung copied comething android devs won't support cause android people don't like to pay for innovation.

no news here.
 
  • Disagree
Reactions: miq
Copied?
Samsung introduced ToF in the S20 series which was announced in Feb 2020... seven months *before* Apple added it to their phone range.
Actually they introduced it with the galaxy note 10+ which was released in August 2019
 
Last edited by a moderator:
  • Like
Reactions: miq
I can not disagree with Samsung on the feature. As it stands there is very little good applications for it on a consumer level device. Pro level the cell phones Lidar is far from being good so same boat not a good use for it with a relatively expensive sensor.
 
Since when Samsung uses LiDAR technology? I thought they employed a different type of ToF that is similar to the one that is used in FaceID which is not so great when it comes to longer distance and light interference, Samsung uses an infrared flood illuminator not laser arrays
 
  • Like
Reactions: Ajaxpinch
Wow so they rushed that out in 2019 then. Pretty good for Samsung to rush it out a whole year before Apple. I reckon those periscope camera lenses must be a copy cat action too. I mean by the time Apple get theirs out Samsung will have had them 2-3 years before Apple but definitely copying. It’s the same with the foldable phone. We know Samsung has been working on it since at least 2013 but they have clearly rushed theirs out to copy Apple. Can these companies actually do anything or do they just wait around to hear what apple is doing and then rush it to the market.
Yes. Apple takes their time to get the technology right. Samsung thinks being first to market is the ticket. All of the tech companies copy each other to some degree or pick up on certain similar trends, but Samsung is particularly egregious in taking Apple rumors and beating Apple to market. Unfortunately, Samsung's execution is almost always lacking compared to Apple's execution of the same idea.
 
Apple is already late in offering any real world usage for the Lidar sensor. Currently it is expensive technology that only offers marginal improvement in photo shooting. It is currently a gimmick!
Am I the only one who sees the potential in Lidar for us consumers?
What about using it to scan your body's measurements, save it to your online profile and then shop clothes / shoes online without having to worry if it will fit or not. You could have a 3D avatar on your screen and see the T-shirt directly on your avatar to see how it fits before buying.

Also for measurements... your phone can now replace your tape measurer. Need the width required for the order of your new fridge? take out your phone and it's done.

Those are two simple examples but there are many more that we are just not aware.
 
  • Like
Reactions: _Spinn_
I hope Apple will keep improving their LiDAR. I want it for more accurate AR overlays and occlusion, more precise measurement in the measure app, better quality 3D scanning with no visible artifacts and camera features like accurate portrait mode and portrait mode video.

Sadly, this is looking more like another 3D Touch at the moment. It's a cool feature that works well in some regards but that most people don't care or even don't know about.
They’re building out their virtual world for AR, there is ZERO chance Apple drops this. It’s key to their roadmap.
 
  • Like
Reactions: Ajaxpinch
switch to AR mode and physically walk into your photos
How does that work?? It doesn't have enough information to show that. If you shift the paint of view, let's say, forward, and you will be able to see things previously covered by a person, but the camera then couldn't see it, so that data was never captured. Even if you have the depth data, it is physically not possible to reconstruct the AR experience.

The only way I can think of is to show blank regions when you zoom around. Show you 3D view when possible, and blank regions when not.

A joke:

Imagine your boyfriend (if you are a girl) takes a photo of you butt naked from your back with LIDAR on, then he "turns on AR experience" after the photo has been taken, then moves the point of view forward to the other side of you, can he see your boobs? If it's AR, he sure should be able to.
 
I have just measured my house using an iPad Pro and lidar. I could of course use an Leica laser scanner but the cost difference is just massive ($50000+).

The model created by the camera-lidar combo is of course quite rough but it is certainly useful!

Also lidar improves the AR a lot, Apple is just so much far ahead on AR at the moment compared to Android.

I don’t even think you can create AR content from Adobe Aero, Substance painter etc to Android yet.

of course it’s a lot of early concepts and a lot of rough apps but remember the lousy cameras in the iPhone 3G and in the end how they changed the camera market.

will you be able to scan objects and environments with an iPhone 15, creating 3D environments instead of just panoramas – yes I certainly believe that.
I don’t have a device with Lidar yet, but I know the software I use with a proprietary 3D camera (Matterport) can make use of the Lidar in IOS devices, thus letting me leave the big heavy 3D camera in the car in smaller projects. Looking forward to trying it out when I order my next iPhone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.