Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The Kinect required expensive 3D scanning hardware, which ultimately Microsoft couldn't afford and discontinued. (Kinect games even attracted an additional royalty which I recall was rumored at $10) This is all done with computer vision.
That's funny, I didn't think the Xbox was a mobile phone??
How is that product and technology doing, now?
Are none of you aware the TrueDepth Camera is made with PrimeSense technology, the same tech in Kinect? It works the same way, just miniaturized.


And, MS continues to use that technology in Hololens.

Which is a shame, I still have mine around.
Same here.
 
Are none of you aware the TrueDepth Camera is made with PrimeSense technology, the same tech in Kinect? It works the same way, just miniaturized.

I am aware. The technology was not invented by PrimeSense and was in use years before they existed. PrimeSense's innovation was to use pseudo-random patterns instead of structured patterns.

TrueDepth works on the scale of no more than 2 feet. When you get to longer distances, you need higher resolution in the detectors, more expensive optics, more complex processing due to the wider FOV, more sensitive detectors due to eye safety.

Intel sells simliar technology, the Realsense SR305, which they suggest no more than 1.5 m range.

Again, this isn't how this new API works, here its 100% computer vision.
 
And you think Apple will do it right? This might be useful in very limited set of circumstances. With the regular interface, one just needs to move a finger to interact. With these new features one needs to move a hand or the entire body. Why would anyone want to do that? Apple introduce AR support three years ago. The demos were cool but it is hardly used.

Why would anyone want to do this?
Uh, duh! For skinning!
Look at the examples given. If you want to teach yourself anything involving your body (dancing, sports, workout, physical therapy, maybe even playing a musical instrument) there's massive value in being able to
- extract how the student is moving their body
- display how the body SHOULD be moved by skinning the user's body on top of a skeleton posed in the correct orientation.

This is the difference between Apple and other companies! Most can't see value in anything further than three days in the future; Apple is operating on a ten year schedule.
Yes, all the AR stuff appears mostly useless now. Then when the Apple Glasses ship...
 
Why would anyone want to do this?
Uh, duh! For skinning!
Look at the examples given. If you want to teach yourself anything involving your body (dancing, sports, workout, physical therapy, maybe even playing a musical instrument) there's massive value in being able to
- extract how the student is moving their body
- display how the body SHOULD be moved by skinning the user's body on top of a skeleton posed in the correct orientation.

This is the difference between Apple and other companies! Most can't see value in anything further than three days in the future; Apple is operating on a ten year schedule.
Yes, all the AR stuff appears mostly useless now. Then when the Apple Glasses ship...
As I said, there is limited set of useful features in there. What you described is a very limited set. Apple was not even the first company to offer AR, Google did it before them. I am not sure where your excitement about Apple being unique comes from. They have not even implemented NFC properly, and that some ten years after Android vendors did it.
 
  • Like
Reactions: PC_tech
apple-vision-framework-human-body-pose-detection-jumping-jack.jpg

For those of you who've dreamed of becoming the constellation Orion.
 
  • Like
  • Haha
Reactions: Jxdawg and phenste
Didn’t Apple buy the Israeli company Microsoft partnered with to develop Kinect? That’s how FaceID came to be right?
 
  • Like
Reactions: insomniac86
Seems a lot like what Xbox was able to do with the Kinect 10 years ago
The Kinect was an atrocity that Apple would never put out. If I had to put it on the scale of recent Apple fails, though—AirPower x1000 in terms of what they TOLD people it could do versus what it actually DID. The only problem is that Microsoft actually put it out and kept it out for so long, promising people it could do what it couldn’t even when there was blatant evidence otherwise.
(Forgive me. Day 1 360 owner who was so excited by the prospects of Kinect and is still bitter about it 15 years later.)
I don’t need to say what others have said about the fact that this being in a mobile device is amazing.

Beyond excited for this. Especially as far as it goes for fitness. If implemented properly, fitness/workout apps could help so much with making sure you’re using the proper form for things as basic as stair stepping, etc.
I also wonder how this could work for people with disabilities. Emergency SOS is already an amazing feature, but, what if there were some kind of particular hand motion one could make to enact it? This would arguably be easier for some and harder for others, depending on how it’s done. Just a creative spitball.

(PS—Yes, I go up and down a stair/step to work out. It still works, and I get to watch all the mindless crap on my television whilst feeling like I’m doing something! :p)
 
Except they didn't do it right. You notice how they don't pursue Kinect anymore?
This may be why the iPad has LIDAR.
The Kinect required expensive 3D scanning hardware, which ultimately Microsoft couldn't afford and discontinued. (Kinect games even attracted an additional royalty which I recall was rumored at $10) This is all done with computer vision.

It’s funny you guys mention Microsoft Kinect. So what you guys probably don’t know is that Microsoft Kinect was based off IP from PrimeSense.

Apple bought PrimeSense. Most of what a Kinnect sensor was is actually in FaceID.

 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.