Perfect for people who like cloning artifacts.Doesn't seem as useful as removing certain objects from background for photo retouching.
True, but the world doesn't really need artists to be wasting their time and energy on grunt work. It was only out of necessity that they had to acquire these technical skills before. This kind of tech frees talented folks up to do more creative things.And that's why I find it a little bit terrifying Mike. We've gone from hours for a skilled professional to a couple of clicks by a complete idiot.
Well, they're identifying and accurately defining the bounds of objects, so removing them would be trivial. Any decent GAN can infill the background once the subject is gone. Probably this kind of thing will turn up as an editing option in Photos.Doesn't seem as useful as removing certain objects from background for photo retouching.
Great point....👍Yes, people look at these "features" and see them as standalone, as though Apple builds these features as islands.
Not the case. They build "capabilities" into the platforms, and those capabilities may be realized through certain features, such as this one, but the bigger picture is that the platform gains new smarts that can be applied elsewhere, too. The research and development span deeper and wider than most users realize.
You need to press a little quicker. Not a long press.Doesn't work for me with the iPhone mini on beta 1, but if I long press there is a "copy subject" option that works. If I long press and drag it grabs the entire photo.
Have you seen the edges of the subjects auto selected? There is true artistry in creating a great mask for something.True, but the world doesn't really need artists to be wasting their time and energy on grunt work. It was only out of necessity that they had to acquire these technical skills before. This kind of tech frees talented folks up to do more creative things.
Well, they're identifying and accurately defining the bounds of objects, so removing them would be trivial. Any decent GAN can infill the background once the subject is gone. Probably this kind of thing will turn up as an editing option in Photos.
If so, I want Craig Federighi to explain why.
It was a complete joke considering there is no M1s in the iOS lineup (yet)Nope, works on my 13 Pro Max
I'm a skilled artist.And that's why I find it a little bit terrifying Mike. We've gone from hours for a skilled professional to a couple of clicks by a complete idiot.
Gotta be some reason for that 16-core Neural EngineProbably M1 only 😅
As a graphic designer of 20+ years I’ve yet to be impressed by any of adobes subject removal capabilities. It’s always felt overly complicated and inaccurateAs a graphic artist, I can’t tell you how many thousands of hours I’ve spent carefully outlining the subjects of photos. Adobe rolled out a similar feature a couple of years ago on their high-priced apps. Yea for Apple for baking it into the new versions of their (free) OSes!
Well, I think it's more technique than artistry, but that gets into semantics. No doubt there's a high level of skill there, but it could most definitely be replicated by an ML system (it's worth noting that object removal doesn't have to worry about a clean edge once the bulk of the object is gone since the model can infill the empty space with "new" content—in fact the object removal demo most certainly has horrific edges as well, before the infilling runs). I'm just saying that the intention of cutting a subject out of a context or background poses a technical challenge, not necessarily an artistic one. What you do with it after that is a another matter, and that's generally where the artistry comes in.Have you seen the edges of the subjects auto selected? There is true artistry in creating a great mask for something.
Thanks, Dan. I'm looking forward to this feature!
With iOS 16, Apple introduced a curious new feature that's kind of like instant Photoshop, as you can use it to pull the subject out of any image or photo, pasting it into another photo or using it as a sticker in the Messages app. It's nifty enough that we thought we'd show it off in our latest YouTube video.
Subscribe to the MacRumors YouTube channel for more videos.
Apple calls this feature "Lift subject from background," and it is part of the Visual Look Up suite of functions. To use it, you just long press on any image and then you can drag it out into a different app or copy it.
It works in Photos, Screenshot, Quick Look, Safari, and even videos, and it is available on the iPhone, iPad, and Mac. There's not a lot of practical use for it, but it is certainly impressive and fun to use. Make sure to check out our video to see it in action.
Article Link: Video: Apple's Coolest iOS 16 Feature Lets You Drag Subjects Right Out of Images
Uh, hopefully you have other "skills" besides tracing around images...I'm a skilled artist.
So what do you actually do?
I painstakingly trace around images.
So, like, elementary school art?
No. I'm a skilled artist.
If it was taking hours for the skilled professional I’d have to question how skilled they are. And if they are that skilled they are now freed up to work on better projects.And that's why I find it a little bit terrifying Mike. We've gone from hours for a skilled professional to a couple of clicks by a complete idiot.
Yes but does it work on Intel Ventura?Nope, works on my 13 Pro Max
Have you seen the edges of the subjects auto selected? There is true artistry in creating a great mask for something.
The professionals that take time to do it properly are always going to get better results than this. Look at Apple's portrait mode in the camera app for example. It's terrible and doesn't come close to the result a skilled operator will achieve. What it does do is allow casuals to do something quickly and nastily and that covers the lion share of iOS users.And that's why I find it a little bit terrifying Mike. We've gone from hours for a skilled professional to a couple of clicks by a complete idiot.