I really dislike how Apple is using visual intelligence on the iPhone unless I am missing something here. On the S25 Ultra I can hold touch and hold on the bottom of the phone and I can literally use it anywhere in any app and then circle anything I am looking for and it will find it. I can be on a web page, hold the bottom, then circle what I want it to find and boom it finds it. Apple's integration of this is just the camera and its not nearly as useful as it should be. Is there a way to make it work outside of taking a picture of something? Why is it only through the camera? Apple really needs to fix this and take a cue from how Samsung has this integrated because its 1000 times more useful.
Or, am I missing something and not using it correctly on the iPhone?
Or, am I missing something and not using it correctly on the iPhone?