In iOS 15 it would be much better if swiping up on a picture in the photos app gives the user more than just location. I would like to see all information for the file with the option to easily change the file's name right there. Also adjusting file size right there would be handy.
On the lock screen it would be cool if I saw my current speed and the speed limit for the road I am currently driving on when I tap the screen. Perhaps also a small tap icon as a direct link to opening maps.
It might be cool to have small security camera icons on maps a person could tap and view in Maps.
Why is current weather conditions still not on the lock screen?
Icons for the the last 3 or 4 open apps I am using could show-up on the lock screen. With one tap I can continue to use the app.
iOS needs native call recording.
When the device finishes charging to 100% it could play a sound or Siri could say "your iPhone is finished charging".
Apple Watch could show charge level for the iPhone.
It would be quite cool if MacOS & iOS could recognize text and hand writing in a PDF. For example, viewing a PDF in MacOS, with the mouse I expand a selection box over an area and it recognizes the text or handwriting.
The camera could have a LIDAR view. The longer and closer a person focuses on an object could refine the wire mesh of points on that object or space.
On the lock screen it would be cool if I saw my current speed and the speed limit for the road I am currently driving on when I tap the screen. Perhaps also a small tap icon as a direct link to opening maps.
It might be cool to have small security camera icons on maps a person could tap and view in Maps.
Why is current weather conditions still not on the lock screen?
Icons for the the last 3 or 4 open apps I am using could show-up on the lock screen. With one tap I can continue to use the app.
iOS needs native call recording.
When the device finishes charging to 100% it could play a sound or Siri could say "your iPhone is finished charging".
Apple Watch could show charge level for the iPhone.
It would be quite cool if MacOS & iOS could recognize text and hand writing in a PDF. For example, viewing a PDF in MacOS, with the mouse I expand a selection box over an area and it recognizes the text or handwriting.
The camera could have a LIDAR view. The longer and closer a person focuses on an object could refine the wire mesh of points on that object or space.
Last edited: