Why Apple ditched Steve Jobs' secret haptic texture technology

Discussion in 'iOS 7' started by darkgoob, Apr 20, 2014.

  1. darkgoob macrumors 6502

    Joined:
    Oct 16, 2008
    #1
    Vision and touch in fact tied together: "eye-hand coordination" goes far beyond just being able to coordinate action. It also entails Hebbian adaptation: the ability to "feel" a touch sensation *before* you have actually touched something. Seeing a texture anticipates feeling it; this predictive ability of the imagination and subconscious is evolution's solution for the problem of lag: by feeling something before it is touched you can avoid touching something bad before you already have, and it's too late. (Touch a hot stove.)

    All sensations of texture take place fully in the mind. Sensory data from the fingers (touch) can be one source of what causes the mind to perceive a texture, but sight data from the eyes can also trigger it, as can memories. Just as you can imagine touching a leaf and remember the feeling of it, so also you can see a picture of a leaf and reach out to touch it, and even if it's behind some glass, just the action of doing this can cue your mind to "feel" the leaf more realistically than if you closed your eyes and just tried to imagine it.

    On a certain level, you don't need to actually feel "real" (physical, mechanical) haptic feedback to have haptic feedback, because to quote Morpheus, "Your mind makes it real." You just need realistic, familiar textures and realistic-looking skeuomorphic design, and boom, you have haptic feedback -- probably better haptic feedback than if you actually had whatever crappy haptic feedback we'll likely see whenever the physical kind finally comes out. On a subconscious level, your brain anticipates feeling the textures it sees; then it anticipates that feeling even if the neural impulses don't actually come from the fingers.

    Steve Jobs understood this. He famously said regarding OS X's Aqua interface, "One of the design goals was when you saw it you wanted to lick it." Steve understood there is a subconscious element to things and that a good UI ought to leverage all the processing power not only of the computer, but of the user's subconscious mind.

    After all, Neo, what's the difference between something that tastes like chicken, and "real" chicken?

    ...

    So why did Apple give up that technology in iOS 7? I would argue that they were forced to, and it was not necessarily by choice. They realized that in order to migrate iOS to a variety of screen sizes, they could not keep kicking the pixel can down the road.

    The only legitimate problem with skeuomorphic design is that it's highly reliant upon bitmaps. We saw with the iPad Mini that the UI simply shrank, something Jobs wanted to avoid because the original iPad UI was designed to be just the right physical size for actual fingers to use. However some people have better manual dexterity and close-up eyesight than others, and Apple sells plenty of iPad minis as a result. I don't see a problem there.

    But on the iPhone, Apple knows that it can't just decrease the screen DPI and make it bigger, while remaining competitive. They want to make a UI that will scale properly across devices while remaining the same physical size.

    That's why in iOS 7 they got rid of so much UI chrome, because they want the UI to be solely based on vector graphics elements (like fonts and lines) that can scale perfectly if the screen size changes. That's why they are pushing developers to use the "AutoLayout" constraints, which are a series of rules that defines where on the screen each element gets drawn according to its distance from other elements or the screen boundaries, instead of just a position on a grid of coordinates that is directly mapped to a pixel grid.

    I've been in favor of a fully vector-based UI for OS X and iOS for many years, since in a world of varying screen sizes and varying screen DPIs it would be the only way to have a truly resolution-independent, WYSIWYG UI.

    However even though Apple said several years ago now that they would move to a vector-based UI for OS X, they never did because it would have been too much to ask everyone to convert all their UIs into to vector-based ones. Every app would have to get redone. It's a monumental undertaking, and Apple did not have a way to force developers to do things. Besides, it's not like you can't just change your screen resolution on a Mac if you want the UI elements to look bigger, or use the zoom-feature. Also, apps run inside their own windows on the Mac, so it doesn't matter if that window gets smaller.

    Now on iOS, Apple does have the ability to force developers to do things. That's good because it means they can require the use of APIs like AutoLayout and TextKit to make apps that will scale to differing screen sizes. This becomes a manageable task when you can stop worrying about scaling textures and design all your icons as vector-based ones. You can use a desktop-publishing style approach to design now and achieve a nice look through alternate means than skeuomorphism.

    ...

    But have we lost something important with the move away from skeuomorphism? I'd argue that perhaps we have. How can we get back there and strike more of a balance between resolution independence, and psychohaptic feedback?

    One idea would be to use dynamically generated textures based on fractals or other forms of procedural math, or leverage OpenGL to scale bitmapped textures in a way that looks better using bump-mapping and dynamic light sources linked to the accelerometer.

    This is where I think we are ultimately headed, probably not as soon as iOS 8, but somewhere down the road.

    Check out the new Frax app for iOS to get a sense for what kind of real-time textures can be possible:
    http://www.pocketmeta.com/frax-hd-ipad-beautiful-strange-ingenious-5940/

    The issue of course right now is battery life; you have processor-expensive textures everywhere and it just kills you. However if the textures are not updated in real-time, but simply rendered once upon determination of the screen size, and then cached, it doesn't kill you.

    Apple has a lot of ground-work to do before it can realize something like resolution-independent skeuomorphism in a battery-efficient way through system APIs with fractal textures and OpenGL, etc. However I would not rule out the possibility or assume that they went away from skeuomorphism on purpose.

    Because if you look closely skeuomorphism is still present in certain places in iOS 7: the look of frosted, translucent glass in Control Center and Notification Center; the light paper texture of Notes app. It's present in things things that can scale independent of resolution, as one might predict if I'm right, and they haven't completely eschewed skeuomorphism but rather been forced away from it in order to migrate to a variety of screen sizes.

    I do not think Jonny Ive doesn't understand the value of psychohaptic feedback in places, but I think they wanted to make a bold move and push the envelope in a new direction to freshen things up. That's not to say the pendulum couldn't swing back the other direction towards a method of implementing psychohaptic feedback via skeuomorphism in places where it really does help the interface feel more interactive and draw the user in more.

    I'd like to hear all your thoughts on the matter.
     
  2. warden macrumors member

    Joined:
    Aug 28, 2013
    #2
    this is genuinely one of the most intelligent threads I think I've ever read on this site. you understand UI affordances and the need for technology to accommodate our psychological hardwiring better than most. some people get so tied up with modern-looking aesthetics they forget the environment that has shaped our brains and senses throughout our evolution has been a physical one - complete with depth, shadow, texture. haptic feedback is the logical extension of touch-based UI design. iOS 7 certainly seems less suited to a haptic feedback implementation though. I think you're right to attribute its design to a move towards resolution-independence rather than Ive simply not understanding psychology/design basic principles (in part, anyway) - he's not the only designer at Apple, and will have learnt much from the others. we will possibly see a return of some skeumorphic principles (think of the 7.1 slide to power off button which is physically manipulatable).
     
  3. ScottishCaptain macrumors 6502a

    Joined:
    Oct 4, 2008
    #3
    Why do you think vectors are scalable?

    If you have a line that measures 2px wide, and scale that up by 1.25x- you'll land up with a line that measures 2.5px wide, and now one or both of the sides will fall on fractional pixels and therefore appear blurry (aliased).

    I suppose you could use a high enough DPI display that this just doesn't matter, but you'll still run into edge cases where certain lines *do* fall on even pixels and others don't- which will look bizarre when situated next to each other on the same screen (one will appear perfectly sharp, the other won't).

    So vector graphics may be scalable, but they're rarely cleanly scalable. If you're going to try and target different resolutions with an even multiplier by making sure all your important bits fall on pixels divisible by that number, then you might as well be using raster graphics anyways because it won't matter in the end.

    IMHO; Ive is off his rocker. Apple used to build minimalistic hardware that was coupled with texture rich software, and the contrast between the two is what made everything feel premium. The hardware stayed out of your way and let you focus on the software. Now that they've gotten rid of this, everything feels cheap, everything feels the same, and frankly their software is boring as ****.

    -SC
     
  4. chambone macrumors 6502a

    chambone

    Joined:
    Dec 24, 2011
    Location:
    Netherlands
    #4
    They could use subpixels, just like text is rendered. It would come with its own problems, but I think interfaces going vector is inevitable. And it's going to happen on their mobile products first, because of the near 100% penetration of hi-res screens.
     
  5. TC03 macrumors 65816

    Joined:
    Aug 17, 2008
    #5
    This is a rendering problem rather than a vector problem. You could use anti-aliasing techniques similar to font rendering.
     
  6. Xenc macrumors 65816

    Xenc

    Joined:
    May 8, 2010
    Location:
    London, England
    #6
    It's not really a secret, Apple aren't the only players in this field. I'd wager it's more of a case of the technology not being ready more than anything else.
     
  7. chambone macrumors 6502a

    chambone

    Joined:
    Dec 24, 2011
    Location:
    Netherlands
    #7
    Exactly. You can bet money that Google and Microsoft are pouring in millions as well.
     

Share This Page