The pencil isn't very smart..

Discussion in 'iPad Accessories' started by David58117, Nov 22, 2015.

  1. David58117 macrumors 65816

    Joined:
    Jan 24, 2013
    #1
    There's something I don't understand.

    If the pencil connects to the iPad via bluetooth - why does my finger/side of hand make marks on the screen?

    Shouldn't it differentiate touch input VS pencil input?

    If I remember correctly - the SP3 differentiated them, and a finger touching the screen would be recognized as such, while on the iPad Pro it seems like it can't tell either of them apart. I'm assuming this is why a lot of apps requires 2 fingers to scroll..

    The ONLY exception I've found so far is GoodNotes, which connects to the Pencil as a "smart Stylus." (But even then - it still requires 2 fingers to scroll.)

    The stock Notes app, notability, PDF Expert (which claimed to be updated for the pencil..), forScore, iAnnotate - all do it.

    Considering all the apps do this, including Apples own Notes app - I'm starting to wonder if this is a limitation of the hardware.

    To be clear - I'm not talking about palm rejection while writing.

    This is - you lift the pencil off the iPad, accidentally touch the screen, and the app thinks your finger/side of hand etc was the pencil and starts writing as if it were.

    It's extremely annoying, and I don't remember this on the Surface..
     
  2. thedon1 macrumors 6502

    Joined:
    Jun 26, 2010
    #2
    It might be because some people use the finger along with the stylus. There's a couple of adobe apps where it's easier to resize a shape (by dragging, or long pressing) with the hand that isn't holding the pencil. Other apps like Paper let you use the finger to smudge and blend.

    I'm sure the developers could implement something to reject touch inputs when the Pencil is in use. The device clearly knows when the pencil is around as it adjusts the screens refresh rate for it. They probably don't though because it would be very frustrating for the user if their touches weren't recognised while holding the pencil.
     
  3. David58117 thread starter macrumors 65816

    Joined:
    Jan 24, 2013
    #3
    I think we're talking about different things.

    The problem is that touch inputs are being recognized as pencil inputs.

    On the surface - only the pen would make a pen mark, and touching the screen would be recognized as something else - not a pen input.
     
  4. Pakaku macrumors 68000

    Pakaku

    Joined:
    Aug 29, 2009
    #4
    I guess the interval between the pen lift and your finger touching was too short for the iPad to realize that's a finger and not a pen. You wouldn't want to quickly tap the pen somewhere else on the screen and not have the tablet assume too fast whether it's a finger or not.
     
  5. David58117 thread starter macrumors 65816

    Joined:
    Jan 24, 2013
    #5
    Why can't it differentiate that though?

    I don't remember that being an issue with the surface..
     
  6. Slim Boy Fat macrumors member

    Joined:
    Apr 25, 2015
    #6
    I think your answer is in every one of your posts in this thread. Go back to the surface ;)
     
  7. sunapple macrumors 6502a

    sunapple

    Joined:
    Jul 16, 2013
    Location:
    The Netherlands
    #7
    If this is a real issue (sounds like it, but I can't try it out myself yet..), it will probably be addressed in a future software update.
     
  8. MH01 macrumors G4

    MH01

    Joined:
    Feb 11, 2008
    #8
    I did notice this myself. I would assume the software will mature.
     
  9. iunnohead macrumors regular

    Joined:
    May 10, 2008
    #9
    Microsoft's One Note app for the iPad Pro differentiates perfectly -- touching the screen with 1 finger lets you drag around, 2 fingers can zoom, and only with the pencil will actual lines be drawn (there is a toggle to allow finger input as well if that is desired).
     
  10. fenjen, Nov 23, 2015
    Last edited: Nov 23, 2015

    fenjen macrumors 6502

    fenjen

    Joined:
    Nov 9, 2012
    #10
    The iPad can differentiatie the Pencil and a finger input trough the digitizer. 3rd party styluses indeed differentiated stylus from finger input by utilizing Bluetooth and basically paying attention to the timing of the touches. Because obviously Bluetooth has a delay this causes a small window in which finger inputs will be recognized as stylus input every time when putting the stylus to the screen. As I said this is not the case with the pencil. The iPad can also differentiate the pencil from a finger simultaneously. Therefore it is all up to the developer to decide what finger input will do when also using a pencil. I'm sure developers will support the pencil correctly soon. Let's not forget that this product is still in its infancy so developers still have to find out what settings work best for their programs. Give them some time and I'm sure it will get better soon.
     
  11. tekchic macrumors 65816

    tekchic

    Joined:
    Apr 19, 2010
    Location:
    Phoenix, AZ
    #11
    I've had far more issues with palm rejection on the Surface Pro (owned one for 2 years, selling it now) than iPad Pro.

    Stray marks and lines all over the place in Photoshop and Sketchbook Pro with Surface Pro. Resorted to drawing with a fingerless glove.

    iPad Pro in stock Notes so far has been the best sketching and inking experience I've had on a device since the 2002 Toshiba M Series laptops (which had no capacitive screen, thus no palm rejection issues).
     
  12. joeallen macrumors member

    joeallen

    Joined:
    Sep 3, 2014
    #12
    It's not a hardware issue...

    The iPad actually only detects the Pencil's location and 'touch' from its antennae, it makes you realise the engineering that goes into making it work so well...

    The iPad can't actually determine the physical touch from the plastic Pencil tip... You can try yourself by carefully moving the Pencil about 1mm from the screen's surface without touching, and it will still register this as a touch from the pencil. (takes some practice but you will see it)

    So despite what their their explainer video shows, the iPad isn't detecting a 'touch' or any contact from the Pencil with the display glass. The pressure and tilt are both also broadcast from the Pencil to the iPad in the same way.

    Since these apps have always previously allowed users to use their finger to draw, this has remained unchanged. Remember that many cheap styli just mimic a standard finger touch.

    It's down to the developer if they want to offer users the option to disable all finger touches as pen/brush stroked while the Pencil is connected.

    I always have my pencil connected but sometimes just make a very quick stroke with my finger just like normal.
     
  13. TurboPGT! macrumors 68000

    Joined:
    Sep 25, 2015
    #13
    It is not a stylus. It is for drawing. Not navigating the display.
     
  14. TurboPGT! macrumors 68000

    Joined:
    Sep 25, 2015
    #14
    The problem is you don't understand what the pencil is for. Let me guess, you got one just because you thought you're supposed to?

    I'm starting to understand now. So many non-artists buying the pencil. People think it is a stylus.
     
  15. David58117 thread starter macrumors 65816

    Joined:
    Jan 24, 2013
    #15
    No one is talking about navigating the UI.

    This is when you're taking notes.
     
  16. jclardy macrumors 68040

    jclardy

    Joined:
    Oct 6, 2008
    #16
    It does differentiate it. Every touch on the screen with the pen is marked as type "stylus". You are asking for the software to be different, which it could be (An app could assign one tool to the pencil, and one tool to the finger for example.)

    The problem is the software doesn't know if you are actively using the pencil or if you just having laying next to your iPad...On the surface a finger touch is pretty much always a mouse cursor action. Apple always wants the finger to be the primary input (And these apps are currently designed mostly for iPad's without an apple pencil) so the finger still counts as the main input to the app.
     
  17. the future macrumors 6502a

    Joined:
    Jul 17, 2002
    #17
    To be able to use the Pencil and your fingers in parallel is not a bug, it's a feature. And one which I really like.

    But as other have said, it is up to the individual app to implement a setting where finger touches are completely ignored. Technically, that should be trivial.
     
  18. David58117 thread starter macrumors 65816

    Joined:
    Jan 24, 2013
    #18
    Being able to use fingers and pencil together is the whole point of this question.

    Here's an analogy - you plug a mouse into a computer, and everytime you press either the left or right click button - it does a right click.

    Right click = right click.
    Left click = right click.

    That's how it is now, where both your finger and pencil get the same response.

    But I want a left click to be a left click, and right click to be a right click.

    I want the pen to get an inking response, while your finger gets a touch response...like on the surface.

    Anyway - apparently the surface has a "pixel sense" chip or layer (?) that is used to distinguish this across the OS. I get the impression Apple is doing it differently, and is relying on app developers to implement their own version of it,,which is a bit disappointing.

    What concerns me, is - why didn't Apple implement it (if it was doable) in their own notes app?

    The more I use this, the more I think they're aiming this at graphic artists, people who place the pencil on the screen and generally keep it there to do their work.
     
  19. AdonisSMU macrumors 603

    Joined:
    Oct 23, 2010
  20. ddrulez macrumors member

    Joined:
    Dec 12, 2012
    Location:
    Germany
    #20
    In Procreate for example you can define the pencil and finger behavior pretty well. I don't have problems with wrist detection in this app at all.
     
  21. username: macrumors 6502a

    Joined:
    Dec 16, 2013
    #21
    From what I have seen, the iPad pro hardware does differentiate between pencil and finger.

    But the app setting in Notes for example are still set to register both for drawing.

    I have seen videos in some apps where the pencil registers drawing, while the finger can be used as an eraser or to smudge.

    So your statement that the iPad pro does not differentiate them is rubbish.
     
  22. Krevnik macrumors 68040

    Krevnik

    Joined:
    Sep 8, 2003
    #22
    Let me ask you this: What should happen if the pencil is used in an app that doesn't explicitly support it? Should it just not do anything? What should happen if the person left their pencil behind? Should the app just not let them do anything?

    I actually think that how it was done on the Surface was wrong. It is inflexible. To do anything interesting with the stylus, you have to integrate it in a different way than your other touch inputs (even though fundamentally, they are both touch inputs). If I want to expose functionality that can use either/or, then it gets even more complicated.

    The reality is this: Apple exposes all touch events the same way. If as a developer, you want to distinguish between them, you can do so very easily. Much more easily than it would take to support the Surface Pen. If not, then it still works as pointing device, much like Windows allows with pen touch input when it isn't in a canvas configured by an app.

    But the other fundamental here is that if I haven't bought a Pencil, I should not be blocked from functionality that otherwise works. I should be able to draw in Paper. I should be able to draw in Notes, etc.
     
  23. Jal217 macrumors regular

    Jal217

    Joined:
    Oct 28, 2015
    #23
    There is absolutely no hardware limitation. The apple pencil is not recognized by touch at all (as someone else previously stated) Each app has two sets of input that it can receive now, on by touch and one by Apple pencil communication. All apps have to accept both and by default they accept both as the same thing. The developer can then go and change their code so an Apple pencil does seperate things than a finger. However most apps want people to not have to use the Apple pencil if they don't want so the only difference they implement is the pressure sensitivity and tilt. This includes Apple with its notes app because they know 90 percent of users don't use the Apple pencil.
     
  24. xraydoc macrumors demi-god

    xraydoc

    Joined:
    Oct 9, 2005
    Location:
    192.168.1.1
    #24
    OneNote and Goodnotes are two apps that come to mind that won't draw on the screen in response to finger touches.
    I'm presuming this is up to the developer to implement or not.
     
  25. David58117 thread starter macrumors 65816

    Joined:
    Jan 24, 2013
    #25
    The apps I used on the surface often had a "finger drawing" option.

    i also think the pen on the surface would just be used as a stylus in apps that didn't explicitly use it. Pressing the buttons on the pen would act as left/right click.
     

Share This Page