Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Atomic Walrus

macrumors 6502a
Original poster
Sep 24, 2012
878
434
Update: I've found the source of this error in the TouchCanvas example code. See post #20 for details. Original post unedited below.

------------

Got my Pencil today, been spending a lot of time with it. Generally impressed, but there's a major issue right now that wasn't visible on the in-store demos because of the software Apple provided.

Basically, on a certain percentage of strokes the first data point that comes in has no pressure data and receives a placeholder value of 50%.

Might be related to latency of BT vs the digitizer (first contact event is detected before any pressure comes in). May be related to the digitizer recognizing Pencil input just a hair above the actual screen, which would result in a touch event that has no pressure data associated with it. (I would make a video of this second thing happening, but I'm not sure it would be possible to tell on video).

Reproduction:
In Procreate, create a brush which has no dynamics aside from Pencil pressure size variation. No speed, tilt, taper, etc. effects to mess up the results. Make sure it has a wide pressure range. Draw a series of quick strokes with light pressure. Some of them will be way fatter than the others to a degree that could not be an accidental application of pressure.

What's going on? Watch the video (slow motion on the second row) and ignore the fact that it's in portrait:
I want to reiterate that those fat lines are absolutely not the product of applying significantly more pressure. I tested this extensively. I also want to make it clear that pressure sensitivity once you've started a stroke is excellent (link)

3 cases:
-Works fine, everything is normal.
-An initial large circle appears because of the missing pressure data, but is corrected. No problem.
-The correction either doesn't happen, or comes in late and the brush engine begins interpolation from that initial (wrong) pressure data towards the real (much lower) values that are coming in. Part of this could be resolved if Procreate just got rid of the pressure smoothing, but that's not the point.

Here's a different case: Notability doesn't ever correct the initial bad pressure data, but it also doesn't seem to use any pressure smoothing so we get to see the error unfiltered.

3cE6jgH.jpg

Notice how the error value isn't 0% or 100% pressure, but some value in between.

The only app out there right now that doesn't display this issue is Notes, but I've seen it perform a correction to fix it. It's much less common, but I've spotted a dark pencil line replaced with a lighter one.

So... that's quite a thing. I don't want to make any conclusions yet. It may be as simple as other devs needing to fix their prediction/correction implementations so they work like Notes. I was actually hesitant to post this right now, but getting these things fixed requires visibility.
 
Last edited:
  • Like
Reactions: gootdude
Before we all start shouting 'Pencil gate', it would be good to see this replicated on another pencil/iPP i.e. to be sure there isn't a fault with your hardware.
 
Before we all start shouting 'Pencil gate', it would be good to see this replicated on another pencil/iPP i.e. to be sure there isn't a fault with your hardware.

Already done, it was already reported by another user on the Procreate forums and it's being investigated by the Procreate devs right now.

*Edit: I'm trying to get footage of Notes correcting this error and will post it here when I do.
 
Last edited:
  • Like
Reactions: gootdude
Got my Pencil today, been spending a lot of time with it. Generally impressed, but there's a major issue right now that wasn't visible on the in-store demos because of the software Apple provided.

Seems that if it's not visible in Apple's software, then it's probably a Procreate issue rather than major issue on Apple's part, right?
 
Already done, it was already reported by another user on the Procreate forums and it's being investigated by the Procreate devs right now.

*Edit: I'm trying to get footage of Notes correcting this error and will post it here when I do.

Yeah, part of how things work to reduce latency is that the UITouch may tell the app developer that it isn't a "final" pressure value. At that point, the app is supposed to sign onto a notification that will get you the final pressure and let you apply it retroactively, as all touches are given unique time stamps for identification when getting predicted data (either predicted locations or predicted pressure data). So what you describe in the first post is definitely a sign of an app not listening for the notifications.
 
My pencil arrives today. Note to self, "Stay away from all the Pencil issues thread".

There is no issue. I've used notes and the accuracy is the same as me using a pen and paper. I have not had or seen an issue. I really don't understand or get what this thread is about.
 
Thanks for reporting in such detail on these issues! The blown-out ends on some strokes look very similar to the glitchy strokes the Pencil produces in Astropad (posted by me in another thread), so I'm inclined to think there may be some issues with how the Pencil's drivers handle strokes predictively. I've also noticed other problems, such as an inability to taper lines effectively (some apps will fake it on a long enough stroke, but a short flicked stroke will always come out as a dead-weight line).

I'm disappointed because the hype made it sound like the Pencil was definitively without peer and could easily stand up to the performance of a Cintiq; it's definitely great in some areas, but it's far from a flawless experience. I've already boxed up my Pencil again and will more than likely be returning it within the next few days. Super bummed. :( Hope you're able to find some solution to these issues for your own needs!
 
Seems that if it's not visible in Apple's software, then it's probably a Procreate issue rather than major issue on Apple's part, right?

Notes is the ONLY app that doesn't show this issue, so I don't think it's fair to call it a Procreate issue. Notes resolves it by "correcting" the error out (it appears and is quickly erased) so hopefully it's something in the API that other developers will be able to fix... but so far none of them have.

Yeah, part of how things work to reduce latency is that the UITouch may tell the app developer that it isn't a "final" pressure value. At that point, the app is supposed to sign onto a notification that will get you the final pressure and let you apply it retroactively, as all touches are given unique time stamps for identification when getting predicted data (either predicted locations or predicted pressure data). So what you describe in the first post is definitely a sign of an app not listening for the notifications.

Thanks for the details on this. I especially wondered about the time stamp. What I think is happening with Procreate is that even though it is taking the corrected values (we see some of the larger spots disappear in the slow motion vid), its also doing some interpolation on the pressure values and when it gets a correction it doesn't go back and redo all the other values which propagated the original bad data.

The problem is that I believe a lot of apps will be doing something like this since developers are not used to seeing this kind of predictive input system.

There is no issue. I've used notes and the accuracy is the same as me using a pen and paper. I have not had or seen an issue. I really don't understand or get what this thread is about.

The fact that you don't understand it doesn't mean it isn't real. If you're only using Notes then you won't see the issue, but every other app has this pressure sensitivity glitch, most importantly the primary art app (Procreate). It ruins pressure output in strokes, which isn't a small problem.

I want to reiterate that I believe this is an issue with developers not knowing how to deal with the predictive API, not a hardware fault with the iPP itself. There's no reason to return your iPP/Pencil over this, this discussion is primarily for app developers.
 
Last edited:
Notes is the ONLY app that doesn't show this issue, so I don't think it's fair to call it a Procreate issue. Notes resolves it by "correcting" the error out (it appears and is quickly erased) so hopefully it's something in the API that other developers will be able to fix... but so far none of them have.

It totally is. See my earlier post for why. (Disclosure: I am a developer. I have worked with the APIs)
 
It totally is. See my earlier post for why. (Disclosure: I am a developer. I have worked with the APIs)

In that case I want to ask you about how you'd handle the situation Procreate's dealing with.

They perform some smoothing on decreasing pressure data to create a taper effect when you reduce pressure suddenly, which I believe is what's causing them trouble. So for example if their brush engine gets a 100% pressure value followed immediately by 20 10% values, it will smooth out that jump over the first let's say 10 of the smaller values by increasing their size.

Now let's say a correction comes in for the initial 100% value after the second input prediction has occurred. You change its size, but the original data already propagated down the rest of the stroke. It seems like you'd have no choice but to completely re-render the stroke using the corrected data, right?

If that's the case then they should probably just entirely disable this smoothing effect. The pressure sensitivity on the Pencil is impressively fine already, I can't see it being necessary.

-----

Want to add that I'm sorry if this discussion comes off as trying to start an echo chamber storm, a ___gate, or whatever. I'm very impressed with the Pencil and I want to get this solved before the bulk of users start drawing with it. I'm changing the thread title a bit to make it less inflammatory
 
Last edited:
In that case I want to ask you about how you'd handle the situation Procreate's dealing with.

They perform some smoothing on decreasing pressure data to create a taper effect when you reduce pressure suddenly, which I believe is what's causing them trouble. So for example if their brush engine gets a 100% pressure value followed immediately by 20 10% values, it will smooth out that jump over the first let's say 10 of the smaller values by increasing their size.

Now let's say a correction comes in for the initial 100% value after the second input prediction has occurred. You change its size, but the original data already propagated down the rest of the stroke. It seems like you'd have no choice but to completely re-render the stroke using the corrected data, right?

If that's the case then they should probably just entirely disable this smoothing effect. The pressure sensitivity on the Pencil is impressively fine already, I can't see it being necessary.

So, I will say that I haven't been the one working directly with our own "brush engine" as you refer to it (we use a different name internally). So deciding what to redraw is actually not something I've thought about directly. Instead, I've been working more on the integration between the incoming events and the engine. The engine decides what to redraw based on what new data it gets.

But here's the thing, corrected inputs from predictions is now a thing you constantly should be considering with the Pencil. You don't get touches one at a time, and you are constantly being fed predictions for latency reasons. You get events like so:

Touch Event [Touches 1-4, Predicted Touches 5-8] -> Touch Event [Touches 5-8, Predicted Touches 9-12] -> Etc

So, to cut down on latency, you really should be using those predicted touches, as that buys you 1/60th of a second of latency (almost 17ms). Now, interspersed with these will be the updated force information. Some of the real touches will have real force data, some won't, so you need to be listening to that. And all these updates from prediction to actual come with timestamps so you can know what input data needs to be refreshed when those updates do appear.

If you know how your smoothing works as a developer, you know which points in the stroke are affected by other strokes. I.e. If I smooth touch 1-3 as a group, then when I update touch 1, I need to redraw all three. I can very easily calculate the bounding box from that information. So, as long as smoothing is not applied along the entire stroke, which is really bad for performance reasons, you can also limit your redraws to just the affected portion of the stroke. And this sort of logic is already required if you want to use those predicted touches.

Turning off the smoothing I think is a bit of a mistake because it can't make good data any worse. But it can make bad data (like leaving poor predictions in place) look funny when you don't correct it.
 
Pardon the guesses by a lay person, but it sounds like Apple solved the active stylus issues vs EMR technology with some seriously clever and powerful algorithms vs strictly a hardware technological leap?

This gives me a small hope that MS might eventually be able to solve the N-trig limitations with software in the future. I sure hope all this stylus competition knocks Wacom off their high horse and we can actually start seeing some awesome stylii from all companies.
 
  • Like
Reactions: ara-mia
Got my Pencil today, been spending a lot of time with it. Generally impressed, but there's a major issue right now that wasn't visible on the in-store demos because of the software Apple provided.

Basically, on a certain percentage of strokes the first data point that comes in has no pressure data and receives a placeholder value of 50%.

Might be related to latency of BT vs the digitizer (first contact event is detected before any pressure comes in). May be related to the digitizer recognizing Pencil input just a hair above the actual screen, which would result in a touch event that has no pressure data associated with it. (I would make a video of this second thing happening, but I'm not sure it would be possible to tell on video).

Reproduction:
In Procreate, create a brush which has no dynamics aside from Pencil pressure size variation. No speed, tilt, taper, etc. effects to mess up the results. Make sure it has a wide pressure range. Draw a series of quick strokes with light pressure. Some of them will be way fatter than the others to a degree that could not be an accidental application of pressure.

What's going on? Watch the video (slow motion on the second row) and ignore the fact that it's in portrait:
I want to reiterate that those fat lines are absolutely not the product of applying significantly more pressure. I tested this extensively. I also want to make it clear that pressure sensitivity once you've started a stroke is excellent (link)

3 cases:
-Works fine, everything is normal.
-An initial large circle appears because of the missing pressure data, but is corrected. No problem.
-The correction either doesn't happen, or comes in late and the brush engine begins interpolation from that initial (wrong) pressure data towards the real (much lower) values that are coming in. Part of this could be resolved if Procreate just got rid of the pressure smoothing, but that's not the point.

Here's a different case: Notability doesn't ever correct the initial bad pressure data, but it also doesn't seem to use any pressure smoothing so we get to see the error unfiltered.

3cE6jgH.jpg

Notice how the error value isn't 0% or 100% pressure, but some value in between.

The only app out there right now that doesn't display this issue is Notes, but I've seen it perform a correction to fix it. It's much less common, but I've spotted a dark pencil line replaced with a lighter one.

So... that's quite a thing. I don't want to make any conclusions yet. It may be as simple as other devs needing to fix their prediction/correction implementations so they work like Notes. I was actually hesitant to post this right now, but getting these things fixed requires visibility.

It's look like ink is leaking
 
So, I will say that I haven't been the one working directly with our own "brush engine" as you refer to it (we use a different name internally). So deciding what to redraw is actually not something I've thought about directly. Instead, I've been working more on the integration between the incoming events and the engine. The engine decides what to redraw based on what new data it gets.

But here's the thing, corrected inputs from predictions is now a thing you constantly should be considering with the Pencil. You don't get touches one at a time, and you are constantly being fed predictions for latency reasons. You get events like so:

Touch Event [Touches 1-4, Predicted Touches 5-8] -> Touch Event [Touches 5-8, Predicted Touches 9-12] -> Etc

So, to cut down on latency, you really should be using those predicted touches, as that buys you 1/60th of a second of latency (almost 17ms). Now, interspersed with these will be the updated force information. Some of the real touches will have real force data, some won't, so you need to be listening to that. And all these updates from prediction to actual come with timestamps so you can know what input data needs to be refreshed when those updates do appear.

If you know how your smoothing works as a developer, you know which points in the stroke are affected by other strokes. I.e. If I smooth touch 1-3 as a group, then when I update touch 1, I need to redraw all three. I can very easily calculate the bounding box from that information. So, as long as smoothing is not applied along the entire stroke, which is really bad for performance reasons, you can also limit your redraws to just the affected portion of the stroke. And this sort of logic is already required if you want to use those predicted touches.

Turning off the smoothing I think is a bit of a mistake because it can't make good data any worse. But it can make bad data (like leaving poor predictions in place) look funny when you don't correct it.

Appreciate the detailed response. I can see that this is going to be a serious learning curve for developers, because you've basically just described multiplayer game engine networking (prediction, backstepping and repeating simulation ticks with corrected data, not always getting all the data in order or in once piece). In that situation you can always get out of a jam by just getting the latest data from the server and overriding the client simulation, because the path is less important as long as all moving objects end up in the right places and don't appear to behave erratically.

In this case the path itself is the goal so any corrections need to be processed in terms of their effects on the events that came after.

Let me ask you a now completely tangential question while you're here: You don't have the option to turn off vsync, right? Do you know if iOS is using something equivalent to triple buffering? I ask because that could easily be a few frames of lag and I can't see the screen tearing mattering when the only thing changing on screen is a single line being drawn.

Pardon the guesses by a lay person, but it sounds like Apple solved the active stylus issues vs EMR technology with some seriously clever and powerful algorithms vs strictly a hardware technological leap?

This gives me a small hope that MS might eventually be able to solve the N-trig limitations with software in the future. I sure hope all this stylus competition knocks Wacom off their high horse and we can actually start seeing some awesome stylii from all companies.

It's definitely hard to say how much is software vs. hardware here. Apple's touch digitizers have always had very low latency compared to the competition, but the predictive input must have made a solid difference too.

I believe the linearity improvements point to a higher density digitizer, which is a hardware solution, but not really a hard one to implement (that would be more about the manufacturing costs).

Pressure curve quality is part software, part hardware. You can only do so much with data from a poor-quality sensor. Tilt is a similar thing (requires more sensors).

It's definitely a great time for digital artists though. Wacom will have to do something about their expensive, heavy, fat, overheating Cintiq Companion. MS is obviously interested in improving the N-trig tech so I expect to see big things in the SP5 too.

My pencil arrives today. Note to self, "Stay away from all the Pencil issues thread".

Would it help if I added that I can't stop sketching on this thing regardless of the issue being discussed here?
 
Last edited:
  • Like
Reactions: fieldsphotos
Would it help if I added that I can't stop sketching on this thing regardless of the issue being discussed here?
And Marco Arment tweeted that the Pencil is "amazing". I only mention that because I wouldn't call Marco an Apple apologist by any means. He pissed all over the new MacBook earlier this year.
 
Appreciate the detailed response. I can see that this is going to be a serious learning curve for developers, because you've basically just described multiplayer game engine networking (prediction, backstepping and repeating simulation ticks with corrected data, not always getting all the data in order or in once piece). In that situation you can always get out of a jam by just getting the latest data from the server and overriding the client simulation, because the path is less important as long as all moving objects end up in the right places and don't appear to behave erratically.

In this case the path itself is the goal so any corrections need to be processed in terms of their effects on the events that came after.

I think the key difference is that Apple does what they can to make it very easy for developers to craft a solution. In general, you should be keeping the current stroke available with more details while the user is manipulating it, and then stripping down the data once the stroke is done and you can jettison things like the timestamps.

The other difference is that a stroke is simply a dataset, and the effects are VERY deterministic. Editing existing data isn't as big a deal as it would be in a larger simulation. Every stroke is isolated, and once it is on the page, you don't need to go mucking with it again.

Let me ask you a now completely tangential question while you're here: You don't have the option to turn off vsync, right? Do you know if iOS is using something equivalent to triple buffering? I ask because that could easily be a few frames of lag and I can't see the screen tearing mattering when the only thing changing on screen is a single line being drawn.

Hah, another area that I have an interest in, but haven't spent much time in, personally. No, vsync is always on with iOS. As for what buffering it uses? It depends on what part of the stack you are working with. If you are playing with OGL or Metal, I believe you have the option to turn on triple buffering at the expense of RAM. But if you are built on top of UIKit? I'm not sure. I'd honestly be surprised if they would use the RAM to pursue it. It would mean every app would be using more RAM just for a bit of latency.
 
Last edited:
*edit: Blah blah a lot of words...

This entire bug comes down to a SINGLE MISPLACED EXCLAMATION POINT (NOT symbol in programming terms) breaking the entire prediction/correction system for the demo app.

Every developer almost certainly copied code from this demo app because there's almost zero other info on using the new touch data.

Bascially, it was saying this:
If this point is expecting an update for [FORCE/POSITION/ETC] then DO NOT DO that update.
If this point is NOT expecting an update for [FORCE/POSITION/ETC] then DO that update.

Remove !, solve problem, even fixed the issue with the example app where points on the line were never changing from orange (update pending) to green (update completed, data from coalesced source).

Not only is this a software issue, it's one with an incredibly simple solution that was propagated by reckless copy-paste. The best way to mess up a room full of programmers is to give them an example app with an existing bug, because we'll happily copy it and assume it works.

The line in question is in TouchCanvas Line.swift in the LinePoint::updateWithTouch function.
Code:
guard !estimatedPropertiesExpectingUpdates.contains(expectedProperty) else { continue }
should be
guard estimatedPropertiesExpectingUpdates.contains(expectedProperty) else { continue }

Someone else spread the word, no one likes to listen to me because of my long-winded tech rants, even when I finally get to the answer. :p
 
Last edited:
Every developer almost certainly copied code from this demo app because there's almost zero other info on using the new touch data.

Interesting (but not surprising) that sample code has a bug in it. I can see why this propagated when there was no good way to test it until you got a Pencil. I will say that not every developer copied the code. I expect those lucky enough to get access to build the stage demos got better support, before that code even existed.
 
@AtomicWalrus -- pretty astute observation! I wonder what would be the best way to disseminate this to the devs.

Can't believe it was just a "notting" issue. It's like everyone's copying bad StackOverflow code! Not that I've eeeever done the same. ;)
 
Can't believe it was just a "notting" issue. It's like everyone's copying bad StackOverflow code! Not that I've eeeever done the same. ;)

I'd say you really shouldn't be submitting a feature to the public without testing it first (even if you are copying bad StackOverflow code)... but there really isn't a good way to meet both the clamoring user base who wants day one support, and also test with hardware Apple won't share with you before release. I think you are screwed no matter what you do. Ship on day one, get hit with bugs like this. Ship later so you can validate it, and people ask why you aren't out on the App Store with your update yet.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.