It wouldn’t be a “decrease accuracy” thing, it would be something more akin to increasing the allowable dead pixel count.
[doublepost=1508982038][/doublepost]
Yep. Think of it this way. Say the red dot generator spec was +any, -100. At 29,900 min the mfg yields sucked, only 20%. So they ask the mfgs what the parts are coming out like, and they say if you would drop the min down to 29,000 the allowable yield jumps to 80% acceptance rate.
Then Apple decides, well even at 25,000 red dots our data showed we were still getting an acceptable rate of false negatives / false positives so we’ll drop the min to 29,000. All they’ve done at that point is eaten into their degradation margin, not accuracy.
[doublepost=1508982377][/doublepost]
Strawman. In your example you assume they have to loosen the tolerance by 33%. Nowhere was it suggested that the tolerance changed that much. They could have loosened it by 5% and dramatically increased the parts acceptance rate.