The Apple review wouldn't need to make that determination. Any positive matches would be to images from the NCMEC database where the images are already catalogued and subjects are known to be underage.
False positive matches are pure coincidental matches. They are no more likely to happen to porn-like images than any other kind of image. They are chance mathematical collisions with completely random images. (Apple claims their testing shows this is a less than 1 in a trillion chance of it happening to you, per year. That's about 1 occurrence to somebody, somewhere every 1000 years)
Because of this, the Apple reviewer has almost nothing to do. They would only need to glance at the collection of thumbnails, and decide if the collection looks suspect or if it's that 1 in 1000 year event and it looks completely random.
It's possible that when this event happens, in the year 2785 or so, the false positive collection could coincidentally contain legal adult porn and get passed to NCMEC as 'suspect' even though it's legal. But that would be an incredibly unlucky coincidence. Even if that did happen, NCMEC would get the report, check the thumbnails against the real images on record, and immediately realise they aren't a match. I guess they might want to investigate if the subjects are underage anyway, but because they only have the low res derivative to work from its probably not going to be productive.