From a technical perspective, this would not work. A simple hash scan would be ineffective. If you used a hash, changing a single pixel would break it. What would need to be done would be to change the image into a vector, then train ML software to rotate, scale and compare that vector to a very large database. This would result in a very large percentage of false positives. When you had a positive, you would need to upload the original non vectored image and have a human compare it to the suspected child porn image. As most of these false positives would be very intimate images, this would be an extreme violation of privacy.