So it is able to correlate an image as porn and also correlate an image as containing children. Seems like it should be able to apply an AND operation to this result and identify new images that are not part of the data set.
The AI doesn’t even need to apply the AND. Two AI queries. Then AND the results with one non-AI operation.
No, it found elements in an image that it tends to find in images labelled porn in the training data. It finds elements in an image it tends to find in images labelled child in the training data. If the training data is not representative, then the statistical inference is meaningless. Images that are unlike any in the training set may not trigger either category if they are lacking the things the AI expects to find, which may be quite irrelevant to what humans care about.