I highly encourage anyone who hasn't actually read the article to go back and do so. Aside from the claims of the manufacturere seeming to be outright fraudelent, the reported brought up a good point from a CMU scientist.
Dave Touretzky, a senior research scientist in the computer science department
at Carnegie Mellon University, doubts Exotrope's claims.
"How do you tell the difference between a woman in a bikini in a sailboat which is
not racy and a naked woman in a sailboat?" Touretzky asks. "The only difference
is a couple of nipples and a patch of pubic hair. You're not going to be able to find
that with a neural network."
"If they don't disclose the training data, there's no way to figure out what's going
on," Touretzky says. "But anyone who knows anything about neural networks
knows there's no way it can do what they're claiming."
I'm not exactly a proponent of pr0n, but it seems to me that the whole idea of a neural network judging whether or not images are 'obscene' or 'adult' in nature is laughable. You can't even get two randomly selected people to agree on obscenity (and pornography) actually is.
You can't do it only on skin tones. Adult cartoons will slip right on past. Many innocent pictures will get canned. I could be wrong, but to give this even a shot at working it seems to be that you would need a big-blue class machine to realtime pattern recognition based on hundreds of thousands, if not millions of patterns.