NSFWDetector is able to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.
guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
return
}
detector.check(image: image, completion: { result in
switch result {
case let .success(nsfwConfidence: confidence):
if confidence > 0.9 {
// 😱🙈😏
} else {
// ¯\_(ツ)_/¯
}
default:
break
}
})
If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.
NSFWDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'NSFWDetector', :git => 'https://github.com/lovoo/NSFWDetector.git'
Just download the MLModel file from the latest Release
Michael Berg, [email protected]
NSFWDetector is available under the BSD license. See the LICENSE file for more info.