NSFWDetector is a small (17 kB) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.
guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
return
}
detector.check(image: image, completion: { result in
switch result {
case let .success(nsfwConfidence: confidence):
if confidence > 0.9 {
// 😱🙈😏
} else {
// ¯\_(ツ)_/¯
}
default:
break
}
})
If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.
NSFWDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'NSFWDetector'
The Machine Learning Model is only 17 kB in size, so App size won't be affected compared to other libraries using the yahoo model.
If you don't want to use the Detection Code, you can also just download the MLModel file directly from the latest Release.
If you recognize issues with certain kind of pictures, feel free to reach out via Mail or Twitter.
Michael Berg, [email protected]
NSFWDetector is available under the BSD license. See the LICENSE file for more info.