Skip to content

lovoo/NSFWDetector

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NSFWDetector

Version License Platform

NSFWDetector is able to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.

Usage

guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
    return
}

detector.check(image: image, completion: { result in
    switch result {
    case let .success(nsfwConfidence: confidence):
        if confidence > 0.9 {
            // 😱🙈😏
        } else {
            // ¯\_(ツ)_/¯
        }
    default:
        break
    }
})

If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.

Installation

NSFWDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod 'NSFWDetector', :git => 'https://github.com/lovoo/NSFWDetector.git'

App Size

The Machine Learning Model is only 17 kB, so App size won't be affected compared to other libraries using the yahoo model.

Using just the Model

If you don't want to use the Detection Code, you can also just download the MLModel file directly from the latest Release.

Author

Michael Berg, [email protected]

License

NSFWDetector is available under the BSD license. See the LICENSE file for more info.