WOPLivenessDetect
is a third party for iOS liveness detection, We provide basic face feature detection and require users to cooperate with some actions to prove that they are real people.
WOPLivenessDetect
checks for the following actions and properties
- Only one face
- The face is in the specified display view
- Smile
- Open mouth
- Shaking head left and right
To run the example project, clone the repo, and run pod install
from the Example directory first.
WOPLivenessDetect works on iOS 10.0+. It depends on the Google MLKit, AVFoundation
WOPLivenessDetect is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'WOPLivenessDetect'
Using WOPLivenessDetect
is easy, you can only init and show WOPLivenessDetectViewController
, all the detect will complete in the WOPLivenessDetectViewController
, We provide the block or delegate callback for the detect result photos. The caller can execute your face compare logic
you can use block to get callback after success
WOPLivenessDetectViewController *detectVC = [WOPLivenessDetectViewController new];
detectVC.livenessDetectSuccessBlock = ^(NSArray<UIImage *> * _Nonnull imageArray) {
# do something
};
detectVC.modalPresentationStyle = UIModalPresentationOverFullScreen;
[self presentViewController: detectVC animated: true completion: nil];
or use detect success delegate
/** 活体检测成功 */
- (void)livenessDetectSuccessWithAllTaskImage:(NSArray<UIImage *> *)imageArray;
Jack, [email protected]
WOPLivenessDetect is available under the MIT license. See the LICENSE file for more info.