Flutter活体检测与扫描插件flutter_accurascan_fm_liveness的使用
Flutter活体检测与扫描插件flutter_accurascan_fm_liveness的使用
简介
Accura Scan OCR用于光学字符识别。
Accura Scan Face Match用于匹配两张人脸图像,即源脸和目标脸。它会将用户自拍与文档中的用户图像进行匹配。
Accura Scan Authentication用于客户验证和认证。通过3D自拍技术解锁用户的真实身份。
以下步骤将Accura Scan的SDK集成到你的项目中。
注意事项
在pubspec.yaml
文件中添加flutter_accurascan_fm_liveness
依赖:
dependencies:
flutter_accurascan_fm_liveness: ^版本号
导入flutter库:
import 'package:flutter_accurascan_fm_liveness/flutter_accurascan_fm_liveness.dart';
1. 安卓设置
在AndroidManifest.xml
文件中添加权限:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
在根目录下的build.gradle
文件末尾添加仓库:
allprojects {
repositories {
google()
jcenter()
maven {
url 'https://jitpack.io'
credentials { username 'jp_ssguccab6c5ge2l4jitaj92ek2' }
}
}
}
在app/build.gradle
文件中添加打包选项:
packagingOptions {
pickFirst 'lib/arm64-v8a/libcrypto.so'
pickFirst 'lib/arm64-v8a/libssl.so'
pickFirst 'lib/armeabi-v7a/libcrypto.so'
pickFirst 'lib/armeabi-v7a/libssl.so'
pickFirst 'lib/x86/libcrypto.so'
pickFirst 'lib/x86/libssl.so'
pickFirst 'lib/x86_64/libcrypto.so'
pickFirst 'lib/x86_64/libssl.so'
}
2. iOS设置
- 使用命令安装Git LFS。
- 运行
pod install
。 - 在
Info.plist
文件中添加权限:
<key>NSCameraUsageDescription</key>
<string>应用使用摄像头进行文档扫描。</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>应用使用照片获取文档图片。</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>应用使用照片保存文档图片。</string>
3. 设置Accura Scan许可证到项目中
对于Android
在app/src/main
下创建一个assets
文件夹,并将许可证文件添加到该文件夹中:
- accuraface.license // for Accura Scan Face Match
生成你的Accura Scan许可证,可以从 https://accurascan.com/developer/dashboard 获取。
对于iOS
将许可证文件放置在项目的Runner
目录中,并将其添加到目标中。
4. 活体检测方法
以下是活体检测的方法示例:
Future<void> startLiveness() async {
SystemChrome.setPreferredOrientations([DeviceOrientation.portraitUp]);
try {
var accuraConfs = {
"face_uri": this.faceMatchURL, // 如果只需要活体分数,传递空字符串;否则传递人脸图像路径以获得面部匹配分数
};
await AccuraLiveness.setLivenessFeedbackTextSize(18);
await AccuraLiveness.setLivenessFeedBackframeMessage("框住您的脸部");
await AccuraLiveness.setLivenessFeedBackAwayMessage("远离手机");
await AccuraLiveness.setLivenessFeedBackOpenEyesMessage("保持眼睛睁开");
await AccuraLiveness.setLivenessFeedBackCloserMessage("靠近手机");
await AccuraLiveness.setLivenessFeedBackCenterMessage("移动手机居中");
await AccuraLiveness.setLivenessFeedbackMultipleFaceMessage("检测到多个脸部");
await AccuraLiveness.setLivenessFeedBackFaceSteadymessage("保持头部稳定");
await AccuraLiveness.setLivenessFeedBackBlurFaceMessage("检测到模糊");
await AccuraLiveness.setLivenessFeedBackGlareFaceMessage("检测到眩光");
await AccuraLiveness.setLivenessBlurPercentage(80);
await AccuraLiveness.setLivenessGlarePercentage_0(-1);
await AccuraLiveness.setLivenessGlarePercentage_1(-1);
await AccuraLiveness.setLivenessFeedBackLowLightMessage("检测到低光环境");
await AccuraLiveness.setLivenessfeedbackLowLightTolerence(39);
await AccuraLiveness.setLivenessURL("Your Liveness Url");
await AccuraLiveness.startLiveness([accuraConfs])
.then((value) => {
setState(() {
dynamic result = json.decode(value);
})
})
.onError((error, stackTrace) => {});
} on PlatformException {}
}
10. 面部匹配方法
打开图库
图库1
Future<void> openGallery() async {
try {
var accuraConfs = {
"face1": this.facematchURI,
"face2": this.facematchURI2
};
await AccuraOcr.getGallery1([accuraConfs]).then((value) => {
setState(() {
_result = json.decode(value);
facematchURI = _result["Image"];
if (_result.toString().contains("score")) {
Score = _result["score"];
}
print("RESULT:- $_result");
})
}).onError((error, stackTrace) => {});
} on PlatformException {}
if (!mounted) return;
}
图库2
Future<void> openGallery2() async {
try {
var accuraConfs = {
"face1": this.facematchURI,
"face2": this.facematchURI2
};
await AccuraOcr.getGallery2([accuraConfs]).then((value) => {
setState(() {
_result = json.decode(value);
facematchURI2 = _result["Image"];
if (_result.toString().contains("score")) {
Score = _result["score"];
}
print("RESULT:- $_result");
})
}).onError((error, stackTrace) => {});
} on PlatformException {}
if (!mounted) return;
}
打开相机进行面部匹配
相机1
Future<void> openCamera() async {
try {
var accuraConfs = {
"face1": this.facematchURI,
"face2": this.facematchURI2
};
await AccuraFacematch.setFaceMatchFeedbackTextSize(18);
await AccuraFacematch.setFaceMatchFeedBackframeMessage("框住您的脸部");
await AccuraFacematch.setFaceMatchFeedBackAwayMessage("远离手机");
await AccuraFacematch.setFaceMatchFeedBackOpenEyesMessage("保持眼睛睁开");
await AccuraFacematch.setFaceMatchFeedBackCloserMessage("靠近手机");
await AccuraFacematch.setFaceMatchFeedBackCenterMessage("移动手机居中");
await AccuraFacematch.setFaceMatchFeedbackMultipleFaceMessage("检测到多个脸部");
await AccuraFacematch.setFaceMatchFeedBackFaceSteadymessage("保持头部稳定");
await AccuraFacematch.setFaceMatchFeedBackLowLightMessage("检测到低光环境");
await AccuraFacematch.setFaceMatchFeedBackBlurFaceMessage("检测到模糊");
await AccuraFacematch.setFaceMatchFeedBackGlareFaceMessage("检测到眩光");
await AccuraFacematch.setFaceMatchBlurPercentage(80);
await AccuraFacematch.setFaceMatchGlarePercentage_0(-1);
await AccuraFacematch.setFaceMatchGlarePercentage_1(-1);
await AccuraFacematch.getCamera1([accuraConfs]).then((value) => {
setState(() {
_result = json.decode(value);
facematchURI = _result["Image"];
if (_result.toString().contains("score")) {
Score = _result["score"];
}
print("RESULT:- $_result");
})
});
} on PlatformException {}
if (!mounted) return;
}
相机2
Future<void> openCamera2() async {
try {
var accuraConfs = {
"face1": this.facematchURI,
"face2": this.facematchURI2
};
await AccuraFacematch.setFaceMatchFeedbackTextSize(18);
await AccuraFacematch.setFaceMatchFeedBackframeMessage("框住您的脸部");
await AccuraFacematch.setFaceMatchFeedBackAwayMessage("远离手机");
await AccuraFacematch.setFaceMatchFeedBackOpenEyesMessage("保持眼睛睁开");
await AccuraFacematch.setFaceMatchFeedBackCloserMessage("靠近手机");
await AccuraFacematch.setFaceMatchFeedBackCenterMessage("移动手机居中");
await AccuraFacematch.setFaceMatchFeedbackMultipleFaceMessage("检测到多个脸部");
await AccuraFacematch.setFaceMatchFeedBackFaceSteadymessage("保持头部稳定");
await AccuraFacematch.setFaceMatchFeedBackLowLightMessage("检测到低光环境");
await AccuraFacematch.setFaceMatchFeedBackBlurFaceMessage("检测到模糊");
await AccuraFacematch.setFaceMatchFeedBackGlareFaceMessage("检测到眩光");
await AccuraFacematch.setFaceMatchBlurPercentage(80);
await AccuraFacematch.setFaceMatchGlarePercentage_0(-1);
await AccuraFacematch.setFaceMatchGlarePercentage_1(-1);
await AccuraFacematch.getCamera2([accuraConfs]).then((value) => {
setState(() {
_result = json.decode(value);
facematchURI2 = _result["Image"];
if (_result.toString().contains("score")) {
Score = _result["score"];
}
print("RESULT:- $_result");
})
});
} on PlatformException {}
if (!mounted) return;
}
示例代码
import 'package:flutter/material.dart';
import 'package:flutter_accurascan_fm_liveness_example/FaceMatch.dart';
import 'package:flutter_accurascan_fm_liveness_example/Liveness.dart';
void main() {
runApp(const MaterialApp(
home: HomePage(),
debugShowCheckedModeBanner: false,
));
}
class HomePage extends StatefulWidget {
const HomePage({Key? key}) : super(key: key);
[@override](/user/override)
State<HomePage> createState() => _HomePageState();
}
class _HomePageState extends State<HomePage> {
[@override](/user/override)
void initState() {
super.initState();
}
[@override](/user/override)
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: const Text("Accura Scan"),
backgroundColor: Colors.red[800],
),
body: Container(
decoration: const BoxDecoration(
image: DecorationImage(
image: AssetImage("assets/images/bg_home.png"),
fit: BoxFit.cover
)
),
child: SingleChildScrollView(
child: Center(
child: Column(
children: [
const SizedBox(height: 30,),
Container(
width: 180,
child: ElevatedButton(
style: ElevatedButton.styleFrom(
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(5.0),
), backgroundColor: Colors.red[800],
padding: const EdgeInsets.only(
top: 10, bottom: 10, right: 20, left: 20)),
child: Row(
children: [
Image.asset(
"assets/images/ic_liveness.png",
height: 30,
width: 30,
),
const SizedBox(
width: 10,
),
const Text(
"LIVENESS",
style: TextStyle(color: Colors.white),
),
],
),
onPressed: () {
Navigator.push(context, MaterialPageRoute(builder: (context) => const MyApp()));
},
),
),
SizedBox(height: 30,),
Container(
width: 180,
child: ElevatedButton(
style: ElevatedButton.styleFrom(
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(5.0),
), backgroundColor: Colors.red[800],
padding: const EdgeInsets.only(
top: 10, bottom: 10, right: 20, left: 20)),
child: Row(
children: [
Image.asset(
"assets/images/ic_facematch.png",
height: 30,
width: 30,
),
const SizedBox(
width: 10,
),
const Text(
"FACEMATCH",
style: TextStyle(color: Colors.white),
),
],
),
onPressed: () {
Navigator.push(context, MaterialPageRoute(builder: (context) => const FaceMatch()));
},
),
)
],
),
)
)
),
);
}
}
更多关于Flutter活体检测与扫描插件flutter_accurascan_fm_liveness的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html
更多关于Flutter活体检测与扫描插件flutter_accurascan_fm_liveness的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html
当然,以下是如何在Flutter项目中集成和使用flutter_accurascan_fm_liveness
插件来进行活体检测与扫描的一个示例代码案例。这个插件假设已经发布在pub.dev上,或者你可以通过其他方式获取它。
1. 添加依赖
首先,在你的pubspec.yaml
文件中添加flutter_accurascan_fm_liveness
依赖:
dependencies:
flutter:
sdk: flutter
flutter_accurascan_fm_liveness: ^latest_version # 请替换为实际发布的最新版本号
然后运行flutter pub get
来安装依赖。
2. 导入插件
在你的Dart文件中导入插件:
import 'package:flutter_accurascan_fm_liveness/flutter_accurascan_fm_liveness.dart';
3. 配置权限
由于活体检测通常涉及相机和可能的存储访问,你需要在AndroidManifest.xml
和Info.plist
中添加相应的权限。
Android
在AndroidManifest.xml
中添加:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
iOS
在Info.plist
中添加:
<key>NSCameraUsageDescription</key>
<string>App needs access to the camera</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>App needs access to the photo library</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>App needs access to the photo library</string>
4. 使用插件进行活体检测与扫描
在你的Flutter应用中,你可以这样使用插件:
import 'package:flutter/material.dart';
import 'package:flutter_accurascan_fm_liveness/flutter_accurascan_fm_liveness.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
late FlutterAccurascanFmLiveness _accurascan;
@override
void initState() {
super.initState();
_accurascan = FlutterAccurascanFmLiveness();
_startLivenessDetection();
}
Future<void> _startLivenessDetection() async {
try {
// 配置活体检测参数(这里假设插件提供了一些配置方法)
// 例如:_accurascan.configure(someConfiguration);
// 启动活体检测
var result = await _accurascan.startLivenessDetection();
if (result.success) {
// 处理成功的扫描结果
print("Liveness detection successful: ${result.data}");
} else {
// 处理失败的情况
print("Liveness detection failed: ${result.error}");
}
} catch (e) {
// 处理异常
print("Error during liveness detection: $e");
}
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('Flutter Accurascan FM Liveness Demo'),
),
body: Center(
child: CircularProgressIndicator(), // 显示加载指示器,直到活体检测完成
),
),
);
}
}
注意事项
- 插件方法:上述代码中的
configure
和startLivenessDetection
方法是假设的,具体方法名称和参数需要参考插件的官方文档。 - UI更新:在实际应用中,你可能需要在检测过程中更新UI,例如在检测到人脸时显示一个框,或者显示检测进度。
- 错误处理:确保妥善处理各种可能的错误情况,例如权限被拒绝、设备不支持等。
- 插件版本:由于插件可能会更新,请参考最新的插件文档和示例代码。
这个示例提供了一个基本的框架,你可以根据具体需求进行扩展和修改。