Flutter音频通信插件zego_zim_audio的使用
Flutter音频通信插件zego_zim_audio的使用
概述
借助ZIM Audio SDK完整的语音处理能力,包括语音捕获、播放、解码、噪声抑制(ANS)、自动增益控制(AGC)等,开发者可以轻松实现高清语音消息的发送和接收,而无需担心底层音频处理的实现细节。
功能介绍
语音录制
支持控制语音录制、完成和取消操作,并通过相关事件检索当前录制状态。同时,录制的语音文件将存储在你提供的文件路径中。
语音播放
支持开始和停止语音播放。
音量增益控制和噪声抑制
支持动态调整音频输入的增益并启用噪声抑制,以提高语音录制的质量。
适用场景
发送语音消息
你可以使用ZEGO Audio SDK来录制和播放语音消息,然后通过发送富媒体消息的方式将录制的语音文件发送给其他用户。
开始使用
本项目是一个用于Flutter的插件包,包含了适用于Android和/或iOS的平台特定实现代码。
完整示例代码
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:zego_zim_audio/zego_zim_audio.dart';
import 'package:path_provider/path_provider.dart';
import 'dart:io' show Directory, File;
void main() {
runApp(MyApp());
}
class MyApp extends StatefulWidget {
const MyApp({super.key});
[@override](/user/override)
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
String _version = "";
List<String> _recordings = [];
Map<String, int> recordDurationMap = {};
bool _isRecording = false;
int _currentRecordDuration = 0;
String _currentPlaying = '';
int _currentPlayingDuration = 0;
String _waitingPlaying = '';
bool _isPlaying = false;
bool _isSpeaker = true;
bool isCancelRecord = false;
ScrollController _scrollController = ScrollController();
final GlobalKey<_MyAppState> key = GlobalKey();
Future<void> clearApplicationDocumentsDirectory() async {
Directory appDocDir = await getApplicationDocumentsDirectory();
if (appDocDir.existsSync()) {
await for (var entity in appDocDir.list()) {
if (entity is File) {
await entity.delete();
}
}
}
}
[@override](/user/override)
void initState() {
super.initState();
clearApplicationDocumentsDirectory();
ZIMAudio.getVersion().then((value) => {
_version = value,
setState(() {})
});
ZIMAudio.getInstance().init("");
ZIMAudioEventHandler.onError = (ZIMAudioError errorInfo) {
ErrorDiaLog.showFailedDialog(
key.currentContext!,
errorInfo.code.toString(),
'onError.message:${errorInfo.message}');
};
ZIMAudioEventHandler.onRecorderCompleted = (int totalDuration) async {
setState(() {
_isRecording = false;
recordDurationMap['${_recordings.length + 1}'] = totalDuration;
_recordings.add('${_recordings.length + 1}');
});
Future.delayed(Duration(milliseconds: 500), () {
_scrollToTop();
});
};
ZIMAudioEventHandler.onRecorderFailed = (int errorCode) async {
setState((){
_isRecording = false;
ErrorDiaLog.showFailedDialog(key.currentContext!, errorCode.toString(), 'onRecorderFailed');
});
};
ZIMAudioEventHandler.onRecorderStarted = () async {
_isRecording = true;
_currentRecordDuration = 0;
setState(() {});
};
ZIMAudioEventHandler.onRecorderCancelled = () {
_isRecording = false;
setState(() {});
};
ZIMAudioEventHandler.onRecorderProgress = (int currentDuration) {
if (kDebugMode) {
print('[Flutter] onRecorderProgress:$currentDuration');
}
_currentRecordDuration = currentDuration;
setState(() {});
};
ZIMAudioEventHandler.onPlayerStarted = (int duration) {
setState(() {
_isPlaying = true;
_currentPlaying = _waitingPlaying;
_currentPlayingDuration = duration;
});
};
ZIMAudioEventHandler.onPlayerEnded = () {
setState(() {
_currentPlaying = '';
_isPlaying = false;
});
};
ZIMAudioEventHandler.onPlayerStopped = () {
setState(() {
_currentPlaying = '';
_isPlaying = false;
});
};
ZIMAudioEventHandler.onPlayerInterrupted = () {
setState(() {
_currentPlaying = '';
_isPlaying = false;
});
};
ZIMAudioEventHandler.onPlayerProgress = (int currentDuration) {
setState(() {
_currentPlayingDuration = currentDuration;
});
};
ZIMAudioEventHandler.onPlayerFailed = (int errorCode) {
setState(() {
_isPlaying = false;
_currentPlaying = '';
ErrorDiaLog.showFailedDialog(key.currentContext!, errorCode.toString(), 'onPlayerFailed');
});
};
}
void _startRecording() async {
try {
Directory? dic = await getApplicationDocumentsDirectory();
String path = '${dic.path}/${_recordings.length + 1}.mp3';
await ZIMAudio.getInstance().startRecord(ZIMAudioRecordConfig(path, maxDuration: 120000));
} catch (e) {
print("Error starting recording: $e");
}
}
void _CancelRecording() async {
try {
await ZIMAudio.getInstance().cancelRecord();
} catch (e) {
print("Error cancel recording: $e");
}
}
void _CompleteRecording() async {
try {
await ZIMAudio.getInstance().completeRecord();
} catch (e) {
print("Error complete recording: $e");
}
}
void _startPlaying(String recording) async {
if (_isRecording) {
return;
}
try {
Directory? dic = await getApplicationDocumentsDirectory();
String path = '${dic.path}/$recording.mp3';
await ZIMAudio.getInstance().startPlay(ZIMAudioPlayConfig(path, routeType: _isSpeaker ? ZIMAudioRouteType.speaker : ZIMAudioRouteType.receiver));
_waitingPlaying = recording;
} catch (e) {
print("Error starting playback: $e");
}
}
Future<void> _stopPlaying() async {
try {
await ZIMAudio.getInstance().stopPlay();
} catch (e) {
print("Error stopping playback: $e");
}
}
void _scrollToTop() {
_scrollController.animateTo(
_scrollController.position.maxScrollExtent,
duration: Duration(milliseconds: 500),
curve: Curves.easeInOut,
);
}
String formatMillisecondsTime(int milliseconds) {
int seconds = (milliseconds / 1000).floor();
int remainingMilliseconds = milliseconds % 1000;
int minutes = (seconds / 60).floor();
int remainingSeconds = seconds % 60;
String minutesStr = minutes.toString();
String secondsStr = remainingSeconds.toString().padLeft(2, '0');
String millisecondsStr = (remainingMilliseconds ~/ 10).toString().padLeft(2, '0');
return '$minutesStr:$secondsStr.$millisecondsStr';
}
String formatSecondTime(int milliseconds) {
int seconds = (milliseconds / 1000).floor();
int remainingSeconds = seconds % 60;
int minutes = (seconds / 60).floor();
String minutesStr = minutes.toString().padLeft(1, '0');
String secondsStr = remainingSeconds.toString().padLeft(2, '0');
return '$minutesStr:$secondsStr';
}
[@override](/user/override)
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('ZIM Audio:$_version'),
),
key: key,
body: Column(
children: [
Flexible(
flex: 2,
child: ListView(
controller: _scrollController,
reverse: false,
children: _recordings
.map(
(recording) => ListTile(
title: Text('Recording $recording'),
subtitle: Text('${formatSecondTime(recordDurationMap[recording]!)}'),
trailing: ElevatedButton(
onPressed: () {
if (recording == _currentPlaying) {
_stopPlaying();
} else {
_startPlaying(recording);
}
HapticFeedback.mediumImpact();
},
style: ButtonStyle(
backgroundColor: recording == _currentPlaying
? MaterialStateProperty.all(Colors.red)
: null,
),
child: Text(
recording == _currentPlaying && _isPlaying
? '${formatSecondTime(recordDurationMap[recording]! - _currentPlayingDuration)}'
: 'Play',
),
),
),
)
.toList(),
),
),
SizedBox(height: 20),
IconButton(
onPressed: () {
_isSpeaker = !_isSpeaker;
HapticFeedback.mediumImpact();
if (_isSpeaker == true) {
ZIMAudio.getInstance().setAudioRouteType(ZIMAudioRouteType.speaker);
} else {
ZIMAudio.getInstance().setAudioRouteType(ZIMAudioRouteType.receiver);
}
setState(() {});
},
icon: Icon(
_isSpeaker ? Icons.volume_up : Icons.hearing,
color: Colors.grey,
),
iconSize: 50,
splashRadius: 30,
padding: EdgeInsets.all(16),
color: Colors.blue,
),
Align(
alignment: Alignment.bottomCenter,
child: Padding(
padding: const EdgeInsets.only(bottom: 20.0),
child: GestureDetector(
onLongPressStart: (details) async {
_startRecording();
HapticFeedback.mediumImpact();
},
onLongPressEnd: (_) {
if (isCancelRecord) {
_CancelRecording();
} else {
HapticFeedback.mediumImpact();
_CompleteRecording();
}
isCancelRecord = false;
},
onLongPressMoveUpdate: (LongPressMoveUpdateDetails details) {
if (details.localPosition.dy < -50) {
isCancelRecord = true;
}
},
child: Container(
padding: EdgeInsets.all(20),
decoration: BoxDecoration(
color: _isRecording ? Colors.red : Colors.blue,
borderRadius: BorderRadius.circular(10),
),
child: Text(
_isRecording ? 'Swipe up to cancel ${formatSecondTime(_currentRecordDuration)}' : 'Press and hold to record',
style: TextStyle(
color: Colors.white,
fontSize: 18,
),
),
),
),
),
),
],
),
),
);
}
}
class ErrorDiaLog {
static Future<bool?> showFailedDialog(BuildContext context, String code, String message) {
return showDialog<bool>(
context: context,
builder: (context) {
return AlertDialog(
title: const Text(
"Error",
),
content: Text('code:' + code + '\n\n' + 'message:' + message),
actions: <Widget>[
TextButton(
onPressed: (() {
Navigator.of(context).pop();
}),
child: const Text('OK'))
],
);
});
}
}
更多关于Flutter音频通信插件zego_zim_audio的使用的实战教程也可以访问 https://www.itying.com/category-92-b0.html
更多关于Flutter音频通信插件zego_zim_audio的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html
zego_zim_audio
是 ZEGO 提供的一个音频通信插件,用于在 Flutter 应用中实现实时音频通信功能。它基于 ZEGO 的即时通讯(ZIM)和实时音视频(ZEGO Express)技术,提供了简单的 API 来实现音频通话、语音消息等功能。
以下是 zego_zim_audio
插件的基本使用步骤:
1. 添加依赖
首先,你需要在 pubspec.yaml
文件中添加 zego_zim_audio
插件的依赖:
dependencies:
flutter:
sdk: flutter
zego_zim_audio: ^1.0.0 # 请使用最新版本
然后,运行 flutter pub get
来安装依赖。
2. 初始化 ZegoZimAudio
在应用启动时,初始化 ZegoZimAudio
:
import 'package:zego_zim_audio/zego_zim_audio.dart';
void main() async {
WidgetsFlutterBinding.ensureInitialized();
// 初始化 ZegoZimAudio
await ZegoZimAudio.init(
appID: yourAppID, // ZEGO 分配的 AppID
appSign: yourAppSign, // ZEGO 分配的 AppSign
);
runApp(MyApp());
}
3. 登录 ZIM
在使用音频通信功能之前,用户需要先登录 ZIM:
await ZegoZimAudio.login(userID, userName);
userID
: 用户唯一标识符,通常是用户 ID。userName
: 用户名,用于显示在通话中。
4. 发起音频通话
通过 callUser
方法可以发起音频通话:
await ZegoZimAudio.callUser(
targetUserID, // 目标用户 ID
callID: 'yourCallID', // 通话 ID,唯一标识一次通话
extendedData: 'customData', // 自定义数据
);
5. 接听/拒接音频通话
当收到通话邀请时,可以通过 acceptCall
或 rejectCall
来接听或拒接:
// 接听通话
await ZegoZimAudio.acceptCall(callID);
// 拒接通话
await ZegoZimAudio.rejectCall(callID);
6. 结束通话
通话结束后,可以通过 endCall
方法来结束通话:
await ZegoZimAudio.endCall(callID);
7. 监听通话状态
你可以通过监听器来获取通话状态的变化:
ZegoZimAudio.onCallStateChanged.listen((callState) {
// 处理通话状态变化
switch (callState) {
case ZegoCallState.Calling:
// 正在呼叫
break;
case ZegoCallState.Connected:
// 通话已连接
break;
case ZegoCallState.Ended:
// 通话已结束
break;
default:
break;
}
});
8. 处理错误
监听错误事件,以便在出现问题时进行处理:
ZegoZimAudio.onError.listen((error) {
// 处理错误
print('Error: ${error.code}, ${error.message}');
});
9. 退出登录
在应用退出或用户注销时,记得调用 logout
方法:
await ZegoZimAudio.logout();