Flutter 使用 flutter_webrtc 实现 WebRTC 功能主要包括信令交换、媒体协商和连接建立。以下是关键步骤和示例代码:
1. 添加依赖
在 pubspec.yaml 中添加:
dependencies:
flutter_webrtc: ^0.9.44
2. 权限配置
Android (android/app/src/main/AndroidManifest.xml):
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
iOS (ios/Runner/Info.plist):
<key>NSCameraUsageDescription</key>
<string>Camera permission</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone permission</string>
3. 核心实现代码
import 'package:flutter_webrtc/flutter_webrtc.dart';
class WebRTCExample {
RTCPeerConnection? _peerConnection;
MediaStream? _localStream;
RTCVideoRenderer _localRenderer = RTCVideoRenderer();
RTCVideoRenderer _remoteRenderer = RTCVideoRenderer();
// 初始化渲染器
Future<void> initRenderers() async {
await _localRenderer.initialize();
await _remoteRenderer.initialize();
}
// 创建本地媒体流
Future<MediaStream> createLocalStream() async {
final Map<String, dynamic> constraints = {
'audio': true,
'video': {
'mandatory': {
'minWidth': '640',
'minHeight': '480',
'minFrameRate': '30',
},
'facingMode': 'user'
}
};
return await navigator.mediaDevices.getUserMedia(constraints);
}
// 创建对等连接
Future<RTCPeerConnection> createPeerConnection() async {
final Map<String, dynamic> configuration = {
'iceServers': [
{'urls': 'stun:stun.l.google.com:19302'}
]
};
final Map<String, dynamic> offerSdpConstraints = {
'mandatory': {
'OfferToReceiveAudio': true,
'OfferToReceiveVideo': true,
},
'optional': [],
};
return await createPeerConnection(configuration, offerSdpConstraints);
}
// 建立连接
Future<void> connect() async {
await initRenderers();
_localStream = await createLocalStream();
_localRenderer.srcObject = _localStream;
_peerConnection = await createPeerConnection();
// 添加本地流到连接
_localStream!.getTracks().forEach((track) {
_peerConnection!.addTrack(track, _localStream!);
});
// 处理远程流
_peerConnection!.onAddStream = (MediaStream stream) {
_remoteRenderer.srcObject = stream;
};
// 创建并发送offer
final RTCSessionDescription offer = await _peerConnection!.createOffer();
await _peerConnection!.setLocalDescription(offer);
// 通过信令服务器发送offer(需自行实现信令逻辑)
// signaling.sendOffer(offer);
}
// 处理远程answer
void handleAnswer(RTCSessionDescription answer) async {
await _peerConnection!.setRemoteDescription(answer);
}
// 处理ICE候选
void handleIceCandidate(RTCIceCandidate candidate) async {
await _peerConnection!.addCandidate(candidate);
}
void dispose() {
_peerConnection?.close();
_localStream?.dispose();
_localRenderer.dispose();
_remoteRenderer.dispose();
}
}
4. 信令服务器
需要实现信令服务器(可用 Socket.IO/SignalR 等)交换:
5. 在Widget中使用
@override
Widget build(BuildContext context) {
return Scaffold(
body: Column(
children: [
RTCVideoView(_localRenderer),
RTCVideoView(_remoteRenderer),
],
),
);
}
关键说明:
- 信令交换:需自行实现服务器传递 SDP 和 ICE 候选
- ICE 服务器:生产环境需配置 TURN/STUN 服务器
- 状态管理:建议使用 Provider/Bloc 管理连接状态
通过以上步骤即可实现基础的视频通话功能。实际部署时需完善错误处理和重连机制。