Flutter视频通话插件video_call_sdk的使用

发布于 1周前 作者 caililin 来自 Flutter

Flutter视频通话插件video_call_sdk的使用

VideoCall SDK

轻松实现视频通话功能。


支持的平台

VideoCall SDK for Flutter 设计为可以在 Flutter 支持的所有平台上运行:

  • Android
  • iOS
  • Web
  • macOS
  • Windows
  • Linux

示例应用

我们在 example/ 文件夹中构建了一个多用户会议的应用程序作为示例。LiveKit 跨平台兼容:您可以使用任何支持的实时 SDK 加入同一个房间。

iOS

在您的 Info.plist 文件中需要声明相机和麦克风的使用权限。

<dict>
  ...
  <key>NSCameraUsageDescription</key>
  <string>$(PRODUCT_NAME) 使用您的相机</string>
  <key>NSMicrophoneUsageDescription</key>
  <string>$(PRODUCT_NAME) 使用您的麦克风</string>
</dict>

如果启用了后台模式,您的应用程序可以在切换到后台时仍能运行语音通话。在 Xcode 中选择应用程序目标,点击 Capabilities 标签,启用后台模式,并勾选 音频、AirPlay 和画中画

您的 Info.plist 应该包含以下条目:

<dict>
  ...
  <key>UIBackgroundModes</key>
  <array>
    <string>audio</string>
  </array>
</dict>

注意事项

由于 xcode 14 不再支持 32 位构建,且我们的最新版本基于 libwebrtc m104+,因此 iOS 框架不再支持 32 位构建。强烈建议升级到 flutter 3.3.0+。如果您使用的是 flutter 3.0.0 或更低版本,有很大可能会因为缺少 i386 和 arm 32 位框架而导致您的 flutter 应用无法正确编译(详见问题 #132#172)。

您可以尝试修改 {projects_dir}/ios/Podfile 来修复此问题:

post_install do |installer|
  installer.pods_project.targets.each do |target|
    flutter_additional_ios_build_settings(target)

    target.build_configurations.each do |config|
      config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= 此行
    end
  end
end

对于 iOS,最低支持的部署目标是 13.0。您需要在 Podfile 中添加以下内容:

platform :ios, '13.0'

更新部署目标后,您可能需要删除 Podfile.lock 并重新运行 pod install


Android

我们需要声明一组权限,这些权限需要在您的 AppManifest.xml 文件中声明。这些权限是 Flutter WebRTC 所必需的。

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.your.package">
  <uses-feature android:name="android.hardware.camera" />
  <uses-feature android:name="android.hardware.camera.autofocus" />
  <uses-permission android:name="android.permission.CAMERA" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
  <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
  <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
  <uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
  <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
  <uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
  <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
  <uses-permission android:name="android.permission.WRITE_INTERNAL_STORAGE" />
  ...
</manifest>

在 Android 的 AndroidManifest.xml 文件中,<application> 标签下添加以下服务:

<service
  android:name="com.foregroundservice.ForegroundService"
  android:foregroundServiceType="mediaProjection">
</service>

导入 SDK:

import 'package:video_call_sdk/video_call_sdk.dart';

通过以下方式对系统进行身份验证以启用 SDK:

final authSuccess = await MTVideoCallPlugin.instance.authenticate(apiKey: 'your_api_key');

获取设备硬件信息:

List<MediaDevice> audioInputs = MTVideoCallPlugin.instance.getDeviceAudioInput();
List<MediaDevice> videoInputs = MTVideoCallPlugin.instance.getDeviceVideoInput();

获取队列支持:

List<MTQueue> queues = await MTVideoCallPlugin.instance.getQueues();

开始视频通话

使用 SDK 提供的视频通话 UI 模板

Navigator.push(
  context,
  MaterialPageRoute(
    builder: (context) => MTCallingPage(
      user: MTUser(
        name: "name_of_end_user", // 必填
        email: "email_of_end_user",
        phone: "phone_of_end_user",
      ),
      queue: queueSelected,
      device: videoInputSelected,
    ),
  ),
);

使用 SDK 提供的方法

首先,添加来自 MTRoomEventListener 或 MTTrackListener 的监听器:

class YourPageState extends StatefulWidget {
  const YourPageState({super.key});

  [@override](/user/override)
  State<YourPageState> createState() => _YourPageStateState();
}

class _YourPageStateState extends State<YourPageState> with MTRoomEventListener, MTTrackListener {

  [@override](/user/override)
  void initState() {
    // 初始化监听器
    super.initState();
    MTVideoCallPlugin.instance.addMTRoomEventListener(this);
    MTVideoCallPlugin.instance.addMTTrackEventListener(this);
  }

  [@override](/user/override)
  Widget build(BuildContext context) {
    return const Placeholder(); // 写您的 UI
  }

  [@override](/user/override)
  void onConnectedRoom(Room room, String? metaData) {
    // 实现 onConnectedRoom 回调
    super.onConnectedRoom(room, metaData);
  }

  [@override](/user/override)
  void onDisconnectedRoom(DisconnectReason? reason) async {
    // 实现 onDisconnectedRoom 回调
    super.onDisconnectedRoom(reason);
    // 移除监听器
    MTVideoCallPlugin.instance.removeMTRoomEventListener(this);
    MTVideoCallPlugin.instance.removeMTTrackEventListener(this);
  }

  [@override](/user/override)
  void onParticipantConnectedRoom(RemoteParticipant participant) async {
    // 实现 onParticipantConnectedRoom 回调
    super.onParticipantConnectedRoom(participant);
  }

  [@override](/user/override)
  void onParticipantDisconnectedRoom(RemoteParticipant participant) async {
    // 实现 onParticipantDisconnectedRoom 回调
    super.onParticipantDisconnectedRoom(participant);
  }

  [@override](/user/override)
  void onRemoteUnMutedTrack(TrackPublication<Track> publication, Participant<TrackPublication<Track>> participant) {
    // 实现 onRemoteUnMutedTrack 回调
    super.onRemoteUnMutedTrack(publication, participant);
  }

  [@override](/user/override)
  void onRemoteMutedTrack(TrackPublication<Track> publication, Participant participant) {
    // 实现 onRemoteMutedTrack 回调
    super.onRemoteMutedTrack(publication, participant);
  }

  [@override](/user/override)
  void onLocalTrackPublished(LocalParticipant localParticipant, LocalTrackPublication<LocalTrack> publication) {
    // 实现 onLocalTrackPublished 回调
    super.onLocalTrackPublished(localParticipant, publication);
  }

  [@override](/user/override)
  void onLocalTrackUnPublished(LocalParticipant localParticipant, LocalTrackPublication<LocalTrack> publication) {
    // 实现 onLocalTrackUnPublished 回调
    super.onLocalTrackUnPublished(localParticipant, publication);
  }

  [@override](/user/override)
  void onReceiveData(List<int> data, RemoteParticipant? participant, String? topic) {
    // 实现 onReceiveData 回调
    super.onReceiveData(data, participant, topic);
  }

  [@override](/user/override)
  void onTrackSubscribed(RemoteTrackPublication<RemoteTrack> publication, RemoteParticipant participant, Track track) {
    // 实现 onTrackSubscribed 回调
    super.onTrackSubscribed(publication, participant, track);
  }

  [@override](/user/override)
  void onTrackUnSubscribed(RemoteTrackPublication<RemoteTrack> publication, RemoteParticipant participant, Track track) {
    // 实现 onTrackUnSubscribed 回调
    super.onTrackUnSubscribed(publication, participant, track);
  }
}

初始化并连接到房间,使用以下函数:

final user = MTUser(
  name: "sample_name",
  email: "sample_email",
  phone: "sample_phone",
);
final queue = queues.first; // 示例
final room = await MTVideoCallPlugin.instance.startVideoCall(
  user: user, // MTUser 对象
  queue: queue,
);
await MTVideoCallPlugin.instance.setInputVideo(inputVideo);
final isCnSuccess = await MTVideoCallPlugin.instance.connect2Room(queue: widget.queue, user: widget.user, room: room);

启用或禁用录制:

MTVideoCallPlugin.instance.enableRecording(true) // 或 false

在视频通话中切换、开启或关闭摄像头和麦克风:

MTVideoCallPlugin.instance.changeVideoTrack(mediaDevice)

MTVideoCallPlugin.instance.enableVideo(true) // 或 false

MTVideoCallPlugin.instance.enableMicrophone(true) // 或 false

最后,断开视频通话:

bool isSuccess = await MTVideoCallPlugin.instance.disconnectVideoCall();

示例代码

以下是完整的示例代码:

import 'package:flutter/material.dart';
import 'package:fluttertoast/fluttertoast.dart';
import 'package:video_call_sdk/models/queue.dart';
import 'package:video_call_sdk/models/user.dart';
import 'package:video_call_sdk/video_call_sdk.dart';
import 'package:video_call_sdk/view/page/calling_page.dart';

void main() => runApp(MyApp());

class MyApp extends StatelessWidget {
  [@override](/user/override)
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'VideoCall Demo',
      theme: ThemeData(
        colorScheme: ColorScheme.fromSeed(seedColor: Colors.blue),
        useMaterial3: true,
      ),
      home: const MyHomePage(),
    );
  }
}

class MyHomePage extends StatefulWidget {
  const MyHomePage({super.key});

  [@override](/user/override)
  State<MyHomePage> createState() => _MyHomePageState();
}

class _MyHomePageState extends State<MyHomePage> {
  bool isAuthenticating = true;
  late List<MTQueue> queues;
  late List<MediaDevice> audioInputs;
  late List<MediaDevice> videoInputs;
  MTQueue? queueSelected;
  MediaDevice? audioInputSelected;
  MediaDevice? videoInputSelected;
  double space = 18;
  double heightDropdown = 60;
  bool isRecording = false;

  final TextEditingController _nameCtrl = TextEditingController();
  final TextEditingController _phoneCtrl = TextEditingController();
  final TextEditingController _emailCtrl = TextEditingController();

  [@override](/user/override)
  void initState() {
    super.initState();
    authenticate();
  }

  [@override](/user/override)
  Widget build(BuildContext context) {
    return Scaffold(
      backgroundColor: Colors.white,
      appBar: AppBar(
        title: const Text(
          "Video Call SDK Demo",
          style: TextStyle(
            color: Colors.white,
            fontSize: 18,
            fontWeight: FontWeight.bold,
          ),
        ),
        centerTitle: true,
        flexibleSpace: Container(
          decoration: const BoxDecoration(
            gradient: LinearGradient(
              begin: Alignment.topCenter,
              end: Alignment.bottomCenter,
              colors: [Colors.cyan, Colors.blue],
            ),
          ),
        ),
      ),
      body: Padding(
        padding: const EdgeInsets.all(12.0),
        child: Center(
          child: isAuthenticating
              ? const CircularProgressIndicator()
              : Column(
                  crossAxisAlignment: CrossAxisAlignment.center,
                  children: [
                    userSection(),
                    infoCallSection(),
                  ],
                ),
        ),
      ),
      floatingActionButton: (!isAuthenticating)
          ? FloatingActionButton(
              onPressed: () async {
                if (_nameCtrl.text.isEmpty) {
                  Fluttertoast.showToast(
                    msg: "Vui lòng nhập tên",
                    toastLength: Toast.LENGTH_SHORT,
                    gravity: ToastGravity.BOTTOM,
                    timeInSecForIosWeb: 1,
                    backgroundColor: Colors.black,
                    textColor: Colors.white,
                    fontSize: 14.0,
                  );
                  return;
                }
                Navigator.push(
                  context,
                  MaterialPageRoute(
                    builder: (context) => MTCallingPage(
                      user: MTUser(
                        name: _nameCtrl.text,
                        email: _emailCtrl.text,
                        phone: _phoneCtrl.text,
                      ),
                      queue: queueSelected ?? queues.first,
                      videoInput: videoInputSelected ?? videoInputs.first,
                      audioInput: audioInputSelected ?? audioInputs.first,
                    ),
                  ),
                );
              },
              backgroundColor: Colors.blue,
              child: const Icon(
                Icons.video_call,
                color: Colors.white,
                size: 36,
              ),
            )
          : null,
    );
  }

  Widget userSection() {
    return Column(
      crossAxisAlignment: CrossAxisAlignment.start,
      mainAxisAlignment: MainAxisAlignment.center,
      children: [
        label("Name", true),
        tF(_nameCtrl, "Name"),
        SizedBox(height: space),
        label("Phone", false),
        tF(_phoneCtrl, "Phone"),
        SizedBox(height: space),
        label("Email", false),
        tF(_emailCtrl, "Email"),
        SizedBox(height: space),
      ],
    );
  }

  Widget label(String label, [bool isRequired = false]) {
    const double fontSize = 16;
    return isRequired
        ? Row(
            crossAxisAlignment: CrossAxisAlignment.center,
            children: [
              Text(
                label,
                style: const TextStyle(
                  color: Colors.black,
                  fontSize: fontSize,
                ),
              ),
              const Text(
                "*",
                style: TextStyle(
                  color: Colors.red,
                ),
              )
            ],
          )
        : Text(
            label,
            style: const TextStyle(
              color: Colors.black,
              fontSize: fontSize,
            ),
          );
  }

  Widget infoCallSection() {
    return Column(
      crossAxisAlignment: CrossAxisAlignment.start,
      mainAxisAlignment: MainAxisAlignment.center,
      children: [
        label("Queue"),
        DropdownButton(
          value: queueSelected,
          items: queues
              .map(
                (e) => DropdownMenuItem<MTQueue>(
                  value: e,
                  child: Text(e.queueName),
                ),
              )
              .toList(),
          onChanged: (queue) {
            queueSelected = queue;
          },
          hint: Text("Queue"),
        ),
        SizedBox(height: space),
        label("Video input"),
        DropdownButton(
          value: videoInputSelected,
          items: videoInputs
              .map(
                (e) => DropdownMenuItem<MediaDevice>(
                  value: e,
                  child: Text(e.label),
                ),
              )
              .toList(),
          onChanged: (value) {
            videoInputSelected = value;
          },
          hint: Text("Video input"),
        ),
        SizedBox(height: space),
        label("Audio input"),
        DropdownButton(
          value: audioInputSelected,
          items: audioInputs
              .map(
                (e) => DropdownMenuItem<MediaDevice>(
                  value: e,
                  child: Text(e.label),
                ),
              )
              .toList(),
          onChanged: (value) {
            audioInputSelected = value;
          },
          hint: Text("Audio input"),
        ),
      ],
    );
  }

  Future<void> authenticate() async {
    final authSuccess =
        await MTVideoCallPlugin.instance.authenticate(apiKey: 'your_api_key');
    if (authSuccess) {
      queues = await MTVideoCallPlugin.instance.getQueues();
      queueSelected = queues.first;
      audioInputs = MTVideoCallPlugin.instance.getDeviceAudioInput();
      audioInputSelected = audioInputs.first;
      videoInputs = MTVideoCallPlugin.instance.getDeviceVideoInput();
      videoInputSelected = videoInputs.first;
      setState(() {
        isAuthenticating = false;
      });
    } else {}
  }

  Widget tF(TextEditingController controller, String hint) {
    return Container(
      width: double.infinity,
      height: 55,
      alignment: Alignment.center,
      padding: const EdgeInsets.symmetric(horizontal: 12),
      decoration: BoxDecoration(
        color: Colors.white,
        borderRadius: BorderRadius.circular(8),
        border: Border.all(color: Colors.grey),
      ),
      child: TextField(
        controller: controller,
        style: const TextStyle(fontSize: 14),
        decoration: InputDecoration(
          hintText: hint,
          border: InputBorder.none,
        ),
      ),
    );
  }
}

更多关于Flutter视频通话插件video_call_sdk的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html

1 回复

更多关于Flutter视频通话插件video_call_sdk的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html


在Flutter中实现视频通话功能,可以使用第三方插件 video_call_sdk。这个插件通常封装了底层的音视频通信功能,提供了简单的API供开发者使用。以下是如何使用 video_call_sdk 插件的基本步骤:

1. 添加依赖

首先,在 pubspec.yaml 文件中添加 video_call_sdk 插件的依赖:

dependencies:
  flutter:
    sdk: flutter
  video_call_sdk: ^1.0.0  # 请使用最新版本

然后运行 flutter pub get 来安装依赖。

2. 初始化SDK

在使用 video_call_sdk 之前,通常需要先初始化SDK。你可以在 main.dart 或应用启动时进行初始化:

import 'package:video_call_sdk/video_call_sdk.dart';

void main() async {
  WidgetsFlutterBinding.ensureInitialized();
  
  // 初始化SDK
  await VideoCallSdk.initialize(
    appId: 'YOUR_APP_ID',  // 替换为你的App ID
    appToken: 'YOUR_APP_TOKEN',  // 替换为你的App Token
  );

  runApp(MyApp());
}

3. 创建视频通话页面

接下来,你可以创建一个视频通话页面,使用 VideoCallSdk 提供的API来启动和管理视频通话。

import 'package:flutter/material.dart';
import 'package:video_call_sdk/video_call_sdk.dart';

class VideoCallPage extends StatefulWidget {
  final String callId;

  VideoCallPage({required this.callId});

  @override
  _VideoCallPageState createState() => _VideoCallPageState();
}

class _VideoCallPageState extends State<VideoCallPage> {
  late VideoCallController _controller;

  @override
  void initState() {
    super.initState();
    _controller = VideoCallController();
    _startCall();
  }

  void _startCall() async {
    try {
      await _controller.startCall(
        callId: widget.callId,
        localVideoView: _controller.getLocalVideoView(),
        remoteVideoView: _controller.getRemoteVideoView(),
      );
    } catch (e) {
      print('Failed to start call: $e');
    }
  }

  @override
  void dispose() {
    _controller.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: Text('Video Call'),
      ),
      body: Column(
        children: [
          Expanded(
            child: _controller.getRemoteVideoView(),
          ),
          Expanded(
            child: _controller.getLocalVideoView(),
          ),
        ],
      ),
    );
  }
}

4. 启动视频通话

在你的应用中,你可以通过导航到 VideoCallPage 来启动视频通话:

Navigator.push(
  context,
  MaterialPageRoute(
    builder: (context) => VideoCallPage(callId: 'unique_call_id'),
  ),
);

5. 处理通话事件

video_call_sdk 通常还提供了一些事件回调,例如通话建立、通话结束、错误处理等。你可以通过监听这些事件来处理不同的通话状态:

_controller.onCallConnected = () {
  print('Call connected');
};

_controller.onCallDisconnected = () {
  print('Call disconnected');
};

_controller.onError = (error) {
  print('Error occurred: $error');
};

6. 结束通话

在通话结束时,你可以调用 _controller.endCall() 来结束通话:

_controller.endCall();
回到顶部
AI 助手
你好,我是IT营的 AI 助手
您可以尝试点击下方的快捷入口开启体验!