Flutter音频录制插件flutter_audio_recorder2的使用

发布于 1周前 作者 sinazl 来自 Flutter

Flutter音频录制插件flutter_audio_recorder2的使用

简介

flutter_audio_recorder2 是一个支持 RecordPauseResumeStop 功能的 Flutter 插件,并提供了访问音频电平表属性(如 average powerpeak power)的能力。该插件适用于 Android 和 iOS 平台。

示例

安装

pubspec.yaml 文件中添加 flutter_audio_recorder2

dependencies:
  flutter_audio_recorder2: ^最新版本号

iOS 权限

  1. Info.plist 中添加使用描述:
<key>NSMicrophoneUsageDescription</key>
<string>Can We Use Your Microphone Please</string>
  1. 使用 hasPermission API 在需要时请求用户权限:
bool hasPermission = await FlutterAudioRecorder2.hasPermissions;

Android 权限

  1. ./android/app/src/main/AndroidManifest.xml 中添加权限:
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
  1. 使用 hasPermission API 在需要时请求用户权限:
bool hasPermission = await FlutterAudioRecorder2.hasPermissions;

配置

iOS 部署目标

确保 iOS 部署目标为 8.0 及以上。

Android

  • AndroidX: 使用最新版本(例如 0.5.x
  • Legacy Android: 使用旧版本(例如 0.4.9

使用

推荐的 API 使用顺序:hasPermission => init > start -> (pause <-> resume) * n -> stop,在开始新的录音前再次调用 init

检查权限

始终先检查权限(如果权限尚未设置为 true/false,则会请求权限,否则返回录音权限的结果):

bool hasPermission = await FlutterAudioRecorder2.hasPermissions;

初始化

在开始录音前运行此步骤,以检查给定名称的文件是否已存在:

var recorder = FlutterAudioRecorder2("file_path.mp4"); // .wav .aac .m4a
await recorder.initialized;

或者指定音频格式:

var recorder = FlutterAudioRecorder2("file_path", audioFormat: AudioFormat.AAC); // 或 AudioFormat.WAV
await recorder.initialized;

设置采样率

var recorder = FlutterAudioRecorder2("file_path", audioFormat: AudioFormat.AAC, sampleRate: 22000); // 默认采样率为 16000
await recorder.initialized;

音频扩展名和格式映射

音频格式 音频扩展名列表
AAC .m4a .aac .mp4
WAV .wav

开始录音

await recorder.start();
var recording = await recorder.current(channel: 0);

获取录音详情

var current = await recording.current(channel: 0);
// print(current.status);

可以使用定时器每 50 毫秒获取一次详情(在录音完成后取消定时器):

new Timer.periodic(tick, (Timer t) async {
  var current = await recording.current(channel: 0);
  // print(current.status);
  setState(() {
  });
});

录音状态

名称 描述
path String
extension String
duration Duration
audioFormat AudioFormat
metering AudioMetering
status RecordingStatus

录音电平表

名称 描述
peakPower double
averagePower double
isMeteringEnabled bool

录音状态

  • Unset
  • Initialized
  • Recording
  • Paused
  • Stopped

暂停

await recorder.pause();

恢复

await recorder.resume();

停止

var result = await recorder.stop();
File file = widget.localFileSystem.file(result.path);

示例代码

以下是一个完整的示例代码,展示了如何使用 flutter_audio_recorder2 插件进行音频录制、暂停、恢复和停止操作:

import 'dart:async';
import 'dart:io' as io;

import 'package:audioplayers/audioplayers.dart';
import 'package:file/file.dart';
import 'package:file/local.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:flutter_audio_recorder2/flutter_audio_recorder2.dart';
import 'package:path_provider/path_provider.dart';

void main() {
  WidgetsFlutterBinding.ensureInitialized();
  SystemChrome.setEnabledSystemUIOverlays([]);
  return runApp(new MyApp());
}

class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => new _MyAppState();
}

class _MyAppState extends State<MyApp> {
  @override
  Widget build(BuildContext context) {
    return new MaterialApp(
      home: new Scaffold(
        body: SafeArea(
          child: new RecorderExample(),
        ),
      ),
    );
  }
}

class RecorderExample extends StatefulWidget {
  final LocalFileSystem localFileSystem;

  RecorderExample({localFileSystem}) : this.localFileSystem = localFileSystem ?? LocalFileSystem();

  @override
  State<StatefulWidget> createState() => new RecorderExampleState();
}

class RecorderExampleState extends State<RecorderExample> {
  FlutterAudioRecorder2? _recorder;
  Recording? _current;
  RecordingStatus _currentStatus = RecordingStatus.Unset;

  @override
  void initState() {
    super.initState();
    _init();
  }

  @override
  Widget build(BuildContext context) {
    return new Center(
      child: new Padding(
        padding: new EdgeInsets.all(8.0),
        child: new Column(
            mainAxisAlignment: MainAxisAlignment.spaceAround,
            children: <Widget>[
              new Row(
                mainAxisAlignment: MainAxisAlignment.center,
                children: <Widget>[
                  Padding(
                    padding: const EdgeInsets.all(8.0),
                    child: TextButton(
                      onPressed: () {
                        switch (_currentStatus) {
                          case RecordingStatus.Initialized:
                            {
                              _start();
                              break;
                            }
                          case RecordingStatus.Recording:
                            {
                              _pause();
                              break;
                            }
                          case RecordingStatus.Paused:
                            {
                              _resume();
                              break;
                            }
                          case RecordingStatus.Stopped:
                            {
                              _init();
                              break;
                            }
                          default:
                            break;
                        }
                      },
                      child: _buildText(_currentStatus),
                      style: ButtonStyle(
                          backgroundColor: MaterialStateProperty.all<Color>(
                            Colors.lightBlue,
                          )),
                    ),
                  ),
                  new TextButton(
                    onPressed: _currentStatus != RecordingStatus.Unset ? _stop : null,
                    child: new Text("Stop", style: TextStyle(color: Colors.white)),
                    style: ButtonStyle(
                        backgroundColor: MaterialStateProperty.all<Color>(
                          Colors.blueAccent.withOpacity(0.5),
                        )),
                  ),
                  SizedBox(
                    width: 8,
                  ),
                  new TextButton(
                    onPressed: onPlayAudio,
                    child: new Text("Play", style: TextStyle(color: Colors.white)),
                    style: ButtonStyle(
                        backgroundColor: MaterialStateProperty.all<Color>(
                          Colors.blueAccent.withOpacity(0.5),
                        )),
                  ),
                ],
              ),
              new Text("Status : $_currentStatus"),
              new Text('Avg Power: ${_current?.metering?.averagePower}'),
              new Text('Peak Power: ${_current?.metering?.peakPower}'),
              new Text("File path of the record: ${_current?.path}"),
              new Text("Format: ${_current?.audioFormat}"),
              new Text("isMeteringEnabled: ${_current?.metering?.isMeteringEnabled}"),
              new Text("Extension : ${_current?.extension}"),
              new Text("Audio recording duration : ${_current?.duration.toString()}")
            ]),
      ),
    );
  }

  _init() async {
    try {
      bool hasPermission = await FlutterAudioRecorder2.hasPermissions ?? false;

      if (hasPermission) {
        String customPath = '/flutter_audio_recorder_';
        io.Directory appDocDirectory;
        if (io.Platform.isIOS) {
          appDocDirectory = await getApplicationDocumentsDirectory();
        } else {
          appDocDirectory = (await getExternalStorageDirectory())!;
        }

        customPath = appDocDirectory.path +
            customPath +
            DateTime.now().millisecondsSinceEpoch.toString();

        _recorder = FlutterAudioRecorder2(customPath, audioFormat: AudioFormat.WAV);

        await _recorder!.initialized;
        var current = await _recorder!.current(channel: 0);
        print(current);
        setState(() {
          _current = current;
          _currentStatus = current!.status!;
          print(_currentStatus);
        });
      } else {
        ScaffoldMessenger.of(context).showSnackBar(
            SnackBar(content: new Text("You must accept permissions")));
      }
    } catch (e) {
      print(e);
    }
  }

  _start() async {
    try {
      await _recorder!.start();
      var recording = await _recorder!.current(channel: 0);
      setState(() {
        _current = recording;
      });

      const tick = const Duration(milliseconds: 50);
      new Timer.periodic(tick, (Timer t) async {
        if (_currentStatus == RecordingStatus.Stopped) {
          t.cancel();
        }

        var current = await _recorder!.current(channel: 0);
        setState(() {
          _current = current;
          _currentStatus = _current!.status!;
        });
      });
    } catch (e) {
      print(e);
    }
  }

  _resume() async {
    await _recorder!.resume();
    setState(() {});
  }

  _pause() async {
    await _recorder!.pause();
    setState(() {});
  }

  _stop() async {
    var result = await _recorder!.stop();
    print("Stop recording: ${result!.path}");
    print("Stop recording: ${result.duration}");
    File file = widget.localFileSystem.file(result.path);
    print("File length: ${await file.length()}");
    setState(() {
      _current = result;
      _currentStatus = _current!.status!;
    });
  }

  Widget _buildText(RecordingStatus status) {
    var text = "";
    switch (_currentStatus) {
      case RecordingStatus.Initialized:
        {
          text = 'Start';
          break;
        }
      case RecordingStatus.Recording:
        {
          text = 'Pause';
          break;
        }
      case RecordingStatus.Paused:
        {
          text = 'Resume';
          break;
        }
      case RecordingStatus.Stopped:
        {
          text = 'Init';
          break;
        }
      default:
        break;
    }
    return Text(text, style: TextStyle(color: Colors.white));
  }

  void onPlayAudio() async {
    AudioPlayer audioPlayer = AudioPlayer();
    await audioPlayer.play(_current!.path!, isLocal: true);
  }
}

希望这个示例能帮助你更好地理解和使用 flutter_audio_recorder2 插件。如果你有任何问题或需要进一步的帮助,请随时提问!


更多关于Flutter音频录制插件flutter_audio_recorder2的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html

1 回复

更多关于Flutter音频录制插件flutter_audio_recorder2的使用的实战系列教程也可以访问 https://www.itying.com/category-92-b0.html


当然,下面是一个关于如何使用 flutter_audio_recorder2 插件进行音频录制的代码案例。这个插件提供了在Flutter应用中录制音频的功能。

首先,你需要在你的 pubspec.yaml 文件中添加依赖:

dependencies:
  flutter:
    sdk: flutter
  flutter_audio_recorder2: ^2.4.4  # 请确保版本号是最新的

然后运行 flutter pub get 来获取依赖。

接下来,我们来看一个基本的实现音频录制的示例。这个示例包括初始化录音器、开始录音、停止录音和保存音频文件。

主代码文件 main.dart

import 'package:flutter/material.dart';
import 'package:flutter_audio_recorder2/flutter_audio_recorder2.dart';
import 'package:path_provider/path_provider.dart';

void main() {
  runApp(MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: AudioRecorderScreen(),
    );
  }
}

class AudioRecorderScreen extends StatefulWidget {
  @override
  _AudioRecorderScreenState createState() => _AudioRecorderScreenState();
}

class _AudioRecorderScreenState extends State<AudioRecorderScreen> {
  late FlutterAudioRecorder _recorder;
  late String _localPath;
  bool _isRecording = false;

  @override
  void initState() {
    super.initState();
    _initRecorder();
  }

  Future<void> _initRecorder() async {
    _recorder = FlutterAudioRecorder();

    // 获取本地存储路径
    final Directory appDocDir = await getApplicationDocumentsDirectory();
    _localPath = appDocDir.path;

    // 配置录音器
    await _recorder.initialize(
      audioFormat: AudioFormat.AAC,
      channelConfig: ChannelConfig.MONO,
      sampleRate: 44100,
      bitRate: 128000,
    );
  }

  Future<void> _startRecording() async {
    setState(() {
      _isRecording = true;
    });

    String path = '$_localPath/audio_${DateTime.now().millisecondsSinceEpoch}.aac';
    await _recorder.start(path: path);
  }

  Future<void> _stopRecording() async {
    setState(() {
      _isRecording = false;
    });

    await _recorder.stop();
    final String? path = await _recorder.getCurrentPath();
    if (path != null) {
      // 在这里你可以处理录音文件,比如播放或分享
      print('录音文件路径: $path');
    }
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: Text('Audio Recorder'),
      ),
      body: Center(
        child: Column(
          mainAxisAlignment: MainAxisAlignment.center,
          children: <Widget>[
            ElevatedButton(
              onPressed: _isRecording ? null : _startRecording,
              child: Text('开始录音'),
            ),
            SizedBox(height: 20),
            ElevatedButton(
              onPressed: _isRecording ? _stopRecording : null,
              child: Text('停止录音'),
            ),
          ],
        ),
      ),
    );
  }

  @override
  void dispose() {
    _recorder.dispose();
    super.dispose();
  }
}

权限处理

在Android上,你需要添加权限到你的 AndroidManifest.xml 文件中:

<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.yourapp">

    <uses-permission android:name="android.permission.RECORD_AUDIO"/>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>

    <!-- 其他配置 -->

</manifest>

在iOS上,你需要在 Info.plist 文件中添加权限请求描述:

<key>NSMicrophoneUsageDescription</key>
<string>需要访问麦克风以录制音频</string>
<key>NSAppTransportSecurity</key>
<dict>
    <key>NSAllowsArbitraryLoads</key>
    <true/>
</dict>

并且,你可能还需要在iOS项目中处理权限请求,这通常可以通过Flutter的 permission_handler 插件来完成。

注意事项

  1. 权限请求:在实际应用中,你需要处理权限请求,确保用户已经授予了必要的权限。
  2. 错误处理:在真实的应用中,你应该添加更多的错误处理逻辑,比如处理录音失败的情况。
  3. UI优化:上述示例中的UI非常简单,你可以根据需求进行优化。

这个示例展示了如何使用 flutter_audio_recorder2 插件进行基本的音频录制功能。根据具体需求,你可以进一步扩展和修改这个示例。

回到顶部