HarmonyOS鸿蒙Next中如何实现自定义视频录制页面、自定义视频录制倒计时

发布于 1周前 作者 yuanlaile 来自 鸿蒙OS

HarmonyOS鸿蒙Next中如何实现自定义视频录制页面、自定义视频录制倒计时 如何实现自定义视频录制页面、自定义视频录制倒计时

3 回复

可以参考视频录制文档:https://developer.huawei.com/consumer/cn/doc/harmonyos-guides-V5/camera-recording-case-V5

参考代码:

import { media } from '@kit.MediaKit';
import { BusinessError } from '@kit.BasicServicesKit';
import { abilityAccessCtrl, PermissionRequestResult, Permissions, bundleManager, common } from '@kit.AbilityKit';
import { camera } from '@kit.CameraKit';
import { photoAccessHelper } from '@kit.MediaLibraryKit';
import { fileIo as fs } from '@kit.CoreFileKit';

@Entry
@Component
struct XComponentPage {
  mXComponentController: XComponentController = new XComponentController;
  surfaceId: string = '';
  format: string = 'mm:ss'
  textTimerController: TextTimerController = new TextTimerController()
  isShow:boolean = true

  timer(){
    Flex({
      alignItems:ItemAlign.Center,
      justifyContent:FlexAlign.Center
    }){
      TextTimer({
        isCountDown: true,
        count: 3000,
        controller: this.textTimerController
      })
      .format(this.format)
      .fontColor(Color.White)
      .fontSize(50)
      .onTimer(async (utc: number, elapsedTime: number) => {
        console.info('textTimer notCountDown utc is:' + utc + ', elapsedTime: ' + elapsedTime)
        if(elapsedTime === 3){
          console.info("videoRecorderDemo")
          await videoRecording(getContext(),this.surfaceId)
        }
      })
    }
    .visibility(this.isShow?Visibility.Visible:Visibility.Hidden)
    .width("100%")
    .height("100%")
  }

  build(){
    Flex({ wrap: FlexWrap.Wrap }) {
      Row(){
        XComponent({
          id: '',
          type: 'surface',
          libraryname: '',
          controller: this.mXComponentController
        })
        .onLoad(() => {
          console.log(`loading start`);
          this.surfaceId = this.mXComponentController.getXComponentSurfaceId();
          console.log(`loading end surfaceId = ${JSON.stringify(this.surfaceId)}`);
        })
      }
      .width('100%')
      .height(480)
      .overlay(this.timer())
      Column(){
        SaveButton()
        .onClick(async () => {
          let permissions: Permissions[] = ['ohos.permission.CAMERA','ohos.permission.MICROPHONE']
          let atManager = abilityAccessCtrl.createAtManager();
          await atManager.requestPermissionsFromUser(getContext(), permissions);
          this.textTimerController.reset()
          this.textTimerController.start()
        })
      }.width("100%").padding({top:10})
    }
  }
}

async function videoRecording(context: common.Context, surfaceId: string): Promise<void> {
  let cameraManager: camera.CameraManager = camera.getCameraManager(context);
  if (!cameraManager) {
    console.error("camera.getCameraManager error");
    return;
  }
  cameraManager.on('cameraStatus', (err: BusinessError, cameraStatusInfo: camera.CameraStatusInfo) => {
    if (err !== undefined && err.code !== 0) {
      console.error('cameraStatus with errorCode = ' + err.code);
      return;
    }
    console.info(`camera : ${cameraStatusInfo.camera.cameraId}`);
    console.info(`status: ${cameraStatusInfo.status}`);
  });
  let cameraArray: Array<camera.CameraDevice> = [];
  try {
    cameraArray = cameraManager.getSupportedCameras();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`getSupportedCameras call failed. error code: ${err.code}`);
  }
  if (cameraArray.length <= 0) {
    console.error("cameraManager.getSupportedCameras error");
    return;
  }
  let sceneModes: Array<camera.SceneMode> = cameraManager.getSupportedSceneModes(cameraArray[0]);
  let isSupportVideoMode: boolean = sceneModes.indexOf(camera.SceneMode.NORMAL_VIDEO) >= 0;
  if (!isSupportVideoMode) {
    console.error('video mode not support');
    return;
  }
  let cameraOutputCap: camera.CameraOutputCapability = cameraManager.getSupportedOutputCapability(cameraArray[0], camera.SceneMode.NORMAL_VIDEO);
  if (!cameraOutputCap) {
    console.error("cameraManager.getSupportedOutputCapability error")
    return;
  }
  console.info("outputCapability: " + JSON.stringify(cameraOutputCap));
  let previewProfilesArray: Array<camera.Profile> = cameraOutputCap.previewProfiles;
  if (!previewProfilesArray) {
    console.error("createOutput previewProfilesArray == null || undefined");
  }
  let photoProfilesArray: Array<camera.Profile> = cameraOutputCap.photoProfiles;
  if (!photoProfilesArray) {
    console.error("createOutput photoProfilesArray == null || undefined");
  }
  let videoProfilesArray: Array<camera.VideoProfile> = cameraOutputCap.videoProfiles;
  if (!videoProfilesArray) {
    console.error("createOutput videoProfilesArray == null || undefined");
  }
  let videoSize: camera.Size = {
    width: 640,
    height: 480
  }
  let videoProfile: undefined | camera.VideoProfile = videoProfilesArray.find((profile: camera.VideoProfile) => {
    return profile.size.width === videoSize.width && profile.size.height === videoSize.height;
  });
  if (!videoProfile) {
    console.error('videoProfile is not found');
    return;
  }
  let aVRecorderProfile: media.AVRecorderProfile = {
    audioBitrate: 48000,
    audioChannels: 2,
    audioCodec: media.CodecMimeType.AUDIO_AAC,
    audioSampleRate: 48000,
    fileFormat: media.ContainerFormatType.CFT_MPEG_4,
    videoBitrate: 2000000,
    videoCodec: media.CodecMimeType.VIDEO_AVC,
    videoFrameWidth: videoSize.width,
    videoFrameHeight: videoSize.height,
    videoFrameRate: 30
  };
  let options: photoAccessHelper.CreateOptions = {
    title: Date.now().toString()
  };
  let accessHelper: photoAccessHelper.PhotoAccessHelper = photoAccessHelper.getPhotoAccessHelper(context);
  let videoUri: string = await accessHelper.createAsset(photoAccessHelper.PhotoType.VIDEO, 'mp4', options);
  console.info("videoUri:",videoUri)
  let file: fs.File = fs.openSync(videoUri, fs.OpenMode.READ_WRITE | fs.OpenMode.CREATE);
  let aVRecorderConfig: media.AVRecorderConfig = {
    audioSourceType: media.AudioSourceType.AUDIO_SOURCE_TYPE_MIC,
    videoSourceType: media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV,
    profile: aVRecorderProfile,
    url: `fd://${file.fd.toString()}`, 
    rotation: 0, 
    location: { latitude: 30, longitude: 130 }
  };
  let avRecorder: media.AVRecorder | undefined = undefined;
  try {
    avRecorder = await media.createAVRecorder();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`createAVRecorder call failed. error code: ${err.code}`);
  }
  if (avRecorder === undefined) {
    return;
  }
  try {
    await avRecorder.prepare(aVRecorderConfig);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`prepare call failed. error code: ${err.code}`);
  }
  let videoSurfaceId: string | undefined = undefined; 
  try {
    videoSurfaceId = await avRecorder.getInputSurface();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`getInputSurface call failed. error code: ${err.code}`);
  }
  if (videoSurfaceId === undefined) {
    return;
  }
  let videoOutput: camera.VideoOutput | undefined = undefined;
  try {
    videoOutput = cameraManager.createVideoOutput(videoProfile, videoSurfaceId);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to create the videoOutput instance. error: ${JSON.stringify(err)}`);
  }
  if (videoOutput === undefined) {
    return;
  }
  videoOutput.on('error', (error: BusinessError) => {
    console.error(`Preview output error code: ${error.code}`);
  });
  let videoSession: camera.VideoSession | undefined = undefined;
  try {
    videoSession = cameraManager.createSession(camera.SceneMode.NORMAL_VIDEO) as camera.VideoSession;
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to create the session instance. error: ${JSON.stringify(err)}`);
  }
  if (videoSession === undefined) {
    return;
  }
  videoSession.on('error', (error: BusinessError) => {
    console.error(`Video session error code: ${error.code}`);
  });
  try {
    videoSession.beginConfig();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to beginConfig. error: ${JSON.stringify(err)}`);
  }
  let cameraInput: camera.CameraInput | undefined = undefined;
  try {
    cameraInput = cameraManager.createCameraInput(cameraArray[0]);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to createCameraInput. error: ${JSON.stringify(err)}`);
  }
  if (cameraInput === undefined) {
    return;
  }
  cameraInput.on('error', cameraArray[0], (error: BusinessError) => {
    console.error(`Camera input error code: ${error.code}`);
  });
  try {
    await cameraInput.open();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to open cameraInput. error: ${JSON.stringify(err)}`);
  }
  try {
    videoSession.addInput(cameraInput);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to add cameraInput. error: ${JSON.stringify(err)}`);
  }
  let previewOutput: camera.PreviewOutput | undefined = undefined;
  try {
    previewOutput = cameraManager.createPreviewOutput(previewProfilesArray[0], surfaceId);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to create the PreviewOutput instance. error: ${JSON.stringify(err)}`);
  }
  if (previewOutput === undefined) {
    return;
  }
  try {
    videoSession.addOutput(previewOutput);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to add previewOutput. error: ${JSON.stringify(err)}`);
  }
  try {
    videoSession.addOutput(videoOutput);
  } catch (error) {
    let err = error as BusinessError;
    console.error(`Failed to add videoOutput. error: ${JSON.stringify(err)}`);
  }
  try {
    await videoSession.commitConfig();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`videoSession commitConfig error: ${JSON.stringify(err)}`);
  }
  try {
    await videoSession.start();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`videoSession start error: ${JSON.stringify(err)}`);
  }
  videoOutput.start((err: BusinessError) => {
    if (err) {
      console.error(`Failed to start the video output. error: ${JSON.stringify(err)}`);
      return;
    }
    console.info('Callback invoked to indicate the video output start success.');
  });
  try {
    await avRecorder.start();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`avRecorder start error: ${JSON.stringify(err)}`);
  }
  videoOutput.stop((err: BusinessError) => {
    if (err) {
      console.error(`Failed to stop the video output. error: ${JSON.stringify(err)}`);
      return;
    }
    console.info('Callback invoked to indicate the video output stop success.');
  });
  try {
    await avRecorder.stop();
  } catch (error) {
    let err = error as BusinessError;
    console.error(`avRecorder stop error: ${JSON.stringify(err)}`);
  }
  videoSession.stop();
  fs.closeSync(file);
  cameraInput.close();
  previewOutput.release();
  videoOutput.release();
  videoSession.release();
  videoSession = undefined;
}

更多关于HarmonyOS鸿蒙Next中如何实现自定义视频录制页面、自定义视频录制倒计时的实战系列教程也可以访问 https://www.itying.com/category-93-b0.html


在HarmonyOS鸿蒙Next中,实现自定义视频录制页面和自定义视频录制倒计时,可以通过以下步骤进行:

  1. 自定义视频录制页面

    • 使用CameraKitMediaRecorder类进行视频录制。
    • 创建自定义布局,包含预览窗口、录制按钮、倒计时显示等控件。
    • 使用SurfaceViewTextureView作为视频预览的容器。
    • 通过CameraCamera2 API获取摄像头数据,并将其渲染到预览窗口。
    • 监听录制按钮的点击事件,调用MediaRecorderstart()stop()方法控制视频录制。
  2. 自定义视频录制倒计时

    • 使用CountDownTimer类实现倒计时功能。
    • 在倒计时开始时,更新UI显示剩余时间。
    • 倒计时结束后,自动开始视频录制。
    • 可以在倒计时过程中添加动画效果或声音提示,增强用户体验。

示例代码片段:

// 自定义倒计时
const countDownTimer = new CountDownTimer(10000, 1000); // 10秒倒计时,每秒更新一次
countDownTimer.onTick((millisUntilFinished) => {
    updateCountDownUI(millisUntilFinished / 1000); // 更新倒计时UI
});
countDownTimer.onFinish(() => {
    startRecording(); // 倒计时结束,开始录制
});
countDownTimer.start();

// 开始录制
function startRecording() {
    mediaRecorder.start();
}

在HarmonyOS鸿蒙Next中,自定义视频录制页面可通过CameraMediaRecorder组件实现。首先,利用Camera组件获取摄像头数据,然后通过MediaRecorder进行视频录制。自定义倒计时可通过TimerTaskDispatcher实现,设置定时任务并在UI线程更新倒计时显示。具体代码示例如下:

// 初始化Camera和MediaRecorder
Camera camera = new Camera(context);
MediaRecorder recorder = new MediaRecorder();

// 设置倒计时
TaskDispatcher dispatcher = TaskDispatcherFactory.getInstance().getMainTaskDispatcher();
dispatcher.delayDispatch(() -> {
    // 更新UI显示倒计时
}, 1000); // 每1秒执行一次

通过这种方式,可以灵活实现自定义视频录制页面和倒计时功能。

回到顶部
AI 助手
你好,我是IT营的 AI 助手
您可以尝试点击下方的快捷入口开启体验!