iOS移动直播,自定义采集视频数据推流

2019-03-14 12:09:06 浏览数 (1)

常见场景

当音视频采集和预处理(即美颜、滤镜这些)开发者已经全部实现,只需要使用 SDK 来编码和推流,那么可以通过 TXLiteAVSDK 提供的自定义采集数据接口来满足该场景。

解决方案

  1. 自定义采集数据不再需要调用 TXLivePush 的 startPreview 接口;
  2. 通过 TXLivePushConfig 设置 customModeType 属性,可选自定义采集视频 CUSTOM_MODE_VIDEO_CAPTURE 和自定义采集音频 CUSTOM_MODE_AUDIO_CAPTURE 2种类型;
  3. 设置CMSampleBuffer的输出分辨率,通过 TXLivePushConfig 设置 sampleBufferSize 或者 autoSampleBufferSize;
  4. 使用 sendVideoSampleBuffer 接口向SDK填充Video数据,填充数据类型硬编码方式支持NV12和BGRA格式,软编码方式支持YUV420p格式。

简单示例部分代码如下(以AVCaptureSession采集为例):

代码语言:javascript复制
#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate、AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == _audioOutput) {
        //向SDK填充Audio数据
        [_txLivePush sendAudioSampleBuffer:sampleBuffer withType:RPSampleBufferTypeAudioMic];
    } else {
        //向SDK填充Video数据
        [_txLivePush sendVideoSampleBuffer:sampleBuffer];
    }
}

//自定义采集参数设置以及启动推流
- (void)startRtmp {

    if (_txLivePublisher != nil) {

        TXLivePushConfig* config = [[TXLivePushConfig alloc] init];
        //【示例代码1】设置自定义视频采集逻辑(自定义视频采集逻辑不要调用startPreview)
        _config.customModeType |= CUSTOM_MODE_VIDEO_CAPTURE;
        _config.autoSampleBufferSize = YES;


        //【示例代码2】设置自定义音频采集逻辑(音频采样位宽必须是16)
        //_config.customModeType |= CUSTOM_MODE_AUDIO_CAPTURE;
        //_config.audioSampleRate = AUDIO_SAMPLE_RATE_48000;


        //开始推流
        [_txLivePublisher setConfig:_config];
        _txLivePublisher.delegate = self;
        [_txLivePublisher startPush:rtmpUrl];
    }
}

//YUV数据转CVPixelBuffer(不是必须)
- (void)didReceivedYUV420pPacket:(APYUV420pPacket)packet {
    int sYLineSize = packet.yLineSize;
    int sULineSize = packet.uLineSize;
    int sVLineSize = packet.vLineSize;
    int sYSize = sYLineSize * packet.height;
    int sUSize = sULineSize * packet.height/2;
    int sVSize = sVLineSize * packet.height/2;

    int dWidth = packet.width;
    int dHeight = packet.height;

    CVPixelBufferRef pxbuffer;
    CVReturn rc;

    rc = CVPixelBufferCreate(NULL, dWidth, dHeight, kCVPixelFormatType_420YpCbCr8PlanarFullRange, NULL, &pxbuffer);
    if (rc != 0) {
        NSLog(@"CVPixelBufferCreate failed %d", rc);
        if (pxbuffer) { CFRelease(pxbuffer); }
        return;
    }

    rc = CVPixelBufferLockBaseAddress(pxbuffer, 0);

    if (rc != 0) {
        NSLog(@"CVPixelBufferLockBaseAddress falied %d", rc);
        if (pxbuffer) { CFRelease(pxbuffer); }
        return;
    } else {
        uint8_t *y_copyBaseAddress = (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(pxbuffer, 0);
        uint8_t *u_copyBaseAddress= (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(pxbuffer, 1);
        uint8_t *v_copyBaseAddress= (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(pxbuffer, 2);

        int dYLineSize = (int)CVPixelBufferGetBytesPerRowOfPlane(pxbuffer, 0);
        int dULineSize = (int)CVPixelBufferGetBytesPerRowOfPlane(pxbuffer, 1);
        int dVLineSize = (int)CVPixelBufferGetBytesPerRowOfPlane(pxbuffer, 2);

        memcpy(y_copyBaseAddress, packet.dataBuffer,                    sYSize);
        memcpy(u_copyBaseAddress, packet.dataBuffer   sYSize,           sUSize);
        memcpy(v_copyBaseAddress, packet.dataBuffer   sYSize   sUSize,  sVSize);


        rc = CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
        if (rc != 0) {
            NSLog(@"CVPixelBufferUnlockBaseAddress falied %d", rc);
        }
    }

    CMVideoFormatDescriptionRef videoInfo = NULL;
    CMVideoFormatDescriptionCreateForImageBuffer(NULL, pxbuffer, &videoInfo);

    CMSampleTimingInfo timing = {kCMTimeInvalid, kCMTimeInvalid, kCMTimeInvalid};
    CMSampleBufferRef dstSampleBuffer = NULL;
    rc = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pxbuffer, YES, NULL, NULL, videoInfo, &timing, &dstSampleBuffer);

    if (rc) {
        NSLog(@"CMSampleBufferCreateForImageBuffer error: %d", rc);
    } else {
        [self.txLivePublisher sendVideoSampleBuffer:dstSampleBuffer];
    }

    if (pxbuffer) { CFRelease(pxbuffer); }
    if (videoInfo) { CFRelease(videoInfo); }
    if (dstSampleBuffer) { CFRelease(dstSampleBuffer); }
}

原理

接口说明

代码语言:javascript复制
/**
 * 发送自定义的SampleBuffer,代替sendCustomVideoData
 * 内部有简单的帧率控制,发太快会自动丢帧;超时则会重发最后一帧
 * @note 相关属性设置请参考TXLivePushConfig,autoSampleBufferSize优先级高于sampleBufferSize
 * @property sampleBufferSize,设置输出分辨率,如果此分辨率不等于sampleBuffer中数据分辨率则会对视频数据做缩放
 * @property autoSampleBufferSize,输出分辨率等于输入分辨率,即sampleBuffer中数据的实际分辨率
 */
- (void)sendVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;

/**
 * Replaykit发送自定义音频包
 *  @prama sampleBuffer 声音
 *  @prama sampleBufferType     RPSampleBufferTypeAudioApp or RPSampleBufferTypeAudioMic,
 *
 *  当两种声音都发送时,内部做混音;否则只发送一路声音
 */
- (void)sendAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType;

自定义采集数据流程图

注意事项

  1. 编码失败 调用 sendVideoSampleBuffer 接口传给SDK的 sampleBuffer 数据必须符合硬编码方式支持NV12和BGRA格式,软编码方式支持YUV420p格式,否则格式不正确会导致编码失败。
  2. CMSampleBuffer的输出分辨率sampleBufferSize 建议优先选择autoSampleBufferSize,否则务必保证传给SDK的视频数据分辨率和这里设置的移植。比如传给SDK的视频数据是360*640,那么设置_config.sampleBufferSize = CGSizeMake(360, 640);
  3. 指定推流分辨率(setVideoResolution)的宽度(高度)一定要小于或者等于摄像机预览画面的宽度(高度)。例如预览分辨率是960x720,设置推流的分辨率可以 960x540。
  4. 如果不使用自定义采集数据接口,请勿设置TXLivePushConfig 中的customModeType 属性。
  5. TXLivePushConfig 中的customModeType 设置为CUSTOM_MODE_VIDEO_CAPTURE,SDK 还是会采集音频数据的。

Android移动直播,自定义采集视频数据推流

完整自定义采集数据Demo点击我

0 人点赞