首页 文章

AVPlayer从外部源同时附加的文件加载AVAsset(对于macOS和iOS)

提问于
浏览
11

我有一个关于使用AVFoundation的问题AVPlayer(可能适用于iOS和macOS) . 我正在尝试播放来自标准HTTP直播流以外的 Channels 的音频(未压缩的wav)数据 .

案子:
音频数据包在通道中压缩,以及应用程序需要使用的其他数据 . 例如,视频和音频来自同一个 Channels ,并由 Headers 分隔 .
过滤后,我获取音频数据并将其解压缩为WAV格式(此阶段不包含 Headers ) .
一旦数据包准备就绪(24k,每个2400字节,立体声16位音频),它们将被传递到AVPlayer的实例(根据Apple的AVAudioPlayer不适合流式传输音频) .

鉴于AVPlayer(项目或资产)不从内存加载(没有initWithData:(NSData))并且需要HTTP实时流URL或文件URL,我在磁盘上创建文件(macOS或iOS),添加WAV Headers 并在那里附加未压缩的数据 .

回到AVPlayer,我创建了以下内容:

AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:tempAudioFile] options:nil];
AVPlayerItem *audioItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
AVPlayer *audioPlayer = [[AVPlayer alloc] initWithPlayerItem:audioItem];

添加KVO然后尝试开始播放:

[audioPlayer play];

结果是音频播放1-2秒然后停止(确切地说是 AVPlayerItemDidPlayToEndTimeNotification ),同时数据继续附加到文件 . 由于整个事物处于循环状态,[audioPlayer play]会多次启动和暂停(rate == 0) .

整个概念以简化形式:

-(void)PlayAudioWithData:(NSData *data) //data in encoded format
{
    NSData *decodedSound = [AudioDecoder DecodeData:data]; //decodes the data from the compressed format (Opus) to WAV
    [Player CreateTemporaryFiles]; //This creates the temporary file by appending the header and waiting for input.

    [Player SendDataToPlayer:decodedSound]; //this sends the decoded data to the Player to be stored to file. See below for appending.

    Boolean prepared = [Player isPrepared]; //a check if AVPlayer, Item and Asset are initialized
    if (!prepared)= [Player Prepare]; //creates the objects like above
    Boolean playing = [Player isAudioPlaying]; //a check done on the AVPlayer if rate == 1
    if (!playing) [Player startPlay]; //this is actually [audioPlayer play]; on AVPlayer Instance
}

-(void)SendDataToPlayer:(NSData *data)
{
    //Two different methods here. First with NSFileHandle — not so sure about this though as it definitely locks the file.
    //Initializations and deallocations happen elsewhere, just condensing code to give you an idea
    NSFileHandle *audioFile = [NSFileHandle fileHandleForWritingAtPath:_tempAudioFile]; //happens else where
    [audioFile seekToEndOfFile];
    [audioFile writeData:data];
    [audioFile closeFile]; //happens else where

    //Second method is 
    NSOutputStream *audioFileStream = [NSOutputStream outputStreamWithURL:[NSURL fileURLWithPath:_tempStreamFile] append:YES];
    [audioFileStream open];
    [audioFileStream write:[data bytes] maxLength:data.length];
    [audioFileStream close];
}

NSFileHandleNSOutputStream都可以使QuickTime,iTunes,VLC等播放完全正常工作的WAV文件 . 此外,如果我绕过[Player SendDataToPlayer:decodingSound]并使用标准WAV预加载临时音频文件,它也会正常播放 .

到目前为止,有两个方面:a)我已经解压缩并准备播放音频数据b)我正确保存数据 .

我想要做的是连续 send-write-read . 这使我认为将数据保存到文件,获得对文件资源的独占访问权限,并且不允许AVPlayer继续播放 .

任何人都知道如何将文件保存到NSFileHandle / NSOutputStream和AVPlayer?

或者甚至更好......有AVPlayer initWithData吗? (呵呵…)

任何帮助深表感谢!提前致谢 .

1 回答

  • 9

    您可以使用 AVAssetResourceLoader 将您自己的数据和元数据传输到 AVAsset ,然后您可以使用 AVPlayer 进行管理,实际上是 [[AVPlayer alloc] initWithData:...]

    - (AVPlayer *)playerWithWavData:(NSData* )wavData {
        self.strongDelegateReference = [[NSDataAssetResourceLoaderDelegate alloc] initWithData:wavData contentType:AVFileTypeWAVE];
    
        NSURL *url = [NSURL URLWithString:@"ns-data-scheme://"];
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
    
        // or some other queue != main queue
        [asset.resourceLoader setDelegate:self.strongDelegateReference queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
    
        AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
        return [[AVPlayer alloc] initWithPlayerItem:item];
    }
    

    您可以这样使用:

    [self setupAudioSession];
    
    NSURL *wavUrl = [[NSBundle mainBundle] URLForResource:@"foo" withExtension:@"wav"];
    NSData *wavData = [NSData dataWithContentsOfURL:wavUrl];
    
    self.player = [self playerWithWavData:wavData];
    
    [self.player play];
    

    问题是, AVAssetResourceLoader 是非常强大的(unless you want to use AirPlay),所以你可能比一次性将音频数据提供给 AVPlayer 做得更好 - 你可以将它流入 AVAssetResourceLoader 委托,因为它变得可用 .

    这是简单的"one lump" AVAssetResourceLoader 委托 . 要修改它以进行流式传输,应该足够设置比当前数据量更长的 contentLength .

    头文件:

    #import <Foundation/Foundation.h>
    #import <AVFoundation/AVFoundation.h>
    
    @interface NSDataAssetResourceLoaderDelegate : NSObject <AVAssetResourceLoaderDelegate>
    
    - (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType;
    
    @end
    

    实施文件:

    @interface NSDataAssetResourceLoaderDelegate()
    
    @property (nonatomic) NSData *data;
    @property (nonatomic) NSString *contentType;
    
    @end
    
    @implementation NSDataAssetResourceLoaderDelegate
    
    - (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType {
        if (self = [super init]) {
            self.data = data;
            self.contentType = contentType;
        }
        return self;
    }
    
    - (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
        AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest;
    
        // TODO: check that loadingRequest.request is actually our custom scheme        
    
        if (contentRequest) {
            contentRequest.contentType = self.contentType;
            contentRequest.contentLength = self.data.length;
            contentRequest.byteRangeAccessSupported = YES;
        }
    
        AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest;
    
        if (dataRequest) {
            // TODO: handle requestsAllDataToEndOfResource
            NSRange range = NSMakeRange((NSUInteger)dataRequest.requestedOffset, (NSUInteger)dataRequest.requestedLength);
            [dataRequest respondWithData:[self.data subdataWithRange:range]];
            [loadingRequest finishLoading];
        }
    
        return YES;
    }
    
    @end
    

相关问题