How to get actual size and screenshot of Bitmovin Player content iOS

Hi Bitmovin,

We have successfully replaced the AVPlayer with the Bimovin Player in our iOS application.
We have to play HLS and MP4 videos only for now.
But we are facing two issues where we are not able to get the following values-

1 - We are not able to get the Screenshot of the Bitmovin Playe as we were getting with the following code in AVPlayer-

- (UIImage *)currentItemAsset
{
    CVPixelBufferRef buffer = [self.itemVideoOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
    
    CIContext *temporaryContext = [CIContext contextWithOptions:nil];
    CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
    CVPixelBufferRelease(buffer);
    UIImage *currentVideoFrame = [UIImage imageWithCGImage:videoImage];
    CGImageRelease(videoImage);
    return currentVideoFrame;
}

2 - We are not able to get the size of the player content, which means the actual frame of the player. Before we were getting this from AVPlayer following code-

/*!
	@property	videoBounds
	@abstract	The current size and position of the video image as displayed within the receiver's view's bounds.
 */
@property (nonatomic, readonly) CGRect videoBounds;

//Uses
CGRect avPlayerBounds = self.avCustomPlayerController.videoBounds;

Please help us to get the correct data from the Bitmovin Player.

Here is the code how I initialized the code for Bitmovin Player-

-(void)initBitmovinPlayer {
    self.view.backgroundColor = UIColor.blackColor;
    // Create player configuration
    BMPPlayerConfig *playerConfig = [BMPPlayerConfig new];
    // Set your player license key on the player configuration
    playerConfig.key = kBitmovinPlayerLicenceKey;
    playerConfig.styleConfig.isUiEnabled = NO;
    
    playerConfig.styleConfig.scalingMode = BMPScalingModeFit;
    
    // Create player based on player and analytics configurations
    self.bitmovinPlayer = [BMPPlayerFactory createWithPlayerConfig:playerConfig];
    // Create player view and pass the player instance to it
    self.bitmovinPlayerView = [[BMPPlayerView alloc] initWithPlayer:self.bitmovinPlayer frame:CGRectZero];
    // Listen to player events
    [self.bitmovinPlayer addPlayerListener:self];
         
    self.bitmovinPlayerView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
    self.bitmovinPlayerView.frame = self.dotContainerView.bounds;
    
    [self.dotContainerView addSubview:self.bitmovinPlayerView];
    [self.dotContainerView bringSubviewToFront:self.bitmovinPlayerView];
         
}

Any help would be appreciated.

Thanks

Hi @iamravigautam,

Unfortunately, we do not have support for those two APIs you used before. Please open a feature request following this post.

I have managed that with the following updates in the Bitmocin Player-

I got the size with the following code-

AVPlayerLayer *layer = (AVPlayerLayer*)self.bitmovinPlayerView.layer;
NSLog(@"layer.width-->%f",layer.videoRect.size.width);
NSLog(@"layer.height-->%f",layer.videoRect.size.height);

And got the screenshot with the following code-

 AVPlayerLayer *layer = (AVPlayerLayer*)self.bitmovinPlayerView.layer;
self.avCustomPlayerController = [[AVCustomPlayerViewController alloc] initWithItem:layer.player.currentItem];


- (instancetype)initWithItem:(AVPlayerItem *)playerItem
{
    if(self = [super init])
    {
       // self.avPlayer = [[AVPlayer alloc] initWithURL:videoURL];
        NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
        self.itemVideoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
        self.playerItem = playerItem;
        [self.playerItem addOutput:self.itemVideoOutput];
    }
    return self;
}

- (UIImage *)currentItemAsset
{
    CVPixelBufferRef buffer = [self.itemVideoOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
    
    CIContext *temporaryContext = [CIContext contextWithOptions:nil];
    CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
    CVPixelBufferRelease(buffer);
    UIImage *currentVideoFrame = [UIImage imageWithCGImage:videoImage];
    CGImageRelease(videoImage);
    return currentVideoFrame;
}

If this can help others.

Thanks

While this workaround works technically, it’s not recommended to access the underlying AVPlayer instance. We do not support this and unexpected issues can occur.