Implement painting for MSE videos using -[AVSampleBufferDisplayLayer copyDisplayedPixelBuffer]
https://bugs.webkit.org/show_bug.cgi?id=241788
rdar://94325004

Reviewed by Eric Carlson.

In r292811, we enabled MSE inline painting on iOS 16 and macOS Ventura; this was intentionally
limited to these versions, since CoreMedia made refinements in these OS versions to prune the
`AVSampleBufferVideoOutput` queue more frequently, in order to avoid a large increase in memory use
while playing MSE videos, due to accumulating excess video output frame data. However, this more
frequent pruning interval has led to significantly increased power use when playing MSE video, due
to the extra work done every time the pruning timer fires.

To ensure that Live Text in MSE video and MSE to canvas painting still work in iOS 16 and macOS
Ventura, we instead adopt new AVFoundation SPI that allows us to ask `AVSampleBufferDisplayLayer`
directly for the currently displayed pixel buffer. As opposed to the `AVSampleBufferVideoOutput`-
based approach, this will only kick in if MSE video inline painting is actually requested (either by
the page, or from within the engine, in the case of Live Text), which avoids both increased memory
use and power use.

On versions of macOS and iOS that don't have the new SPI, we simply fall back to the
`AVSampleBufferVideoOutput`-based snapshotting approach that we currently use. We also fall back to
using the video output if the display layer is empty, in which case the backing `CAImageQueue` won't
contain _any_ displayed surfaces (which means `-copyDisplayedPixelBuffer` will always end up
returning null). By refactoring logic to create and set `m_videoOutput` out into a helper method
(`updateVideoOutput`) that's invoked after we've finished setting up the sample buffer display
layer, we can transition as needed between setting and unsetting the video output, based on whether
or not the display layer is actually displaying any content.

There should be no change in behavior, apart from less memory and power use due to not spinning up
the `AVSampleBufferVideoOutput` queue whenever we play MSE videos. See below for more details.

* Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml:

Gate MSE inline painting on `HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)`, instead of
`HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)`.

* Source/WTF/wtf/PlatformHave.h:

Add a new feature flag to guard the availability of the new AVFoundation SPI,
`-[AVSampleBufferDisplayLayer copyDisplayedPixelBuffer]`.

* Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h:

Add a staging declaration for `-copyDisplayedPixelBuffer`, so that we can maintain source
compatibility when building against older versions of the iOS 16 or macOS Ventura SDKs.

* Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp:
(WebCore::WebGLRenderingContextBase::texImageSourceHelper):
* Source/WebCore/platform/graphics/MediaPlayer.cpp:
* Source/WebCore/platform/graphics/MediaPlayer.h:
* Source/WebCore/platform/graphics/MediaPlayerPrivate.h:

Replace more uses of `HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` with the new flag
`HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)`, which is now used to guard availability
of MSE inline painting. The purpose of guarding this logic behind
`!HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` in the first place seems to have been to limit the
`willBeAskedToPaintGL()` codepaths to versions of macOS and iOS, where we can't enable MSE inline
painting due to lack of system support. Since "system support" now depends on the availability of
`-copyDisplayedPixelBuffer`, we should change to use that flag instead of one about pruning interval
frequency. This also allows us to remove the `HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` flag
altogether, now that there isn't any code that needs to be guarded by it.

* Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h:
* Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastPixelBuffer):

Adjust this logic to ask `m_sampleBufferDisplayLayer` for a copy of the last displayed pixel buffer,
instead of grabbing it from the video output, if possible.

(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::shouldEnsureLayer const):
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged):
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateVideoOutput):

Factor out logic for creating or destroying the video output into a separate helper method, that's
invoked after updating the display layer.

(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer):
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::readbackMethod const):

Replace `isVideoOutputAvailable()` with another helper method, that returns a strongly typed enum
indicating which readback method to use. `None` indicates that readback isn't supported,
`CopyPixelBufferFromDisplayLayer` indicates that we'll use the new AVFoundation SPI method, and
`UseVideoOutput` indicates that we'll fall back to `AVSampleBufferVideoOutput`.

(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::isVideoOutputAvailable const): Deleted.
* Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h:
* Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in:
* Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm:
* Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp:
* Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h:

Canonical link: https://commits.webkit.org/251761@main


git-svn-id: http://svn.webkit.org/repository/webkit/trunk@295756 268f45cc-cd09-0410-ab3c-d52691b4dbfc
diff --git a/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml b/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml
index 36acbf0..cbb7afb 100644
--- a/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml
+++ b/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml
@@ -962,7 +962,7 @@
     WebKitLegacy:
       default: false
     WebKit:
-      "HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)": true
+      "HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)": true
       default: false
 
 ModelDocumentEnabled:
diff --git a/Source/WTF/wtf/PlatformHave.h b/Source/WTF/wtf/PlatformHave.h
index 8b81ebd..50fbc3a 100644
--- a/Source/WTF/wtf/PlatformHave.h
+++ b/Source/WTF/wtf/PlatformHave.h
@@ -1273,3 +1273,11 @@
 #define HAVE_MEDIA_VOLUME_PER_ELEMENT 1
 #endif
 
+#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 130000) \
+    || ((PLATFORM(IOS) || PLATFORM(MACCATALYST)) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 160000) \
+    || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 90000) \
+    || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 160000)
+#if !defined(HAVE_AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
+#define HAVE_AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER 1
+#endif
+#endif
diff --git a/Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h b/Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h
index df03bfe..971a169 100644
--- a/Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h
+++ b/Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h
@@ -366,6 +366,12 @@
 NS_ASSUME_NONNULL_END
 #endif // __has_include(<AVFoundation/AVSampleBufferDisplayLayer.h>)
 
+#if HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
+@interface AVSampleBufferDisplayLayer (Staging_94324932)
+- (nullable CVPixelBufferRef)copyDisplayedPixelBuffer;
+@end
+#endif
+
 #if __has_include(<AVFoundation/AVSampleBufferAudioRenderer.h>)
 #import <AVFoundation/AVSampleBufferAudioRenderer.h>
 NS_ASSUME_NONNULL_BEGIN
diff --git a/Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp b/Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp
index eec49a6..6d7f6e8 100644
--- a/Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp
+++ b/Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp
@@ -5109,7 +5109,7 @@
             && type == GraphicsContextGL::UNSIGNED_BYTE
             && !level) {
             if (auto player = video->player()) {
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
                 player->willBeAskedToPaintGL();
 #endif
                 if (m_context->copyTextureFromMedia(*player, texture->object(), target, level, internalformat, format, type, m_unpackPremultiplyAlpha, m_unpackFlipY)) {
diff --git a/Source/WebCore/platform/graphics/MediaPlayer.cpp b/Source/WebCore/platform/graphics/MediaPlayer.cpp
index 0b27b4c..03d9b7a 100644
--- a/Source/WebCore/platform/graphics/MediaPlayer.cpp
+++ b/Source/WebCore/platform/graphics/MediaPlayer.cpp
@@ -1090,7 +1090,7 @@
 }
 
 
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
 void MediaPlayer::willBeAskedToPaintGL()
 {
     m_private->willBeAskedToPaintGL();
diff --git a/Source/WebCore/platform/graphics/MediaPlayer.h b/Source/WebCore/platform/graphics/MediaPlayer.h
index d97e2c5..71607a6 100644
--- a/Source/WebCore/platform/graphics/MediaPlayer.h
+++ b/Source/WebCore/platform/graphics/MediaPlayer.h
@@ -474,7 +474,7 @@
     bool copyVideoTextureToPlatformTexture(GraphicsContextGL*, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY);
 #endif
 
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     void willBeAskedToPaintGL();
 #endif
 
diff --git a/Source/WebCore/platform/graphics/MediaPlayerPrivate.h b/Source/WebCore/platform/graphics/MediaPlayerPrivate.h
index f85e613..d4b2c5c 100644
--- a/Source/WebCore/platform/graphics/MediaPlayerPrivate.h
+++ b/Source/WebCore/platform/graphics/MediaPlayerPrivate.h
@@ -182,7 +182,7 @@
 #if !USE(AVFOUNDATION)
     virtual bool copyVideoTextureToPlatformTexture(GraphicsContextGL*, PlatformGLObject, GCGLenum, GCGLint, GCGLenum, GCGLenum, GCGLenum, bool, bool) { return false; }
 #endif
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     virtual void willBeAskedToPaintGL() { }
 #endif
 
diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
index e7deb4d..dd3cb58 100644
--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
@@ -222,7 +222,7 @@
     bool updateLastImage();
     void paint(GraphicsContext&, const FloatRect&) override;
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     void willBeAskedToPaintGL() final;
 #endif
     RefPtr<VideoFrame> videoFrameForCurrentTime() final;
@@ -268,7 +268,14 @@
 
     bool shouldBePlaying() const;
 
-    bool isVideoOutputAvailable() const;
+    enum class VideoOutputReadbackMethod : uint8_t {
+        None,
+        CopyPixelBufferFromDisplayLayer,
+        UseVideoOutput,
+    };
+    VideoOutputReadbackMethod readbackMethod() const;
+
+    void updateVideoOutput();
 
     bool setCurrentTimeDidChangeCallback(MediaPlayer::CurrentTimeDidChangeCallback&&) final;
 
@@ -346,7 +353,7 @@
     bool m_seeking;
     SeekState m_seekCompleted { SeekCompleted };
     mutable bool m_loadingProgressed;
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     bool m_hasBeenAskedToPaintGL { false };
 #endif
     bool m_hasAvailableVideoFrame { false };
diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
index cc87dc3..5183566 100644
--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
@@ -666,6 +666,16 @@
     }
 #endif
 
+#if HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
+    if ([m_sampleBufferDisplayLayer respondsToSelector:@selector(copyDisplayedPixelBuffer)]) {
+        if (auto pixelBuffer = adoptCF([m_sampleBufferDisplayLayer copyDisplayedPixelBuffer])) {
+            ALWAYS_LOG(LOGIDENTIFIER, "displayed pixelbuffer copied for time ", currentMediaTime());
+            m_lastPixelBuffer = WTFMove(pixelBuffer);
+            return true;
+        }
+    }
+#endif
+
     if (m_sampleBufferDisplayLayer || !m_decompressionSession)
         return false;
 
@@ -725,7 +735,7 @@
     context.drawNativeImage(*image, imageRect.size(), outputRect, imageRect);
 }
 
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
 void MediaPlayerPrivateMediaSourceAVFObjC::willBeAskedToPaintGL()
 {
     // We have been asked to paint into a WebGL canvas, so take that as a signal to create
@@ -766,11 +776,11 @@
 
 bool MediaPlayerPrivateMediaSourceAVFObjC::shouldEnsureLayer() const
 {
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     if (!m_hasBeenAskedToPaintGL)
         return true;
 #endif
-    return isVideoOutputAvailable();
+    return readbackMethod() != VideoOutputReadbackMethod::None;
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged()
@@ -782,6 +792,25 @@
         destroyLayer();
         ensureDecompressionSession();
     }
+    updateVideoOutput();
+}
+
+void MediaPlayerPrivateMediaSourceAVFObjC::updateVideoOutput()
+{
+#if HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)
+    if (readbackMethod() == VideoOutputReadbackMethod::UseVideoOutput) {
+        if (!m_videoOutput)
+            m_videoOutput = adoptNS([PAL::allocAVSampleBufferVideoOutputInstance() init]);
+        ASSERT(m_videoOutput);
+    } else
+        m_videoOutput = nil;
+
+    if (m_videoOutput == [m_sampleBufferDisplayLayer output])
+        return;
+
+    ALWAYS_LOG(LOGIDENTIFIER, m_videoOutput ? "Setting up" : "Tearing down", " sample buffer video output.");
+    [m_sampleBufferDisplayLayer setOutput:m_videoOutput.get()];
+#endif // HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::notifyActiveSourceBuffersChanged()
@@ -868,15 +897,6 @@
         return;
     }
 
-#if HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)
-    ASSERT(!m_videoOutput);
-    if (isVideoOutputAvailable()) {
-        m_videoOutput = adoptNS([PAL::allocAVSampleBufferVideoOutputInstance() init]);
-        ASSERT(m_videoOutput);
-        [m_sampleBufferDisplayLayer setOutput:m_videoOutput.get()];
-    }
-#endif
-
     if ([m_sampleBufferDisplayLayer respondsToSelector:@selector(setPreventsDisplaySleepDuringVideoPlayback:)])
         m_sampleBufferDisplayLayer.get().preventsDisplaySleepDuringVideoPlayback = NO;
 
@@ -946,13 +966,35 @@
     return m_playing && !seeking() && allRenderersHaveAvailableSamples() && m_readyState >= MediaPlayer::ReadyState::HaveFutureData;
 }
 
-bool MediaPlayerPrivateMediaSourceAVFObjC::isVideoOutputAvailable() const
+MediaPlayerPrivateMediaSourceAVFObjC::VideoOutputReadbackMethod MediaPlayerPrivateMediaSourceAVFObjC::readbackMethod() const
 {
 #if HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)
-    return MediaSessionManagerCocoa::mediaSourceInlinePaintingEnabled() && PAL::getAVSampleBufferVideoOutputClass();
-#else
-    return false;
+    if (!MediaSessionManagerCocoa::mediaSourceInlinePaintingEnabled())
+        return VideoOutputReadbackMethod::None;
 #endif
+
+#if HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
+    static auto canCopyDisplayedPixelBuffer = [&] {
+        return [PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(copyDisplayedPixelBuffer)];
+    }();
+
+    if (canCopyDisplayedPixelBuffer) {
+        if (m_sampleBufferDisplayLayer) {
+            if (!CGRectIsEmpty([m_sampleBufferDisplayLayer bounds]))
+                return VideoOutputReadbackMethod::CopyPixelBufferFromDisplayLayer;
+        } else {
+            if (!m_player->playerContentBoxRect().isEmpty())
+                return VideoOutputReadbackMethod::CopyPixelBufferFromDisplayLayer;
+        }
+    }
+#endif // HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
+
+#if HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)
+    if (PAL::getAVSampleBufferVideoOutputClass())
+        return VideoOutputReadbackMethod::UseVideoOutput;
+#endif
+
+    return VideoOutputReadbackMethod::None;
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame(bool flag)
diff --git a/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h b/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h
index 9fc0fadc..c0b11c2 100644
--- a/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h
+++ b/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h
@@ -341,7 +341,7 @@
     void setVideoInlineSizeIfPossible(const WebCore::FloatSize&);
     void nativeImageForCurrentTime(CompletionHandler<void(std::optional<WTF::MachSendRight>&&, WebCore::DestinationColorSpace)>&&);
     void colorSpace(CompletionHandler<void(WebCore::DestinationColorSpace)>&&);
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     void willBeAskedToPaintGL();
 #endif
 #endif
diff --git a/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in b/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in
index 885338d..c1c8cfc 100644
--- a/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in
+++ b/Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in
@@ -125,7 +125,7 @@
 #if PLATFORM(COCOA)
     NativeImageForCurrentTime() -> (std::optional<MachSendRight> sendRight, WebCore::DestinationColorSpace colorSpace) Synchronous
     ColorSpace() -> (WebCore::DestinationColorSpace colorSpace) Synchronous
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     void WillBeAskedToPaintGL()
 #endif
 #endif
diff --git a/Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm b/Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm
index eec9499..7e2654c 100644
--- a/Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm
+++ b/Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm
@@ -134,7 +134,7 @@
     completionHandler(m_player->colorSpace());
 }
 
-#if !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
 void RemoteMediaPlayerProxy::willBeAskedToPaintGL()
 {
     if (m_player)
diff --git a/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp b/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp
index 8854fe5..2d13d93 100644
--- a/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp
+++ b/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp
@@ -1024,7 +1024,7 @@
 }
 #endif
 
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
 void MediaPlayerPrivateRemote::willBeAskedToPaintGL()
 {
     if (m_hasBeenAskedToPaintGL)
diff --git a/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h b/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h
index 9a9f0b3..26070c2 100644
--- a/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h
+++ b/Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h
@@ -295,7 +295,7 @@
 #if !USE(AVFOUNDATION)
     bool copyVideoTextureToPlatformTexture(WebCore::GraphicsContextGL*, PlatformGLObject, GCGLenum, GCGLint, GCGLenum, GCGLenum, GCGLenum, bool, bool) final;
 #endif
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     void willBeAskedToPaintGL() final;
 #endif
     RefPtr<WebCore::VideoFrame> videoFrameForCurrentTime() final;
@@ -476,7 +476,7 @@
 #endif
     std::optional<WebCore::VideoFrameMetadata> m_videoFrameMetadata;
     bool m_isGatheringVideoFrameMetadata { false };
-#if PLATFORM(COCOA) && !HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)
+#if PLATFORM(COCOA) && !HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)
     bool m_hasBeenAskedToPaintGL { false };
 #endif
 };