commit | 9d6ba2cf6c064d9cb1890583985a07452f7befa6 | [log] [tgz] |
---|---|---|
author | wenson_hsieh@apple.com <wenson_hsieh@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc> | Thu Jun 23 00:04:40 2022 +0000 |
committer | wenson_hsieh@apple.com <wenson_hsieh@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc> | Thu Jun 23 00:04:40 2022 +0000 |
tree | 0c2f6e4f42f93345ecbab24bccdb1f9433e57d8f | |
parent | 84774fe198803611f5618b9a9f2415bf7b147653 [diff] |
Implement painting for MSE videos using -[AVSampleBufferDisplayLayer copyDisplayedPixelBuffer] https://bugs.webkit.org/show_bug.cgi?id=241788 rdar://94325004 Reviewed by Eric Carlson. In r292811, we enabled MSE inline painting on iOS 16 and macOS Ventura; this was intentionally limited to these versions, since CoreMedia made refinements in these OS versions to prune the `AVSampleBufferVideoOutput` queue more frequently, in order to avoid a large increase in memory use while playing MSE videos, due to accumulating excess video output frame data. However, this more frequent pruning interval has led to significantly increased power use when playing MSE video, due to the extra work done every time the pruning timer fires. To ensure that Live Text in MSE video and MSE to canvas painting still work in iOS 16 and macOS Ventura, we instead adopt new AVFoundation SPI that allows us to ask `AVSampleBufferDisplayLayer` directly for the currently displayed pixel buffer. As opposed to the `AVSampleBufferVideoOutput`- based approach, this will only kick in if MSE video inline painting is actually requested (either by the page, or from within the engine, in the case of Live Text), which avoids both increased memory use and power use. On versions of macOS and iOS that don't have the new SPI, we simply fall back to the `AVSampleBufferVideoOutput`-based snapshotting approach that we currently use. We also fall back to using the video output if the display layer is empty, in which case the backing `CAImageQueue` won't contain _any_ displayed surfaces (which means `-copyDisplayedPixelBuffer` will always end up returning null). By refactoring logic to create and set `m_videoOutput` out into a helper method (`updateVideoOutput`) that's invoked after we've finished setting up the sample buffer display layer, we can transition as needed between setting and unsetting the video output, based on whether or not the display layer is actually displaying any content. There should be no change in behavior, apart from less memory and power use due to not spinning up the `AVSampleBufferVideoOutput` queue whenever we play MSE videos. See below for more details. * Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml: Gate MSE inline painting on `HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)`, instead of `HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)`. * Source/WTF/wtf/PlatformHave.h: Add a new feature flag to guard the availability of the new AVFoundation SPI, `-[AVSampleBufferDisplayLayer copyDisplayedPixelBuffer]`. * Source/WebCore/PAL/pal/spi/cocoa/AVFoundationSPI.h: Add a staging declaration for `-copyDisplayedPixelBuffer`, so that we can maintain source compatibility when building against older versions of the iOS 16 or macOS Ventura SDKs. * Source/WebCore/html/canvas/WebGLRenderingContextBase.cpp: (WebCore::WebGLRenderingContextBase::texImageSourceHelper): * Source/WebCore/platform/graphics/MediaPlayer.cpp: * Source/WebCore/platform/graphics/MediaPlayer.h: * Source/WebCore/platform/graphics/MediaPlayerPrivate.h: Replace more uses of `HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` with the new flag `HAVE(AVSAMPLEBUFFERDISPLAYLAYER_COPYDISPLAYEDPIXELBUFFER)`, which is now used to guard availability of MSE inline painting. The purpose of guarding this logic behind `!HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` in the first place seems to have been to limit the `willBeAskedToPaintGL()` codepaths to versions of macOS and iOS, where we can't enable MSE inline painting due to lack of system support. Since "system support" now depends on the availability of `-copyDisplayedPixelBuffer`, we should change to use that flag instead of one about pruning interval frequency. This also allows us to remove the `HAVE(LOW_AV_SAMPLE_BUFFER_PRUNING_INTERVAL)` flag altogether, now that there isn't any code that needs to be guarded by it. * Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h: * Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm: (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastPixelBuffer): Adjust this logic to ask `m_sampleBufferDisplayLayer` for a copy of the last displayed pixel buffer, instead of grabbing it from the video output, if possible. (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::shouldEnsureLayer const): (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged): (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateVideoOutput): Factor out logic for creating or destroying the video output into a separate helper method, that's invoked after updating the display layer. (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::readbackMethod const): Replace `isVideoOutputAvailable()` with another helper method, that returns a strongly typed enum indicating which readback method to use. `None` indicates that readback isn't supported, `CopyPixelBufferFromDisplayLayer` indicates that we'll use the new AVFoundation SPI method, and `UseVideoOutput` indicates that we'll fall back to `AVSampleBufferVideoOutput`. (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::isVideoOutputAvailable const): Deleted. * Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h: * Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.messages.in: * Source/WebKit/GPUProcess/media/cocoa/RemoteMediaPlayerProxyCocoa.mm: * Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.cpp: * Source/WebKit/WebProcess/GPU/media/MediaPlayerPrivateRemote.h: Canonical link: https://commits.webkit.org/251761@main git-svn-id: http://svn.webkit.org/repository/webkit/trunk@295756 268f45cc-cd09-0410-ab3c-d52691b4dbfc
WebKit is a cross-platform web browser engine. On iOS and macOS, it powers Safari, Mail, iBooks, and many other applications.
Visit WebKit Feature Status page to see which Web API has been implemented, in development, or under consideration.
On macOS, download Safari Technology Preview to test the latest version of WebKit. On Linux, download Epiphany Technology Preview. On Windows, you'll have to build it yourself.
Once your bug is filed, you will receive email when it is updated at each stage in the bug life cycle. After the bug is considered fixed, you may be asked to download the latest nightly and confirm that the fix works for you.
On Windows, follow the instructions on our website.
Run the following command to clone WebKit's Git SVN repository:
git clone https://github.com/WebKit/WebKit.git WebKit
If you want to be able to track Subversion revision from your git checkout, you can run the following command to do so:
Tools/Scripts/git-webkit setup-git-svn
For information about this, and other aspects of using Git with WebKit, read the wiki page.
Install Xcode and its command line tools if you haven't done so already:
xcode-select --install
Run the following command to build a debug build with debugging symbols and assertions:
Tools/Scripts/build-webkit --debug
For performance testing, and other purposes, use --release
instead.
You can open WebKit.xcworkspace
to build and debug WebKit within Xcode.
If you don't use a custom build location in Xcode preferences, you have to update the workspace settings to use WebKitBuild
directory. In menu bar, choose File > Workspace Settings, then click the Advanced button, select “Custom”, “Relative to Workspace”, and enter WebKitBuild
for both Products and Intermediates.
iOS, tvOS and watchOS are all considered embedded builds. The first time after you install a new Xcode, you will need to run:
sudo Tools/Scripts/configure-xcode-for-embedded-development
Without this step, you will see the error message: “target specifies product type ‘com.apple.product-type.tool’, but there’s no such product type for the ‘iphonesimulator’ platform.
” when building target JSCLLIntOffsetsExtractor
of project JavaScriptCore
.
Run the following command to build a debug build with debugging symbols and assertions for embedded simulators:
Tools/Scripts/build-webkit --debug --<platform>-simulator
or embedded devices:
Tools/Scripts/build-webkit --debug --<platform>-device
where platform
is ios
, tvos
or watchos
.
For production builds:
cmake -DPORT=GTK -DCMAKE_BUILD_TYPE=RelWithDebInfo -GNinja ninja sudo ninja install
For development builds:
Tools/gtk/install-dependencies Tools/Scripts/update-webkitgtk-libs Tools/Scripts/build-webkit --gtk --debug
For more information on building WebKitGTK+, see the wiki page.
For production builds:
cmake -DPORT=WPE -DCMAKE_BUILD_TYPE=RelWithDebInfo -GNinja ninja sudo ninja install
For development builds:
Tools/wpe/install-dependencies Tools/Scripts/update-webkitwpe-libs Tools/Scripts/build-webkit --wpe --debug
For building WebKit on Windows, see the wiki page.
Run the following command to launch Safari with your local build of WebKit:
Tools/Scripts/run-safari --debug
The run-safari
script sets the DYLD_FRAMEWORK_PATH
environment variable to point to your build products, and then launches /Applications/Safari.app
. DYLD_FRAMEWORK_PATH
tells the system loader to prefer your build products over the frameworks installed in /System/Library/Frameworks
.
To run other applications with your local build of WebKit, run the following command:
Tools/Scripts/run-webkit-app <application-path>
Run the following command to launch iOS simulator with your local build of WebKit:
run-safari --debug --ios-simulator
In both cases, if you have built release builds instead, use --release
instead of --debug
.
If you have a development build, you can use the run-minibrowser script, e.g.:
run-minibrowser --debug --wpe
Pass one of --gtk
, --jsc-only
, or --wpe
to indicate the port to use.
Congratulations! You’re up and running. Now you can begin coding in WebKit and contribute your fixes and new features to the project. For details on submitting your code to the project, read Contributing Code.