[MSE][GStreamer] Revert WebKitMediaSrc rework temporarily
https://bugs.webkit.org/show_bug.cgi?id=203078

Reviewed by Carlos Garcia Campos.

.:

* Source/cmake/GStreamerChecks.cmake:

Source/WebCore:

While the WebKitMediaSrc rework fixed a number of tests and introduced
design improvements in MSE, it also exposed a number of bugs related
to the playbin3 switch.

Fixing these has been turned tricky, so in order to not keep known
user-facing bugs, I'm reverting it for now until a workable solution
is available.

* platform/GStreamer.cmake:
* platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
(WTF::refGPtr<GstMiniObject>): Deleted.
(WTF::derefGPtr<GstMiniObject>): Deleted.
* platform/graphics/gstreamer/GRefPtrGStreamer.h:
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
(WebCore::MediaPlayerPrivateGStreamer::changePipelineState):
(WebCore::MediaPlayerPrivateGStreamer::paused const):
(WebCore::MediaPlayerPrivateGStreamer::updateTracks):
(WebCore::MediaPlayerPrivateGStreamer::enableTrack):
(WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
(WebCore::MediaPlayerPrivateGStreamer::videoSinkCapsChangedCallback):
(WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideoCaps):
(WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
(WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
(WebCore::MediaPlayerPrivateGStreamer::configurePlaySink):
(WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition): Deleted.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
(WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
(WebCore::MediaPlayerPrivateGStreamerBase::sizeChanged):
(WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):
(WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const): Deleted.
(WebCore::MediaPlayerPrivateGStreamerBase::doSamplesHaveDifferentNaturalSizes const): Deleted.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
* platform/graphics/gstreamer/mse/AppendPipeline.cpp:
(WebCore::AppendPipeline::appsinkNewSample):
(WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):
* platform/graphics/gstreamer/mse/AppendPipeline.h:
(WebCore::AppendPipeline::appsinkCaps):
(WebCore::AppendPipeline::track):
(WebCore::AppendPipeline::streamType):
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
(WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
(WebCore::MediaPlayerPrivateGStreamerMSE::load):
(WebCore::MediaPlayerPrivateGStreamerMSE::pause):
(WebCore::MediaPlayerPrivateGStreamerMSE::seek):
(WebCore::MediaPlayerPrivateGStreamerMSE::configurePlaySink):
(WebCore::MediaPlayerPrivateGStreamerMSE::changePipelineState):
(WebCore::MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime):
(WebCore::MediaPlayerPrivateGStreamerMSE::doSeek):
(WebCore::MediaPlayerPrivateGStreamerMSE::maybeFinishSeek):
(WebCore::MediaPlayerPrivateGStreamerMSE::updatePlaybackRate):
(WebCore::MediaPlayerPrivateGStreamerMSE::seeking const):
(WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
(WebCore::MediaPlayerPrivateGStreamerMSE::waitForSeekCompleted):
(WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
(WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
(WebCore::MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone):
(WebCore::MediaPlayerPrivateGStreamerMSE::mediaSourceClient):
(WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
(WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
(WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):
(WebCore::MediaPlayerPrivateGStreamerMSE::markEndOfStream):
(WebCore::MediaPlayerPrivateGStreamerMSE::currentMediaTime const):
(WebCore::MediaPlayerPrivateGStreamerMSE::play): Deleted.
(WebCore::MediaPlayerPrivateGStreamerMSE::reportSeekCompleted): Deleted.
(WebCore::MediaPlayerPrivateGStreamerMSE::didEnd): Deleted.
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
* platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:
(WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
(WebCore::MediaSourceClientGStreamerMSE::markEndOfStream):
(WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
(WebCore::MediaSourceClientGStreamerMSE::flush):
(WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
(WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):
(WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples): Deleted.
(WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples): Deleted.
* platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
* platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:
(WebCore::MediaSourceGStreamer::markEndOfStream):
(WebCore::MediaSourceGStreamer::unmarkEndOfStream):
(WebCore::MediaSourceGStreamer::waitForSeekCompleted):
(WebCore::MediaSourceGStreamer::seekCompleted):
* platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
* platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Added.
(getStreamByTrackId):
(getStreamBySourceBufferPrivate):
(pushSample):
(WebCore::PlaybackPipeline::setWebKitMediaSrc):
(WebCore::PlaybackPipeline::webKitMediaSrc):
(WebCore::PlaybackPipeline::addSourceBuffer):
(WebCore::PlaybackPipeline::removeSourceBuffer):
(WebCore::PlaybackPipeline::attachTrack):
(WebCore::PlaybackPipeline::reattachTrack):
(WebCore::PlaybackPipeline::notifyDurationChanged):
(WebCore::PlaybackPipeline::markEndOfStream):
(WebCore::PlaybackPipeline::flush):
(WebCore::PlaybackPipeline::enqueueSample):
(WebCore::PlaybackPipeline::allSamplesInTrackEnqueued):
(WebCore::PlaybackPipeline::pipeline):
* platform/graphics/gstreamer/mse/PlaybackPipeline.h: Copied from Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h.
* platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:
(WebCore::SourceBufferPrivateGStreamer::enqueueSample):
(WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
(WebCore::SourceBufferPrivateGStreamer::setReadyForMoreSamples):
(WebCore::SourceBufferPrivateGStreamer::notifyReadyForMoreSamples):
(WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):
* platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
(disabledAppsrcNeedData):
(disabledAppsrcEnoughData):
(disabledAppsrcSeekData):
(enabledAppsrcEnoughData):
(enabledAppsrcSeekData):
(getStreamByAppsrc):
(webkitMediaSrcChain):
(webkit_media_src_init):
(webKitMediaSrcFinalize):
(webKitMediaSrcSetProperty):
(webKitMediaSrcGetProperty):
(webKitMediaSrcDoAsyncStart):
(webKitMediaSrcDoAsyncDone):
(webKitMediaSrcChangeState):
(webKitMediaSrcGetSize):
(webKitMediaSrcQueryWithParent):
(webKitMediaSrcUpdatePresentationSize):
(webKitMediaSrcLinkStreamToSrcPad):
(webKitMediaSrcLinkSourcePad):
(webKitMediaSrcFreeStream):
(webKitMediaSrcCheckAllTracksConfigured):
(webKitMediaSrcUriGetType):
(webKitMediaSrcGetProtocols):
(webKitMediaSrcGetUri):
(webKitMediaSrcSetUri):
(webKitMediaSrcUriHandlerInit):
(seekNeedsDataMainThread):
(notifyReadyForMoreSamplesMainThread):
(webKitMediaSrcSetMediaPlayerPrivate):
(webKitMediaSrcSetReadyForSamples):
(webKitMediaSrcPrepareSeek):
(WebKitMediaSrcPrivate::streamByName): Deleted.
(): Deleted.
(WTF::refGPtr<WebKitMediaSrcPad>): Deleted.
(WTF::derefGPtr<WebKitMediaSrcPad>): Deleted.
(webkit_media_src_pad_class_init): Deleted.
(Stream::Stream): Deleted.
(Stream::StreamingMembers::StreamingMembers): Deleted.
(Stream::StreamingMembers::durationEnqueued const): Deleted.
(findPipeline): Deleted.
(webkit_media_src_class_init): Deleted.
(debugProbe): Deleted.
(copyCollectionAndAddStream): Deleted.
(copyCollectionWithoutStream): Deleted.
(gstStreamType): Deleted.
(webKitMediaSrcAddStream): Deleted.
(webKitMediaSrcRemoveStream): Deleted.
(webKitMediaSrcActivateMode): Deleted.
(webKitMediaSrcPadLinked): Deleted.
(webKitMediaSrcStreamNotifyLowWaterLevel): Deleted.
(webKitMediaSrcLoop): Deleted.
(webKitMediaSrcEnqueueObject): Deleted.
(webKitMediaSrcEnqueueSample): Deleted.
(webKitMediaSrcEnqueueEvent): Deleted.
(webKitMediaSrcEndOfStream): Deleted.
(webKitMediaSrcIsReadyForMoreSamples): Deleted.
(webKitMediaSrcNotifyWhenReadyForMoreSamples): Deleted.
(webKitMediaSrcStreamFlushStart): Deleted.
(webKitMediaSrcStreamFlushStop): Deleted.
(webKitMediaSrcFlush): Deleted.
(webKitMediaSrcSeek): Deleted.
(countStreamsOfType): Deleted.
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Added.

Tools:

* Scripts/webkitpy/style/checker.py:

LayoutTests:

* platform/gtk/TestExpectations:


git-svn-id: http://svn.webkit.org/repository/webkit/trunk@251365 268f45cc-cd09-0410-ab3c-d52691b4dbfc
diff --git a/ChangeLog b/ChangeLog
index 1999a9b..2f2788e 100644
--- a/ChangeLog
+++ b/ChangeLog
@@ -1,3 +1,12 @@
+2019-10-21  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Revert WebKitMediaSrc rework temporarily
+        https://bugs.webkit.org/show_bug.cgi?id=203078
+
+        Reviewed by Carlos Garcia Campos.
+
+        * Source/cmake/GStreamerChecks.cmake:
+
 2019-10-11  Konstantin Tokarev  <annulen@yandex.ru>
 
         [cmake] Use HINTS instead of PATHS when searching in paths from pkg-config
diff --git a/LayoutTests/ChangeLog b/LayoutTests/ChangeLog
index 88d5890..5a6369e 100644
--- a/LayoutTests/ChangeLog
+++ b/LayoutTests/ChangeLog
@@ -1,3 +1,12 @@
+2019-10-21  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Revert WebKitMediaSrc rework temporarily
+        https://bugs.webkit.org/show_bug.cgi?id=203078
+
+        Reviewed by Carlos Garcia Campos.
+
+        * platform/gtk/TestExpectations:
+
 2019-10-19  Ryosuke Niwa  <rniwa@webkit.org>
 
         Flaky Test: fast/events/resize-subframe-in-rendering-update.html
diff --git a/LayoutTests/platform/gtk/TestExpectations b/LayoutTests/platform/gtk/TestExpectations
index b6be727..41ce4d1 100644
--- a/LayoutTests/platform/gtk/TestExpectations
+++ b/LayoutTests/platform/gtk/TestExpectations
@@ -234,7 +234,6 @@
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-endofstream.html [ Failure Crash ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-errors.html [ Failure ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-preload.html [ Failure ]
-webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-remove.html [ Failure ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-seekable.html [ Failure ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-sequencemode-append-buffer.html [ Failure ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-sourcebuffer-mode-timestamps.html [ Failure ]
@@ -1292,6 +1291,16 @@
 
 webkit.org/b/201275 fast/events/focus-anchor-with-tabindex-hang.html [ Crash ]
 
+# Known issues that were fixed by the WebKitMediaSrc rework that is now reverted.
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-seek-beyond-duration.html [ Failure Crash ]
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-remove.html [ Pass Crash ]
+webkit.org/b/203078 media/media-source/media-source-remove-unload-crash.html [ Pass Crash ]
+webkit.org/b/203078 media/media-source/media-source-seek-detach-crash.html [ Pass Crash ]
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-getvideoplaybackquality.html [ Failure Crash ]
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-replay.html [ Failure Crash ]
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-seek-during-pending-seek.html [ Pass Crash ]
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-redundant-seek.html [ Pass Crash ]
+
 #////////////////////////////////////////////////////////////////////////////////////////
 # End of Crashing tests
 #////////////////////////////////////////////////////////////////////////////////////////
@@ -3297,7 +3306,6 @@
 
 webkit.org/b/176020 imported/w3c/web-platform-tests/media-source/mediasource-removesourcebuffer.html [ Crash Pass ]
 
-webkit.org/b/171726 imported/w3c/web-platform-tests/media-source/mediasource-seek-beyond-duration.html [ Failure ]
 webkit.org/b/171726 media/media-source/media-source-init-segment-duration.html [ Failure ]
 
 webkit.org/b/172270 fast/text/font-interstitial-invisible-width-while-loading.html [ Failure ]
@@ -3867,6 +3875,9 @@
 
 webkit.org/b/202750 http/tests/download/anchor-download-attribute-content-disposition-no-extension-text-plain.html [ Failure ]
 
+# Known issues that were fixed by the WebKitMediaSrc rework that is now reverted.
+webkit.org/b/203078 imported/w3c/web-platform-tests/media-source/mediasource-config-change-mp4-v-framesize.html [ Pass Failure ]
+
 #////////////////////////////////////////////////////////////////////////////////////////
 # End of non-crashing, non-flaky tests failing
 #////////////////////////////////////////////////////////////////////////////////////////
diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog
index b1433e3..5ff2b6e 100644
--- a/Source/WebCore/ChangeLog
+++ b/Source/WebCore/ChangeLog
@@ -1,3 +1,188 @@
+2019-10-21  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Revert WebKitMediaSrc rework temporarily
+        https://bugs.webkit.org/show_bug.cgi?id=203078
+
+        Reviewed by Carlos Garcia Campos.
+
+        While the WebKitMediaSrc rework fixed a number of tests and introduced
+        design improvements in MSE, it also exposed a number of bugs related
+        to the playbin3 switch.
+
+        Fixing these has been turned tricky, so in order to not keep known
+        user-facing bugs, I'm reverting it for now until a workable solution
+        is available.
+
+        * platform/GStreamer.cmake:
+        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
+        (WTF::refGPtr<GstMiniObject>): Deleted.
+        (WTF::derefGPtr<GstMiniObject>): Deleted.
+        * platform/graphics/gstreamer/GRefPtrGStreamer.h:
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
+        (WebCore::MediaPlayerPrivateGStreamer::changePipelineState):
+        (WebCore::MediaPlayerPrivateGStreamer::paused const):
+        (WebCore::MediaPlayerPrivateGStreamer::updateTracks):
+        (WebCore::MediaPlayerPrivateGStreamer::enableTrack):
+        (WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
+        (WebCore::MediaPlayerPrivateGStreamer::videoSinkCapsChangedCallback):
+        (WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideoCaps):
+        (WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
+        (WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
+        (WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
+        (WebCore::MediaPlayerPrivateGStreamer::configurePlaySink):
+        (WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition): Deleted.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
+        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
+        (WebCore::MediaPlayerPrivateGStreamerBase::sizeChanged):
+        (WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):
+        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const): Deleted.
+        (WebCore::MediaPlayerPrivateGStreamerBase::doSamplesHaveDifferentNaturalSizes const): Deleted.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
+        * platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
+        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
+        * platform/graphics/gstreamer/mse/AppendPipeline.cpp:
+        (WebCore::AppendPipeline::appsinkNewSample):
+        (WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):
+        * platform/graphics/gstreamer/mse/AppendPipeline.h:
+        (WebCore::AppendPipeline::appsinkCaps):
+        (WebCore::AppendPipeline::track):
+        (WebCore::AppendPipeline::streamType):
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
+        (WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::load):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::pause):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::configurePlaySink):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::changePipelineState):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::doSeek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::maybeFinishSeek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::updatePlaybackRate):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seeking const):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::waitForSeekCompleted):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::mediaSourceClient):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::markEndOfStream):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::currentMediaTime const):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::play): Deleted.
+        (WebCore::MediaPlayerPrivateGStreamerMSE::reportSeekCompleted): Deleted.
+        (WebCore::MediaPlayerPrivateGStreamerMSE::didEnd): Deleted.
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
+        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:
+        (WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
+        (WebCore::MediaSourceClientGStreamerMSE::markEndOfStream):
+        (WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
+        (WebCore::MediaSourceClientGStreamerMSE::flush):
+        (WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
+        (WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):
+        (WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples): Deleted.
+        (WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples): Deleted.
+        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
+        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:
+        (WebCore::MediaSourceGStreamer::markEndOfStream):
+        (WebCore::MediaSourceGStreamer::unmarkEndOfStream):
+        (WebCore::MediaSourceGStreamer::waitForSeekCompleted):
+        (WebCore::MediaSourceGStreamer::seekCompleted):
+        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Added.
+        (getStreamByTrackId):
+        (getStreamBySourceBufferPrivate):
+        (pushSample):
+        (WebCore::PlaybackPipeline::setWebKitMediaSrc):
+        (WebCore::PlaybackPipeline::webKitMediaSrc):
+        (WebCore::PlaybackPipeline::addSourceBuffer):
+        (WebCore::PlaybackPipeline::removeSourceBuffer):
+        (WebCore::PlaybackPipeline::attachTrack):
+        (WebCore::PlaybackPipeline::reattachTrack):
+        (WebCore::PlaybackPipeline::notifyDurationChanged):
+        (WebCore::PlaybackPipeline::markEndOfStream):
+        (WebCore::PlaybackPipeline::flush):
+        (WebCore::PlaybackPipeline::enqueueSample):
+        (WebCore::PlaybackPipeline::allSamplesInTrackEnqueued):
+        (WebCore::PlaybackPipeline::pipeline):
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.h: Copied from Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h.
+        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:
+        (WebCore::SourceBufferPrivateGStreamer::enqueueSample):
+        (WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
+        (WebCore::SourceBufferPrivateGStreamer::setReadyForMoreSamples):
+        (WebCore::SourceBufferPrivateGStreamer::notifyReadyForMoreSamples):
+        (WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):
+        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
+        (disabledAppsrcNeedData):
+        (disabledAppsrcEnoughData):
+        (disabledAppsrcSeekData):
+        (enabledAppsrcEnoughData):
+        (enabledAppsrcSeekData):
+        (getStreamByAppsrc):
+        (webkitMediaSrcChain):
+        (webkit_media_src_init):
+        (webKitMediaSrcFinalize):
+        (webKitMediaSrcSetProperty):
+        (webKitMediaSrcGetProperty):
+        (webKitMediaSrcDoAsyncStart):
+        (webKitMediaSrcDoAsyncDone):
+        (webKitMediaSrcChangeState):
+        (webKitMediaSrcGetSize):
+        (webKitMediaSrcQueryWithParent):
+        (webKitMediaSrcUpdatePresentationSize):
+        (webKitMediaSrcLinkStreamToSrcPad):
+        (webKitMediaSrcLinkSourcePad):
+        (webKitMediaSrcFreeStream):
+        (webKitMediaSrcCheckAllTracksConfigured):
+        (webKitMediaSrcUriGetType):
+        (webKitMediaSrcGetProtocols):
+        (webKitMediaSrcGetUri):
+        (webKitMediaSrcSetUri):
+        (webKitMediaSrcUriHandlerInit):
+        (seekNeedsDataMainThread):
+        (notifyReadyForMoreSamplesMainThread):
+        (webKitMediaSrcSetMediaPlayerPrivate):
+        (webKitMediaSrcSetReadyForSamples):
+        (webKitMediaSrcPrepareSeek):
+        (WebKitMediaSrcPrivate::streamByName): Deleted.
+        (): Deleted.
+        (WTF::refGPtr<WebKitMediaSrcPad>): Deleted.
+        (WTF::derefGPtr<WebKitMediaSrcPad>): Deleted.
+        (webkit_media_src_pad_class_init): Deleted.
+        (Stream::Stream): Deleted.
+        (Stream::StreamingMembers::StreamingMembers): Deleted.
+        (Stream::StreamingMembers::durationEnqueued const): Deleted.
+        (findPipeline): Deleted.
+        (webkit_media_src_class_init): Deleted.
+        (debugProbe): Deleted.
+        (copyCollectionAndAddStream): Deleted.
+        (copyCollectionWithoutStream): Deleted.
+        (gstStreamType): Deleted.
+        (webKitMediaSrcAddStream): Deleted.
+        (webKitMediaSrcRemoveStream): Deleted.
+        (webKitMediaSrcActivateMode): Deleted.
+        (webKitMediaSrcPadLinked): Deleted.
+        (webKitMediaSrcStreamNotifyLowWaterLevel): Deleted.
+        (webKitMediaSrcLoop): Deleted.
+        (webKitMediaSrcEnqueueObject): Deleted.
+        (webKitMediaSrcEnqueueSample): Deleted.
+        (webKitMediaSrcEnqueueEvent): Deleted.
+        (webKitMediaSrcEndOfStream): Deleted.
+        (webKitMediaSrcIsReadyForMoreSamples): Deleted.
+        (webKitMediaSrcNotifyWhenReadyForMoreSamples): Deleted.
+        (webKitMediaSrcStreamFlushStart): Deleted.
+        (webKitMediaSrcStreamFlushStop): Deleted.
+        (webKitMediaSrcFlush): Deleted.
+        (webKitMediaSrcSeek): Deleted.
+        (countStreamsOfType): Deleted.
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Added.
+
 2019-10-21  Tim Horton  <timothy_horton@apple.com>
 
         Clean up some includes to improve WebKit2 build speed
diff --git a/Source/WebCore/platform/GStreamer.cmake b/Source/WebCore/platform/GStreamer.cmake
index f93ce6b..f240ec7 100644
--- a/Source/WebCore/platform/GStreamer.cmake
+++ b/Source/WebCore/platform/GStreamer.cmake
@@ -32,6 +32,7 @@
         platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
         platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
         platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
+        platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
         platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
         platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
 
diff --git a/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp
index 669a0eb..b15f7f3 100644
--- a/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp
@@ -25,25 +25,6 @@
 
 namespace WTF {
 
-template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr)
-{
-    return GRefPtr<GstMiniObject>(ptr, GRefPtrAdopt);
-}
-
-template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr)
-{
-    if (ptr)
-        gst_mini_object_ref(ptr);
-
-    return ptr;
-}
-
-template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr)
-{
-    if (ptr)
-        gst_mini_object_unref(ptr);
-}
-
 template <> GRefPtr<GstElement> adoptGRef(GstElement* ptr)
 {
     ASSERT(!ptr || !g_object_is_floating(ptr));
diff --git a/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h b/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h
index 3d04b1b..a1965d4 100644
--- a/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h
+++ b/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h
@@ -34,10 +34,6 @@
 
 namespace WTF {
 
-template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr);
-template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr);
-template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr);
-
 template<> GRefPtr<GstElement> adoptGRef(GstElement* ptr);
 template<> GstElement* refGPtr<GstElement>(GstElement* ptr);
 template<> void derefGPtr<GstElement>(GstElement* ptr);
diff --git a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
index 1a47c15..7f29495 100644
--- a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
@@ -356,16 +356,8 @@
 MediaTime MediaPlayerPrivateGStreamer::playbackPosition() const
 {
     GST_TRACE_OBJECT(pipeline(), "isEndReached: %s, seeking: %s, seekTime: %s", boolForPrinting(m_isEndReached), boolForPrinting(m_seeking), m_seekTime.toString().utf8().data());
-    if (m_isEndReached) {
-        // Position queries on a pipeline that is not running return 0. This is the case when the prerolling
-        // from a seek is still not done and after EOS. In these cases we want to report the seek time or the
-        // duration respectively.
-        if (m_seeking)
-            return m_seekTime;
-
-        MediaTime duration = durationMediaTime();
-        return duration.isInvalid() ? MediaTime::zeroTime() : duration;
-    }
+    if (m_isEndReached && m_seeking)
+        return m_seekTime;
 
     // This constant should remain lower than HTMLMediaElement's maxTimeupdateEventFrequency.
     static const Seconds positionCacheThreshold = 200_ms;
@@ -421,7 +413,7 @@
         gst_element_state_get_name(currentState), gst_element_state_get_name(pending));
 
 #if USE(GSTREAMER_GL)
-    if (currentState <= GST_STATE_READY && newState >= GST_STATE_PAUSED)
+    if (currentState == GST_STATE_READY && newState == GST_STATE_PAUSED)
         ensureGLVideoSinkContext();
 #endif
 
@@ -667,10 +659,10 @@
         return false;
     }
 
-    GstState state, pending;
-    gst_element_get_state(m_pipeline.get(), &state, &pending, 0);
+    GstState state;
+    gst_element_get_state(m_pipeline.get(), &state, nullptr, 0);
     bool paused = state <= GST_STATE_PAUSED;
-    GST_LOG_OBJECT(pipeline(), "Paused: %s (pending state: %s)", toString(paused).utf8().data(), gst_element_state_get_name(pending));
+    GST_LOG_OBJECT(pipeline(), "Paused: %s", toString(paused).utf8().data());
     return paused;
 }
 
@@ -716,11 +708,13 @@
 #if ENABLE(VIDEO_TRACK)
 #define CREATE_TRACK(type, Type) G_STMT_START {                         \
         m_has##Type = true;                                             \
-        RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
-        m_##type##Tracks.add(track->id(), track);                       \
-        m_player->add##Type##Track(*track);                             \
-        if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
-            m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
+        if (!useMediaSource) {                                          \
+            RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
+            m_##type##Tracks.add(track->id(), track);                   \
+            m_player->add##Type##Track(*track);                         \
+            if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
+                m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
+        }                                                               \
     } G_STMT_END
 #else
 #define CREATE_TRACK(type, Type) G_STMT_START { \
@@ -734,8 +728,6 @@
 
     bool useMediaSource = isMediaSource();
     unsigned length = gst_stream_collection_get_size(m_streamCollection.get());
-    GST_DEBUG_OBJECT(pipeline(), "Inspecting stream collection: %s %" GST_PTR_FORMAT,
-        gst_stream_collection_get_upstream_id(m_streamCollection.get()), m_streamCollection.get());
 
     bool oldHasAudio = m_hasAudio;
     bool oldHasVideo = m_hasVideo;
@@ -775,13 +767,21 @@
 
 void MediaPlayerPrivateGStreamer::enableTrack(TrackPrivateBaseGStreamer::TrackType trackType, unsigned index)
 {
+    // FIXME: Remove isMediaSource() test below when fixing https://bugs.webkit.org/show_bug.cgi?id=182531.
+    if (isMediaSource()) {
+        GST_FIXME_OBJECT(m_pipeline.get(), "Audio/Video/Text track switching is not yet supported by the MSE backend.");
+        return;
+    }
+
     const char* propertyName;
     const char* trackTypeAsString;
     Vector<String> selectedStreams;
     String selectedStreamId;
 
+    GstStream* stream = nullptr;
+
     if (!m_isLegacyPlaybin) {
-        GstStream* stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
+        stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
         if (!stream) {
             GST_WARNING_OBJECT(pipeline(), "No stream to select at index %u", index);
             return;
@@ -862,8 +862,7 @@
     if (UNLIKELY(!m_pipeline || !m_source))
         return;
 
-    ASSERT(m_isLegacyPlaybin);
-    ASSERT(!isMediaSource());
+    ASSERT(m_isLegacyPlaybin || isMediaSource());
 
     gint numTracks = 0;
     bool useMediaSource = isMediaSource();
@@ -916,6 +915,19 @@
     m_player->client().mediaPlayerEngineUpdated(m_player);
 }
 
+void MediaPlayerPrivateGStreamer::videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer* player)
+{
+    player->m_notifier->notify(MainThreadNotification::VideoCapsChanged, [player] {
+        player->notifyPlayerOfVideoCaps();
+    });
+}
+
+void MediaPlayerPrivateGStreamer::notifyPlayerOfVideoCaps()
+{
+    m_videoSize = IntSize();
+    m_player->client().mediaPlayerEngineUpdated(m_player);
+}
+
 void MediaPlayerPrivateGStreamer::audioChangedCallback(MediaPlayerPrivateGStreamer* player)
 {
     player->m_notifier->notify(MainThreadNotification::AudioChanged, [player] {
@@ -1834,7 +1846,6 @@
 
 void MediaPlayerPrivateGStreamer::sourceSetup(GstElement* sourceElement)
 {
-    ASSERT(!isMediaSource());
     GST_DEBUG_OBJECT(pipeline(), "Source element set-up for %s", GST_ELEMENT_NAME(sourceElement));
 
     if (WEBKIT_IS_WEB_SRC(m_source.get()) && GST_OBJECT_PARENT(m_source.get()))
@@ -2085,21 +2096,15 @@
 bool MediaPlayerPrivateGStreamer::handleSyncMessage(GstMessage* message)
 {
     if (GST_MESSAGE_TYPE(message) == GST_MESSAGE_STREAM_COLLECTION && !m_isLegacyPlaybin) {
-        // GStreamer workaround:
-        // Unfortunately, when we have a stream-collection aware source (like WebKitMediaSrc) parsebin and decodebin3 emit
-        // their own stream-collection messages, but late, and sometimes with duplicated streams. Let's only listen for
-        // stream-collection messages from the source in the MSE case to avoid these issues.
-        if (isMediaSource() && message->src != GST_OBJECT(m_source.get()))
-            return true;
-
         GRefPtr<GstStreamCollection> collection;
         gst_message_parse_stream_collection(message, &collection.outPtr());
-        ASSERT(collection);
-        m_streamCollection.swap(collection);
 
-        m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
-            this->updateTracks();
-        });
+        if (collection) {
+            m_streamCollection.swap(collection);
+            m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
+                this->updateTracks();
+            });
+        }
     }
 
     return MediaPlayerPrivateGStreamerBase::handleSyncMessage(message);
@@ -2395,9 +2400,10 @@
 {
     const gchar* playbinName = "playbin";
 
-    // MSE and Mediastream require playbin3. Regular playback can use playbin3 on-demand with the
-    // WEBKIT_GST_USE_PLAYBIN3 environment variable.
-    if ((isMediaSource() || url.protocolIs("mediastream") || g_getenv("WEBKIT_GST_USE_PLAYBIN3")))
+    // MSE doesn't support playbin3. Mediastream requires playbin3. Regular
+    // playback can use playbin3 on-demand with the WEBKIT_GST_USE_PLAYBIN3
+    // environment variable.
+    if ((!isMediaSource() && g_getenv("WEBKIT_GST_USE_PLAYBIN3")) || url.protocolIs("mediastream"))
         playbinName = "playbin3";
 
     if (m_pipeline) {
@@ -2480,6 +2486,8 @@
 
     g_object_set(m_pipeline.get(), "video-sink", createVideoSink(), "audio-sink", createAudioSink(), nullptr);
 
+    configurePlaySink();
+
     if (m_preservesPitch) {
         GstElement* scale = gst_element_factory_make("scaletempo", nullptr);
 
@@ -2499,6 +2507,10 @@
         } else
             GST_WARNING("The videoflip element is missing, video rotation support is now disabled. Please check your gst-plugins-good installation.");
     }
+
+    GRefPtr<GstPad> videoSinkPad = adoptGRef(gst_element_get_static_pad(m_videoSink.get(), "sink"));
+    if (videoSinkPad)
+        g_signal_connect_swapped(videoSinkPad.get(), "notify::caps", G_CALLBACK(videoSinkCapsChangedCallback), this);
 }
 
 void MediaPlayerPrivateGStreamer::simulateAudioInterruption()
diff --git a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
index e041235..511325e 100644
--- a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
+++ b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
@@ -114,12 +114,14 @@
 
     void loadStateChanged();
     void timeChanged();
+    void didEnd();
     virtual void durationChanged();
     void loadingFailed(MediaPlayer::NetworkState, MediaPlayer::ReadyState = MediaPlayer::HaveNothing, bool forceNotifications = false);
 
     virtual void sourceSetup(GstElement*);
 
     GstElement* audioSink() const override;
+    virtual void configurePlaySink() { }
 
     void simulateAudioInterruption() override;
 
@@ -204,7 +206,7 @@
     GstState m_requestedState;
     bool m_resetPipeline;
     bool m_seeking;
-    bool m_seekIsPending; // Set when the user requests a seek but gst can't handle it yet, so it's deferred until we're >=PAUSED.
+    bool m_seekIsPending;
     MediaTime m_seekTime;
     GRefPtr<GstElement> m_source;
     bool m_volumeAndMuteInitialized;
@@ -212,6 +214,7 @@
     void readyTimerFired();
 
     void notifyPlayerOfVideo();
+    void notifyPlayerOfVideoCaps();
     void notifyPlayerOfAudio();
 
 #if ENABLE(VIDEO_TRACK)
@@ -222,13 +225,11 @@
     void ensureAudioSourceProvider();
     void setAudioStreamProperties(GObject*);
 
-    virtual void didEnd();
-    void invalidateCachedPosition() { m_lastQueryTime.reset(); }
-
     static void setAudioStreamPropertiesCallback(MediaPlayerPrivateGStreamer*, GObject*);
 
     static void sourceSetupCallback(MediaPlayerPrivateGStreamer*, GstElement*);
     static void videoChangedCallback(MediaPlayerPrivateGStreamer*);
+    static void videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer*);
     static void audioChangedCallback(MediaPlayerPrivateGStreamer*);
 #if ENABLE(VIDEO_TRACK)
     static void textChangedCallback(MediaPlayerPrivateGStreamer*);
diff --git a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp
index d64787b..7f2b041 100644
--- a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp
@@ -560,7 +560,6 @@
 // Returns the size of the video
 FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
 {
-    ASSERT(isMainThread());
 #if USE(GSTREAMER_HOLEPUNCH)
     // When using the holepuch we may not be able to get the video frames size, so we can't use
     // it. But we need to report some non empty naturalSize for the player's GraphicsLayer
@@ -575,7 +574,6 @@
         return m_videoSize;
 
     auto sampleLocker = holdLock(m_sampleMutex);
-
     if (!GST_IS_SAMPLE(m_sample.get()))
         return FloatSize();
 
@@ -583,14 +581,6 @@
     if (!caps)
         return FloatSize();
 
-    m_videoSize = naturalSizeFromCaps(caps);
-    GST_DEBUG_OBJECT(pipeline(), "Natural size: %.0fx%.0f", m_videoSize.width(), m_videoSize.height());
-    return m_videoSize;
-}
-
-FloatSize MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps(GstCaps* caps) const
-{
-    ASSERT(caps);
 
     // TODO: handle possible clean aperture data. See
     // https://bugzilla.gnome.org/show_bug.cgi?id=596571
@@ -641,7 +631,9 @@
         height = static_cast<guint64>(originalSize.height());
     }
 
-    return FloatSize(static_cast<int>(width), static_cast<int>(height));
+    GST_DEBUG_OBJECT(pipeline(), "Natural size: %" G_GUINT64_FORMAT "x%" G_GUINT64_FORMAT, width, height);
+    m_videoSize = FloatSize(static_cast<int>(width), static_cast<int>(height));
+    return m_videoSize;
 }
 
 void MediaPlayerPrivateGStreamerBase::setVolume(float volume)
@@ -695,6 +687,11 @@
     return m_readyState;
 }
 
+void MediaPlayerPrivateGStreamerBase::sizeChanged()
+{
+    notImplemented();
+}
+
 void MediaPlayerPrivateGStreamerBase::setMuted(bool mute)
 {
     if (!m_volumeElement)
@@ -827,28 +824,12 @@
     m_drawCondition.notifyOne();
 }
 
-bool MediaPlayerPrivateGStreamerBase::doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const
-{
-    ASSERT(sampleA);
-    ASSERT(sampleB);
-
-    GstCaps* capsA = gst_sample_get_caps(sampleA);
-    GstCaps* capsB = gst_sample_get_caps(sampleB);
-
-    if (LIKELY(capsA == capsB))
-        return false;
-
-    return naturalSizeFromCaps(capsA) != naturalSizeFromCaps(capsB);
-}
-
 void MediaPlayerPrivateGStreamerBase::triggerRepaint(GstSample* sample)
 {
     bool triggerResize;
     {
         auto sampleLocker = holdLock(m_sampleMutex);
-        triggerResize = !m_sample || doSamplesHaveDifferentNaturalSizes(m_sample.get(), sample);
-        if (triggerResize)
-            m_videoSize = FloatSize(); // Force re-calculation in next call to naturalSize().
+        triggerResize = !m_sample;
         m_sample = sample;
     }
 
diff --git a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
index 312c705..ab5a93d 100644
--- a/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
+++ b/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
@@ -122,6 +122,7 @@
 
     void setVisible(bool) override { }
     void setSize(const IntSize&) override;
+    void sizeChanged();
 
     // Prefer MediaTime based methods over float based.
     float duration() const override { return durationMediaTime().toFloat(); }
@@ -250,6 +251,7 @@
 
     enum MainThreadNotification {
         VideoChanged = 1 << 0,
+        VideoCapsChanged = 1 << 1,
         AudioChanged = 1 << 2,
         VolumeChanged = 1 << 3,
         MuteChanged = 1 << 4,
@@ -269,11 +271,10 @@
     MediaPlayer::ReadyState m_readyState;
     mutable MediaPlayer::NetworkState m_networkState;
     IntSize m_size;
-
     mutable Lock m_sampleMutex;
     GRefPtr<GstSample> m_sample;
-    mutable FloatSize m_videoSize;
 
+    mutable FloatSize m_videoSize;
     bool m_usingFallbackVideoSink { false };
     bool m_renderingCanBeAccelerated { false };
 
@@ -315,10 +316,6 @@
 #endif
 
     Optional<GstVideoDecoderPlatform> m_videoDecoderPlatform;
-
-private:
-    FloatSize naturalSizeFromCaps(GstCaps*) const;
-    bool doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const;
 };
 
 }
diff --git a/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp
index 912aa5f..f5a7dd6 100644
--- a/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp
@@ -43,7 +43,7 @@
 
     auto createMediaTime =
         [](GstClockTime time) -> MediaTime {
-            return MediaTime(time, GST_SECOND);
+            return MediaTime(GST_TIME_AS_USECONDS(time), G_USEC_PER_SEC);
         };
 
     if (GST_BUFFER_PTS_IS_VALID(buffer))
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp
index b03944e..5a2bded 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp
@@ -458,12 +458,6 @@
         return;
     }
 
-    if (!GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get()))) {
-        // When demuxing Vorbis, matroskademux creates several PTS-less frames with header information. We don't need those.
-        GST_DEBUG("Ignoring sample without PTS: %" GST_PTR_FORMAT, gst_sample_get_buffer(sample.get()));
-        return;
-    }
-
     auto mediaSample = WebCore::MediaSampleGStreamer::create(WTFMove(sample), m_presentationSize, trackId());
 
     GST_TRACE("append: trackId=%s PTS=%s DTS=%s DUR=%s presentationSize=%.0fx%.0f",
@@ -752,9 +746,6 @@
     // Only one stream per demuxer is supported.
     ASSERT(!gst_pad_is_linked(sinkSinkPad.get()));
 
-    // As it is now, resetParserState() will cause the pads to be disconnected, so they will later be re-added on the next initialization segment.
-    bool firstTimeConnectingTrack = m_track == nullptr;
-
     GRefPtr<GstCaps> caps = adoptGRef(gst_pad_get_current_caps(GST_PAD(demuxerSrcPad)));
 
 #ifndef GST_DISABLE_GST_DEBUG
@@ -795,7 +786,7 @@
     }
 
     m_appsinkCaps = WTFMove(caps);
-    m_playerPrivate->trackDetected(this, m_track, firstTimeConnectingTrack);
+    m_playerPrivate->trackDetected(this, m_track, true);
 }
 
 void AppendPipeline::disconnectDemuxerSrcPadFromAppsinkFromAnyThread(GstPad*)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h
index bd06ede..952970d 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h
@@ -52,9 +52,8 @@
     void pushNewBuffer(GRefPtr<GstBuffer>&&);
     void resetParserState();
     Ref<SourceBufferPrivateGStreamer> sourceBufferPrivate() { return m_sourceBufferPrivate.get(); }
-    const GRefPtr<GstCaps>& appsinkCaps() { return m_appsinkCaps; }
+    GstCaps* appsinkCaps() { return m_appsinkCaps.get(); }
     RefPtr<WebCore::TrackPrivateBase> track() { return m_track; }
-    MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
     MediaPlayerPrivateGStreamerMSE* playerPrivate() { return m_playerPrivate; }
 
 private:
@@ -82,6 +81,7 @@
     GstElement* appsrc() { return m_appsrc.get(); }
     GstElement* appsink() { return m_appsink.get(); }
     GstCaps* demuxerSrcPadCaps() { return m_demuxerSrcPadCaps.get(); }
+    WebCore::MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
 
     void disconnectDemuxerSrcPadFromAppsinkFromAnyThread(GstPad*);
     void connectDemuxerSrcPadToAppsinkFromStreamingThread(GstPad*);
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
index ac0122f..e6880d9 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
@@ -3,9 +3,9 @@
  * Copyright (C) 2007 Collabora Ltd.  All rights reserved.
  * Copyright (C) 2007 Alp Toker <alp@atoker.com>
  * Copyright (C) 2009 Gustavo Noronha Silva <gns@gnome.org>
- * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017, 2018, 2019 Igalia S.L
+ * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017 Igalia S.L
  * Copyright (C) 2015 Sebastian Dröge <sebastian@centricular.com>
- * Copyright (C) 2015, 2016, 2017, 2018, 2019 Metrological Group B.V.
+ * Copyright (C) 2015, 2016, 2017 Metrological Group B.V.
  *
  * This library is free software; you can redistribute it and/or
  * modify it under the terms of the GNU Library General Public
@@ -37,6 +37,7 @@
 #include "MediaDescription.h"
 #include "MediaPlayer.h"
 #include "NotImplemented.h"
+#include "PlaybackPipeline.h"
 #include "SourceBufferPrivateGStreamer.h"
 #include "TimeRanges.h"
 #include "VideoTrackPrivateGStreamer.h"
@@ -99,7 +100,13 @@
 #endif
     m_appendPipelinesMap.clear();
 
-    m_source.clear();
+    if (m_source) {
+        webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), nullptr);
+        g_signal_handlers_disconnect_by_data(m_source.get(), this);
+    }
+
+    if (m_playbackPipeline)
+        m_playbackPipeline->setWebKitMediaSrc(nullptr);
 }
 
 void MediaPlayerPrivateGStreamerMSE::load(const String& urlString)
@@ -111,6 +118,9 @@
         return;
     }
 
+    if (!m_playbackPipeline)
+        m_playbackPipeline = PlaybackPipeline::create();
+
     MediaPlayerPrivateGStreamer::load(urlString);
 }
 
@@ -120,18 +130,10 @@
     load(makeString("mediasource", url));
 }
 
-void MediaPlayerPrivateGStreamerMSE::play()
-{
-    GST_DEBUG_OBJECT(pipeline(), "Play requested");
-    m_paused = false;
-    updateStates();
-}
-
 void MediaPlayerPrivateGStreamerMSE::pause()
 {
-    GST_DEBUG_OBJECT(pipeline(), "Pause requested");
     m_paused = true;
-    updateStates();
+    MediaPlayerPrivateGStreamer::pause();
 }
 
 MediaTime MediaPlayerPrivateGStreamerMSE::durationMediaTime() const
@@ -144,50 +146,310 @@
 
 void MediaPlayerPrivateGStreamerMSE::seek(const MediaTime& time)
 {
-    GST_DEBUG_OBJECT(pipeline(), "Seeking to %s", time.toString().utf8().data());
+    if (UNLIKELY(!m_pipeline || m_errorOccured))
+        return;
+
+    GST_INFO("[Seek] seek attempt to %s secs", toString(time).utf8().data());
+
+    // Avoid useless seeking.
+    MediaTime current = currentMediaTime();
+    if (time == current) {
+        if (!m_seeking)
+            timeChanged();
+        return;
+    }
+
+    if (isLiveStream())
+        return;
+
+    if (m_seeking && m_seekIsPending) {
+        m_seekTime = time;
+        return;
+    }
+
+    GST_DEBUG("Seeking from %s to %s seconds", toString(current).utf8().data(), toString(time).utf8().data());
+
+    MediaTime previousSeekTime = m_seekTime;
     m_seekTime = time;
-    m_seeking = true;
+
+    if (!doSeek()) {
+        m_seekTime = previousSeekTime;
+        GST_WARNING("Seeking to %s failed", toString(time).utf8().data());
+        return;
+    }
+
     m_isEndReached = false;
-
-    webKitMediaSrcSeek(WEBKIT_MEDIA_SRC(m_source.get()), toGstClockTime(m_seekTime), m_playbackRate);
-
-    invalidateCachedPosition();
-    m_canFallBackToLastFinishedSeekPosition = true;
-
-    // Notify MediaSource and have new frames enqueued (when they're available).
-    m_mediaSource->seekToTime(time);
+    GST_DEBUG("m_seeking=%s, m_seekTime=%s", boolForPrinting(m_seeking), toString(m_seekTime).utf8().data());
 }
 
-void MediaPlayerPrivateGStreamerMSE::reportSeekCompleted()
+void MediaPlayerPrivateGStreamerMSE::configurePlaySink()
 {
-    m_seeking = false;
-    m_player->timeChanged();
+    MediaPlayerPrivateGStreamer::configurePlaySink();
+
+    GRefPtr<GstElement> playsink = adoptGRef(gst_bin_get_by_name(GST_BIN(m_pipeline.get()), "playsink"));
+    if (playsink) {
+        // The default value (0) means "send events to all the sinks", instead
+        // of "only to the first that returns true". This is needed for MSE seek.
+        g_object_set(G_OBJECT(playsink.get()), "send-event-mode", 0, nullptr);
+    }
 }
 
+bool MediaPlayerPrivateGStreamerMSE::changePipelineState(GstState newState)
+{
+    if (seeking()) {
+        GST_DEBUG("Rejected state change to %s while seeking",
+            gst_element_state_get_name(newState));
+        return true;
+    }
+
+    return MediaPlayerPrivateGStreamer::changePipelineState(newState);
+}
+
+void MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime(const MediaTime& seekTime)
+{
+    // Reenqueue samples needed to resume playback in the new position.
+    m_mediaSource->seekToTime(seekTime);
+
+    GST_DEBUG("MSE seek to %s finished", toString(seekTime).utf8().data());
+
+    if (!m_gstSeekCompleted) {
+        m_gstSeekCompleted = true;
+        maybeFinishSeek();
+    }
+}
+
+bool MediaPlayerPrivateGStreamerMSE::doSeek(const MediaTime&, float, GstSeekFlags)
+{
+    // Use doSeek() instead. If anybody is calling this version of doSeek(), something is wrong.
+    ASSERT_NOT_REACHED();
+    return false;
+}
+
+bool MediaPlayerPrivateGStreamerMSE::doSeek()
+{
+    MediaTime seekTime = m_seekTime;
+    double rate = m_player->rate();
+    GstSeekFlags seekType = static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE);
+
+    // Always move to seeking state to report correct 'currentTime' while pending for actual seek to complete.
+    m_seeking = true;
+
+    // Check if playback pipeline is ready for seek.
+    GstState state, newState;
+    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
+    if (getStateResult == GST_STATE_CHANGE_FAILURE || getStateResult == GST_STATE_CHANGE_NO_PREROLL) {
+        GST_DEBUG("[Seek] cannot seek, current state change is %s", gst_element_state_change_return_get_name(getStateResult));
+        webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
+        m_seeking = false;
+        return false;
+    }
+    if ((getStateResult == GST_STATE_CHANGE_ASYNC
+        && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED))
+        || state < GST_STATE_PAUSED
+        || m_isEndReached
+        || !m_gstSeekCompleted) {
+        CString reason = "Unknown reason";
+        if (getStateResult == GST_STATE_CHANGE_ASYNC) {
+            reason = makeString("In async change ",
+                gst_element_state_get_name(state), " --> ",
+                gst_element_state_get_name(newState)).utf8();
+        } else if (state < GST_STATE_PAUSED)
+            reason = "State less than PAUSED";
+        else if (m_isEndReached)
+            reason = "End reached";
+        else if (!m_gstSeekCompleted)
+            reason = "Previous seek is not finished yet";
+
+        GST_DEBUG("[Seek] Delaying the seek: %s", reason.data());
+
+        m_seekIsPending = true;
+
+        if (m_isEndReached) {
+            GST_DEBUG("[Seek] reset pipeline");
+            m_resetPipeline = true;
+            m_seeking = false;
+            if (!changePipelineState(GST_STATE_PAUSED))
+                loadingFailed(MediaPlayer::Empty);
+            else
+                m_seeking = true;
+        }
+
+        return m_seeking;
+    }
+
+    // Stop accepting new samples until actual seek is finished.
+    webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), false);
+
+    // Correct seek time if it helps to fix a small gap.
+    if (!isTimeBuffered(seekTime)) {
+        // Look if a near future time (<0.1 sec.) is buffered and change the seek target time.
+        if (m_mediaSource) {
+            const MediaTime miniGap = MediaTime(1, 10);
+            MediaTime nearest = m_mediaSource->buffered()->nearest(seekTime);
+            if (nearest.isValid() && nearest > seekTime && (nearest - seekTime) <= miniGap && isTimeBuffered(nearest + miniGap)) {
+                GST_DEBUG("[Seek] Changed the seek target time from %s to %s, a near point in the future", toString(seekTime).utf8().data(), toString(nearest).utf8().data());
+                seekTime = nearest;
+            }
+        }
+    }
+
+    // Check if MSE has samples for requested time and defer actual seek if needed.
+    if (!isTimeBuffered(seekTime)) {
+        GST_DEBUG("[Seek] Delaying the seek: MSE is not ready");
+        GstStateChangeReturn setStateResult = gst_element_set_state(m_pipeline.get(), GST_STATE_PAUSED);
+        if (setStateResult == GST_STATE_CHANGE_FAILURE) {
+            GST_DEBUG("[Seek] Cannot seek, failed to pause playback pipeline.");
+            webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
+            m_seeking = false;
+            return false;
+        }
+        m_readyState = MediaPlayer::HaveMetadata;
+        notifySeekNeedsDataForTime(seekTime);
+        ASSERT(!m_mseSeekCompleted);
+        return true;
+    }
+
+    // Complete previous MSE seek if needed.
+    if (!m_mseSeekCompleted) {
+        m_mediaSource->monitorSourceBuffers();
+        ASSERT(m_mseSeekCompleted);
+        // Note: seekCompleted will recursively call us.
+        return m_seeking;
+    }
+
+    GST_DEBUG("We can seek now");
+
+    MediaTime startTime = seekTime, endTime = MediaTime::invalidTime();
+
+    if (rate < 0) {
+        startTime = MediaTime::zeroTime();
+        endTime = seekTime;
+    }
+
+    if (!rate)
+        rate = 1;
+
+    GST_DEBUG("Actual seek to %s, end time:  %s, rate: %f", toString(startTime).utf8().data(), toString(endTime).utf8().data(), rate);
+
+    // This will call notifySeekNeedsData() after some time to tell that the pipeline is ready for sample enqueuing.
+    webKitMediaSrcPrepareSeek(WEBKIT_MEDIA_SRC(m_source.get()), seekTime);
+
+    m_gstSeekCompleted = false;
+    if (!gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType, GST_SEEK_TYPE_SET, toGstClockTime(startTime), GST_SEEK_TYPE_SET, toGstClockTime(endTime))) {
+        webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
+        m_seeking = false;
+        m_gstSeekCompleted = true;
+        GST_DEBUG("doSeek(): gst_element_seek() failed, returning false");
+        return false;
+    }
+
+    // The samples will be enqueued in notifySeekNeedsData().
+    GST_DEBUG("doSeek(): gst_element_seek() succeeded, returning true");
+    return true;
+}
+
+void MediaPlayerPrivateGStreamerMSE::maybeFinishSeek()
+{
+    if (!m_seeking || !m_mseSeekCompleted || !m_gstSeekCompleted)
+        return;
+
+    GstState state, newState;
+    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
+
+    if (getStateResult == GST_STATE_CHANGE_ASYNC
+        && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED)) {
+        GST_DEBUG("[Seek] Delaying seek finish");
+        return;
+    }
+
+    if (m_seekIsPending) {
+        GST_DEBUG("[Seek] Committing pending seek to %s", toString(m_seekTime).utf8().data());
+        m_seekIsPending = false;
+        if (!doSeek()) {
+            GST_WARNING("[Seek] Seeking to %s failed", toString(m_seekTime).utf8().data());
+            m_cachedPosition = MediaTime::invalidTime();
+        }
+        return;
+    }
+
+    GST_DEBUG("[Seek] Seeked to %s", toString(m_seekTime).utf8().data());
+
+    webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
+    m_seeking = false;
+    m_cachedPosition = MediaTime::invalidTime();
+    // The pipeline can still have a pending state. In this case a position query will fail.
+    // Right now we can use m_seekTime as a fallback.
+    m_canFallBackToLastFinishedSeekPosition = true;
+    timeChanged();
+}
+
+void MediaPlayerPrivateGStreamerMSE::updatePlaybackRate()
+{
+    notImplemented();
+}
+
+bool MediaPlayerPrivateGStreamerMSE::seeking() const
+{
+    return m_seeking;
+}
+
+// FIXME: MediaPlayerPrivateGStreamer manages the ReadyState on its own. We shouldn't change it manually.
 void MediaPlayerPrivateGStreamerMSE::setReadyState(MediaPlayer::ReadyState readyState)
 {
     if (readyState == m_readyState)
         return;
 
-    GST_DEBUG("MediaPlayerPrivateGStreamerMSE::setReadyState(%p): %s -> %s", this, dumpReadyState(m_readyState), dumpReadyState(readyState));
-    m_readyState = readyState;
-    updateStates();
+    if (seeking()) {
+        GST_DEBUG("Skip ready state change(%s -> %s) due to seek\n", dumpReadyState(m_readyState), dumpReadyState(readyState));
+        return;
+    }
 
-    // Both readyStateChanged() and timeChanged() check for "seeked" condition, which requires all the following three things:
-    //   1. HTMLMediaPlayer.m_seekRequested == true.
-    //   2. Our seeking() method to return false (that is, we have completed the seek).
-    //   3. readyState > HaveMetadata.
-    //
-    // We normally would set m_seeking = false in seekCompleted(), but unfortunately by that time, playback has already
-    // started which means that the "playing" event is emitted before "seeked". In order to avoid that wrong order,
-    // we do it here already.
-    if (m_seeking && readyState > MediaPlayer::ReadyState::HaveMetadata)
-        m_seeking = false;
+    GST_DEBUG("Ready State Changed manually from %u to %u", m_readyState, readyState);
+    MediaPlayer::ReadyState oldReadyState = m_readyState;
+    m_readyState = readyState;
+    GST_DEBUG("m_readyState: %s -> %s", dumpReadyState(oldReadyState), dumpReadyState(m_readyState));
+
+    if (oldReadyState < MediaPlayer::HaveCurrentData && m_readyState >= MediaPlayer::HaveCurrentData) {
+        GST_DEBUG("[Seek] Reporting load state changed to trigger seek continuation");
+        loadStateChanged();
+    }
     m_player->readyStateChanged();
 
-    // The readyState change may be a result of monitorSourceBuffers() finding that currentTime == duration, which
-    // should cause the video to be marked as ended. Let's have the player check that.
-    m_player->timeChanged();
+    GstState pipelineState;
+    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &pipelineState, nullptr, 250 * GST_NSECOND);
+    bool isPlaying = (getStateResult == GST_STATE_CHANGE_SUCCESS && pipelineState == GST_STATE_PLAYING);
+
+    if (m_readyState == MediaPlayer::HaveMetadata && oldReadyState > MediaPlayer::HaveMetadata && isPlaying) {
+        GST_TRACE("Changing pipeline to PAUSED...");
+        bool ok = changePipelineState(GST_STATE_PAUSED);
+        GST_TRACE("Changed pipeline to PAUSED: %s", ok ? "Success" : "Error");
+    }
+}
+
+void MediaPlayerPrivateGStreamerMSE::waitForSeekCompleted()
+{
+    if (!m_seeking)
+        return;
+
+    GST_DEBUG("Waiting for MSE seek completed");
+    m_mseSeekCompleted = false;
+}
+
+void MediaPlayerPrivateGStreamerMSE::seekCompleted()
+{
+    if (m_mseSeekCompleted)
+        return;
+
+    GST_DEBUG("MSE seek completed");
+    m_mseSeekCompleted = true;
+
+    doSeek();
+
+    if (!seeking() && m_readyState >= MediaPlayer::HaveFutureData)
+        changePipelineState(GST_STATE_PLAYING);
+
+    if (!seeking())
+        m_player->timeChanged();
 }
 
 void MediaPlayerPrivateGStreamerMSE::setRate(float)
@@ -203,33 +465,168 @@
 void MediaPlayerPrivateGStreamerMSE::sourceSetup(GstElement* sourceElement)
 {
     m_source = sourceElement;
+
     ASSERT(WEBKIT_IS_MEDIA_SRC(m_source.get()));
+
+    m_playbackPipeline->setWebKitMediaSrc(WEBKIT_MEDIA_SRC(m_source.get()));
+
     MediaSourceGStreamer::open(*m_mediaSource.get(), *this);
+    g_signal_connect_swapped(m_source.get(), "video-changed", G_CALLBACK(videoChangedCallback), this);
+    g_signal_connect_swapped(m_source.get(), "audio-changed", G_CALLBACK(audioChangedCallback), this);
+    g_signal_connect_swapped(m_source.get(), "text-changed", G_CALLBACK(textChangedCallback), this);
+    webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), this);
 }
 
 void MediaPlayerPrivateGStreamerMSE::updateStates()
 {
-    bool shouldBePlaying = !m_paused && readyState() >= MediaPlayer::ReadyState::HaveFutureData;
-    GST_DEBUG_OBJECT(pipeline(), "shouldBePlaying = %d, m_isPipelinePlaying = %d", static_cast<int>(shouldBePlaying), static_cast<int>(m_isPipelinePlaying));
-    if (shouldBePlaying && !m_isPipelinePlaying) {
-        if (!changePipelineState(GST_STATE_PLAYING))
-            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PLAYING failed");
-        m_isPipelinePlaying = true;
-    } else if (!shouldBePlaying && m_isPipelinePlaying) {
-        if (!changePipelineState(GST_STATE_PAUSED))
-            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PAUSED failed");
-        m_isPipelinePlaying = false;
+    if (UNLIKELY(!m_pipeline || m_errorOccured))
+        return;
+
+    MediaPlayer::NetworkState oldNetworkState = m_networkState;
+    MediaPlayer::ReadyState oldReadyState = m_readyState;
+    GstState state, pending;
+
+    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &pending, 250 * GST_NSECOND);
+
+    bool shouldUpdatePlaybackState = false;
+    switch (getStateResult) {
+    case GST_STATE_CHANGE_SUCCESS: {
+        GST_DEBUG("State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
+
+        // Do nothing if on EOS and state changed to READY to avoid recreating the player
+        // on HTMLMediaElement and properly generate the video 'ended' event.
+        if (m_isEndReached && state == GST_STATE_READY)
+            break;
+
+        m_resetPipeline = (state <= GST_STATE_READY);
+        if (m_resetPipeline)
+            m_mediaTimeDuration = MediaTime::zeroTime();
+
+        // Update ready and network states.
+        switch (state) {
+        case GST_STATE_NULL:
+            m_readyState = MediaPlayer::HaveNothing;
+            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+            m_networkState = MediaPlayer::Empty;
+            break;
+        case GST_STATE_READY:
+            m_readyState = MediaPlayer::HaveMetadata;
+            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+            m_networkState = MediaPlayer::Empty;
+            break;
+        case GST_STATE_PAUSED:
+        case GST_STATE_PLAYING:
+            if (seeking()) {
+                m_readyState = MediaPlayer::HaveMetadata;
+                // FIXME: Should we manage NetworkState too?
+                GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+            } else {
+                if (m_readyState < MediaPlayer::HaveFutureData)
+                    m_readyState = MediaPlayer::HaveFutureData;
+                GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+                m_networkState = MediaPlayer::Loading;
+            }
+
+            if (m_eosMarked && state == GST_STATE_PLAYING)
+                m_eosPending = true;
+
+            break;
+        default:
+            ASSERT_NOT_REACHED();
+            break;
+        }
+
+        // Sync states where needed.
+        if (state == GST_STATE_PAUSED) {
+            if (!m_volumeAndMuteInitialized) {
+                notifyPlayerOfVolumeChange();
+                notifyPlayerOfMute();
+                m_volumeAndMuteInitialized = true;
+            }
+
+            if (!seeking() && !m_paused && m_playbackRate) {
+                GST_DEBUG("[Buffering] Restarting playback.");
+                changePipelineState(GST_STATE_PLAYING);
+            }
+        } else if (state == GST_STATE_PLAYING) {
+            m_paused = false;
+
+            if (!m_playbackRate) {
+                GST_DEBUG("[Buffering] Pausing stream for buffering.");
+                changePipelineState(GST_STATE_PAUSED);
+            }
+        } else
+            m_paused = true;
+
+        if (m_requestedState == GST_STATE_PAUSED && state == GST_STATE_PAUSED) {
+            shouldUpdatePlaybackState = true;
+            GST_DEBUG("Requested state change to %s was completed", gst_element_state_get_name(state));
+        }
+
+        break;
+    }
+    case GST_STATE_CHANGE_ASYNC:
+        GST_DEBUG("Async: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
+        // Change in progress.
+        break;
+    case GST_STATE_CHANGE_FAILURE:
+        GST_WARNING("Failure: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
+        // Change failed.
+        return;
+    case GST_STATE_CHANGE_NO_PREROLL:
+        GST_DEBUG("No preroll: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
+
+        // Live pipelines go in PAUSED without prerolling.
+        m_isStreaming = true;
+
+        if (state == GST_STATE_READY) {
+            m_readyState = MediaPlayer::HaveNothing;
+            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+        } else if (state == GST_STATE_PAUSED) {
+            m_readyState = MediaPlayer::HaveEnoughData;
+            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
+            m_paused = true;
+        } else if (state == GST_STATE_PLAYING)
+            m_paused = false;
+
+        if (!m_paused && m_playbackRate)
+            changePipelineState(GST_STATE_PLAYING);
+
+        m_networkState = MediaPlayer::Loading;
+        break;
+    default:
+        GST_DEBUG("Else : %d", getStateResult);
+        break;
+    }
+
+    m_requestedState = GST_STATE_VOID_PENDING;
+
+    if (shouldUpdatePlaybackState)
+        m_player->playbackStateChanged();
+
+    if (m_networkState != oldNetworkState) {
+        GST_DEBUG("Network State Changed from %u to %u", oldNetworkState, m_networkState);
+        m_player->networkStateChanged();
+    }
+    if (m_readyState != oldReadyState) {
+        GST_DEBUG("Ready State Changed from %u to %u", oldReadyState, m_readyState);
+        m_player->readyStateChanged();
+    }
+
+    if (getStateResult == GST_STATE_CHANGE_SUCCESS && state >= GST_STATE_PAUSED) {
+        updatePlaybackRate();
+        maybeFinishSeek();
     }
 }
-
-void MediaPlayerPrivateGStreamerMSE::didEnd()
+void MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone()
 {
-    GST_DEBUG_OBJECT(pipeline(), "EOS received, currentTime=%s duration=%s", currentMediaTime().toString().utf8().data(), durationMediaTime().toString().utf8().data());
-    m_isEndReached = true;
-    invalidateCachedPosition();
-    // HTMLMediaElement will emit ended if currentTime >= duration (which should now be the case).
-    ASSERT(currentMediaTime() == durationMediaTime());
-    m_player->timeChanged();
+    if (UNLIKELY(!m_pipeline || m_errorOccured))
+        return;
+
+    if (m_seeking)
+        maybeFinishSeek();
+    else
+        updateStates();
 }
 
 bool MediaPlayerPrivateGStreamerMSE::isTimeBuffered(const MediaTime &time) const
@@ -244,6 +641,11 @@
     m_mediaSourceClient = client.ptr();
 }
 
+RefPtr<MediaSourceClientGStreamerMSE> MediaPlayerPrivateGStreamerMSE::mediaSourceClient()
+{
+    return m_mediaSourceClient;
+}
+
 void MediaPlayerPrivateGStreamerMSE::blockDurationChanges()
 {
     ASSERT(isMainThread());
@@ -256,6 +658,7 @@
     ASSERT(isMainThread());
     if (m_shouldReportDurationWhenUnblocking) {
         m_player->durationChanged();
+        m_playbackPipeline->notifyDurationChanged();
         m_shouldReportDurationWhenUnblocking = false;
     }
 
@@ -278,9 +681,10 @@
     // Avoid emiting durationchanged in the case where the previous duration was 0 because that case is already handled
     // by the HTMLMediaElement.
     if (m_mediaTimeDuration != previousDuration && m_mediaTimeDuration.isValid() && previousDuration.isValid()) {
-        if (!m_areDurationChangesBlocked)
+        if (!m_areDurationChangesBlocked) {
             m_player->durationChanged();
-        else
+            m_playbackPipeline->notifyDurationChanged();
+        } else
             m_shouldReportDurationWhenUnblocking = true;
         m_mediaSource->durationChanged(m_mediaTimeDuration);
     }
@@ -290,18 +694,20 @@
 {
     ASSERT(appendPipeline->track() == newTrack);
 
-    GRefPtr<GstCaps> caps = appendPipeline->appsinkCaps();
+    GstCaps* caps = appendPipeline->appsinkCaps();
     ASSERT(caps);
-    GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps.get());
+    GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps);
 
-    if (doCapsHaveType(caps.get(), GST_VIDEO_CAPS_TYPE_PREFIX)) {
-        Optional<FloatSize> size = getVideoResolutionFromCaps(caps.get());
+    if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
+        Optional<FloatSize> size = getVideoResolutionFromCaps(caps);
         if (size.hasValue())
             m_videoSize = size.value();
     }
 
     if (firstTrackDetected)
-        webKitMediaSrcAddStream(WEBKIT_MEDIA_SRC(m_source.get()), newTrack->id(), appendPipeline->streamType(), WTFMove(caps));
+        m_playbackPipeline->attachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
+    else
+        m_playbackPipeline->reattachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
 }
 
 void MediaPlayerPrivateGStreamerMSE::getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types)
@@ -338,6 +744,34 @@
     return finalResult;
 }
 
+void MediaPlayerPrivateGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
+{
+    if (status != MediaSourcePrivate::EosNoError)
+        return;
+
+    GST_DEBUG("Marking end of stream");
+    m_eosMarked = true;
+    updateStates();
+}
+
+MediaTime MediaPlayerPrivateGStreamerMSE::currentMediaTime() const
+{
+    MediaTime position = MediaPlayerPrivateGStreamer::currentMediaTime();
+
+    if (m_eosPending && position >= durationMediaTime()) {
+        if (m_networkState != MediaPlayer::Loaded) {
+            m_networkState = MediaPlayer::Loaded;
+            m_player->networkStateChanged();
+        }
+
+        m_eosPending = false;
+        m_isEndReached = true;
+        m_cachedPosition = m_mediaTimeDuration;
+        m_player->timeChanged();
+    }
+    return position;
+}
+
 MediaTime MediaPlayerPrivateGStreamerMSE::maxMediaTimeSeekable() const
 {
     if (UNLIKELY(m_errorOccured))
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h b/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h
index fe6179f..3bcc0fd 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h
@@ -55,12 +55,13 @@
     void updateDownloadBufferingFlag() override { };
 
     bool isLiveStream() const override { return false; }
+    MediaTime currentMediaTime() const override;
 
-    void play() override;
     void pause() override;
+    bool seeking() const override;
     void seek(const MediaTime&) override;
-    void reportSeekCompleted();
-    void updatePipelineState(GstState);
+    void configurePlaySink() override;
+    bool changePipelineState(GstState) override;
 
     void durationChanged() override;
     MediaTime durationMediaTime() const override;
@@ -72,38 +73,50 @@
     void sourceSetup(GstElement*) override;
 
     void setReadyState(MediaPlayer::ReadyState);
+    void waitForSeekCompleted();
+    void seekCompleted();
     MediaSourcePrivateClient* mediaSourcePrivateClient() { return m_mediaSource.get(); }
 
+    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
+
     void trackDetected(RefPtr<AppendPipeline>, RefPtr<WebCore::TrackPrivateBase>, bool firstTrackDetected);
+    void notifySeekNeedsDataForTime(const MediaTime&);
 
     void blockDurationChanges();
     void unblockDurationChanges();
 
-    void asyncStateChangeDone() override { }
-
-protected:
-    void didEnd() override;
-
 private:
     static void getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>&);
     static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&);
 
+    // FIXME: Reduce code duplication.
     void updateStates() override;
 
+    bool doSeek(const MediaTime&, float, GstSeekFlags) override;
+    bool doSeek();
+    void maybeFinishSeek();
+    void updatePlaybackRate() override;
+    void asyncStateChangeDone() override;
+
     // FIXME: Implement videoPlaybackQualityMetrics.
     bool isTimeBuffered(const MediaTime&) const;
 
     bool isMediaSource() const override { return true; }
 
     void setMediaSourceClient(Ref<MediaSourceClientGStreamerMSE>);
+    RefPtr<MediaSourceClientGStreamerMSE> mediaSourceClient();
 
     HashMap<RefPtr<SourceBufferPrivateGStreamer>, RefPtr<AppendPipeline>> m_appendPipelinesMap;
+    bool m_eosMarked = false;
+    mutable bool m_eosPending = false;
+    bool m_gstSeekCompleted = true;
     RefPtr<MediaSourcePrivateClient> m_mediaSource;
     RefPtr<MediaSourceClientGStreamerMSE> m_mediaSourceClient;
     MediaTime m_mediaTimeDuration;
+    bool m_mseSeekCompleted = true;
     bool m_areDurationChangesBlocked = false;
     bool m_shouldReportDurationWhenUnblocking = false;
-    bool m_isPipelinePlaying = true;
+    RefPtr<PlaybackPipeline> m_playbackPipeline;
 };
 
 } // namespace WebCore
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
index e8d9d01..fae74f4 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
@@ -23,6 +23,7 @@
 
 #include "AppendPipeline.h"
 #include "MediaPlayerPrivateGStreamerMSE.h"
+#include "PlaybackPipeline.h"
 #include "WebKitMediaSourceGStreamer.h"
 #include <gst/gst.h>
 
@@ -59,13 +60,14 @@
 {
     ASSERT(WTF::isMainThread());
 
+    ASSERT(m_playerPrivate.m_playbackPipeline);
     ASSERT(sourceBufferPrivate);
 
     RefPtr<AppendPipeline> appendPipeline = adoptRef(new AppendPipeline(*this, *sourceBufferPrivate, m_playerPrivate));
     GST_TRACE("Adding SourceBuffer to AppendPipeline: this=%p sourceBuffer=%p appendPipeline=%p", this, sourceBufferPrivate.get(), appendPipeline.get());
     m_playerPrivate.m_appendPipelinesMap.add(sourceBufferPrivate, appendPipeline);
 
-    return MediaSourcePrivate::Ok;
+    return m_playerPrivate.m_playbackPipeline->addSourceBuffer(sourceBufferPrivate);
 }
 
 const MediaTime& MediaSourceClientGStreamerMSE::duration()
@@ -135,17 +137,25 @@
     appendPipeline->pushNewBuffer(WTFMove(buffer));
 }
 
+void MediaSourceClientGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
+{
+    ASSERT(WTF::isMainThread());
+
+    m_playerPrivate.markEndOfStream(status);
+}
+
 void MediaSourceClientGStreamerMSE::removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
 {
     ASSERT(WTF::isMainThread());
 
+    ASSERT(m_playerPrivate.m_playbackPipeline);
+
     // Remove the AppendPipeline from the map. This should cause its destruction since there should be no alive
     // references at this point.
     ASSERT(m_playerPrivate.m_appendPipelinesMap.get(sourceBufferPrivate)->hasOneRef());
     m_playerPrivate.m_appendPipelinesMap.remove(sourceBufferPrivate);
 
-    if (!sourceBufferPrivate->trackId().isNull())
-        webKitMediaSrcRemoveStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), sourceBufferPrivate->trackId());
+    m_playerPrivate.m_playbackPipeline->removeSourceBuffer(sourceBufferPrivate);
 }
 
 void MediaSourceClientGStreamerMSE::flush(AtomString trackId)
@@ -154,41 +164,21 @@
 
     // This is only for on-the-fly reenqueues after appends. When seeking, the seek will do its own flush.
     if (!m_playerPrivate.m_seeking)
-        webKitMediaSrcFlush(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
+        m_playerPrivate.m_playbackPipeline->flush(trackId);
 }
 
-void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample, AtomString trackId)
+void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample)
 {
     ASSERT(WTF::isMainThread());
 
-    GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
-        trackId.string().utf8().data(), sample->presentationTime().toFloat(),
-        sample->presentationSize().width(), sample->presentationSize().height(),
-        GST_TIME_ARGS(WebCore::toGstClockTime(sample->presentationTime())),
-        GST_TIME_ARGS(WebCore::toGstClockTime(sample->duration())));
-
-    GRefPtr<GstSample> gstSample = sample->platformSample().sample.gstSample;
-    ASSERT(gstSample);
-    ASSERT(gst_sample_get_buffer(gstSample.get()));
-
-    webKitMediaSrcEnqueueSample(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, WTFMove(gstSample));
-}
-
-bool MediaSourceClientGStreamerMSE::isReadyForMoreSamples(const AtomString& trackId)
-{
-    return webKitMediaSrcIsReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
-}
-
-void MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples(const AtomString& trackId, SourceBufferPrivateClient* sourceBuffer)
-{
-    webKitMediaSrcNotifyWhenReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, sourceBuffer);
+    m_playerPrivate.m_playbackPipeline->enqueueSample(WTFMove(sample));
 }
 
 void MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued(const AtomString& trackId)
 {
     ASSERT(WTF::isMainThread());
 
-    webKitMediaSrcEndOfStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
+    m_playerPrivate.m_playbackPipeline->allSamplesInTrackEnqueued(trackId);
 }
 
 GRefPtr<WebKitMediaSrc> MediaSourceClientGStreamerMSE::webKitMediaSrc()
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h
index 8e52f94..5008a06 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h
@@ -43,6 +43,7 @@
     // From MediaSourceGStreamer.
     MediaSourcePrivate::AddStatus addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>, const ContentType&);
     void durationChanged(const MediaTime&);
+    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
 
     // From SourceBufferPrivateGStreamer.
     void abort(RefPtr<SourceBufferPrivateGStreamer>);
@@ -50,12 +51,9 @@
     void append(RefPtr<SourceBufferPrivateGStreamer>, Vector<unsigned char>&&);
     void removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer>);
     void flush(AtomString);
-    void enqueueSample(Ref<MediaSample>&&, AtomString trackId);
+    void enqueueSample(Ref<MediaSample>&&);
     void allSamplesInTrackEnqueued(const AtomString&);
 
-    bool isReadyForMoreSamples(const AtomString&);
-    void notifyClientWhenReadyForMoreSamples(const AtomString&, SourceBufferPrivateClient*);
-
     const MediaTime& duration();
     GRefPtr<WebKitMediaSrc> webKitMediaSrc();
 
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
index 51adf82..e62cd56 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
@@ -92,18 +92,14 @@
     m_client->durationChanged(m_mediaSource->duration());
 }
 
-void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus)
+void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus status)
 {
-    // We don't need to do anything in the AppendPipeline nor the playback pipeline. Instead, SourceBuffer knows better
-    // when .endOfStream() has been called and there are no more samples to enqueue, which it will signal with a call
-    // to SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(), where we enqueue an EOS event into WebKitMediaSrc.
-
-    // At this point it would be dangerously early to do that! There may be samples waiting to reach WebKitMediaSrc
-    // (e.g. because high water level is hit) that will not be shown if we enqueue an EOS now.
+    m_client->markEndOfStream(status);
 }
 
 void MediaSourceGStreamer::unmarkEndOfStream()
 {
+    notImplemented();
 }
 
 MediaPlayer::ReadyState MediaSourceGStreamer::readyState() const
@@ -118,11 +114,12 @@
 
 void MediaSourceGStreamer::waitForSeekCompleted()
 {
+    m_playerPrivate.waitForSeekCompleted();
 }
 
 void MediaSourceGStreamer::seekCompleted()
 {
-    m_playerPrivate.reportSeekCompleted();
+    m_playerPrivate.seekCompleted();
 }
 
 void MediaSourceGStreamer::sourceBufferPrivateDidChangeActiveState(SourceBufferPrivateGStreamer* sourceBufferPrivate, bool isActive)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h
index 0a44746..c9a09fa 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h
@@ -40,6 +40,8 @@
 #include <wtf/Forward.h>
 #include <wtf/HashSet.h>
 
+typedef struct _WebKitMediaSrc WebKitMediaSrc;
+
 namespace WebCore {
 
 class SourceBufferPrivateGStreamer;
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
new file mode 100644
index 0000000..b005d0d
--- /dev/null
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
@@ -0,0 +1,400 @@
+/*
+ * Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
+ * Copyright (C) 2016 Metrological Group B.V.
+ * Copyright (C) 2016 Igalia S.L
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#include "config.h"
+#include "PlaybackPipeline.h"
+
+#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
+
+#include "AudioTrackPrivateGStreamer.h"
+#include "GStreamerCommon.h"
+#include "MediaSampleGStreamer.h"
+#include "MediaSample.h"
+#include "SourceBufferPrivateGStreamer.h"
+#include "VideoTrackPrivateGStreamer.h"
+
+#include <gst/app/gstappsrc.h>
+#include <gst/gst.h>
+#include <wtf/MainThread.h>
+#include <wtf/RefCounted.h>
+#include <wtf/glib/GRefPtr.h>
+#include <wtf/glib/GUniquePtr.h>
+#include <wtf/text/AtomString.h>
+
+GST_DEBUG_CATEGORY_EXTERN(webkit_mse_debug);
+#define GST_CAT_DEFAULT webkit_mse_debug
+
+static Stream* getStreamByTrackId(WebKitMediaSrc*, AtomString);
+static Stream* getStreamBySourceBufferPrivate(WebKitMediaSrc*, WebCore::SourceBufferPrivateGStreamer*);
+
+static Stream* getStreamByTrackId(WebKitMediaSrc* source, AtomString trackIdString)
+{
+    // WebKitMediaSrc should be locked at this point.
+    for (Stream* stream : source->priv->streams) {
+        if (stream->type != WebCore::Invalid
+            && ((stream->audioTrack && stream->audioTrack->id() == trackIdString)
+                || (stream->videoTrack && stream->videoTrack->id() == trackIdString) ) ) {
+            return stream;
+        }
+    }
+    return nullptr;
+}
+
+static Stream* getStreamBySourceBufferPrivate(WebKitMediaSrc* source, WebCore::SourceBufferPrivateGStreamer* sourceBufferPrivate)
+{
+    for (Stream* stream : source->priv->streams) {
+        if (stream->sourceBuffer == sourceBufferPrivate)
+            return stream;
+    }
+    return nullptr;
+}
+
+// FIXME: Use gst_app_src_push_sample() instead when we switch to the appropriate GStreamer version.
+static GstFlowReturn pushSample(GstAppSrc* appsrc, GstSample* sample)
+{
+    g_return_val_if_fail(GST_IS_SAMPLE(sample), GST_FLOW_ERROR);
+
+    GstCaps* caps = gst_sample_get_caps(sample);
+    if (caps)
+        gst_app_src_set_caps(appsrc, caps);
+    else
+        GST_WARNING_OBJECT(appsrc, "received sample without caps");
+
+    GstBuffer* buffer = gst_sample_get_buffer(sample);
+    if (UNLIKELY(!buffer)) {
+        GST_WARNING_OBJECT(appsrc, "received sample without buffer");
+        return GST_FLOW_OK;
+    }
+
+    // gst_app_src_push_buffer() steals the reference, we need an additional one.
+    return gst_app_src_push_buffer(appsrc, gst_buffer_ref(buffer));
+}
+
+namespace WebCore {
+
+void PlaybackPipeline::setWebKitMediaSrc(WebKitMediaSrc* webKitMediaSrc)
+{
+    GST_DEBUG("webKitMediaSrc=%p", webKitMediaSrc);
+    m_webKitMediaSrc = webKitMediaSrc;
+}
+
+WebKitMediaSrc* PlaybackPipeline::webKitMediaSrc()
+{
+    return m_webKitMediaSrc.get();
+}
+
+MediaSourcePrivate::AddStatus PlaybackPipeline::addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
+{
+    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
+
+    if (priv->allTracksConfigured) {
+        GST_ERROR_OBJECT(m_webKitMediaSrc.get(), "Adding new source buffers after first data not supported yet");
+        return MediaSourcePrivate::NotSupported;
+    }
+
+    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "State %d", int(GST_STATE(m_webKitMediaSrc.get())));
+
+    Stream* stream = new Stream{ };
+    stream->parent = m_webKitMediaSrc.get();
+    stream->appsrc = gst_element_factory_make("appsrc", nullptr);
+    stream->appsrcNeedDataFlag = false;
+    stream->sourceBuffer = sourceBufferPrivate.get();
+
+    // No track has been attached yet.
+    stream->type = Invalid;
+    stream->caps = nullptr;
+    stream->audioTrack = nullptr;
+    stream->videoTrack = nullptr;
+    stream->presentationSize = WebCore::FloatSize();
+    stream->lastEnqueuedTime = MediaTime::invalidTime();
+
+    gst_app_src_set_callbacks(GST_APP_SRC(stream->appsrc), &enabledAppsrcCallbacks, stream->parent, nullptr);
+    gst_app_src_set_emit_signals(GST_APP_SRC(stream->appsrc), FALSE);
+    gst_app_src_set_stream_type(GST_APP_SRC(stream->appsrc), GST_APP_STREAM_TYPE_SEEKABLE);
+
+    gst_app_src_set_max_bytes(GST_APP_SRC(stream->appsrc), 2 * WTF::MB);
+    g_object_set(G_OBJECT(stream->appsrc), "block", FALSE, "min-percent", 20, "format", GST_FORMAT_TIME, nullptr);
+
+    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
+    priv->streams.append(stream);
+    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+
+    gst_bin_add(GST_BIN(m_webKitMediaSrc.get()), stream->appsrc);
+    gst_element_sync_state_with_parent(stream->appsrc);
+
+    return MediaSourcePrivate::Ok;
+}
+
+void PlaybackPipeline::removeSourceBuffer(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
+{
+    ASSERT(WTF::isMainThread());
+
+    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "Element removed from MediaSource");
+    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
+    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
+    Stream* stream = getStreamBySourceBufferPrivate(m_webKitMediaSrc.get(), sourceBufferPrivate.get());
+    if (stream)
+        priv->streams.removeFirst(stream);
+    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+
+    if (stream)
+        webKitMediaSrcFreeStream(m_webKitMediaSrc.get(), stream);
+}
+
+void PlaybackPipeline::attachTrack(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate, RefPtr<TrackPrivateBase> trackPrivate, GstCaps* caps)
+{
+    WebKitMediaSrc* webKitMediaSrc = m_webKitMediaSrc.get();
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    Stream* stream = getStreamBySourceBufferPrivate(webKitMediaSrc, sourceBufferPrivate.get());
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    ASSERT(stream);
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    unsigned padId = stream->parent->priv->numberOfPads;
+    stream->parent->priv->numberOfPads++;
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    const char* mediaType = capsMediaType(caps);
+    GST_DEBUG_OBJECT(webKitMediaSrc, "Configured track %s: appsrc=%s, padId=%u, mediaType=%s", trackPrivate->id().string().utf8().data(), GST_ELEMENT_NAME(stream->appsrc), padId, mediaType);
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    stream->type = Unknown;
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    GRefPtr<GstPad> sourcePad = adoptGRef(gst_element_get_static_pad(stream->appsrc, "src"));
+    ASSERT(sourcePad);
+
+    // FIXME: Is padId the best way to identify the Stream? What about trackId?
+    g_object_set_data(G_OBJECT(sourcePad.get()), "padId", GINT_TO_POINTER(padId));
+    webKitMediaSrcLinkSourcePad(sourcePad.get(), caps, stream);
+
+    ASSERT(stream->parent->priv->mediaPlayerPrivate);
+    int signal = -1;
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    if (doCapsHaveType(caps, GST_AUDIO_CAPS_TYPE_PREFIX)) {
+        stream->type = Audio;
+        stream->parent->priv->numberOfAudioStreams++;
+        signal = SIGNAL_AUDIO_CHANGED;
+        stream->audioTrack = RefPtr<WebCore::AudioTrackPrivateGStreamer>(static_cast<WebCore::AudioTrackPrivateGStreamer*>(trackPrivate.get()));
+    } else if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
+        stream->type = Video;
+        stream->parent->priv->numberOfVideoStreams++;
+        signal = SIGNAL_VIDEO_CHANGED;
+        stream->videoTrack = RefPtr<WebCore::VideoTrackPrivateGStreamer>(static_cast<WebCore::VideoTrackPrivateGStreamer*>(trackPrivate.get()));
+    } else if (doCapsHaveType(caps, GST_TEXT_CAPS_TYPE_PREFIX)) {
+        stream->type = Text;
+        stream->parent->priv->numberOfTextStreams++;
+        signal = SIGNAL_TEXT_CHANGED;
+
+        // FIXME: Support text tracks.
+    }
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    if (signal != -1)
+        g_signal_emit(G_OBJECT(stream->parent), webKitMediaSrcSignals[signal], 0, nullptr);
+}
+
+void PlaybackPipeline::reattachTrack(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate, RefPtr<TrackPrivateBase> trackPrivate, GstCaps* caps)
+{
+    GST_DEBUG("Re-attaching track");
+
+    // FIXME: Maybe remove this method. Now the caps change is managed by gst_appsrc_push_sample() in enqueueSample()
+    // and flushAndEnqueueNonDisplayingSamples().
+
+    WebKitMediaSrc* webKitMediaSrc = m_webKitMediaSrc.get();
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    Stream* stream = getStreamBySourceBufferPrivate(webKitMediaSrc, sourceBufferPrivate.get());
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    ASSERT(stream && stream->type != Invalid);
+
+    int signal = -1;
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    if (doCapsHaveType(caps, GST_AUDIO_CAPS_TYPE_PREFIX)) {
+        ASSERT(stream->type == Audio);
+        signal = SIGNAL_AUDIO_CHANGED;
+        stream->audioTrack = RefPtr<WebCore::AudioTrackPrivateGStreamer>(static_cast<WebCore::AudioTrackPrivateGStreamer*>(trackPrivate.get()));
+    } else if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
+        ASSERT(stream->type == Video);
+        signal = SIGNAL_VIDEO_CHANGED;
+        stream->videoTrack = RefPtr<WebCore::VideoTrackPrivateGStreamer>(static_cast<WebCore::VideoTrackPrivateGStreamer*>(trackPrivate.get()));
+    } else if (doCapsHaveType(caps, GST_TEXT_CAPS_TYPE_PREFIX)) {
+        ASSERT(stream->type == Text);
+        signal = SIGNAL_TEXT_CHANGED;
+
+        // FIXME: Support text tracks.
+    }
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    if (signal != -1)
+        g_signal_emit(G_OBJECT(stream->parent), webKitMediaSrcSignals[signal], 0, nullptr);
+}
+
+void PlaybackPipeline::notifyDurationChanged()
+{
+    gst_element_post_message(GST_ELEMENT(m_webKitMediaSrc.get()), gst_message_new_duration_changed(GST_OBJECT(m_webKitMediaSrc.get())));
+    // WebKitMediaSrc will ask MediaPlayerPrivateGStreamerMSE for the new duration later, when somebody asks for it.
+}
+
+void PlaybackPipeline::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus)
+{
+    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
+
+    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "Have EOS");
+
+    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
+    bool allTracksConfigured = priv->allTracksConfigured;
+    if (!allTracksConfigured)
+        priv->allTracksConfigured = true;
+    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+
+    if (!allTracksConfigured) {
+        gst_element_no_more_pads(GST_ELEMENT(m_webKitMediaSrc.get()));
+        webKitMediaSrcDoAsyncDone(m_webKitMediaSrc.get());
+    }
+
+    Vector<GstAppSrc*> appsrcs;
+
+    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
+    for (Stream* stream : priv->streams) {
+        if (stream->appsrc)
+            appsrcs.append(GST_APP_SRC(stream->appsrc));
+    }
+    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+
+    for (GstAppSrc* appsrc : appsrcs)
+        gst_app_src_end_of_stream(appsrc);
+}
+
+void PlaybackPipeline::flush(AtomString trackId)
+{
+    ASSERT(WTF::isMainThread());
+
+    GST_DEBUG("flush: trackId=%s", trackId.string().utf8().data());
+
+    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
+    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
+
+    if (!stream) {
+        GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+        return;
+    }
+
+    stream->lastEnqueuedTime = MediaTime::invalidTime();
+    GstElement* appsrc = stream->appsrc;
+    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
+
+    if (!appsrc)
+        return;
+
+    gint64 position = GST_CLOCK_TIME_NONE;
+    GRefPtr<GstQuery> query = adoptGRef(gst_query_new_position(GST_FORMAT_TIME));
+    if (gst_element_query(pipeline(), query.get()))
+        gst_query_parse_position(query.get(), 0, &position);
+
+    GST_TRACE("Position: %" GST_TIME_FORMAT, GST_TIME_ARGS(position));
+
+    if (static_cast<guint64>(position) == GST_CLOCK_TIME_NONE) {
+        GST_DEBUG("Can't determine position, avoiding flush");
+        return;
+    }
+
+    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_start())) {
+        GST_WARNING("Failed to send flush-start event for trackId=%s", trackId.string().utf8().data());
+    }
+
+    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_stop(false))) {
+        GST_WARNING("Failed to send flush-stop event for trackId=%s", trackId.string().utf8().data());
+    }
+
+    GST_DEBUG("trackId=%s flushed", trackId.string().utf8().data());
+}
+
+void PlaybackPipeline::enqueueSample(Ref<MediaSample>&& mediaSample)
+{
+    ASSERT(WTF::isMainThread());
+
+    AtomString trackId = mediaSample->trackID();
+
+    GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
+        trackId.string().utf8().data(), mediaSample->presentationTime().toFloat(),
+        mediaSample->presentationSize().width(), mediaSample->presentationSize().height(),
+        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->presentationTime())),
+        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->duration())));
+
+    // No need to lock to access the Stream here because the only chance of conflict with this read and with the usage
+    // of the sample fields done in this method would be the deletion of the stream. However, that operation can only
+    // happen in the main thread, but we're already there. Therefore there's no conflict and locking would only cause
+    // a performance penalty on the readers working in other threads.
+    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
+
+    if (!stream) {
+        GST_WARNING("No stream!");
+        return;
+    }
+
+    if (!stream->sourceBuffer->isReadyForMoreSamples(trackId)) {
+        GST_DEBUG("enqueueSample: skip adding new sample for trackId=%s, SB is not ready yet", trackId.string().utf8().data());
+        return;
+    }
+
+    // This field doesn't change after creation, no need to lock.
+    GstElement* appsrc = stream->appsrc;
+
+    // Only modified by the main thread, no need to lock.
+    MediaTime lastEnqueuedTime = stream->lastEnqueuedTime;
+
+    ASSERT(mediaSample->platformSample().type == PlatformSample::GStreamerSampleType);
+    GRefPtr<GstSample> gstSample = mediaSample->platformSample().sample.gstSample;
+    if (gstSample && gst_sample_get_buffer(gstSample.get())) {
+        GstBuffer* buffer = gst_sample_get_buffer(gstSample.get());
+        lastEnqueuedTime = mediaSample->presentationTime();
+
+        GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DECODE_ONLY);
+        pushSample(GST_APP_SRC(appsrc), gstSample.get());
+        // gst_app_src_push_sample() uses transfer-none for gstSample.
+
+        stream->lastEnqueuedTime = lastEnqueuedTime;
+    }
+}
+
+void PlaybackPipeline::allSamplesInTrackEnqueued(const AtomString& trackId)
+{
+    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
+    gst_app_src_end_of_stream(GST_APP_SRC(stream->appsrc));
+}
+
+GstElement* PlaybackPipeline::pipeline()
+{
+    if (!m_webKitMediaSrc || !GST_ELEMENT_PARENT(GST_ELEMENT(m_webKitMediaSrc.get())))
+        return nullptr;
+
+    return GST_ELEMENT_PARENT(GST_ELEMENT_PARENT(GST_ELEMENT(m_webKitMediaSrc.get())));
+}
+
+} // namespace WebCore.
+
+#endif // USE(GSTREAMER)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h
new file mode 100644
index 0000000..fc7ca0e
--- /dev/null
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h
@@ -0,0 +1,80 @@
+/*
+ * Copyright (C) 2016 Metrological Group B.V.
+ * Copyright (C) 2016 Igalia S.L
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
+
+// PlaybackPipeline is (sort of) a friend class of WebKitMediaSourceGStreamer.
+
+#include "WebKitMediaSourceGStreamer.h"
+#include "WebKitMediaSourceGStreamerPrivate.h"
+
+#include <gst/gst.h>
+#include <wtf/Condition.h>
+#include <wtf/glib/GRefPtr.h>
+
+namespace WTF {
+template<> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc*);
+template<> WebKitMediaSrc* refGPtr<WebKitMediaSrc>(WebKitMediaSrc*);
+template<> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc*);
+};
+
+namespace WebCore {
+
+class ContentType;
+class SourceBufferPrivateGStreamer;
+class MediaSourceGStreamer;
+
+class PlaybackPipeline: public RefCounted<PlaybackPipeline> {
+public:
+    static Ref<PlaybackPipeline> create()
+    {
+        return adoptRef(*new PlaybackPipeline());
+    }
+
+    virtual ~PlaybackPipeline() = default;
+
+    void setWebKitMediaSrc(WebKitMediaSrc*);
+    WebKitMediaSrc* webKitMediaSrc();
+
+    MediaSourcePrivate::AddStatus addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>);
+    void removeSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>);
+    void attachTrack(RefPtr<SourceBufferPrivateGStreamer>, RefPtr<TrackPrivateBase>, GstCaps*);
+    void reattachTrack(RefPtr<SourceBufferPrivateGStreamer>, RefPtr<TrackPrivateBase>, GstCaps*);
+    void notifyDurationChanged();
+
+    // From MediaSourceGStreamer.
+    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
+
+    // From SourceBufferPrivateGStreamer.
+    void flush(AtomString);
+    void enqueueSample(Ref<MediaSample>&&);
+    void allSamplesInTrackEnqueued(const AtomString&);
+
+    GstElement* pipeline();
+private:
+    PlaybackPipeline() = default;
+    GRefPtr<WebKitMediaSrc> m_webKitMediaSrc;
+};
+
+} // namespace WebCore.
+
+#endif // USE(GSTREAMER)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
index 8761cba..7c1e799 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
@@ -46,9 +46,6 @@
 #include "NotImplemented.h"
 #include "WebKitMediaSourceGStreamer.h"
 
-GST_DEBUG_CATEGORY_EXTERN(webkit_mse_debug);
-#define GST_CAT_DEFAULT webkit_mse_debug
-
 namespace WebCore {
 
 Ref<SourceBufferPrivateGStreamer> SourceBufferPrivateGStreamer::create(MediaSourceGStreamer* mediaSource, Ref<MediaSourceClientGStreamerMSE> client, const ContentType& contentType)
@@ -112,9 +109,11 @@
     m_client->flush(trackId);
 }
 
-void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString& trackId)
+void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString&)
 {
-    m_client->enqueueSample(WTFMove(sample), trackId);
+    m_notifyWhenReadyForMoreSamples = false;
+
+    m_client->enqueueSample(WTFMove(sample));
 }
 
 void SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(const AtomString& trackId)
@@ -122,12 +121,23 @@
     m_client->allSamplesInTrackEnqueued(trackId);
 }
 
-bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString& trackId)
+bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString&)
+{
+    return m_isReadyForMoreSamples;
+}
+
+void SourceBufferPrivateGStreamer::setReadyForMoreSamples(bool isReady)
 {
     ASSERT(WTF::isMainThread());
-    bool isReadyForMoreSamples = m_client->isReadyForMoreSamples(trackId);
-    GST_DEBUG("SourceBufferPrivate(%p) - isReadyForMoreSamples: %d", this, (int) isReadyForMoreSamples);
-    return isReadyForMoreSamples;
+    m_isReadyForMoreSamples = isReady;
+}
+
+void SourceBufferPrivateGStreamer::notifyReadyForMoreSamples()
+{
+    ASSERT(WTF::isMainThread());
+    setReadyForMoreSamples(true);
+    if (m_notifyWhenReadyForMoreSamples)
+        m_sourceBufferPrivateClient->sourceBufferPrivateDidBecomeReadyForMoreSamples(m_trackId);
 }
 
 void SourceBufferPrivateGStreamer::setActive(bool isActive)
@@ -139,7 +149,8 @@
 void SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples(const AtomString& trackId)
 {
     ASSERT(WTF::isMainThread());
-    return m_client->notifyClientWhenReadyForMoreSamples(trackId, m_sourceBufferPrivateClient);
+    m_notifyWhenReadyForMoreSamples = true;
+    m_trackId = trackId;
 }
 
 void SourceBufferPrivateGStreamer::didReceiveInitializationSegment(const SourceBufferPrivateClient::InitializationSegment& initializationSegment)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h b/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h
index 38cea50..e5fe406 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h
@@ -69,13 +69,15 @@
     void setActive(bool) final;
     void notifyClientWhenReadyForMoreSamples(const AtomString&) final;
 
+    void setReadyForMoreSamples(bool);
+    void notifyReadyForMoreSamples();
+
     void didReceiveInitializationSegment(const SourceBufferPrivateClient::InitializationSegment&);
     void didReceiveSample(MediaSample&);
     void didReceiveAllPendingSamples();
     void appendParsingFailed();
 
     ContentType type() const { return m_type; }
-    AtomString trackId() const { return m_trackId; }
 
 private:
     SourceBufferPrivateGStreamer(MediaSourceGStreamer*, Ref<MediaSourceClientGStreamerMSE>, const ContentType&);
@@ -85,6 +87,8 @@
     ContentType m_type;
     Ref<MediaSourceClientGStreamerMSE> m_client;
     SourceBufferPrivateClient* m_sourceBufferPrivateClient { nullptr };
+    bool m_isReadyForMoreSamples = true;
+    bool m_notifyWhenReadyForMoreSamples = false;
     AtomString m_trackId;
 };
 
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
index 862305b..1ad4424 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
@@ -3,8 +3,8 @@
  *  Copyright (C) 2013 Collabora Ltd.
  *  Copyright (C) 2013 Orange
  *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
- *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
- *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
+ *  Copyright (C) 2015, 2016 Metrological Group B.V.
+ *  Copyright (C) 2015, 2016 Igalia, S.L
  *
  *  This library is free software; you can redistribute it and/or
  *  modify it under the terms of the GNU Lesser General Public
@@ -24,227 +24,172 @@
 #include "config.h"
 #include "WebKitMediaSourceGStreamer.h"
 
+#include "PlaybackPipeline.h"
+
 #if ENABLE(VIDEO) && ENABLE(MEDIA_SOURCE) && USE(GSTREAMER)
 
+#include "AudioTrackPrivateGStreamer.h"
 #include "GStreamerCommon.h"
+#include "MediaDescription.h"
+#include "MediaPlayerPrivateGStreamerMSE.h"
+#include "MediaSample.h"
+#include "MediaSourceGStreamer.h"
+#include "NotImplemented.h"
+#include "SourceBufferPrivateGStreamer.h"
+#include "TimeRanges.h"
 #include "VideoTrackPrivateGStreamer.h"
+#include "WebKitMediaSourceGStreamerPrivate.h"
 
-#include <gst/gst.h>
+#include <gst/pbutils/pbutils.h>
+#include <gst/video/video.h>
 #include <wtf/Condition.h>
-#include <wtf/DataMutex.h>
-#include <wtf/HashMap.h>
 #include <wtf/MainThread.h>
-#include <wtf/MainThreadData.h>
 #include <wtf/RefPtr.h>
-#include <wtf/glib/WTFGType.h>
-#include <wtf/text/AtomString.h>
-#include <wtf/text/AtomStringHash.h>
 #include <wtf/text/CString.h>
 
-using namespace WTF;
-using namespace WebCore;
-
 GST_DEBUG_CATEGORY_STATIC(webkit_media_src_debug);
 #define GST_CAT_DEFAULT webkit_media_src_debug
 
-static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%s", GST_PAD_SRC,
+#define webkit_media_src_parent_class parent_class
+#define WEBKIT_MEDIA_SRC_CATEGORY_INIT GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "websrc element");
+
+static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%u", GST_PAD_SRC,
     GST_PAD_SOMETIMES, GST_STATIC_CAPS_ANY);
 
-enum {
-    PROP_0,
-    PROP_N_AUDIO,
-    PROP_N_VIDEO,
-    PROP_N_TEXT,
-    PROP_LAST
+static void enabledAppsrcNeedData(GstAppSrc*, guint, gpointer);
+static void enabledAppsrcEnoughData(GstAppSrc*, gpointer);
+static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer);
+
+static void disabledAppsrcNeedData(GstAppSrc*, guint, gpointer) { };
+static void disabledAppsrcEnoughData(GstAppSrc*, gpointer) { };
+static gboolean disabledAppsrcSeekData(GstAppSrc*, guint64, gpointer)
+{
+    return FALSE;
 };
 
-struct Stream;
+GstAppSrcCallbacks enabledAppsrcCallbacks = {
+    enabledAppsrcNeedData,
+    enabledAppsrcEnoughData,
+    enabledAppsrcSeekData,
+    { 0 }
+};
 
-struct WebKitMediaSrcPrivate {
-    HashMap<AtomString, RefPtr<Stream>> streams;
-    Stream* streamByName(const AtomString& name)
-    {
-        Stream* stream = streams.get(name);
-        ASSERT(stream);
-        return stream;
+GstAppSrcCallbacks disabledAppsrcCallbacks = {
+    disabledAppsrcNeedData,
+    disabledAppsrcEnoughData,
+    disabledAppsrcSeekData,
+    { 0 }
+};
+
+static Stream* getStreamByAppsrc(WebKitMediaSrc*, GstElement*);
+static void seekNeedsDataMainThread(WebKitMediaSrc*);
+static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc*, Stream*);
+
+static void enabledAppsrcNeedData(GstAppSrc* appsrc, guint, gpointer userData)
+{
+    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
+    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    OnSeekDataAction appsrcSeekDataNextAction = webKitMediaSrc->priv->appsrcSeekDataNextAction;
+    Stream* appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
+    bool allAppsrcNeedDataAfterSeek = false;
+
+    if (webKitMediaSrc->priv->appsrcSeekDataCount > 0) {
+        if (appsrcStream && !appsrcStream->appsrcNeedDataFlag) {
+            ++webKitMediaSrc->priv->appsrcNeedDataCount;
+            appsrcStream->appsrcNeedDataFlag = true;
+        }
+        int numAppsrcs = webKitMediaSrc->priv->streams.size();
+        if (webKitMediaSrc->priv->appsrcSeekDataCount == numAppsrcs && webKitMediaSrc->priv->appsrcNeedDataCount == numAppsrcs) {
+            GST_DEBUG("All needDatas completed");
+            allAppsrcNeedDataAfterSeek = true;
+            webKitMediaSrc->priv->appsrcSeekDataCount = 0;
+            webKitMediaSrc->priv->appsrcNeedDataCount = 0;
+            webKitMediaSrc->priv->appsrcSeekDataNextAction = Nothing;
+
+            for (Stream* stream : webKitMediaSrc->priv->streams)
+                stream->appsrcNeedDataFlag = false;
+        }
     }
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
 
-    // Used for stream-start events, shared by all streams.
-    const unsigned groupId { gst_util_group_id_next() };
+    if (allAppsrcNeedDataAfterSeek) {
+        GST_DEBUG("All expected appsrcSeekData() and appsrcNeedData() calls performed. Running next action (%d)", static_cast<int>(appsrcSeekDataNextAction));
 
-    // Every time a track is added or removed this collection is swapped by an updated one and a STREAM_COLLECTION
-    // message is posted in the bus.
-    GRefPtr<GstStreamCollection> collection { adoptGRef(gst_stream_collection_new("WebKitMediaSrc")) };
+        switch (appsrcSeekDataNextAction) {
+        case MediaSourceSeekToTime:
+            webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::SeekNeedsData, [webKitMediaSrc] {
+                seekNeedsDataMainThread(webKitMediaSrc);
+            });
+            break;
+        case Nothing:
+            break;
+        }
+    } else if (appsrcSeekDataNextAction == Nothing) {
+        LockHolder locker(webKitMediaSrc->priv->streamLock);
 
-    // Changed on seeks.
-    GstClockTime startTime { 0 };
-    double rate { 1.0 };
+        GST_OBJECT_LOCK(webKitMediaSrc);
 
-    // Only used by URI Handler API implementation.
-    GUniquePtr<char> uri;
-};
+        // Search again for the Stream, just in case it was removed between the previous lock and this one.
+        appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
 
-static void webKitMediaSrcUriHandlerInit(gpointer, gpointer);
-static void webKitMediaSrcFinalize(GObject*);
-static GstStateChangeReturn webKitMediaSrcChangeState(GstElement*, GstStateChange);
-static gboolean webKitMediaSrcActivateMode(GstPad*, GstObject*, GstPadMode, gboolean activate);
-static void webKitMediaSrcLoop(void*);
-static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>&);
-static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>&, bool resetTime);
-static void webKitMediaSrcGetProperty(GObject*, unsigned propId, GValue*, GParamSpec*);
+        if (appsrcStream && appsrcStream->type != WebCore::Invalid)
+            webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::ReadyForMoreSamples, [webKitMediaSrc, appsrcStream] {
+                notifyReadyForMoreSamplesMainThread(webKitMediaSrc, appsrcStream);
+            });
 
-#define webkit_media_src_parent_class parent_class
-
-struct WebKitMediaSrcPadPrivate {
-    RefPtr<Stream> stream;
-};
-
-struct WebKitMediaSrcPad {
-    GstPad parent;
-    WebKitMediaSrcPadPrivate* priv;
-};
-
-struct WebKitMediaSrcPadClass {
-    GstPadClass parent;
-};
-
-namespace WTF {
-
-template<> GRefPtr<WebKitMediaSrcPad> adoptGRef(WebKitMediaSrcPad* ptr)
-{
-    ASSERT(!ptr || !g_object_is_floating(ptr));
-    return GRefPtr<WebKitMediaSrcPad>(ptr, GRefPtrAdopt);
+        GST_OBJECT_UNLOCK(webKitMediaSrc);
+    }
 }
 
-template<> WebKitMediaSrcPad* refGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
+static void enabledAppsrcEnoughData(GstAppSrc *appsrc, gpointer userData)
 {
-    if (ptr)
-        gst_object_ref_sink(GST_OBJECT(ptr));
+    // No need to lock on webKitMediaSrc, we're on the main thread and nobody is going to remove the stream in the meantime.
+    ASSERT(WTF::isMainThread());
 
-    return ptr;
+    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
+    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
+    Stream* stream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
+
+    // This callback might have been scheduled from a child thread before the stream was removed.
+    // Then, the removal code might have run, and later this callback.
+    // This check solves the race condition.
+    if (!stream || stream->type == WebCore::Invalid)
+        return;
+
+    stream->sourceBuffer->setReadyForMoreSamples(false);
 }
 
-template<> void derefGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
+static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer userData)
 {
-    if (ptr)
-        gst_object_unref(ptr);
+    ASSERT(WTF::isMainThread());
+
+    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
+
+    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    webKitMediaSrc->priv->appsrcSeekDataCount++;
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    return TRUE;
 }
 
-} // namespace WTF
-
-static GType webkit_media_src_pad_get_type();
-WEBKIT_DEFINE_TYPE(WebKitMediaSrcPad, webkit_media_src_pad, GST_TYPE_PAD);
-#define WEBKIT_TYPE_MEDIA_SRC_PAD (webkit_media_src_pad_get_type())
-#define WEBKIT_MEDIA_SRC_PAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), WEBKIT_TYPE_MEDIA_SRC_PAD, WebKitMediaSrcPad))
-
-static void webkit_media_src_pad_class_init(WebKitMediaSrcPadClass*)
+static Stream* getStreamByAppsrc(WebKitMediaSrc* source, GstElement* appsrc)
 {
+    for (Stream* stream : source->priv->streams) {
+        if (stream->appsrc == appsrc)
+            return stream;
+    }
+    return nullptr;
 }
 
-G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_ELEMENT,
+G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_BIN,
     G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, webKitMediaSrcUriHandlerInit);
-    G_ADD_PRIVATE(WebKitMediaSrc);
-    GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "WebKit MSE source element"));
+    WEBKIT_MEDIA_SRC_CATEGORY_INIT);
 
-struct Stream : public ThreadSafeRefCounted<Stream> {
-    Stream(WebKitMediaSrc* source, GRefPtr<GstPad>&& pad, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps, GRefPtr<GstStream>&& streamInfo)
-        : source(source)
-        , pad(WTFMove(pad))
-        , name(name)
-        , type(type)
-        , streamInfo(WTFMove(streamInfo))
-        , streamingMembersDataMutex(WTFMove(initialCaps), source->priv->startTime, source->priv->rate, adoptGRef(gst_event_new_stream_collection(source->priv->collection.get())))
-    { }
-
-    WebKitMediaSrc* const source;
-    GRefPtr<GstPad> const pad;
-    AtomString const name;
-    WebCore::MediaSourceStreamTypeGStreamer type;
-    GRefPtr<GstStream> streamInfo;
-
-    // The point of having a queue in WebKitMediaSource is to limit the number of context switches per second.
-    // If we had no queue, the main thread would have to be awaken for every frame. On the other hand, if the
-    // queue had unlimited size WebKit would end up requesting flushes more often than necessary when frames
-    // in the future are re-appended. As a sweet spot between these extremes we choose to allow enqueueing a
-    // few seconds worth of samples.
-
-    // `isReadyForMoreSamples` follows the classical two water levels strategy: initially it's true until the
-    // high water level is reached, then it becomes false until the queue drains down to the low water level
-    // and the cycle repeats. This way we avoid stalls and minimize context switches.
-
-    static const uint64_t durationEnqueuedHighWaterLevel = 5 * GST_SECOND;
-    static const uint64_t durationEnqueuedLowWaterLevel = 2 * GST_SECOND;
-
-    struct StreamingMembers {
-        StreamingMembers(GRefPtr<GstCaps>&& initialCaps, GstClockTime startTime, double rate, GRefPtr<GstEvent>&& pendingStreamCollectionEvent)
-            : pendingStreamCollectionEvent(WTFMove(pendingStreamCollectionEvent))
-            , pendingInitialCaps(WTFMove(initialCaps))
-        {
-            gst_segment_init(&segment, GST_FORMAT_TIME);
-            segment.start = segment.time = startTime;
-            segment.rate = rate;
-
-            GstStreamCollection* collection;
-            gst_event_parse_stream_collection(this->pendingStreamCollectionEvent.get(), &collection);
-            ASSERT(collection);
-        }
-
-        bool hasPushedFirstBuffer { false };
-        bool wasStreamStartSent { false };
-        bool doesNeedSegmentEvent { true };
-        GstSegment segment;
-        GRefPtr<GstEvent> pendingStreamCollectionEvent;
-        GRefPtr<GstCaps> pendingInitialCaps;
-        GRefPtr<GstCaps> previousCaps;
-
-        Condition padLinkedOrFlushedCondition;
-        Condition queueChangedOrFlushedCondition;
-        Deque<GRefPtr<GstMiniObject>> queue;
-        bool isFlushing { false };
-        bool doesNeedToNotifyOnLowWaterLevel { false };
-
-        uint64_t durationEnqueued() const
-        {
-            // Find the first and last GstSample in the queue and subtract their DTS.
-            auto frontIter = std::find_if(queue.begin(), queue.end(), [](const GRefPtr<GstMiniObject>& object) {
-                return GST_IS_SAMPLE(object.get());
-            });
-
-            // If there are no samples in the queue, that makes total duration of enqueued frames of zero.
-            if (frontIter == queue.end())
-                return 0;
-
-            auto backIter = std::find_if(queue.rbegin(), queue.rend(), [](const GRefPtr<GstMiniObject>& object) {
-                return GST_IS_SAMPLE(object.get());
-            });
-
-            const GstBuffer* front = gst_sample_get_buffer(GST_SAMPLE(frontIter->get()));
-            const GstBuffer* back = gst_sample_get_buffer(GST_SAMPLE(backIter->get()));
-            return GST_BUFFER_DTS_OR_PTS(back) - GST_BUFFER_DTS_OR_PTS(front);
-        }
-    };
-    DataMutex<StreamingMembers> streamingMembersDataMutex;
-
-    struct ReportedStatus {
-        // Set to true when the pad is removed. In the case where a reference to the Stream object is alive because of
-        // a posted task to notify isReadyForMoreSamples, the notification must not be delivered if this flag is true.
-        bool wasRemoved { false };
-
-        bool isReadyForMoreSamples { true };
-        SourceBufferPrivateClient* sourceBufferPrivateToNotify { nullptr };
-    };
-    MainThreadData<ReportedStatus> reportedStatus;
-};
-
-static GRefPtr<GstElement> findPipeline(GRefPtr<GstElement> element)
-{
-    while (true) {
-        GRefPtr<GstElement> parentElement = adoptGRef(GST_ELEMENT(gst_element_get_parent(element.get())));
-        if (!parentElement)
-            return element;
-        element = parentElement;
-    }
-}
+guint webKitMediaSrcSignals[LAST_SIGNAL] = { 0 };
 
 static void webkit_media_src_class_init(WebKitMediaSrcClass* klass)
 {
@@ -252,14 +197,18 @@
     GstElementClass* eklass = GST_ELEMENT_CLASS(klass);
 
     oklass->finalize = webKitMediaSrcFinalize;
+    oklass->set_property = webKitMediaSrcSetProperty;
     oklass->get_property = webKitMediaSrcGetProperty;
 
-    gst_element_class_add_static_pad_template_with_gtype(eklass, &srcTemplate, webkit_media_src_pad_get_type());
+    gst_element_class_add_pad_template(eklass, gst_static_pad_template_get(&srcTemplate));
 
-    gst_element_class_set_static_metadata(eklass, "WebKit MediaSource source element", "Source/Network", "Feeds samples coming from WebKit MediaSource object", "Igalia <aboya@igalia.com>");
+    gst_element_class_set_static_metadata(eklass, "WebKit Media source element", "Source", "Handles Blob uris", "Stephane Jadaud <sjadaud@sii.fr>, Sebastian Dröge <sebastian@centricular.com>, Enrique Ocaña González <eocanha@igalia.com>");
 
-    eklass->change_state = webKitMediaSrcChangeState;
-
+    // Allows setting the uri using the 'location' property, which is used for example by gst_element_make_from_uri().
+    g_object_class_install_property(oklass,
+        PROP_LOCATION,
+        g_param_spec_string("location", "location", "Location to read from", nullptr,
+        GParamFlags(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)));
     g_object_class_install_property(oklass,
         PROP_N_AUDIO,
         g_param_spec_int("n-audio", "Number Audio", "Total number of audio streams",
@@ -272,531 +221,428 @@
         PROP_N_TEXT,
         g_param_spec_int("n-text", "Number Text", "Total number of text streams",
         0, G_MAXINT, 0, GParamFlags(G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)));
+
+    webKitMediaSrcSignals[SIGNAL_VIDEO_CHANGED] =
+        g_signal_new("video-changed", G_TYPE_FROM_CLASS(oklass),
+        G_SIGNAL_RUN_LAST,
+        G_STRUCT_OFFSET(WebKitMediaSrcClass, videoChanged), nullptr, nullptr,
+        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
+    webKitMediaSrcSignals[SIGNAL_AUDIO_CHANGED] =
+        g_signal_new("audio-changed", G_TYPE_FROM_CLASS(oklass),
+        G_SIGNAL_RUN_LAST,
+        G_STRUCT_OFFSET(WebKitMediaSrcClass, audioChanged), nullptr, nullptr,
+        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
+    webKitMediaSrcSignals[SIGNAL_TEXT_CHANGED] =
+        g_signal_new("text-changed", G_TYPE_FROM_CLASS(oklass),
+        G_SIGNAL_RUN_LAST,
+        G_STRUCT_OFFSET(WebKitMediaSrcClass, textChanged), nullptr, nullptr,
+        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
+
+    eklass->change_state = webKitMediaSrcChangeState;
+
+    g_type_class_add_private(klass, sizeof(WebKitMediaSrcPrivate));
+}
+
+static GstFlowReturn webkitMediaSrcChain(GstPad* pad, GstObject* parent, GstBuffer* buffer)
+{
+    GRefPtr<WebKitMediaSrc> self = adoptGRef(WEBKIT_MEDIA_SRC(gst_object_get_parent(parent)));
+
+    return gst_flow_combiner_update_pad_flow(self->priv->flowCombiner.get(), pad, gst_proxy_pad_chain_default(pad, GST_OBJECT(self.get()), buffer));
 }
 
 static void webkit_media_src_init(WebKitMediaSrc* source)
 {
-    ASSERT(isMainThread());
-
-    GST_OBJECT_FLAG_SET(source, GST_ELEMENT_FLAG_SOURCE);
-    source->priv = G_TYPE_INSTANCE_GET_PRIVATE((source), WEBKIT_TYPE_MEDIA_SRC, WebKitMediaSrcPrivate);
+    source->priv = WEBKIT_MEDIA_SRC_GET_PRIVATE(source);
     new (source->priv) WebKitMediaSrcPrivate();
+    source->priv->seekTime = MediaTime::invalidTime();
+    source->priv->appsrcSeekDataCount = 0;
+    source->priv->appsrcNeedDataCount = 0;
+    source->priv->appsrcSeekDataNextAction = Nothing;
+    source->priv->flowCombiner = GUniquePtr<GstFlowCombiner>(gst_flow_combiner_new());
+    source->priv->notifier = WebCore::MainThreadNotifier<WebKitMediaSrcMainThreadNotification>::create();
+
+    // No need to reset Stream.appsrcNeedDataFlag because there are no Streams at this point yet.
 }
 
-static void webKitMediaSrcFinalize(GObject* object)
+void webKitMediaSrcFinalize(GObject* object)
 {
-    ASSERT(isMainThread());
+    ASSERT(WTF::isMainThread());
 
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
-    source->priv->~WebKitMediaSrcPrivate();
+    WebKitMediaSrcPrivate* priv = source->priv;
+
+    Vector<Stream*> oldStreams;
+    source->priv->streams.swap(oldStreams);
+
+    for (Stream* stream : oldStreams)
+        webKitMediaSrcFreeStream(source, stream);
+
+    priv->seekTime = MediaTime::invalidTime();
+
+    source->priv->notifier->invalidate();
+
+    if (priv->mediaPlayerPrivate)
+        webKitMediaSrcSetMediaPlayerPrivate(source, nullptr);
+
+    // We used a placement new for construction, the destructor won't be called automatically.
+    priv->~_WebKitMediaSrcPrivate();
+
     GST_CALL_PARENT(G_OBJECT_CLASS, finalize, (object));
 }
 
-static GstPadProbeReturn debugProbe(GstPad* pad, GstPadProbeInfo* info, void*)
-{
-    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
-    GST_TRACE_OBJECT(stream->source, "track %s: %" GST_PTR_FORMAT, stream->name.string().utf8().data(), info->data);
-    return GST_PAD_PROBE_OK;
-}
-
-// GstStreamCollection are immutable objects once posted. THEY MUST NOT BE MODIFIED once they have been posted.
-// Instead, when stream changes occur a new collection must be made. The following functions help to create
-// such new collections:
-
-static GRefPtr<GstStreamCollection> copyCollectionAndAddStream(GstStreamCollection* collection, GRefPtr<GstStream>&& stream)
-{
-    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
-
-    unsigned n = gst_stream_collection_get_size(collection);
-    for (unsigned i = 0; i < n; i++)
-        gst_stream_collection_add_stream(newCollection.get(), static_cast<GstStream*>(gst_object_ref(gst_stream_collection_get_stream(collection, i))));
-    gst_stream_collection_add_stream(newCollection.get(), stream.leakRef());
-
-    return newCollection;
-}
-
-static GRefPtr<GstStreamCollection> copyCollectionWithoutStream(GstStreamCollection* collection, const GstStream* stream)
-{
-    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
-
-    unsigned n = gst_stream_collection_get_size(collection);
-    for (unsigned i = 0; i < n; i++) {
-        GRefPtr<GstStream> oldStream = gst_stream_collection_get_stream(collection, i);
-        if (oldStream.get() != stream)
-            gst_stream_collection_add_stream(newCollection.get(), oldStream.leakRef());
-    }
-
-    return newCollection;
-}
-
-static GstStreamType gstStreamType(WebCore::MediaSourceStreamTypeGStreamer type)
-{
-    switch (type) {
-    case WebCore::MediaSourceStreamTypeGStreamer::Video:
-        return GST_STREAM_TYPE_VIDEO;
-    case WebCore::MediaSourceStreamTypeGStreamer::Audio:
-        return GST_STREAM_TYPE_AUDIO;
-    case WebCore::MediaSourceStreamTypeGStreamer::Text:
-        return GST_STREAM_TYPE_TEXT;
-    default:
-        GST_ERROR("Received unexpected stream type");
-        return GST_STREAM_TYPE_UNKNOWN;
-    }
-}
-
-void webKitMediaSrcAddStream(WebKitMediaSrc* source, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps)
-{
-    ASSERT(isMainThread());
-    ASSERT(!source->priv->streams.contains(name));
-
-    GRefPtr<GstStream> streamInfo = adoptGRef(gst_stream_new(name.string().utf8().data(), initialCaps.get(), gstStreamType(type), GST_STREAM_FLAG_SELECT));
-    source->priv->collection = copyCollectionAndAddStream(source->priv->collection.get(), GRefPtr<GstStream>(streamInfo));
-    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
-
-    GRefPtr<WebKitMediaSrcPad> pad = WEBKIT_MEDIA_SRC_PAD(g_object_new(webkit_media_src_pad_get_type(), "name", makeString("src_", name).utf8().data(), "direction", GST_PAD_SRC, NULL));
-    gst_pad_set_activatemode_function(GST_PAD(pad.get()), webKitMediaSrcActivateMode);
-
-    {
-        RefPtr<Stream> stream = adoptRef(new Stream(source, GRefPtr<GstPad>(GST_PAD(pad.get())), name, type, WTFMove(initialCaps), WTFMove(streamInfo)));
-        pad->priv->stream = stream;
-        source->priv->streams.set(name, WTFMove(stream));
-    }
-
-    if (gst_debug_category_get_threshold(webkit_media_src_debug) >= GST_LEVEL_TRACE)
-        gst_pad_add_probe(GST_PAD(pad.get()), static_cast<GstPadProbeType>(GST_PAD_PROBE_TYPE_DATA_DOWNSTREAM | GST_PAD_PROBE_TYPE_EVENT_FLUSH), debugProbe, nullptr, nullptr);
-
-    // Workaround: gst_element_add_pad() should already call gst_pad_set_active() if the element is PAUSED or
-    // PLAYING. Unfortunately, as of GStreamer 1.14.4 it does so with the element lock taken, causing a deadlock
-    // in gst_pad_start_task(), who tries to post a `stream-status` message in the element, which also requires
-    // the element lock. Activating the pad beforehand avoids that codepath.
-    GstState state;
-    gst_element_get_state(GST_ELEMENT(source), &state, nullptr, 0);
-    if (state > GST_STATE_READY)
-        gst_pad_set_active(GST_PAD(pad.get()), true);
-
-    gst_element_add_pad(GST_ELEMENT(source), GST_PAD(pad.get()));
-}
-
-void webKitMediaSrcRemoveStream(WebKitMediaSrc* source, const AtomString& name)
-{
-    ASSERT(isMainThread());
-    Stream* stream = source->priv->streamByName(name);
-
-    source->priv->collection = copyCollectionWithoutStream(source->priv->collection.get(), stream->streamInfo.get());
-    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
-
-    // Flush the source element **and** downstream. We want to stop the streaming thread and for that we need all elements downstream to be idle.
-    webKitMediaSrcStreamFlushStart(stream);
-    webKitMediaSrcStreamFlushStop(stream, false);
-    // Stop the thread now.
-    gst_pad_set_active(stream->pad.get(), false);
-
-    stream->reportedStatus->wasRemoved = true;
-    gst_element_remove_pad(GST_ELEMENT(source), stream->pad.get());
-    source->priv->streams.remove(name);
-}
-
-static gboolean webKitMediaSrcActivateMode(GstPad* pad, GstObject* source, GstPadMode mode, gboolean active)
-{
-    if (mode != GST_PAD_MODE_PUSH) {
-        GST_ERROR_OBJECT(source, "Unexpected pad mode in WebKitMediaSrc");
-        return false;
-    }
-
-    if (active)
-        gst_pad_start_task(pad, webKitMediaSrcLoop, pad, nullptr);
-    else {
-        // Unblock the streaming thread.
-        RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
-        {
-            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-            streamingMembers->isFlushing = true;
-            streamingMembers->padLinkedOrFlushedCondition.notifyOne();
-            streamingMembers->queueChangedOrFlushedCondition.notifyOne();
-        }
-        // Following gstbasesrc implementation, this code is not flushing downstream.
-        // If there is any possibility of the streaming thread being blocked downstream the caller MUST flush before.
-        // Otherwise a deadlock would occur as the next function tries to join the thread.
-        gst_pad_stop_task(pad);
-        {
-            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-            streamingMembers->isFlushing = false;
-        }
-    }
-    return true;
-}
-
-static void webKitMediaSrcPadLinked(GstPad* pad, GstPad*, void*)
-{
-    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
-    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-    streamingMembers->padLinkedOrFlushedCondition.notifyOne();
-}
-
-static void webKitMediaSrcStreamNotifyLowWaterLevel(const RefPtr<Stream>& stream)
-{
-    RunLoop::main().dispatch([stream]() {
-        if (stream->reportedStatus->wasRemoved)
-            return;
-
-        stream->reportedStatus->isReadyForMoreSamples = true;
-        if (stream->reportedStatus->sourceBufferPrivateToNotify) {
-            // We need to set sourceBufferPrivateToNotify BEFORE calling sourceBufferPrivateDidBecomeReadyForMoreSamples(),
-            // not after, since otherwise it would destroy a notification request should the callback request one.
-            SourceBufferPrivateClient* sourceBuffer = stream->reportedStatus->sourceBufferPrivateToNotify;
-            stream->reportedStatus->sourceBufferPrivateToNotify = nullptr;
-            sourceBuffer->sourceBufferPrivateDidBecomeReadyForMoreSamples(stream->name);
-        }
-    });
-}
-
-// Called with STREAM_LOCK.
-static void webKitMediaSrcLoop(void* userData)
-{
-    GstPad* pad = GST_PAD(userData);
-    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
-
-    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-    if (streamingMembers->isFlushing) {
-        gst_pad_pause_task(pad);
-        return;
-    }
-
-    // Since the pad can and will be added when the element is in PLAYING state, this task can start running
-    // before the pad is linked. Wait for the pad to be linked to avoid buffers being lost to not-linked errors.
-    GST_OBJECT_LOCK(pad);
-    if (!GST_PAD_IS_LINKED(pad)) {
-        g_signal_connect(pad, "linked", G_CALLBACK(webKitMediaSrcPadLinked), nullptr);
-        GST_OBJECT_UNLOCK(pad);
-
-        streamingMembers->padLinkedOrFlushedCondition.wait(streamingMembers.mutex());
-
-        g_signal_handlers_disconnect_by_func(pad, reinterpret_cast<void*>(webKitMediaSrcPadLinked), nullptr);
-        if (streamingMembers->isFlushing)
-            return;
-    } else
-        GST_OBJECT_UNLOCK(pad);
-    ASSERT(gst_pad_is_linked(pad));
-
-    // By keeping the lock we are guaranteed that a flush will not happen while we send essential events.
-    // These events should never block downstream, so the lock should be released in little time in every
-    // case.
-
-    if (streamingMembers->pendingStreamCollectionEvent)
-        gst_pad_push_event(stream->pad.get(), streamingMembers->pendingStreamCollectionEvent.leakRef());
-
-    if (!streamingMembers->wasStreamStartSent) {
-        GUniquePtr<char> streamId(g_strdup_printf("mse/%s", stream->name.string().utf8().data()));
-        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_stream_start(streamId.get()));
-        gst_event_set_group_id(event.get(), stream->source->priv->groupId);
-        gst_event_set_stream(event.get(), stream->streamInfo.get());
-
-        bool wasStreamStartSent = gst_pad_push_event(pad, event.leakRef());
-        streamingMembers->wasStreamStartSent = wasStreamStartSent;
-    }
-
-    if (streamingMembers->pendingInitialCaps) {
-        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_caps(streamingMembers->pendingInitialCaps.get()));
-
-        gst_pad_push_event(pad, event.leakRef());
-
-        streamingMembers->previousCaps = WTFMove(streamingMembers->pendingInitialCaps);
-        ASSERT(!streamingMembers->pendingInitialCaps);
-    }
-
-    streamingMembers->queueChangedOrFlushedCondition.wait(streamingMembers.mutex(), [&]() {
-        return !streamingMembers->queue.isEmpty() || streamingMembers->isFlushing;
-    });
-    if (streamingMembers->isFlushing)
-        return;
-
-    // We wait to get a sample before emitting the first segment. This way, if we get a seek before any
-    // enqueue, we're sending only one segment. This also ensures that when such a seek is made, where we also
-    // omit the flush (see webKitMediaSrcFlush) we actually emit the updated, correct segment.
-    if (streamingMembers->doesNeedSegmentEvent) {
-        gst_pad_push_event(pad, gst_event_new_segment(&streamingMembers->segment));
-        streamingMembers->doesNeedSegmentEvent = false;
-    }
-
-    GRefPtr<GstMiniObject> object = streamingMembers->queue.takeFirst();
-    if (GST_IS_SAMPLE(object.get())) {
-        GRefPtr<GstSample> sample = adoptGRef(GST_SAMPLE(object.leakRef()));
-
-        if (!gst_caps_is_equal(gst_sample_get_caps(sample.get()), streamingMembers->previousCaps.get())) {
-            // This sample needs new caps (typically because of a quality change).
-            gst_pad_push_event(stream->pad.get(), gst_event_new_caps(gst_sample_get_caps(sample.get())));
-            streamingMembers->previousCaps = gst_sample_get_caps(sample.get());
-        }
-
-        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel && streamingMembers->durationEnqueued() <= Stream::durationEnqueuedLowWaterLevel) {
-            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
-            webKitMediaSrcStreamNotifyLowWaterLevel(RefPtr<Stream>(stream));
-        }
-
-        GRefPtr<GstBuffer> buffer = gst_sample_get_buffer(sample.get());
-        sample.clear();
-
-        if (!streamingMembers->hasPushedFirstBuffer) {
-            GUniquePtr<char> fileName { g_strdup_printf("playback-pipeline-before-playback-%s", stream->name.string().utf8().data()) };
-            GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(GST_BIN(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(stream->source))).get()),
-                GST_DEBUG_GRAPH_SHOW_ALL, fileName.get());
-            streamingMembers->hasPushedFirstBuffer = true;
-        }
-
-        // Push the buffer without the streamingMembers lock so that flushes can happen while it travels downstream.
-        streamingMembers.lockHolder().unlockEarly();
-
-        ASSERT(GST_BUFFER_PTS_IS_VALID(buffer.get()));
-        GstFlowReturn ret = gst_pad_push(pad, buffer.leakRef());
-        if (ret != GST_FLOW_OK && ret != GST_FLOW_FLUSHING) {
-            GST_ERROR_OBJECT(pad, "Pushing buffer returned %s", gst_flow_get_name(ret));
-            gst_pad_pause_task(pad);
-        }
-    } else if (GST_IS_EVENT(object.get())) {
-        // EOS events and other enqueued events are also sent unlocked so they can react to flushes if necessary.
-        GRefPtr<GstEvent> event = GRefPtr<GstEvent>(GST_EVENT(object.leakRef()));
-
-        streamingMembers.lockHolder().unlockEarly();
-        bool eventHandled = gst_pad_push_event(pad, GRefPtr<GstEvent>(event).leakRef());
-        if (!eventHandled)
-            GST_DEBUG_OBJECT(pad, "Pushed event was not handled: %" GST_PTR_FORMAT, event.get());
-    } else
-        ASSERT_NOT_REACHED();
-}
-
-static void webKitMediaSrcEnqueueObject(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstMiniObject>&& object)
-{
-    ASSERT(isMainThread());
-    ASSERT(object);
-
-    Stream* stream = source->priv->streamByName(streamName);
-    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-    streamingMembers->queue.append(WTFMove(object));
-    if (stream->reportedStatus->isReadyForMoreSamples && streamingMembers->durationEnqueued() > Stream::durationEnqueuedHighWaterLevel) {
-        stream->reportedStatus->isReadyForMoreSamples = false;
-        streamingMembers->doesNeedToNotifyOnLowWaterLevel = true;
-    }
-    streamingMembers->queueChangedOrFlushedCondition.notifyOne();
-}
-
-void webKitMediaSrcEnqueueSample(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstSample>&& sample)
-{
-    ASSERT(GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get())));
-    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(sample.leakRef())));
-}
-
-static void webKitMediaSrcEnqueueEvent(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstEvent>&& event)
-{
-    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(event.leakRef())));
-}
-
-void webKitMediaSrcEndOfStream(WebKitMediaSrc* source, const AtomString& streamName)
-{
-    webKitMediaSrcEnqueueEvent(source, streamName, adoptGRef(gst_event_new_eos()));
-}
-
-bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName)
-{
-    ASSERT(isMainThread());
-    Stream* stream = source->priv->streamByName(streamName);
-    return stream->reportedStatus->isReadyForMoreSamples;
-}
-
-void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName, WebCore::SourceBufferPrivateClient* sourceBufferPrivate)
-{
-    ASSERT(isMainThread());
-    Stream* stream = source->priv->streamByName(streamName);
-    ASSERT(!stream->reportedStatus->isReadyForMoreSamples);
-    stream->reportedStatus->sourceBufferPrivateToNotify = sourceBufferPrivate;
-}
-
-static GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
-{
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
-    if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) {
-        while (!source->priv->streams.isEmpty())
-            webKitMediaSrcRemoveStream(source, source->priv->streams.begin()->key);
-    }
-    return GST_ELEMENT_CLASS(webkit_media_src_parent_class)->change_state(element, transition);
-}
-
-static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>& stream)
-{
-    ASSERT(isMainThread());
-    {
-        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-
-        streamingMembers->isFlushing = true;
-        streamingMembers->queueChangedOrFlushedCondition.notifyOne();
-        streamingMembers->padLinkedOrFlushedCondition.notifyOne();
-    }
-
-    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_start());
-}
-
-static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>& stream, bool resetTime)
-{
-    ASSERT(isMainThread());
-
-    // By taking the stream lock we are waiting for the streaming thread task to stop if it hadn't yet.
-    GST_PAD_STREAM_LOCK(stream->pad.get());
-    {
-        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-
-        streamingMembers->isFlushing = false;
-        streamingMembers->doesNeedSegmentEvent = true;
-        streamingMembers->queue.clear();
-        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel) {
-            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
-            webKitMediaSrcStreamNotifyLowWaterLevel(stream);
-        }
-    }
-
-    // Since FLUSH_STOP is a synchronized event, we send it while we still hold the stream lock of the pad.
-    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_stop(resetTime));
-
-    gst_pad_start_task(stream->pad.get(), webKitMediaSrcLoop, stream->pad.get(), nullptr);
-    GST_PAD_STREAM_UNLOCK(stream->pad.get());
-}
-
-void webKitMediaSrcFlush(WebKitMediaSrc* source, const AtomString& streamName)
-{
-    ASSERT(isMainThread());
-    Stream* stream = source->priv->streamByName(streamName);
-
-    bool hasPushedFirstBuffer;
-    {
-        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-        hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
-    }
-
-    if (hasPushedFirstBuffer) {
-        // If no buffer has been pushed there is no need for flush... and flushing at that point could
-        // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
-        // having linked the chain so far and forgetting to do it after the flush).
-        webKitMediaSrcStreamFlushStart(stream);
-    }
-
-    GstClockTime pipelineStreamTime;
-    gst_element_query_position(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(source))).get(), GST_FORMAT_TIME,
-        reinterpret_cast<gint64*>(&pipelineStreamTime));
-    // -1 is returned when the pipeline is not yet pre-rolled (e.g. just after a seek). In this case we don't need to
-    // adjust the segment though, as running time has not advanced.
-    if (GST_CLOCK_TIME_IS_VALID(pipelineStreamTime)) {
-        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-        // We need to increase the base by the running time accumulated during the previous segment.
-
-        GstClockTime pipelineRunningTime = gst_segment_to_running_time(&streamingMembers->segment, GST_FORMAT_TIME, pipelineStreamTime);
-        assert(GST_CLOCK_TIME_IS_VALID(pipelineRunningTime));
-        streamingMembers->segment.base = pipelineRunningTime;
-
-        streamingMembers->segment.start = streamingMembers->segment.time = static_cast<GstClockTime>(pipelineStreamTime);
-    }
-
-    if (hasPushedFirstBuffer)
-        webKitMediaSrcStreamFlushStop(stream, false);
-}
-
-void webKitMediaSrcSeek(WebKitMediaSrc* source, uint64_t startTime, double rate)
-{
-    ASSERT(isMainThread());
-    source->priv->startTime = startTime;
-    source->priv->rate = rate;
-
-    for (auto& pair : source->priv->streams) {
-        const RefPtr<Stream>& stream = pair.value;
-        bool hasPushedFirstBuffer;
-        {
-            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-            hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
-        }
-
-        if (hasPushedFirstBuffer) {
-            // If no buffer has been pushed there is no need for flush... and flushing at that point could
-            // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
-            // having linked the chain so far and forgetting to do it after the flush).
-            webKitMediaSrcStreamFlushStart(stream);
-        }
-
-        {
-            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
-            streamingMembers->segment.base = 0;
-            streamingMembers->segment.rate = rate;
-            streamingMembers->segment.start = streamingMembers->segment.time = startTime;
-        }
-
-        if (hasPushedFirstBuffer)
-            webKitMediaSrcStreamFlushStop(stream, true);
-    }
-}
-
-static int countStreamsOfType(WebKitMediaSrc* source, WebCore::MediaSourceStreamTypeGStreamer type)
-{
-    // Barring pipeline dumps someone may add during debugging, WebKit will only read these properties (n-video etc.) from the main thread.
-    return std::count_if(source->priv->streams.begin(), source->priv->streams.end(), [type](auto item) {
-        return item.value->type == type;
-    });
-}
-
-static void webKitMediaSrcGetProperty(GObject* object, unsigned propId, GValue* value, GParamSpec* pspec)
+void webKitMediaSrcSetProperty(GObject* object, guint propId, const GValue* value, GParamSpec* pspec)
 {
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
 
     switch (propId) {
-    case PROP_N_AUDIO:
-        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Audio));
-        break;
-    case PROP_N_VIDEO:
-        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Video));
-        break;
-    case PROP_N_TEXT:
-        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Text));
+    case PROP_LOCATION:
+        gst_uri_handler_set_uri(reinterpret_cast<GstURIHandler*>(source), g_value_get_string(value), nullptr);
         break;
     default:
         G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
+        break;
     }
 }
 
-// URI handler interface. It's only purpose is for the element to be instantiated by playbin on "mediasourceblob:"
-// URIs. The actual URI does not matter.
-static GstURIType webKitMediaSrcUriGetType(GType)
+void webKitMediaSrcGetProperty(GObject* object, guint propId, GValue* value, GParamSpec* pspec)
+{
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
+    WebKitMediaSrcPrivate* priv = source->priv;
+
+    GST_OBJECT_LOCK(source);
+    switch (propId) {
+    case PROP_LOCATION:
+        g_value_set_string(value, priv->location.get());
+        break;
+    case PROP_N_AUDIO:
+        g_value_set_int(value, priv->numberOfAudioStreams);
+        break;
+    case PROP_N_VIDEO:
+        g_value_set_int(value, priv->numberOfVideoStreams);
+        break;
+    case PROP_N_TEXT:
+        g_value_set_int(value, priv->numberOfTextStreams);
+        break;
+    default:
+        G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
+        break;
+    }
+    GST_OBJECT_UNLOCK(source);
+}
+
+void webKitMediaSrcDoAsyncStart(WebKitMediaSrc* source)
+{
+    source->priv->asyncStart = true;
+    GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
+        gst_message_new_async_start(GST_OBJECT(source)));
+}
+
+void webKitMediaSrcDoAsyncDone(WebKitMediaSrc* source)
+{
+    WebKitMediaSrcPrivate* priv = source->priv;
+    if (priv->asyncStart) {
+        GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
+            gst_message_new_async_done(GST_OBJECT(source), GST_CLOCK_TIME_NONE));
+        priv->asyncStart = false;
+    }
+}
+
+GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
+{
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
+    WebKitMediaSrcPrivate* priv = source->priv;
+
+    switch (transition) {
+    case GST_STATE_CHANGE_READY_TO_PAUSED:
+        priv->allTracksConfigured = false;
+        webKitMediaSrcDoAsyncStart(source);
+        break;
+    default:
+        break;
+    }
+
+    GstStateChangeReturn result = GST_ELEMENT_CLASS(parent_class)->change_state(element, transition);
+    if (G_UNLIKELY(result == GST_STATE_CHANGE_FAILURE)) {
+        GST_WARNING_OBJECT(source, "State change failed");
+        webKitMediaSrcDoAsyncDone(source);
+        return result;
+    }
+
+    switch (transition) {
+    case GST_STATE_CHANGE_READY_TO_PAUSED:
+        result = GST_STATE_CHANGE_ASYNC;
+        break;
+    case GST_STATE_CHANGE_PAUSED_TO_READY:
+        webKitMediaSrcDoAsyncDone(source);
+        priv->allTracksConfigured = false;
+        break;
+    default:
+        break;
+    }
+
+    return result;
+}
+
+gint64 webKitMediaSrcGetSize(WebKitMediaSrc* webKitMediaSrc)
+{
+    gint64 duration = 0;
+    for (Stream* stream : webKitMediaSrc->priv->streams)
+        duration = std::max<gint64>(duration, gst_app_src_get_size(GST_APP_SRC(stream->appsrc)));
+    return duration;
+}
+
+gboolean webKitMediaSrcQueryWithParent(GstPad* pad, GstObject* parent, GstQuery* query)
+{
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(GST_ELEMENT(parent));
+    gboolean result = FALSE;
+
+    switch (GST_QUERY_TYPE(query)) {
+    case GST_QUERY_DURATION: {
+        GstFormat format;
+        gst_query_parse_duration(query, &format, nullptr);
+
+        GST_DEBUG_OBJECT(source, "duration query in format %s", gst_format_get_name(format));
+        GST_OBJECT_LOCK(source);
+        switch (format) {
+        case GST_FORMAT_TIME: {
+            if (source->priv && source->priv->mediaPlayerPrivate) {
+                MediaTime duration = source->priv->mediaPlayerPrivate->durationMediaTime();
+                if (duration > MediaTime::zeroTime()) {
+                    gst_query_set_duration(query, format, WebCore::toGstClockTime(duration));
+                    GST_DEBUG_OBJECT(source, "Answering: duration=%" GST_TIME_FORMAT, GST_TIME_ARGS(WebCore::toGstClockTime(duration)));
+                    result = TRUE;
+                }
+            }
+            break;
+        }
+        case GST_FORMAT_BYTES: {
+            if (source->priv) {
+                gint64 duration = webKitMediaSrcGetSize(source);
+                if (duration) {
+                    gst_query_set_duration(query, format, duration);
+                    GST_DEBUG_OBJECT(source, "size: %" G_GINT64_FORMAT, duration);
+                    result = TRUE;
+                }
+            }
+            break;
+        }
+        default:
+            break;
+        }
+
+        GST_OBJECT_UNLOCK(source);
+        break;
+    }
+    case GST_QUERY_URI:
+        if (source) {
+            GST_OBJECT_LOCK(source);
+            if (source->priv)
+                gst_query_set_uri(query, source->priv->location.get());
+            GST_OBJECT_UNLOCK(source);
+        }
+        result = TRUE;
+        break;
+    default: {
+        GRefPtr<GstPad> target = adoptGRef(gst_ghost_pad_get_target(GST_GHOST_PAD_CAST(pad)));
+        // Forward the query to the proxy target pad.
+        if (target)
+            result = gst_pad_query(target.get(), query);
+        break;
+    }
+    }
+
+    return result;
+}
+
+void webKitMediaSrcUpdatePresentationSize(GstCaps* caps, Stream* stream)
+{
+    GST_OBJECT_LOCK(stream->parent);
+    if (WebCore::doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
+        Optional<WebCore::FloatSize> size = WebCore::getVideoResolutionFromCaps(caps);
+        if (size.hasValue())
+            stream->presentationSize = size.value();
+        else
+            stream->presentationSize = WebCore::FloatSize();
+    } else
+        stream->presentationSize = WebCore::FloatSize();
+
+    gst_caps_ref(caps);
+    stream->caps = adoptGRef(caps);
+    GST_OBJECT_UNLOCK(stream->parent);
+}
+
+void webKitMediaSrcLinkStreamToSrcPad(GstPad* sourcePad, Stream* stream)
+{
+    unsigned padId = static_cast<unsigned>(GPOINTER_TO_INT(g_object_get_data(G_OBJECT(sourcePad), "padId")));
+    GST_DEBUG_OBJECT(stream->parent, "linking stream to src pad (id: %u)", padId);
+
+    GUniquePtr<gchar> padName(g_strdup_printf("src_%u", padId));
+    GstPad* ghostpad = WebCore::webkitGstGhostPadFromStaticTemplate(&srcTemplate, padName.get(), sourcePad);
+
+    auto proxypad = adoptGRef(GST_PAD(gst_proxy_pad_get_internal(GST_PROXY_PAD(ghostpad))));
+    gst_flow_combiner_add_pad(stream->parent->priv->flowCombiner.get(), proxypad.get());
+    gst_pad_set_chain_function(proxypad.get(), static_cast<GstPadChainFunction>(webkitMediaSrcChain));
+    gst_pad_set_query_function(ghostpad, webKitMediaSrcQueryWithParent);
+
+    gst_pad_set_active(ghostpad, TRUE);
+    gst_element_add_pad(GST_ELEMENT(stream->parent), ghostpad);
+}
+
+void webKitMediaSrcLinkSourcePad(GstPad* sourcePad, GstCaps* caps, Stream* stream)
+{
+    ASSERT(caps && stream->parent);
+    if (!caps || !stream->parent) {
+        GST_ERROR("Unable to link parser");
+        return;
+    }
+
+    webKitMediaSrcUpdatePresentationSize(caps, stream);
+
+    // FIXME: drop webKitMediaSrcLinkStreamToSrcPad() and move its code here.
+    if (!gst_pad_is_linked(sourcePad)) {
+        GST_DEBUG_OBJECT(stream->parent, "pad not linked yet");
+        webKitMediaSrcLinkStreamToSrcPad(sourcePad, stream);
+    }
+
+    webKitMediaSrcCheckAllTracksConfigured(stream->parent);
+}
+
+void webKitMediaSrcFreeStream(WebKitMediaSrc* source, Stream* stream)
+{
+    if (GST_IS_APP_SRC(stream->appsrc)) {
+        // Don't trigger callbacks from this appsrc to avoid using the stream anymore.
+        gst_app_src_set_callbacks(GST_APP_SRC(stream->appsrc), &disabledAppsrcCallbacks, nullptr, nullptr);
+        gst_app_src_end_of_stream(GST_APP_SRC(stream->appsrc));
+    }
+
+    GST_OBJECT_LOCK(source);
+    switch (stream->type) {
+    case WebCore::Audio:
+        source->priv->numberOfAudioStreams--;
+        break;
+    case WebCore::Video:
+        source->priv->numberOfVideoStreams--;
+        break;
+    case WebCore::Text:
+        source->priv->numberOfTextStreams--;
+        break;
+    default:
+        break;
+    }
+    GST_OBJECT_UNLOCK(source);
+
+    if (stream->type != WebCore::Invalid) {
+        GST_DEBUG("Freeing track-related info on stream %p", stream);
+
+        LockHolder locker(source->priv->streamLock);
+
+        if (stream->caps)
+            stream->caps = nullptr;
+
+        if (stream->audioTrack)
+            stream->audioTrack = nullptr;
+        if (stream->videoTrack)
+            stream->videoTrack = nullptr;
+
+        int signal = -1;
+        switch (stream->type) {
+        case WebCore::Audio:
+            signal = SIGNAL_AUDIO_CHANGED;
+            break;
+        case WebCore::Video:
+            signal = SIGNAL_VIDEO_CHANGED;
+            break;
+        case WebCore::Text:
+            signal = SIGNAL_TEXT_CHANGED;
+            break;
+        default:
+            break;
+        }
+        stream->type = WebCore::Invalid;
+
+        if (signal != -1)
+            g_signal_emit(G_OBJECT(source), webKitMediaSrcSignals[signal], 0, nullptr);
+
+        source->priv->streamCondition.notifyOne();
+    }
+
+    GST_DEBUG("Releasing stream: %p", stream);
+    delete stream;
+}
+
+void webKitMediaSrcCheckAllTracksConfigured(WebKitMediaSrc* webKitMediaSrc)
+{
+    bool allTracksConfigured = false;
+
+    GST_OBJECT_LOCK(webKitMediaSrc);
+    if (!webKitMediaSrc->priv->allTracksConfigured) {
+        allTracksConfigured = true;
+        for (Stream* stream : webKitMediaSrc->priv->streams) {
+            if (stream->type == WebCore::Invalid) {
+                allTracksConfigured = false;
+                break;
+            }
+        }
+        if (allTracksConfigured)
+            webKitMediaSrc->priv->allTracksConfigured = true;
+    }
+    GST_OBJECT_UNLOCK(webKitMediaSrc);
+
+    if (allTracksConfigured) {
+        GST_DEBUG("All tracks attached. Completing async state change operation.");
+        gst_element_no_more_pads(GST_ELEMENT(webKitMediaSrc));
+        webKitMediaSrcDoAsyncDone(webKitMediaSrc);
+    }
+}
+
+// Uri handler interface.
+GstURIType webKitMediaSrcUriGetType(GType)
 {
     return GST_URI_SRC;
 }
 
-static const gchar* const* webKitMediaSrcGetProtocols(GType)
+const gchar* const* webKitMediaSrcGetProtocols(GType)
 {
     static const char* protocols[] = {"mediasourceblob", nullptr };
     return protocols;
 }
 
-static gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
+gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
 {
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
     gchar* result;
 
     GST_OBJECT_LOCK(source);
-    result = g_strdup(source->priv->uri.get());
+    result = g_strdup(source->priv->location.get());
     GST_OBJECT_UNLOCK(source);
     return result;
 }
 
-static gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
+gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
 {
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
 
     if (GST_STATE(source) >= GST_STATE_PAUSED) {
         GST_ERROR_OBJECT(source, "URI can only be set in states < PAUSED");
-        return false;
+        return FALSE;
     }
 
     GST_OBJECT_LOCK(source);
-    source->priv->uri = GUniquePtr<char>(g_strdup(uri));
+    WebKitMediaSrcPrivate* priv = source->priv;
+    priv->location = nullptr;
+    if (!uri) {
+        GST_OBJECT_UNLOCK(source);
+        return TRUE;
+    }
+
+    URL url(URL(), uri);
+
+    priv->location = GUniquePtr<gchar>(g_strdup(url.string().utf8().data()));
     GST_OBJECT_UNLOCK(source);
     return TRUE;
 }
 
-static void webKitMediaSrcUriHandlerInit(void* gIface, void*)
+void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer)
 {
     GstURIHandlerInterface* iface = (GstURIHandlerInterface *) gIface;
 
@@ -806,6 +652,83 @@
     iface->set_uri = webKitMediaSrcSetUri;
 }
 
+static void seekNeedsDataMainThread(WebKitMediaSrc* source)
+{
+    GST_DEBUG("Buffering needed before seek");
+
+    ASSERT(WTF::isMainThread());
+
+    GST_OBJECT_LOCK(source);
+    MediaTime seekTime = source->priv->seekTime;
+    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
+
+    if (!mediaPlayerPrivate) {
+        GST_OBJECT_UNLOCK(source);
+        return;
+    }
+
+    for (Stream* stream : source->priv->streams) {
+        if (stream->type != WebCore::Invalid)
+            stream->sourceBuffer->setReadyForMoreSamples(true);
+    }
+    GST_OBJECT_UNLOCK(source);
+    mediaPlayerPrivate->notifySeekNeedsDataForTime(seekTime);
+}
+
+static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc* source, Stream* appsrcStream)
+{
+    GST_OBJECT_LOCK(source);
+
+    auto it = std::find(source->priv->streams.begin(), source->priv->streams.end(), appsrcStream);
+    if (it == source->priv->streams.end()) {
+        GST_OBJECT_UNLOCK(source);
+        return;
+    }
+
+    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
+    if (mediaPlayerPrivate && !mediaPlayerPrivate->seeking())
+        appsrcStream->sourceBuffer->notifyReadyForMoreSamples();
+
+    GST_OBJECT_UNLOCK(source);
+}
+
+void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc* source, WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate)
+{
+    GST_OBJECT_LOCK(source);
+
+    // Set to nullptr on MediaPlayerPrivateGStreamer destruction, never a dangling pointer.
+    source->priv->mediaPlayerPrivate = mediaPlayerPrivate;
+    GST_OBJECT_UNLOCK(source);
+}
+
+void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc* source, bool isReady)
+{
+    if (source) {
+        GST_OBJECT_LOCK(source);
+        for (Stream* stream : source->priv->streams)
+            stream->sourceBuffer->setReadyForMoreSamples(isReady);
+        GST_OBJECT_UNLOCK(source);
+    }
+}
+
+void webKitMediaSrcPrepareSeek(WebKitMediaSrc* source, const MediaTime& time)
+{
+    GST_OBJECT_LOCK(source);
+    source->priv->seekTime = time;
+    source->priv->appsrcSeekDataCount = 0;
+    source->priv->appsrcNeedDataCount = 0;
+
+    for (Stream* stream : source->priv->streams) {
+        stream->appsrcNeedDataFlag = false;
+        // Don't allow samples away from the seekTime to be enqueued.
+        stream->lastEnqueuedTime = time;
+    }
+
+    // The pending action will be performed in enabledAppsrcSeekData().
+    source->priv->appsrcSeekDataNextAction = MediaSourceSeekToTime;
+    GST_OBJECT_UNLOCK(source);
+}
+
 namespace WTF {
 template <> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr)
 {
@@ -826,7 +749,7 @@
     if (ptr)
         gst_object_unref(ptr);
 }
-} // namespace WTF
+};
 
 #endif // USE(GSTREAMER)
 
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h
index e0cf11c..6c45eb6 100644
--- a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h
@@ -3,8 +3,8 @@
  *  Copyright (C) 2013 Collabora Ltd.
  *  Copyright (C) 2013 Orange
  *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
- *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
- *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
+ *  Copyright (C) 2015, 2016 Metrological Group B.V.
+ *  Copyright (C) 2015, 2016 Igalia, S.L
  *
  *  This library is free software; you can redistribute it and/or
  *  modify it under the terms of the GNU Lesser General Public
@@ -39,7 +39,7 @@
 
 enum MediaSourceStreamTypeGStreamer { Invalid, Unknown, Audio, Video, Text };
 
-} // namespace WebCore
+}
 
 G_BEGIN_DECLS
 
@@ -49,38 +49,32 @@
 #define WEBKIT_IS_MEDIA_SRC(obj)         (G_TYPE_CHECK_INSTANCE_TYPE ((obj), WEBKIT_TYPE_MEDIA_SRC))
 #define WEBKIT_IS_MEDIA_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), WEBKIT_TYPE_MEDIA_SRC))
 
-struct WebKitMediaSrcPrivate;
+typedef struct _WebKitMediaSrc        WebKitMediaSrc;
+typedef struct _WebKitMediaSrcClass   WebKitMediaSrcClass;
+typedef struct _WebKitMediaSrcPrivate WebKitMediaSrcPrivate;
 
-struct WebKitMediaSrc {
-    GstElement parent;
+struct _WebKitMediaSrc {
+    GstBin parent;
 
     WebKitMediaSrcPrivate *priv;
 };
 
-struct WebKitMediaSrcClass {
-    GstElementClass parentClass;
+struct _WebKitMediaSrcClass {
+    GstBinClass parentClass;
+
+    // Notify app that number of audio/video/text streams changed.
+    void (*videoChanged)(WebKitMediaSrc*);
+    void (*audioChanged)(WebKitMediaSrc*);
+    void (*textChanged)(WebKitMediaSrc*);
 };
 
 GType webkit_media_src_get_type(void);
 
-void webKitMediaSrcAddStream(WebKitMediaSrc*, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer, GRefPtr<GstCaps>&& initialCaps);
-void webKitMediaSrcRemoveStream(WebKitMediaSrc*, const AtomString& name);
+void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc*, WebCore::MediaPlayerPrivateGStreamerMSE*);
 
-void webKitMediaSrcEnqueueSample(WebKitMediaSrc*, const AtomString& streamName, GRefPtr<GstSample>&&);
-void webKitMediaSrcEndOfStream(WebKitMediaSrc*, const AtomString& streamName);
-
-bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName);
-void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName, WebCore::SourceBufferPrivateClient*);
-
-void webKitMediaSrcFlush(WebKitMediaSrc*, const AtomString& streamName);
-void webKitMediaSrcSeek(WebKitMediaSrc*, guint64 startTime, double rate);
+void webKitMediaSrcPrepareSeek(WebKitMediaSrc*, const MediaTime&);
+void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc*, bool);
 
 G_END_DECLS
 
-namespace WTF {
-template<> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr);
-template<> WebKitMediaSrc* refGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
-template<> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
-} // namespace WTF
-
 #endif // USE(GSTREAMER)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h
new file mode 100644
index 0000000..413dd3f
--- /dev/null
+++ b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h
@@ -0,0 +1,155 @@
+/*
+ * Copyright (C) 2016 Metrological Group B.V.
+ * Copyright (C) 2016 Igalia S.L
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
+
+#include "AudioTrackPrivateGStreamer.h"
+#include "GUniquePtrGStreamer.h"
+#include "MainThreadNotifier.h"
+#include "SourceBufferPrivateGStreamer.h"
+#include "VideoTrackPrivateGStreamer.h"
+#include "WebKitMediaSourceGStreamer.h"
+
+#include <gst/app/gstappsrc.h>
+#include <gst/gst.h>
+#include <wtf/Forward.h>
+#include <wtf/glib/GRefPtr.h>
+
+namespace WebCore {
+
+class MediaPlayerPrivateGStreamerMSE;
+
+};
+
+void webKitMediaSrcUriHandlerInit(gpointer, gpointer);
+
+#define WEBKIT_MEDIA_SRC_GET_PRIVATE(obj) (G_TYPE_INSTANCE_GET_PRIVATE((obj), WEBKIT_TYPE_MEDIA_SRC, WebKitMediaSrcPrivate))
+
+typedef struct _Stream Stream;
+
+struct _Stream {
+    // Fields filled when the Stream is created.
+    WebKitMediaSrc* parent;
+
+    // AppSrc. Never modified after first assignment.
+    GstElement* appsrc;
+
+    // Never modified after first assignment.
+    WebCore::SourceBufferPrivateGStreamer* sourceBuffer;
+
+    // Fields filled when the track is attached.
+    WebCore::MediaSourceStreamTypeGStreamer type;
+    GRefPtr<GstCaps> caps;
+
+    // Only audio, video or nothing at a given time.
+    RefPtr<WebCore::AudioTrackPrivateGStreamer> audioTrack;
+    RefPtr<WebCore::VideoTrackPrivateGStreamer> videoTrack;
+    WebCore::FloatSize presentationSize;
+
+    // This helps WebKitMediaSrcPrivate.appsrcNeedDataCount, ensuring that needDatas are
+    // counted only once per each appsrc.
+    bool appsrcNeedDataFlag;
+
+    // Used to enforce continuity in the appended data and avoid breaking the decoder.
+    // Only used from the main thread.
+    MediaTime lastEnqueuedTime;
+};
+
+enum {
+    PROP_0,
+    PROP_LOCATION,
+    PROP_N_AUDIO,
+    PROP_N_VIDEO,
+    PROP_N_TEXT,
+    PROP_LAST
+};
+
+enum {
+    SIGNAL_VIDEO_CHANGED,
+    SIGNAL_AUDIO_CHANGED,
+    SIGNAL_TEXT_CHANGED,
+    LAST_SIGNAL
+};
+
+enum OnSeekDataAction {
+    Nothing,
+    MediaSourceSeekToTime
+};
+
+enum WebKitMediaSrcMainThreadNotification {
+    ReadyForMoreSamples = 1 << 0,
+    SeekNeedsData = 1 << 1
+};
+
+struct _WebKitMediaSrcPrivate {
+    // Used to coordinate the release of Stream track info.
+    Lock streamLock;
+    Condition streamCondition;
+
+    // Streams are only added/removed in the main thread.
+    Vector<Stream*> streams;
+
+    GUniquePtr<gchar> location;
+    int numberOfAudioStreams;
+    int numberOfVideoStreams;
+    int numberOfTextStreams;
+    bool asyncStart;
+    bool allTracksConfigured;
+    unsigned numberOfPads;
+
+    MediaTime seekTime;
+
+    // On seek, we wait for all the seekDatas, then for all the needDatas, and then run the nextAction.
+    OnSeekDataAction appsrcSeekDataNextAction;
+    int appsrcSeekDataCount;
+    int appsrcNeedDataCount;
+
+    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate;
+
+    RefPtr<WebCore::MainThreadNotifier<WebKitMediaSrcMainThreadNotification>> notifier;
+    GUniquePtr<GstFlowCombiner> flowCombiner;
+};
+
+extern guint webKitMediaSrcSignals[LAST_SIGNAL];
+extern GstAppSrcCallbacks enabledAppsrcCallbacks;
+extern GstAppSrcCallbacks disabledAppsrcCallbacks;
+
+void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer ifaceData);
+void webKitMediaSrcFinalize(GObject*);
+void webKitMediaSrcSetProperty(GObject*, guint propertyId, const GValue*, GParamSpec*);
+void webKitMediaSrcGetProperty(GObject*, guint propertyId, GValue*, GParamSpec*);
+void webKitMediaSrcDoAsyncStart(WebKitMediaSrc*);
+void webKitMediaSrcDoAsyncDone(WebKitMediaSrc*);
+GstStateChangeReturn webKitMediaSrcChangeState(GstElement*, GstStateChange);
+gint64 webKitMediaSrcGetSize(WebKitMediaSrc*);
+gboolean webKitMediaSrcQueryWithParent(GstPad*, GstObject*, GstQuery*);
+void webKitMediaSrcUpdatePresentationSize(GstCaps*, Stream*);
+void webKitMediaSrcLinkStreamToSrcPad(GstPad*, Stream*);
+void webKitMediaSrcLinkSourcePad(GstPad*, GstCaps*, Stream*);
+void webKitMediaSrcFreeStream(WebKitMediaSrc*, Stream*);
+void webKitMediaSrcCheckAllTracksConfigured(WebKitMediaSrc*);
+GstURIType webKitMediaSrcUriGetType(GType);
+const gchar* const* webKitMediaSrcGetProtocols(GType);
+gchar* webKitMediaSrcGetUri(GstURIHandler*);
+gboolean webKitMediaSrcSetUri(GstURIHandler*, const gchar*, GError**);
+
+#endif // USE(GSTREAMER)
diff --git a/Source/cmake/GStreamerChecks.cmake b/Source/cmake/GStreamerChecks.cmake
index 3a7cec7..18b12f8 100644
--- a/Source/cmake/GStreamerChecks.cmake
+++ b/Source/cmake/GStreamerChecks.cmake
@@ -37,8 +37,8 @@
     SET_AND_EXPOSE_TO_BUILD(USE_GSTREAMER TRUE)
 endif ()
 
-if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.16")
-    message(FATAL_ERROR "GStreamer 1.16 is needed for ENABLE_MEDIA_SOURCE.")
+if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.14")
+    message(FATAL_ERROR "GStreamer 1.14 is needed for ENABLE_MEDIA_SOURCE.")
 endif ()
 
 if (ENABLE_MEDIA_STREAM OR ENABLE_WEB_RTC)
diff --git a/Tools/ChangeLog b/Tools/ChangeLog
index 025e1bc..2631e53 100644
--- a/Tools/ChangeLog
+++ b/Tools/ChangeLog
@@ -1,3 +1,12 @@
+2019-10-21  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Revert WebKitMediaSrc rework temporarily
+        https://bugs.webkit.org/show_bug.cgi?id=203078
+
+        Reviewed by Carlos Garcia Campos.
+
+        * Scripts/webkitpy/style/checker.py:
+
 2019-10-21  Carlos Garcia Campos  <cgarcia@igalia.com>
 
         Unreviewed. Mark some more WTF unit tests as slow for GTK and WPE
diff --git a/Tools/Scripts/webkitpy/style/checker.py b/Tools/Scripts/webkitpy/style/checker.py
index 946ca2f..e529afe 100644
--- a/Tools/Scripts/webkitpy/style/checker.py
+++ b/Tools/Scripts/webkitpy/style/checker.py
@@ -219,7 +219,6 @@
       # variables and functions containing underscores.
       os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'VideoSinkGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'WebKitWebSourceGStreamer.cpp'),
-      os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'mse', 'WebKitMediaSourceGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'audio', 'gstreamer', 'WebKitWebAudioSourceGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'gstreamer', 'GStreamerMediaStreamSource.h'),
       os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'gstreamer', 'GStreamerMediaStreamSource.cpp'),