The media router is a component in Chrome responsible for matching clients that wish to render content outside the browser (media sources) with devices and endpoints capable of rendering that content (media sinks). When a media source is linked with a media sink (in general, requiring user permission), a media route is created that allows two-way messaging between the client and the sink. The media route allows the client to negotiate a peer-to-peer media streaming session with the media sink via messaging (e.g., via WebRTC or Cast Streaming), aka "mirroring." The media route can also be used to control remotely rendered media without an associated peer-to-peer media streaming session, aka "flinging". The media route can be terminated at user or client request, which denies access to the media sink from the application.
The Web Presentation API allows a Web application to request display of Web content on a secondary (wired, or wireless) screen. The content may be rendered locally and streamed to the display or rendered remotely. The Web application controls the content by two-way messaging.
Note that the non-Blink parts of the media router will be implemented only in desktop Chrome and ChromeOS. Presentation API functionality will be implemented in Chrome for Android using analogous platform components such as the Android Media Route Provider framework.
Also note that a separate design is in progress for offscreen rendering, capture, and streaming of WebContents (required for full Presentation API support).
The objectives of this project:
The following are non-goals but may be objectives for future work:
The media router consists of four distinct components:
The following diagram illustrates the architecture of the components described above.
The Chrome Media Router is a browser-resident service that serves as a media-protocol-agnostic platform for parties interested in media routing. It provides its clients with a set of APIs for media routing related queries and operations, including:
The Chrome Media Router, itself, does not directly interact with media sinks. Instead it delegates these requests and responses to a media route provider in the component extension. The Chrome Media Router will contain bookkeeping of established routes, pending route requests, and other related resources, so it does not have to request this information from the route provider each time.
The following pseudocode describes how a client of the Chrome Media Router (through its C++ API) would use it to initiate and control a media sharing session.
The Media Router interacts with the component extension via a Mojo service, the Media Router API, that exposes functionality whose implementation is delegated to the extension.
The component extension manages discovery of and network interaction with individual media sinks. For the purposes of this discussion a sink is a LAN-connected device that speaks the Cast or DIAL protocol, but in theory it could be any other type of endpoint that supports media rendering and two-way messaging. The extension consists of three types of components:
A component extension is used rather than implementing functionality directly into the browser since remote display functionality is implemented by first and third parties using a mix of open source and proprietary code, and must be released on a schedule independently of Chrome (i.e. tied to specific hardware release dates). We only plan to open source the DIAL media route provider.
Initially Media Route Providers will be implemented for Cast and DIAL devices with others to follow. Over time media route providers that do not rely on proprietary protocols will be unbundled and included in the Chromium repository, once script packaging and deployment issues are resolved. As an external component, the extension is installed on the initial run of the browser. It is built around an event page so it registers itself with the Media Router, registers itself with discovery APIs to be notified of display availability, and then suspends. The component extension will only be active when there are applications with pending sink availability requests or media routes, or when there is active network traffic between the extension and a media sink.
There are several modules to the extension that are loaded on-demand. The main event page bundles are 238kb. Updates are independent of Chrome.
Tab and desktop mirroring will request routing of a media source with URN like urn:google:tab:3 representing tab contents. When the component extension receives a request to route this source, the media route provider manager will query route providers to enumerate sinks that can render streamed tab contents. Once a sink is selected by the user, the mirroring service will create the appropriate MediaStream using the chrome.tabCapture extension API. The MediaStream will then be passed to a Cast Streaming or WebRTC session depending on the preferred protocol of the selected sink. When the media route is terminated, the associated streaming session and media capture are also terminated. A similar approach will be used for desktop mirroring but using chrome.desktopCapture instead.
Media routing of Web content will primarily be done through the Presentation API. Some media sinks (e.g. Cast) can render a subset of Web content natively, or render an equivalent app experience (e.g., via DIAL). For generic Web documents, we plan on rendering it in an offscreen WebContents and then using the Tab Mirroring approach outlined above. The design of the offscreen rendering capability will be added later to this document.
The Presentation API implementation in Blink will live in content/ and will operate on the frame level. It will delegate the calls to the embedder's Media Router implementation (Android Media Router / Chrome Media Router for Android / Chrome, respectively) via a common PresentationServiceDelegate interface. A draft Mojo interface follows (not yet complete):
Here is how the presentation API will roughly map to Chrome Media Router API:
End user control of media routing is done through the Media Router Dialog. The media router dialog is a constrained, tab modal dialog implemented using WebUI. It auto-resizes to fit the currently rendered contents and appears in the top center of the browser. The dialog supports a number of views, including a screen selector, screen status, error/warning messages, and informational messages. To avoid excess whitespace, the dialog appropriately resizes to the current view.
The media router dialog is activated by clicking on the Cast icon, which is always available to the user. The Cast icon implements the action icon interface and appears on either the toolbar action menu (normally) or in the omnibox if there is an available Casting experience (a detected media sink).
Clicking on the Cast icon brings up a menu of available media sinks that are compatible with the current content. For Web documents not using the Presentation API, these will include sinks that can render tab or desktop capture. For Web documents, it will include media sinks compatible with the URL requested to be presented through the Presentation API (once we are able to declare this URL through the API and proactively filter to compatible displays).
Media Route providers may customize the appearance of the active media activity and inject custom controls into the WebUI (subject to UX guidelines). We are prototyping this approach using <extensionview>.
We will use the <extensionview> HTML tag to embed the custom media controller UX. This will allow the component extension to flexibly customize and control the UX instead of having the functionality implemented directly into the browser. ExtensionView allows us to embed a page from the component extension into the Media Router WebUI. We will use the load API to take in the full media controller URL.
The extension will utilize chrome.runtime.* functionality to message between the controller embedded in the ExtensionView and the extension itself.
The entire project should be security reviewed from a holistic and architectural perspective. Specific security-related aspects:
The patches to implement the Media Router have been developed in an internal repository. They will be upstreamed into mainline Chromium with the primary code location of
for the media router, and other components living in appropriate locations according to their type.