Replies: 23 comments
-
Proposal on Friday, 14th October from @haudiobe and @dsilhavy is to coalesce the Media Stream Handler into the same process as the 5GMS-Aware Application, wrapping Exoplayer. This approach internalises the M7 Media Stream Handling API between the 5GMS-Aware Application and the Media Stream Handler. (The Media Stream Handler wrapper around Exoplayer could still be structured as a started foreground service to provide a clean realisation of the M7 API, however.) Any further thoughts, @davidjwbbc? |
Beta Was this translation helpful? Give feedback.
-
For an initial implementation I think this would be fine as the 5GMS-Aware Application would be the only one using this, but ultimately it ought to be a separate Service which provides the M7d API interface to multiple applications and triggers Exoplayer when asked to play some media. This then allows you to do things like continuing to play the media in a small overlay if the controlling application transitions to the background, and allows other applications which come to the foreground to change the playing media or at least coordinate playback (e.g. play after current media finishes). |
Beta Was this translation helpful? Give feedback.
-
I'm wondering whether a Media Stream Handler service would ever be released independently of the 5GMS-Aware Application in a separate installable Android APK, @davidjwbbc. The only concrete Use Case I can think of today would be an 5GMS Application Provider with multiple 5GMS-Aware Applications that depend on the same Media Stream Handler. For the majority of cases, though, each 5GMS Application Provider only has a single 5GMS-Aware Application and it's unlikely that different 5GMS Application Providers would agree amongst themselves to develop a common Media Stream Handler service. The exception to this might be if Google itself decided to bundle an implementation of a 5GMS Media Stream Handler with Android. (This would very approximately be analogous to the Android System Web View that provides a basic HTML rendering capability for use by third party applications that aren't fully-fledged web browsers.) This hypothetical "Android System Media Stream Handler" could then expose a de facto implementation of the 3GPP M7 client APIs to 5GMS-Aware Applications from any 5GMS Application Provider. (In this case, though, it might make more sense for Google to provide a composite "Android System 5GMS Client" that bundles the Media Stream Handler service and Media Session Handler service together as a single installable APK offering both M6 and M7 client API implementations to 5GMS-Aware Applications.) Given that adoption of 3GPP 5G Media Streaming by mobile operating system vendors is still some way off, bundling the Media Stream Handler with the 5GMS-Aware Application in the same APK feels like a reasonable approach to software release for now. But developing the Media Stream Handler as a separate Android service in this repository before marrying it in a single installable APK with a test application that depends on the Media Stream Handler service feels like a nice way of factoring the source code that makes it more reusable in the future. |
Beta Was this translation helpful? Give feedback.
-
GeneralI basically agree with what @rjb1000 wrote. Highlighting again I am not an Android developer so parts of my opinion might not reflect best practices for Android development. Looking at 26.512 Section 13, the M7 interface is used for the following:
All of this communication needs to be exposed by the mediaplayer. The methods will probably be simple API calls while status information and notifications are dispatched via events that an application can register for. Similar to this: player.play()
player.pause() player.on(dashjs.MediaPlayer.events["PLAYBACK_ENDED"], function () {
console.log('Playback ended')
}); All of this needs to be setup as part of the application code. For both mediaplayers (dash.js and Exoplayer) we probably need a wrapper to be compliant with the API calls and events defined in 26.512. At least for Exoplayer the methods and events will most likely be named slightly different. An essential part is the content decryption, decoding and rendering that is managed by the mediaplayer using the native platform functionality (e.g. EME for license acquisition in a web-based scenario). Use case dash.jsAssuming we want to implement the architecture depicted in 26.501 and 26.512 with dash.js on Android. dash.js needs MSE and EME for playback of encrypted media streams. For that, we need a web-based environment so we would probably implement the 5GMSd-Aware Application as a webview. Within that webview there is a simple html page that includes dash.js as a library. In addition, there is some glue/wrapper code around the APIs offered by dash.js for instance to trigger playback and to register for certain events. Communication with the Media Session Handler is hopefully possible from a webview via IPC. Use case ExoplayerSimilar to what is described above for dash.js. However, we launch a native view(activity). The Exoplayer is added as a gradle dependency in the build.gradle. Communication with the Media Session Handler is done via IPC. The Media Session Handler runs in the background as a bound service. General Options
As Richard pointed out, we could also have dedicated repositories for Exoplayer and dash.js that can be included as a dependency in a 5GMS Aware application. Both of these repositories would more or less only use dash.js or Exoplayer and provide a wrapper around the player functions and methods. ConclusionAt least from my understanding approach 2) feels like the most natural one. As a service provider, I decide which library I use for media playback. Then I implement my app on top of that library. I suggest we start by adding the Media Stream Handler (Exoplayer) directly to the application and implement the required wrapper and interface to the Media Session Handler as part of the application. Also communication with the Media Session Handler (running as a background service for multiple applications) would be handled here (see also question below). Additional questions
|
Beta Was this translation helpful? Give feedback.
-
Tried to put this into a diagram for discussion in the calls One question in this context: Are notifications, errors an status updates by the media player proxied through the MSH to the application? If so, the app could also go the direct way via the M7 Adapter and receive this information from the media player directly. |
Beta Was this translation helpful? Give feedback.
-
One question, I'm a bit confused about what is part of the:
What I did in Android is to get a generic app, introduce a layout and put the ExoPlayer inside such layout. In this case I assume Exoplayer is part of the 5GMS Aware Application (yet without the MediaStreamHandler). Would the MediaStreamHandler be another module that would incorporate Exoplayer inside it or (I think better) just a new function that handles the communication between Exoplayer and any 5GMS related activity? Therefore isolating ExoPlayer from 5GMS? This goes into a similar direction to what Daniel is commenting in his last comment? It would be good to get how a generic OTT app works and try to minimize any changes to that app. |
Beta Was this translation helpful? Give feedback.
-
If I understand your description correctly I think this would fit into the architecture above. You would include the MediaStreamHandler as a dependency in your application. The MediaStreamHandler incorporates Exoplayer and the two adapters. From your application point of view you would talk to the M7 adapter. At this point I am not sure if the app needs to talk to the MSH adapter for setup or anything. In that sense the Exoplayer would be isolated from the app as you only talk to the adapters. However, that does probably not align completely with what 26.501 and 26.512 say. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Good point, we could also do it like this to make it easier to use the Media Stream Handler with existing applications. For that to work the app could pass an instance of the Exoplayer to the Media Stream Handler? We should clarify that in our next call. Looking at the specs the Media Stream Handler is essentially the media player. This is why the Exoplayer is incorporated in the Media Stream Handler in my architecture. |
Beta Was this translation helpful? Give feedback.
-
Good discussion, @jordijoangimenez and @dsilhavy. In designing the client software architecture, we should take a few things into consideration:
|
Beta Was this translation helpful? Give feedback.
-
I'm unclear about the distinction between the two purple adapter boxes in the above figure.
The way I envisage things is that there is a more monolithic M7 server that exposes services to the 5GMS-Aware Application and to the Media Session Handler (as you correctly identified in the specifications) and which adapts the 3GPP-defined API to the chosen Media Player, using a different plug-in for each supported type. (This is somewhat analogous to the plug-in architecture @davidjwbbc used for the 5GMS Application Server so that it can be integrated with different web proxy servers.) The service users then invoke M7 methods on the adapter and can register callbacks to receive notifications. |
Beta Was this translation helpful? Give feedback.
-
I agree with @jordijoangimenez that, in practice, the M7 service aspects of the Media Stream Handler are going to be pretty tightly integrated with the 5GMS-Aware Application. But it might be clearer to illustrate these two as separate entities that happen to run inside the same app process. It's then a separate discussion about whether the 5GMS-Aware Application part of the app invokes the M7 API on the Media Session Handler part of the app using local method calls or IPC. |
Beta Was this translation helpful? Give feedback.
-
Per my comment about the Media Session Handler design on 5G-MAG/rt-5gms-media-session-handler#1, I think it would be instructive to draw dotted lines on the software architecture diagram indicating Android process boundaries (and application packaging scope). |
Beta Was this translation helpful? Give feedback.
-
@rjb1000 @jordijoangimenez Thank you for the comments I tried to clarify some aspects with a new version below. |
Beta Was this translation helpful? Give feedback.
-
Getting better, @dsilhavy . I think the main thing still missing is a second blob inside each of the dotted blue boxes marked "5GMS-Aware Application". This is the main code providing the user interface furniture and business logic that also uses/provides the M6 and M7 APIs, perhaps by instantiating the appropriate purple adapter classes inside itself.
By factoring it this way, both kinds of deployment can be supported from a single code repository. This would make the code more reusable by different flavours of application. |
Beta Was this translation helpful? Give feedback.
-
Thank you all for the comments in the last call. Based on our discussion, please find an updated diagram below.
|
Beta Was this translation helpful? Give feedback.
-
Thanks for the update, @dsilhavy. This is getting clearer now. Some quick feedback: My suggestion last week was to show two separate red boxes in the app process:
With this arrangement, you can then show reference point M7d as the line (currently labelled "methods") between the 5GMS-Aware Application red box and the Media Player red box, making this implementation architecture compliant with the 5GMSd reference architecture. In this particular deployment, the two red boxes are peers inside the same app process (e.g. The reason for depicting it this way is that in a different kind of deployment (which it would be useful to draw as a separate diagram), the 5GMS-Aware Application runs in one Android process (dotted blue line box) and the Media Player runs as an Android Foreground Service (in a different dotted blue line box). In this case, invoking the M7d methods requires Inter-Process Communication since it crosses a process boundary. And the Media Player is installed from a separate APK to the 5GMS-Aware Application. |
Beta Was this translation helpful? Give feedback.
-
@rjb1000 Thank you for the quick feedback, is this roughly what you had in mind for the first kind of deployment? What I had in mind with the previous architecture was a 5GMS aware application that contains business logic and includes the 5GMS-adapted Media Player / Media Stream Handler as a dependency. That's why the smaller red box was included in the bigger red box. Then I do local calls from my 5GMS-Aware-Application to an instance of |
Beta Was this translation helpful? Give feedback.
-
Yes, @dsilhavy. (I would attach the "methods" invocation arrow directly to the M7d client.) Of the two depictions, I prefer the one with two red boxes because the 5GMS-adapted Media Player isn't really included inside the 5GMS-Aware Application. To my mind it's not a containment relationship. Showing two red boxes will make drawing the alternative implementation architecture diagram simpler, and makes it clear that these are two separate pieces of software to be developed. |
Beta Was this translation helpful? Give feedback.
-
Updated version including dash.js. |
Beta Was this translation helpful? Give feedback.
-
In drawing the three different component configuration models, @dsilhavy, it would be useful to name them so that we can talk about them unambiguously. Here is my take:
|
Beta Was this translation helpful? Give feedback.
-
Thanks @rjb1000 that sounds good. Based on your description I updated the illustrations, the three different versions below: |
Beta Was this translation helpful? Give feedback.
-
I am converting this to a discussion as we have the initial implementation in place now. Our current implementation follows configuration option B |
Beta Was this translation helpful? Give feedback.
-
This issue is supposed to be used to discuss and answer the following questions
Beta Was this translation helpful? Give feedback.
All reactions