The presentation and recording module of JSR 272 controls the presentation and recording of the time-based media types such as audio, video and animated content. And it may also include container formats such as MPEG-4 System and SMIL that describe the synchronization and layout aspects of the presentation.The JSR 272 presentation and recording module make references to JSR 135 and JSR 234 below for time-based media handling. In places where needed, it also includes additional clarifications to these APIs when they are used in the context of JSR 272. In addition, it also extends from these APIs in a well-defined manner some additional media capabilities.
The rest of this chapter is divided into the following sections:
MMAPI provides a few key API concepts to handle time-based multimedia. APlayer
represents a media handler that provides the methods to control aspects of media presentation common to all time-based media types. Its capabilities include the control of media progression -- start, stop, setting media time etc. Media-type specific features such as audio, video, MIDI are defined as specializedControl
interfaces that can be fetched from thePlayer
.The following defines the specific behavior and implementation requirements for an MMAPI implementation in the context of JSR 272.
1.1 Player Creation
MMAPI Players provide the API to control the presentation of audio and video components. They are created as a result of tuning to a broadcast service with the Service Selection API (see ServiceContext). Creating Players from the javax.microedition.media.Manager directly from URI's referencing the broadcast content is NOT supported. After the Player is created from the Service Selection API, it is in the Realized state so Controls from the Player can be immediately obtained.
JSR 272 makes no assumption on how many Players will be created as a result of tuning to a broadcast. For example, it is conceivable that if the broadcast contains one audio and one video track, one Player will be created to play back the synchronized audio and video. However, if there are more than one video track with their matching synchronized audio tracks, more than one Player may be created to allow flexible placement of the video components and individual control of each Player.
1.2 Media Time
For broadcast content, JSR 272 loosely adopts the notion of "Normal Play Time" which defines the continuous timeline over the duration of the broadcast as the semantics for the Player's media time. As such, the media time of the Player may not begin with 0 and the duration of the content may not be known. In the cases when the duration of the media is not known, Player.getDuration MUST return Player.TIME_UNKNOWN.
1.3 Video Display and Graphics Overlays
When the system property "microedition.broadcast.supports.overlay" returns "true", graphics overlay MUST be supported as described below.JSR 272 supports overlaying graphics and GUI elements by having the video displayed at base display plane of the graphics subsystem. Any GUI or graphics elements created by the GUI toolkit (e.g. LCDUI) must be displayed on a layer above the video.
For example, if a video Player is initialized in
USE_DIRECT_VIDEO
mode to be displayed on an LCDUI canvas (see presentation example), any other graphics or GUI components created in the same area occupied by the video player must be rendered on top of the video.
Transparency and/or alpha compositing of the graphics component or GUI item can only be supported if there is a well defined way to specify the overlay object's transparency and/or alpha value. For graphics component, if the graphics format itself supports transparency (e.g. GIF) or alpha value (e.g. PNG), then transparency or alpha compositing MUST be supported if the system supports overlay. However, note that for GUI items, MIDP 1 and MIDP 2 do not support specifications of transparency or alpha values.
1.4 Recording
Real-time recording of media content is supported in MMAPI with
javax.microedition.media.control.RecordControl
. If an implementation supports recording and the application has permissions to record the content, aRecordControl
can be fetched from thePlayer
.Recordings can also be scheduled ahead of time. This is handled by RecordingScheduler and it's associated classes.
Recorded content can be played back using MMAPI by creating a
Player
from the URL of the recorded content, as long as the application has permission (DRM rights) to play the content. If the application does not have playback permission over the content, the playback will fail according to the MMAPI specifications.Recorded content may or may not be superdistributed (shared, exported to other applications or devices) as specified by the rights objects associated with the recorded content. For recordings that are allowed to be superdistributed, an application can read the raw recorded content using the
FileConnection
API from the Generic Connection Framework. For recordings that are not allowed to be superdistributed, access to the raw data MUST fail according to the specifications of theFileConnection
API.See the DRM section for more discussions on DRM-related issues.
1.5 Mandatory Controls
The following table outlines the MMAPI controls that are mandatory for a JSR 272 implementation:
MMAPI Controls Implementation Requirements ToneControl Mandatory (per MMAPI requirements) VolumeControl Mandatory VideoControl Mandatory if the device supports video playback RecordControl Mandatory if the device supports recording RateControl Mandatory if the device supports time-shifted playback
Contents that are locally stored or are transmitted point-to-point via some client-server control protocol (e.g. RTSP) allow maximum control of the player's media time. For example, setting or changing the player's media time can function properly for the full duration of the media.Contents that are strictly broadcast without local caching will limit the level of control that can be applied to the player's media time. So setting or changing the playback media time will not function.
Some implementations may provide some amount of media caching in real-time during playback. In this case, the implementation can provide some more flexible control of the player's media time even for broadcast content. This is known as time-shifted playback or time-shifting. JSR 272 provides an extension,
TimeShiftControl
to support time-shifting.Time-shifting allows media time to be changed and playback rate to be set to the extent of the duration of the cached content.
The following table outlines the limitations for the time-related controls with respect to both pure broadcast and time-shifted playback:
Methods Broadcast Playback: TimeShiftControl not supported Time-shifted Playback: TimeShiftControl enabled Time-shifted Playback: TimeShiftControl not enabledPlayer.setMediaTime Setting media time is not supported. A MediaException
will be thrown.Setting media time is supported to set the playback to any position within cached content. Media time is not changed. Method returns current media time. Player.setLoopCount Setting the loop count on broadcast content will succeed. However, since broadcast content practically may not end and will not generate an end of media event, the Player
may never loop back.The same limitation applies here. The same limitation applies here. Player.setTimeBase Setting the time base on broadcast content is not supported. A MediaException
will be thrown.The same limitation applies here. The same limitation applies here. RateControl Setting the playback rate is not supported. The Player MUST not return a RateControl for broadcast content. Playback rate can be changed. After the new rate is set, the content will be played back in the new rate up to the limits of the cached content. If the playback rate is less than 100000 (slower than real-time), the playback can be sustained until the cached buffer is filled. If the playback rate is greater than 100000 (faster than real-time), the playback can be sustained until the cached buffer is emptied. When playback reaches cached buffer boundaries, playback is set to 100000 by the implementation and a PLAYBACK_REACHED_TIMESHIFT_BUFFER_LIMIT event is posted to the
PlayerListener
.Player MUST return
RateControl
.RateControl.getMax/MinRate
return 100000.Player MUST return
RateControl
.FramePositioningControl Frame positioning is not supported. The Player MUST not return a FramePositioningControl for broadcast content. Application can seek or skip to a frame as long as the given frame lies within the duration of the cached content. Player MAY return
FramePositionControl
seek()
returns current frame andskip()
returns 0. Mapping methods can be used normally.Player MAY return
FramePositionControl
AMMS builds on top of MMAPI by adding an extensive set ofControl
interfaces for advanced multimedia capabilities. Support for AMMS is not mandated for JSR 272. However, if present, AMMS provides additional advanced media controls. These controls MAY include:
JSR 234 Controls Purposes AudioVirtualizerControl For example, for choosing the audio rendering algorithm in case of multichannel audio feed listened without multichannel speaker system. EqualizerControl For setting the audio equalization. ImageTonalityControl For adjusting brightness, contrast and gamma of the video. ImageTransformControl For cropping, zooming, flipping, stretching and rotating the video. OverlayControl This allows overlaying and compositing images (not GUI objects) on top of video. (When MIDP 3 becomes available, it may support compositing GUI objects including images on video. But for MIDP 2 and below, OverlayControl will provide the flexibility to overlay and composite static images to video.) AudioFormatControl, VideoFormatControl, ContainerFormatControl These controls allow the media formats to be queried and set in AMMS. However, when used in the context of JSR 272 Players, only the query of formats is required to be supported for these controls. The following list specifies the list of methods that MUST be supported:
getEstimatedBitRate
getFormat
getIntParameterValue
getStrParameterValue
getSupportedIntParameters
getSupportedMetadataKeys
getSupportedStrParameters
Some broadcasts may carry in addition to the elementary media streams, descriptions of the layout and/or scripted interactive behaviors. These might be achieved with container formats such as MPEG-4 system, SMIL, SVG, LASeR; to name a few.
A presentation described in these container formats can still be represented and controlled as a
Player
. In these cases, thePlayer's
time based control methods will control the overall time-base of the synchronized presentation. i.e. starting or stopping thePlayer
will start or stop the overall synchronized presentation.Some container formats may also offer finer grain programmatic control to override or dynamically change the layout or synchronization aspects of the presentation at individual elementary components level. An example of that is SVG where individual elements such as audio, video or any other graphical elements can be manipulated via the DOM. The API to control these elements, in the case of SVG, can be achieved via JSR 226 which is specialized for SVG content.
Each broadcast may contain other non-time-based media that can be presented. Examples of that are: static images, interactive animated contents, markup languages like XHTML. These contents are not controlled/presented as a
Player
. Rather, they are treated as auxiliary broadcast objects and can be accessed as raw data via the Broadcast Object API (see BroadcastConnection). Other APIs and JSRs can be used to present them as appropriate.
Overview | JavaDoc API | Presentation & Recording | Security & DRM | Purchasing | Examples |