Graphics Composition

Composition is the process of putting together the output elements from different sources to create the screen display that the end user sees on the device.

In a multi-tasking device many of the activities taking place simultaneously generate output for display on the screen. The output can include words, pictures, video, games and the screen furniture (scroll bars, buttons, icons, borders, tabs, menus, title bars) familiar to every computer user.

Many of these output elements can appear at the same time, either next to each other or overlapping each other. They can be opaque, such that they obscure anything behind, or semi-transparent, such that the elements underneath are partially visible.

The diagram below illustrates how the display that the viewer sees (looking down from the top) is a two-dimensional representation composed from a series of layers or scene elements.

Figure: The display is an orthogonal view of a series of layers.

In ScreenPlay, sources that generate complex graphical output render directly to surfaces, which are pixel buffers for holding an image or part of a scene. The Window Server delegates the composition of surfaces to the composition engine, which device creators can adapt to take advantage of graphics processing hardware.

The Window Server keeps track of the position, size, visibility, transparency and z-order of all the application windows and maps this information to scene elements that are ultimately passed to the composition engine.

Composition involves:

  • Calculations based on the size, position, visibility, transparency and ordering of the scene elements to determine the structure of the scene that will be displayed. This is a logic exercise.

  • Handling of the scene content, which is defined by a set of surfaces that contain the pixels data to be displayed. The surfaces are bound with the scene elements that define the scene structure. This is a data processing exercise.

While a scene element is a simple lightweight object, and easy to manipulate, a surface stores a large amount of data and its handling requires more consideration.

The diagram below is a simplistic representation of how applications create output that is rendered, composited and displayed.

Figure: Graphics composition

In a device with graphics acceleration hardware (a Graphics Processing Unit or GPU) there might, in addition to the memory managed by the CPU, be additional memory managed by the GPU. Image data may therefore be considered to have been software rendered (onto a surface in CPU memory) or hardware rendered (onto a surface in hardware accelerated GPU memory).

Composition in Symbian^3

In Symbian^3, the UI is rendered onto a single surface (called the UI surface) before being composited with any background surfaces. The UI surface is displayed on a layer placed in front of all of the others. The UI surface is created for the Window Server during system start up and is then passed to the composition engine as a special case surface for composition.

The compositing of surfaces according to their origin means that the physical composition process behaves differently from the logical composition process that is based on what the user and the UI are doing. Logically the windows in the UI and memory rendered surfaces may be on layers that are interleaved, yet the memory rendered surfaces are physically composed behind.

Surfaces are associated with windows in the Window Server using the RWindowBase::SetBackgroundSurface() method. These surfaces are then called external surfaces. The Window Server is able to include external surfaces in its logical composition and make provision for them during data composition. A window with its background set to an external surface in this way becomes transparent in the UI surface.

In most cases this is not apparent to the viewer. Surfaces that are physically composed behind the UI appear in the correct position on the two-dimensional display. The diagram below illustrates how Window Server-rendered UI content and external surfaces are composited using the UI surface. The following diagram shows applications and other graphical data sources rendering to surfaces in software and hardware.

Figure: Hardware composition and the flattened UI

Here is a second version of the diagram at the top of the page showing how the same composition might be achieved in practice. The UI menus, windows and dialogs are composited by the Window Server onto the single UI surface. The light green layer displays a hardware rendered surface so it is actually behind the layer on which it appears.

Figure: The UI surface

The next figure illustrates the use of hardware accelerated surfaces and the UI surface in Symbian^3.

Figure: Video rendered to a surface mapped to a layer behind the UI surface

Although this method of composition is flexible and powerful, it has limitations and does not support semi-transparent hardware-accelerated surfaces or a semi-transparent surface in front of the UI surface.

Composition in Symbian^4

Symbian^4 has a new application and UI framework that is based on Qt and Orbit. This new framework represents a major move away from using the Window Server to render the UI. Instead the application developer creates the UI using Qt and Orbit and the Qt framework uses OpenVG or OpenGL ES to render the UI directly to an EGL window surface. This is an external surface that is bound to the application's window using RWindowBase::SetBackgroundSurface() just like in Symbian^3. However, unlike Symbian^3, the external surface can be transparent or semi-transparent.

This enables semi-transparent Qt applications to appear over an opaque background surface that is provided by a Qt system application and is shared among all applications. The framework also provides support for semi-transparent Qt system dialogs to appear over other Qt applications, as shown in the next figure.

Figure: Video, Qt application and Qt system dialog

The UI surface is no longer always the topmost layer and external surfaces can appear in front of it. However, the UI surface is only present if there are legacy applications rendering using AVKON and the Window Server's CWindowGc API.

Related concepts