Touch UI

From Series 40 6th Edition FP 1 and S60 5th Edition onwards, the Series 40 and Symbian platforms support touch interaction on mobile devices with a touch screen. The touch screen is sensitive to the user's finger and device stylus, thus replacing or complementing the physical keys of the device as the main means of interaction. The touch UI allows the user to directly manipulate objects on the screen, enabling a more natural interaction with the device.

Touch devices support a variety of design possibilities that cannot be implemented in traditional key-based mobile applications. If a device has a physical keypad and a touch screen, both can be used to interact with applications.

On Series 40 and Symbian devices, touch interaction is supported by all LCDUI components. On Symbian devices, touch interaction is also supported by all eSWT components. You can thus create touch UIs using the same UI components as when creating traditional key-based UIs.

High-level LCDUI components and eSWT components use the predefined touch implementation provided by the device, so you do not need to separately program touch interaction for them. Low-level LCDUI components, however, do not automatically implement touch functionality, so you need to separately listen for and handle touch events for these components. In other words, MIDlets that use only high-level LCDUI components or eSWT components work on touch devices automatically and do not need to separately handle touch interaction, whereas MIDlets that use low-level LCDUI components need specialized methods to handle touch events and implement touch functionality.

To find out which devices have a touch screen, see the Forum Nokia Device Specifications and filter devices based on "Touch Screen".

Basic touch actions

The following table describes the basic touch actions and the corresponding touch events or event combinations registered by the touch UI.

Table: Basic touch actions

Action

Description

Example

Touch events

Touch

The user presses the finger or stylus against the screen.

 

touch down

Release

The user lifts the pressed finger or stylus from the screen.

 

touch release

Tap

The user presses the finger or stylus against the screen for a brief moment and then lifts it from the screen.

Tap:

touch down + touch release ("touch down and release")

Long tap

The user presses the finger or stylus against the screen and holds it there for a set amount of time. The time-out value depends on the platform.

Depending on the object that is long-tapped, this action can also constitute a key repeat.

Long tap:

touch down and hold

Drag

Drag and drop

The user presses the finger or stylus against the screen and then slides it over the screen.

This action can be used for scrolling or swiping content, including the whole screen, and dragging and dropping objects. In drag and drop, the user "grabs" an object on the screen by touching it, drags it to a different location on the screen, and then "drops" it by releasing touch.

Note: High-level UI components do not support drag and drop. You must implement drag and drop separately by using low-level UI components.

On Symbian devices, by default, there is a safety area of a few millimeters around the initial press point from which drag events are discarded for half a second. This avoids unnecessary drag events on short precision taps. For more information about this safety area, see section Tap detection.

Drag:

Drag and drop:

touch down + drag

touch down + drag + touch down ("stop") + touch release

Flick

The user presses the finger or stylus against the screen, slides it over the screen, and then quickly lifts it from the screen in mid-slide. The user can also slide the finger or stylus off the screen. The content continues scrolling with the appropriate momentum before finally stopping.

Flick:

touch down + drag + touch release while dragging

Moving content on the screen

Touch devices allow users to scroll content without using the scrollbar. The content can be scrolled by directly dragging or flicking the content. If the content is flicked, the device applies kinetic momentum to the scroll effect: the content continues scrolling in the direction of the flick with decreasing speed, as though it has physical mass.

In MIDlets, Forms and Lists support direct content scrolling automatically. For low-level LCDUI components, direct content scrolling and any scroll effects must be implemented separately.

The following table shows from which platform release onwards the different scroll effects are supported.

Table: Supported scroll effects when moving content on the screen

Scroll effect

Supported since (Series 40)

Supported since (Symbian)

Drag

Series 40 6th Edition FP 1

S60 5th Edition with Java Runtime 1.4

Note: On S60 5th Edition devices, content can be scrolled by dragging the focus on the screen.

Flick

Series 40 6th Edition FP 1

S60 5th Edition with Java Runtime 1.4

Note: Early S60 5th Edition devices do not support this effect.

Multipoint touch

Symbian^3 devices with JRT 2.2 support for multipoint touch events with up to two touch points on LCDUI Canvas and GameCanvas. For more information, see Multipoint touch.

Text input

If there is no physical keyboard available to the user in touch-only devices and when the user taps on the supported editor, split view is opened.

The following Java editors support split view input:

The following Java editors support full screen input:

For more information on text input, see:

More information

For more information about touch interaction in the Series 40 and Symbian platforms, see: