|
||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |
See:
Description
Interface Summary | |
---|---|
GestureEvent | The GestureEvent interface class is used by an application to receive gesture recognition events from the platform. |
GestureListener | This interface is used by applications which need to receive gesture events from the implementation. |
Class Summary | |
---|---|
GestureInteractiveZone | The GestureInteractiveZone class is used by an application to define an area of the screen that reacts to a set of specified gestures. |
GestureRegistrationManager | The GestureRegistrationManager class provides the ability to register a GestureListener to be notified when a gesture event occurs within a container. |
The Gesture API provides support for a number of platform defined gestures. These are Single tap, Long tap, Drag and drop and Flick.
The Single tap is recognised by a quick touch down and release. |
The Long press is a touch and hold. |
The Long press repeated is a generated when a long press is held down. |
Drag and drop are defined as touch down, move the finger whilst keeping contact with the touch screen, stop and then release. |
The Flick gesture is defined as a touch down, move and release before stopping the finger movement. |
The Nokia Gesture API makes available to the MIDlet the platform�s gesture recognition engine. This simplifies MIDlet development because the MIDlet doesn't need to implement its own gesture recognition engine. It also helps to ensure the MIDlet�s user experience matches that of the native Series 40 Touch and Type UI platform.
The Gesture API uses the Observer design pattern. To use this API MIDlets must first create a GestureInteractiveZone. This defines a bounding rectangle for the Gesture event notifications. By default the bounding rectangle is the entire screen. Only Gesture events that are initiated within the confines of the zone are passed to the MIDlet. The GestureInteractiveZone also defines the types of Gesture events to register for.
// Defines a GestureInteractiveZone for the whole screen and all Gesture types.
GestureInteractiveZone giz = new GestureInteractiveZone( GestureInteractiveZone.GESTURE_ALL );
This zone is then registered with the GestureRegistrationManager by passing in the container (either a Canvas or CustomItem) and the GestureInteractiveZone.
// Register the GestureInteractiveZone for my Canvas.
GestureRegistrationManager.register( canvas, giz );
The MIDlet must then define a class that implements the GestureListener interface and set it as a Listener for a container with the GestureRegistrationManager by passing in the container (either a Canvas or a CustomItem) and the GestureListener. Each container can only have one listener associated with it.
// Set the gestureListener for my Canvas.
GestureRegistrationManager.setListener(canvas, gestureListener);
The GestureListener interface defines a single method, gestureAction, which gets called when the platforms gesture recognition engine detects a gesture in one of the registered GestureInteractiveZones. The gestureAction method will receive a GestureEvent instance each time it is called. This GestureEvent holds the properties of the recently recognized gesture such as the type (TAP,DRAG, etc). For all event types the MIDlet can get the x and y location. For DRAG and DROP events the MIDlet can also get the change in x and y distance since the last Drag event and for FLICK events the MIDlet can get the flick speed and direction.
public void gestureAction(Object container,
GestureInteractiveZone gestureZone,
GestureEvent gestureEvent)
{
// TODO: add custom code here.
}
|
||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |