To handle gesture events and thereby implement a touch UI, a MIDlet must be able to receive gesture events for its UI elements. The MIDlet must separately set each UI element to receive gesture events.
The following figure shows the relationship between the Gesture API classes required for receiving gesture events.
Figure: Relationship between Gesture API classes
Note: The following instructions and code snippets only focus on
the Gesture API. For instructions on creating a full MIDlet, see sections Getting started and MIDlet
lifecycle. For instructions on using the Canvas
class in a MIDlet, see section Canvas .
To receive gesture events:
Create the UI
element. The element can be either a Canvas
or CustomItem
.
The following
code snippet creates a custom Canvas
class called GestureCanvas
.
// Import the necessary classes import com.nokia.mid.ui.gestures.GestureInteractiveZone; import com.nokia.mid.ui.gestures.GestureRegistrationManager; import javax.microedition.lcdui.Canvas; public class GestureCanvas extends Canvas { public GestureCanvas() { // Register a GestureListener for GestureCanvas // (see step 2 of this example) // Register a GestureInteractiveZone for GestureCanvas // (see step 3 of this example) } // Create the Canvas UI by implementing the paint method and // other necessary methods (see the LCDUI instructions) // ... }
Register a GestureListener
for the UI element. The GestureListener
notifies the MIDlet of gesture events associated
with the UI element. Each UI element can only have one GestureListener
registered for it. However, you can register the same GestureListener
for multiple UI elements.
The following code snippet
registers a custom GestureListener
called MyGestureListener
for GestureCanvas
.
// Create a listener instance MyGestureListener myGestureListener = new MyGestureListener(); // Set the listener to register events for GestureCanvas (this) GestureRegistrationManager.setListener(this, myGestureListener);
For detailed information about the MyGestureListener
implementation, see section Handling gesture
events.
Register a GestureInteractiveZone
for the UI element.
The GestureInteractiveZone
defines the touchable
screen area, a rectangle-shaped interactive zone, from which the GestureListener
receives gesture events for the UI element.
The GestureInteractiveZone
also defines which types
of gesture events the listener receives from this area. Each type
corresponds to a Basic touch actions. To specify which gesture events are received,
use the following value constants when creating the GestureInteractiveZone
instance:
Value |
Description |
---|---|
|
Receive all gesture events. |
|
Receive taps. |
|
Receive double taps. |
|
Receive long taps. |
|
Receive repeated long taps. |
|
Receive drag events. |
|
Receive drop events. |
|
Receive flicks. |
|
Receive pinches. Note: This gesture event is supported from Java Runtime 2.0.0 for Series 40 onwards. |
|
Receive gesture recognition start events. A gesture recognition start event represents the initial state of gesture recognition. Touch down always generates a gesture recognition start event, signaling that gesture recognition has started. Use gesture recognition start events together with gesture recognition end events to keep track of the state of gesture recognition. Note: You cannot use gesture recognition start events to track individual fingers. To track individual fingers, use the Multipoint Touch API. Note: This gesture event is supported from Java Runtime 2.0.0 for Series 40 onwards. |
|
Receive gesture recognition end events. A gesture recognition end event represents the end state of gesture recognition. Touch release always generates a gesture recognition end event, signaling that gesture recognition has ended. Use gesture recognition end events together with gesture recognition start events to keep track of the state of gesture recognition. Note: You cannot use gesture recognition end events to track individual fingers. To track individual fingers, use the Multipoint Touch API. Note: This gesture event is supported from Java Runtime 2.0.0 for Series 40 onwards. |
Note: GESTURE_DOUBLE_TAP
is supported
only in Nokia Asha software platform devices.
To change
the set of gesture events received from the area, use the setGestures
method after you create the GestureInteractiveZone
instance.
By default, the touchable
screen area corresponds to the area taken up by the UI element. To
define a different area, use the setRectangle
method on the GestureInteractiveZone
instance. The location of the area is defined relative to the upper
left corner of the UI element.
You can register a specific GestureInteractiveZone
for only a single UI element. However,
a UI element can have multiple GestureInteractiveZones
registered for it. This is useful, for example, when you create
a Canvas
with multiple Images
and
want to make each Image
an independent interactive
element. In this case, you register multiple GestureInteractiveZones
for the Canvas
, one for each Image
, and set the touchable areas to match the areas taken up by the Images
.
GestureInteractiveZones
can
overlap. The GestureListener
associated with each
overlapping zone receives all gesture events for that zone.
The following code snippet registers a single GestureInteractiveZone
for GestureCanvas
. The GestureInteractiveZone
is set to receive taps and long taps, and its screen area is set
to 40x20 pixels positioned in the upper left corner (0,0
) of GestureCanvas
.
// Create an interactive zone and set it to receive taps GestureInteractiveZone myGestureZone = new GestureInteractiveZone(GestureInteractiveZone.GESTURE_TAP); // Set the interactive zone to also receive long taps myGestureZone.setGestures(GestureInteractiveZone.GESTURE_LONG_PRESS); // Set the location (relative to the container) and size of the interactive zone: // x=0, y=0, width=40px, height=20px myGestureZone.setRectangle(0, 0, 40, 20); // Register the interactive zone for GestureCanvas (this) GestureRegistrationManager.register(this, myGestureZone);
Now that you have created the UI element and set it to receive
gesture events, define how it handles the gesture events by implementing the GestureListener
class.