상세 컨텐츠

본문 제목

actionScript air 멑티 터치 관련

ADOBE/ ActionScript

by AlrepondTech 2020. 9. 23. 02:46

본문

반응형

 

 

=================================

=================================

=================================

 

 

 

 

 

 

 

출처: http://www.adobe.com/devnet/flash/articles/multitouch_gestures.html

This article describes the new multitouch APIs available in Adobe Flash Player 10.1 beta and Adobe AIR 2 beta. As more platforms become multitouch enabled, and as users increasingly expect to be able to interact with devices using touch interactions, the Flash Platform will provide developers with the easiest and most efficient way to deliver ubiquitous touch-enabled experiences.

Multitouch and gestures defined

The term multitouch refers to the ability not only to detect physical touches and movements on a screen, but to detect and track multiple touches and movements simultaneously. Touch events are similar to mouse events, except that you can receive and track more than one of them at once, and touch events do not support mouse-specific concepts such as hovering.

Gestures are the synthesis of multiple touch events into a single event. Examples of gestures include "pinching" an image to scale it, or "swiping" to delete something from a list. Some platforms explicitly support the concept of gestures, reducing the amount of work that a developer needs to do in order to detect and react to them, and some platforms require developers to capture multiple touch events and synthesize them into gestures themselves. The Flash Platform automatically synthesizes the most common gestures across different platforms, but also provides developers with the APIs necessary to synthesize their own.

Although multitouch technology has been around for many years, it was the popularity of the Apple iPhone that really introduced the concept to the mainstream. The advantages of interacting with devices directly, rather than through buttons or using a stylus, has become so apparent that the behavior is making its way into desktop computing, as well. Windows 7 supports multitouch right out of the box, HP has been selling touch-enabled TouchSmart computers since 2007, and Microsoft launched the gesture- and touch-centric Microsoft Surface in 2008. Additionally, Apple introduced their multitouch trackpad with the MacBook Air, and has since incorporated the technology into their entire laptop line. Apple's newest mouse—the Magic Mouse—even has limited gesture support. Touch, multitouch, gesture, and haptic-based computing is becoming so prominent that almost all new high-end, hand-held devices support one or more of these interaction models.

Multitouch and gesture support

The Flash Platform currently supports multitouch and gestures in Flash Player 10.1 in the browser, SWF content published for the iPhone or iPod touch, and in AIR 2. However, multitouch support is dependent on a combination of hardware capabilities and the target platform.

Below is a list of where multitouch and gestures were supported at the time this article was posted:

Multitouch

  • Windows 7 and later (with touch-enabled hardware), including both browser-based Flash Player 10.1 SWF content and AIR 2 applications
  • iPhone OS 3.0 and later

Gestures

  • Windows 7 and later (with touch-enabled hardware), including both browser-based Flash 10.1 SWF content and AIR 2 applications
  • Macs running Mac OS X 10.5.3 and later (with multitouch trackpads)
  • iPhone OS 3.0 and later

Below is a list of which gesture events are supported on various platforms:

Windows 7

  • TransformGestureEvent.GESTURE_PAN
  • TransformGestureEvent.GESTURE_ROTATE
  • TransformGestureEvent.GESTURE_ZOOM
  • GestureEvent.GESTURE_TWO_FINGER_TAP
  • PressAndTapGestureEvent.GESTURE_PRESS_AND_TAP

Mac OS X 10.5.3 and later

  • TransformGestureEvent.GESTURE_PAN
  • TransformGestureEvent.GESTURE_ROTATE
  • TransformGestureEvent.GESTURE_SWIPE
  • TransformGestureEvent.GESTURE_ZOOM

iPhone and iPod touch

  • TransformGestureEvent.GESTURE_PAN
  • TransformGestureEvent.GESTURE_ROTATE
  • TransformGestureEvent.GESTURE_SWIPE
  • TransformGestureEvent.GESTURE_ZOOM

Windows Mobile 6.5

  • TransformGestureEvent.GESTURE_PAN

The Multitouch class

All multitouch and gesture interactions start with the Multitouch class. It contains several important properties necessary for authoring a multitouch or gesture-enabled application.

Discovering support

Before registering for any multitouch or gesture events, it's a good idea to use theMultitouch.supportsGestureEvents and the Multitouch.supportsTouchEvents properties in order to determine whether or not the device on which your content is running supports the types of events that your application needs. If you're writing an application specifically for the iPhone, you might not need to use these properties; but if you're writing content that you want to run in multiple places, these two properties will come in very useful.

Setting the input mode

Setting the Multitouch.inputMode property is necessary to tell the runtime what types of events you want to receive. The three options are:

  • MultitouchInputMode.GESTURE: Use this mode if you want multitouch events synthesized into gesture events. Simple events (like taps) are interpreted as mouse events.
  • MultitouchInputMode.TOUCH_POINT: Use this mode if you are interested only in touch events and no mouse or gesture events. You can use this mode to synthesize your own gestures if you want to support gestures that are not supported by the runtime, or if you need to support both multitouch and gestures.
  • MultitouchInputMode.NONE: Use this mode if you want to handle all touches as mouse events. This is appropriate for content that you want to run on touch-enabled and non-touch-enabled devices without having to branch your code.

Note: Multitouch.inputMode can also be queried at any time to determine the current interaction mode.

Determining which gestures are supported

If you set the Multitouch.inputMode property to GESTURE, it's a good idea to check theMultitouch.supportedGestures property to see which gestures are supported by the particular device on which you're running. The property returns a Vector of strings that are consistent with the event type constants inGestureEvent, PressAndTapGestureEvent, and TransformGestureEvent. That means you can use the strings in this Vector directly to register for supported gesture events.

Determining the number of supported touch points

If you set the Multitouch.inputMode property to TOUCH_POINT and register for touch events, it's a good idea to check the Multitouch.maxTouchPoints property to tell you how many touch points the device on which you're currently running supports. Trying to handle more touch points than the device supports can have unintended effects (like all the active touch points disappearing).

Registering for touch and gesture events

Both touch and gesture events are dispatched by InteractiveObject, which means that you can register for touch or gesture events on any object that inherits from InteractiveObject, such as SimpleButton, TextField, Loader,Sprite, and Stage.

Handling multitouch events

Eight different touch events can be registered on InteractiveObject or any class that extendsInteractiveObject:

  • TOUCH_BEGIN: Indicates that a touch event has begun.
  • TOUCH_END: Indicates that a touch event has ended.
  • TOUCH_MOVE: Indicates that a touch point is being moved (in other words, the user is moving his or her finger across the screen).
  • TOUCH_OVER: Indicates that a touch point has entered an InteractiveObject. Unlike TOUCH_ROLL_OVER, this fires for each child of the InteractiveObject.
  • TOUCH_OUT: Indicates that a touch point has left an InteractiveObject. Unlike TOUCH_ROLL_OUT, this fires for each child of the InteractiveObject.
  • TOUCH_ROLL_OVER: Indicates that a touch point has entered an InteractiveObject. Unlike TOUCH_OVER, this will not fire for each child of the InteractiveObject, indicating that it is aware of being a container.
  • TOUCH_ROLL_OUT: Indicates that a touch point has left an InteractiveObject. Unlike TOUCH_OUT, this will not fire for each child of the InteractiveObject, indicating that it is aware of being a container.
  • TOUCH_TAP: Indicates that a tap (a quick TOUCH_BEGIN and TOUCH_END) has occurred.

Touch event properties

Touch events have several of the same properties that mouse events have, but they also have some additional properties specific to touch interactions:

  • isPrimaryTouchPoint: Indicates whether the first point of contact is mapped to mouse events. This is important to know in instances where your application handles both mouse and touch events.
  • pressure: A value between 0.0 and 1.0 indicating force of the contact with the device. This property is device- and platform-specific, which means it is not supported on all multitouch enabled devices.
  • sizeX: The width of the contact area. This property is not supported on all multitouch enabled devices.
  • sizeY: The height of the contact area. This property is not supported on all multitouch enabled devices.
  • touchPointID: A unique identification number assigned to the touch point. This is useful for keeping track of multiple touch points at once.

New touch functions on Sprite

In addition to new touch-related classes, there are also two new touch-related functions on Sprite:

  • startTouchDrag: Lets the user drag the specified sprite on a touch-enabled device.
  • stopTouchDrag: Ends the startTouchDrag() method.

Putting it all together

Below is a code snippet from MultitouchExample, provided with this article, that shows registering for touch events and then handling TOUCH_BEGIN events using several of the APIs I've presented here. Notice how the sprite called dot is saved in a class member Object variable called this.dots with the event's touchPointID used as its key. This allows for the sprite to be looked up by its touchPointID in the onTouchEnd function so that it can be removed from the display list.

Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
this.stage.addEventListener(TouchEvent.TOUCH_BEGIN, onTouchBegin);
this.stage.addEventListener(TouchEvent.TOUCH_MOVE, onTouchMove);
this.stage.addEventListener(TouchEvent.TOUCH_END, onTouchEnd);

private function onTouchBegin(e:TouchEvent):void
{
    var dot:Sprite = this.getCircle();
    dot.x = e.stageX;
    dot.y = e.stageY;
    this.stage.addChild(dot);
    dot.startTouchDrag(e.touchPointID, true);
    this.dots[e.touchPointID] = dot;
}

private function onTouchEnd(e:TouchEvent):void
{
    var dot:Sprite = this.dots[e.touchPointID];
    this.stage.removeChild(dot);
}

Handling gesture events

There are three different kinds of gesture events: GestureEvent (from which the other two types of gesture events inherit), PressAndTapGestureEvent, and TransformGestureEvent. Below are the event types supported by each gesture event:

  • GestureEvent.GESTURE_TWO_FINGER_TAP: Indicates a gesture defined by tapping with two fingers.
  • PressAndTapGestureEvent.GESTURE_PRESS_AND_TAP: Indicates a gesture defined by a user touching the screen with one finger, then tapping with another. This is a Windows convention which can be used for invoking context menus.
  • TransformGestureEvent.GESTURE_PAN: Indicates a gesture to pan content that may be too big to fit on a small screen.
  • TransformGestureEvent.GESTURE_ROTATE: Indicates a gesture defined by two touch points rotating around each other in order to rotate content.
  • TransformGestureEvent.GESTURE_SWIPE: Indicates a gesture defined by the quick movement of a touch point in order to scroll a list, delete an item from a list, etc.
  • TransformGestureEvent.GESTURE_ZOOM: Indicates a gesture defined by two touch points moving either toward or away from each other to zoom content in or out.

Gesture event properties

The GestureEvent class has many of the same properties found in MouseEvent, but PressAndTapGestureEventand TransformGestureEvent add several properties specific to certain types of gestures.

PressAndTapGestureEvent contains the following properties:

  • tapLocalX and tapLocalY indicating the horizontal or vertical coordinate at which the event occurred relative to the containing interactive object.
  • tapStageX and tapStageY indicate the horizontal or vertical coordinate at which the tap touch occurred in global Stage coordinates.

TransformGestureEvent contains the following properties:

  • offsetX and offsetY indicate the horizontal or vertical translation of the display object since the previous gesture event.
  • scaleX and scaleY indicate the horizontal or vertical scale of the display object since the previous gesture event.
  • rotation indicates the current rotation angle, in degrees, of the display object along the z-axis since the previous gesture event.

Putting it all together

Below is a code snippet from GestureExample, provided with this article, that shows registering for theTransformGestureEvent.GESTURE_ZOOM and TransformGestureEvent.GESTURE_ROTATE events and then using the data in the TransformGestureEvents to manipulate an image of an elephant.

Multitouch.inputMode = MultitouchInputMode.GESTURE;
elephant.addEventListener(TransformGestureEvent.GESTURE_ZOOM, onZoom);
elephant.addEventListener(TransformGestureEvent.GESTURE_ROTATE, onRotate);

private function onZoom(e:TransformGestureEvent):void
{
    var elephant:Sprite = e.target as Sprite;
    elephant.scaleX *= e.scaleX;
    elephant.scaleY *= e.scaleY;
}

private function onRotate(e:TransformGestureEvent):void
{
    var elephant:Sprite = e.target as Sprite;
    elephant.rotation += e.rotation;
}

Where to go from here

For more information about using multitouch and gestures in Flash:


This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License

 

More Like This

 

 

=================================

=================================

=================================

 

 

반응형


관련글 더보기

댓글 영역