Tuesday, 19 February 2013

Event Delivery | iPhone Apps Tutorial pdf

=> Event Delivery

The delivery of an event to an object for handling occurs along a specific path. As described in “Core  Application Architecture”, when users touch the screen of a device, iPhone OS recognizes the set of touches and packages them in a UIEvent object that it places in the active application’s event queue. If the system  interprets the shaking of the device as a motion event, an event object representing that event is also placed in the application’s event queue. The singleton UIApplication object managing the application takes an event from the top of the queue and dispatches it for handling. Typically, it sends the event to the application’s key  window the window currently the focus for user events and the UIWindow object representing that  Window sends the event to an initial object for handling. That object is different for touch events and motion events.
=>  Touch events. The window object uses hit-testing and the responder chain to find the view to receive the touch event. In hit-testing, a window calls hitTest:withEvent: on the top-most view of the view hierarchy; this method proceeds by recursively calling pointInside:withEvent: on each view in the view hierarchy that returns YES, proceeding down the hierarchy until it finds the subview within whose bounds the touch took place. That view becomes the hit-test view.
If the hit-test view cannot handle the event, the event travels up the responder chain as described in “Responder Objects and the Responder Chain” (page 78) until the system finds a view that can handle it. A touch object (described in “Touch Events” (page 81)) is associated with its hit-test view for its lifetime, even if the touch represented by the object subsequently moves outside the view. 
=> Motion events. The window object sends the motion event to the first responder for handling. (The first responder is described in “Responder Objects and the Responder Chain.”
Although the hit-test view and the first responder are often the same view object, they do not have to be the same.
The UIApplication object and each UIWindow object dispatches events in the sendEvent: method.
(These classes declare a method with the same signature). Because these methods are funnel points for events coming into an application, you can subclass UIApplication or UIWindow and override the sendEvent: method to monitor events (which is something few applications would need to do). If you override these methods, be sure to call the superclass implementation (that is, [super sendEvent:theEvent]); never tamper with the distribution of events.

=> Responder Objects and the Responder Chain

The preceding discussion mentions the concept of responders. What is a responder object and how does it fit into the architecture for event delivery?
A responder object is an object that can respond to events and handle them. UIResponder is the base class for all responder objects, also known as, simply, responders. It defines the programmatic interface not only for event handling but for common responder behavior. UIApplication, UIView, and all UIKit classes that descend from UIView (including UIWindow) inherit directly or indirectly from UIResponder, and thus their instances are responder objects.
The first responder is the responder object in an application (usually a UIView object) that is designated to be the first recipient of events other than touch events. A UIWindow object sends the first responder these events in messages, giving it the first shot at handling them. To receive these messages, the responder object must implement canBecomeFirstResponder to return YES; it must also receive a becomeFirstResponder message (which it can invoke on itself). The first responder is the first view in a window to receive the following type of events and messages:
=> Motion events via calls to the UIResponder motion-handling methods described in “Motion Events” 
=> Action messages sent when the user manipulates a control (such as a button or slider) and no target is specified for the action message
=> Editing-menu messages sent when users tap the commands of the editing menu (described in “Copy, Cut, and Paste Operations” 
The first responder also plays a role in text editing. A text view or text field that is the focus of editing is made the first responder, which causes the virtual keyboard to appear.
Note: Applications must explicitly set a first responder to handle motion events, action messages, and editing-menu messages; UIKit automatically sets the text field or text view a user taps to be the first  responder.
If the first responder or the hit-test view doesn’t handle an event, it may pass the event (via message) to the next responder in the responder chain to see if it can handle it.
The responder chain is a linked series of responder objects along which an event or action message (or editing-menu message) is passed. It allows responder objects to transfer responsibility for handling an event to other, higher-level objects. An event proceeds up the responder chain as the application looks for an  object capable of handling the event. Because the hit-test view is also a responder object, an application may also take advantage of the responder chain when handing touch events. The responder chain consists of a series of “next responders” in the sequence depicted in below Figure.
Figure : The responder chain in iPhone OS
When the system delivers an event, it first sends it to a specific view. For touch events, that view is the one returned by hitTest:withEvent:; for motion events and action messages, that view is the first responder.
If the initial view doesn’t handle the event, it travels up the responder chain along a particular path:
1. The hit-test view or first responder passes the event or action message to its view controller if it has one;
if the view doesn’t have a view controller, it passes the event or action message to its superview.
2. If a view or its view controller cannot handle the event or action message, it passes it to the superview of the view.
3. Each subsequent superview in the hierarchy follows the pattern described in the first two steps if it cannot handle the event or action message.
4. The topmost view in the view hierarchy, if it doesn’t handle the event or action message, passes it to the UIWindow object for handling.
5. The UIWindow object, if it doesn’t handle the event or action message, passes it to the singleton UIApplication object.
If the application object cannot handle the event or action message, it discards it.
If you implement a custom view to handle events or action messages, you should not forward the event or message to nextResponder directly to send it up the responder chain. Instead invoke the superclass implementation of the current event-handling method let UIKit handle the traversal of the responder chain.

=> Regulating Event Delivery

UIKit gives applications programmatic means to simplify event handling or to turn off the stream of events completely. The following list summarizes these approaches:
=> Turning off delivery of touch events. By default, a view receives touch events, but you can set its userInteractionEnabled property to NO to turn off delivery of events. A view also does not receive events if it’s hidden or if it’s transparent.
=> Turning off delivery of touch events for a period. An application can call the UIApplication method beginIgnoringInteractionEvents and later call the endIgnoringInteractionEvents method.
The first method stops the application from receiving touch event messages entirely; the second method is called to resume the receipt of such messages. You sometimes want to turn off event delivery while your code is performing animations.
=> Turning on delivery of multiple touches. By default, a view ignores all but the first touch during a multi-touch sequence. If you want the view to handle multiple touches you must enable this capability for the view. You do this programmatically by setting the multipleTouchEnabled property of your view to YES, or in Interface Builder by setting the related attribute in the inspector for the related view.
=> Restricting event delivery to a single view. By default, a view’s exclusiveTouch property is set to NO, which means that this view does not block other views in a window from receiving touches. If you set the property to YES, you mark the view so that, if it is tracking touches, it is the only view in the window that is tracking touches. Other views in the window cannot receive those touches. However, a view that is marked “exclusive touch” does not receive touches that are associated with other views in the same window. If a finger contacts an exclusive-touch view, then that touch is delivered only if that view is the only view tracking a finger in that window. If a finger touches a non-exclusive view, then that touch is delivered only if there is not another finger tracking in an exclusive-touch view.
=> Restricting event delivery to subviews. A custom UIView class can override hitTest:withEvent: to restrict the delivery of multi-touch events to its subviews.

No comments: