[MUSIC] Android provides a class called GestureDetector that applications can use to recognize common touch gestures. This class can recognize gestures such as a confirmed single tap, a double tap, which is essentially two single taps in rapid succession, and a fling, which is a press followed by a drag and release motion that has a reasonably high velocity. To use a GestureDetector, your activity will have to create an instance of the GestureDetector class and will have to give it an object that implements the GestureDetector.OnGestureListener interface. And then the activity will need to override its onTouchEvent method, which is the method that gets called when the activity receives a touch event. And this method will then delegate the motion event to the gesture detector's onTouchEvent method. Let's look at an example application that uses a GestureDetector to recognize a fling gesture. This application is called TouchGestureViewFlipper and when it starts, it presents a TextView that displays a number. If the user performs a right to left fling gesture, then the TextView will scroll off the left side of the screen. And while it does it, a new TextView displaying a new number will scroll in behind it from the right. Let's see that application in action. Now I'll start up the TouchGestureViewFlipper application. When it starts up, the screen shows a TextView displaying the number 0. Now, if I perform a fling gesture, that is, if I press and hold the view, and then quickly swipe towards the left side of the screen, and finally, lift my finger off the screen, then we'll see the animation that I mentioned earlier. Let me do that now. And as you can see, the TextView with the number 0 slid off the screen, going towards the left, and a new TextView displaying the number 1 slid in the, to the screen from the right. Let me do that a few more times. And notice that this gesture only works if I swipe from right to left. If I try it in the other direction, nothing will happen. Let's take a look at the source code for this application. So here's the application open in the IDE. Now I'll open the main activity. First of all, this application uses the ViewFlipper class to handle the animations. Now, I won't go into that much here, but feel free to study the code after we finish this segment. For now, let's focus on how this application detects the fling gesture. So, in onCreate, you can see that the code creates a new GestureDetector. And in the constructor for this object, the code passes in a new SimpleOnGestureListener. And this object defines an onFling method. When the GestureDetector detects a fling gesture, this method will be called. Now, we'll come back to that, to this method in a few seconds. Right now, let's look at the onTouchEvent method for this activity. This method gets called when a touch event occurs and no view in the activity handles it. When this method is called, it will simply delegate the call to the GestureDetector. If the GestureDetector eventually decides that it has seen a complete fling gesture, the onFling method that I just showed you will be called. And this onFling method receives a parameter. In this case, it's called velocityX, that tells how fast and in which direction the swipe gesture was performed. In this example, if the swipe was moving from right to left at a speed of more than ten pixels per second, then the code invokes a method called switchLayoutStateTo, which causes the animation of the TextViews to start. If the velocity does not meet that criteria, for instance, if it's a slow drag instead of a fling, or if it's traveling in the wrong direction, left to right instead of right to left, then the fling gesture is ignored. To recognize more complex gestures, you can use Android's GestureBuilder application to create and then save custom gestures. This application comes bundled with the SDK. At runtime, you can use the GestureLibraries class to load your custom gestures and to recognize when a user performs one of those gestures. To make this work, you include a GestureOverlayView in your application, and this view essentially intercepts user gestures. And then, it invokes your application code to handle those gestures. Here's a screenshot of the GestureBuilder application. As you can see, I've created four custom gestures. Next, which is a horizontal swipe from left to right. No, which looks a bit like an, an x that you make using a single stroke. Prev, or previous, which is a horizontal swipe from right to left. And yes, which looks like a check mark. On the emulator, GestureBuilder saves your custom gestures to a file called /mnt/sdcard/gestures. To use these gestures, you'll need to copy this file into your application's res/raw directory. Let's look at the TouchGestures application. This application displays a small view with a candidate color for the entire applications background. The background color for the whole application is initially set to gray, and the user can use these four custom gestures that I showed earlier to interact with this application. For example, if the user performs the Next gesture, the background color will cycle forward. If the user performs the Previous gesture, the background color cycles back. If the user performs the Yes gesture, the application sets the whole application's background to the current color. And if the user performs the No gesture, the application's background color is reset to gray. Let's see the running application. So here's my device. Now I'll start up the TouchGestures application. And when it starts up, the application's background is generally gray, but there's a colored square in the middle of the screen. If I swipe the screen from left to right, the color of that square in the middle changes. And if I do it again, the color changes again. And I can go back to the previous color by swiping, this time from right to left, instead of left to right. If I decide that I like the current color, I can perform the Yes gesture, like so. And as you see, the whole application now has a background of that color. But if I change my mind, I can perform the No gesture, like so. And as you can see, the application's background goes back to its initial gray. The colored square reappears in the middle of the layout. And I can keep it, issuing gestures to look at new candidate colors. Let's take a look at the source code for this application. Here's the application open in the IDE. And now, I'll open the main activity. And you notice that this activity implements the OnGesturePerformedListener interface, which means that it provides an onGesturePerformed method. In onCreate, the code gets a reference to the frame layout, which it stores in a variable called mFrame, and this is where the candidate background colors appear. The code also gets a reference to a relative layout, which it stores in a variable called mLayout, and this is the layout for the entire application. Now, here's the code that reads the gestures file from the res/raw directory, using the GestureLibraries' fromRawResource method. This method returns a GestureLibrary object and the code then goes on to call the load method for the GestureLibrary. After that, the code finds the GestureOverlayView, which is in the layout, and adds the current activity as a listener for gestures that are intercepted by the GestureOverlayView. When the GestureOverlayView detects a gesture, it calls the onGesturePerformed method shown here. And this method first calls the recognize method, which analyzes the detected gesture and then scores each custom gesture as to how much the detected gesture resembles the custom gestures recorded in the gesture file. And next, the code gets the highest ranked prediction, and then if that prediction has a high enough score, the code carries out the action that is associated with that gesture. For example, if the gesture was the YES gesture, then the code sets the layout's background color to the current candidate color.