In this video, we will discuss the input from other user input devices. Most of them are not yet on the consumer market, or do not come with the standard HMDs. From my development perspective, this normally involve using third party libraries or APIs, and thus, they are trickier to implement. The first type of user input, I'd like to mention is actually something fairly common on microphone. I haven't talked about this as part of the standard HMD devices because we don't normally use our voice to move around or select objects. In the real world, we normally only use our voice to interact socially with other people. However, I do sometimes use my voice to interact with objects in the world. For instance, some lights are controlled by sound, and when I'm driving, I can tell my GPS where I want to go to get directions. If you have tried both, you would agree that sound triggered security cameras or lights, are fairly reliable, but getting the GPS to get directions does not always work, because it might not understand you. In virtual reality, similar rules apply. It's easy to program certain aspects of the environment using sound control by setting a threshold of volume, and making graphics animate according to the pitch of the sound. However, it is a bit more difficult if you would like to have a meaningful conversation with a virtual character in the environment. As this often involves complicated speech recognition, and artificial intelligence. The second type of user inputs are devices which track user's hands or body movements. For instance, Leap Motion is an infrared camera which you can mount on your HMD facing outwards. It tracks the position and movement of your hands and also the held finger movements. So, we could use Leap Motion to select objects in VR, but there are two problems. First, I will have to be looking more or less at the direction of the object in order for my hands to be tracked. If I look away when trying to grab an object, as my tracking space shifts, Leap Motion could easily lose track of my hands. Secondly, when I grab an object, or when the object I'm holding collides with another object for instance, when I put a cup on a table in the VR. Ideally, I can get some type of haptic feedback in the form of a small vibration. This is supported by many VR controllers, but with Leap Motion, as we are tracking everything with an infrared camera, and we are not holding any physical devices, we won't receive any haptic feedback unless, we pair Leap Motion with something else. This is similar to Kinect which is another device that tracks user's body movements with an infrared camera. Kinect tracks the whole body instead of just the hands, and is good for tracking posture and gesture. It is normally used to control our own avatars in a game or to socially interact with other people in VR. Finally, there are various kinds of motion capture suits you can use to track the body movement of the user, but they are normally quite expensive and can be a bit complicated to set up and calibrate. So as a result, you will have better quality tracking data. Other than user's body movement, you can also track their physiology signals, and eye gaze. Physiological signals could include heart rate, or EEG which is the brain activity, or EMG which is the muscle activity. This normally requires some expertise in real-time data processing, and it is normally used in training and therapies. For instance, for someone who has fear of heights, we can monitor their heart rate when putting them on top of a virtual table in VR, and only increase the height of that virtual table when they get used to the current height, and get their heart rate under a certain threshold. There are many other ways that you can get user input for your VR application. It really depends on what you want to do, and what kind of market you're looking at. For instance, we can get our users to run on a treadmill and map the speed of the physical treadmill to the speed of running in the virtual world, or our users can be cycling in VR while using a real exercise bike in the gym which both controls the speed and direction of travel in VR. So, instead of seeing sweaty and angry-looking people in the gym, they can be seeing beautiful people cycling with them, or when they cycle in a gym in the basement in VR, they can be cycling on the Great Wall of China.