[MUSIC] We've been going through this list of sensors that you can use to augment your device and in particular the sensors that you can use to augment experiences as your device is in the physical world. The next one that we'd like to tackle is to give you an example of how to use the light sensor. Now, the light sensor is really just a camera, but it's a sensor fusion between the front camera and the back camera that have been distilled together in order to give the ability to measure light level out in the world. And you can use that to trigger different kinds of events. Different kinds of effects, if you want. Now, the high level bit here, though, is that you can't get light levels directly. Presumably because of privacy concerns related to turning on the camera when the user isn't aware of it, it's very difficult to get access to the camera directly. In fact, as far as I know, you can't do it by using public APIs. You can do some stuff with private APIs or with jailbroken phones but no app will get past in the App Store if you do it that way. So the best you can do it without circumventing the APIs that are in place, is to leverage the ability to get to the light level indirectly. And the way you do this is by using the brightness controls that are built into the device. So the high level bit is that if the user has auto brightness turned on by clicking on Settings and clicking here on Auto Brightness, the brightness of the screen reacts to the environment that the device is in. And so, as the user moves around in the world, if they go into a sunnier spot the brightness will change to become brighter so that you can see the screen in that environment. And this video on the right shows it automatically happening in the screen record, that's a recording of a physical device being moved into the Sun, then being moved out of the Sun. So this brightness you can get access to. So if the user has auto brightness turned on, and not all users do, then you can detect the light levels by reading the screen brightness as a proxy. Brightness changes with the ambient light, and so brightness and ambient light are correlated as long as the user has auto brightness on. So like I said, it's not possible with public APIs to get the camera information directly. It's also not possible to determine if auto brightness is on or off and so because you can't determine that automatically, you have to specifically communicate to the user if that's a feature that's mandatory, and ask them to turn the auto brightness feature on. So like said, there are some private APIs that are available but Apple won't allow those through the App Store if you use them. So once auto brightness is turned on, your devise will react to the environment and you can get a reading back using this code here, this UI screen. Being called with the mainScreen selector, that's static method. You'll get the main screen back and then you can ask for what the brightness is and when you ask for the brightness, you'll get back a number between 0.0 and 1.0, where 0.0 refers to a dark environment, and 1.0 refers to a bright environment. Now, you don't have a perfect ability to know what the brightness is because the full range of what the brightness can get set to is modified by how the user interacts with their device. So the user tends to like a brighter screen then the ranges are going to tend to be at the brighter end of the numbers. And if the user prefers darker screen by sliding the brightness down even though auto brightness is set, the brightness will tend towards the darker side. So you usually don't see that full range of readings from zero to one but you do get a pretty good set. And the numbers can go as low as zero, and as high as one. You can ask for the brightness directly like we saw in the previous slide, but you can also ask for call backs. This is a common pattern that we see in a lot of different sensor readings, you can ask for the current reading, or you can ask for readings whenever they change. So to get readings whenever they change, we use the NS notification center pattern where we ask for the default center, the default notification center, that goes into the variable center here. And then, what we do is we add the class that this is inside of. Usually a view controller class. We add that as an observer to the notification center. For selector, we pick which method we would like the notification center to call when the appropriate event has occurred. The name of the event that we're interested in getting back is UI screen brightness did change notification, and in this case we can set that object variable to nil. So there are some caveats to this, in addition to the ones I've already mentioned. The first one is, that again, you can't test this in the simulator. Because the simulator, at least as far as I can tell, doesn't have the ability to simulate brightness for the simulated device. And when you do new testing on a physical device, the iOS light levels are determined by sensor fusion, fusing the readings that are coming in from the front camera and the back camera, as well. And not only that, but it also fuses the proximity sensor. So if you want to simulate a dark environment by covering the light sensors for example, you have to cover the front camera and the back camera because they're fused. And the other thing that you have to do is you have to be careful that if your device has a proximity sensor, like this one right here, you don't want to cover that to simulate darkness. Because this proximity sensor is use to determine if the phone is next to your face when you're making a phone call. And if that's the case, the way the sensor is fused is that the assumption is that it got dark because the phone is next to the user's face, not because of the environment. So if you want to make it seem like it's a dark environment, cover the front camera and the back camera, but be careful not to cover that proximity sensor. Or the light sensors won't adjust. The brightness won't auto-adjust in response, and then in code, you won't see that change happen at all. So what I'd like to do is I'd like to show you how we can build this. It's actually pretty simple. And so, here, is the concept that we'd like to work with. It is a meter, looks like a gauge here. And this is a screen recording of the app actually running. And so, this is on the iPad. And so, what you see here is you see that the gauge is going higher as the device moves into a bright environment, and it's going darker when it goes into a darker environment. It's kind of slow. It doesn't react instantly. It kind of goes down smoothly as the sensor readings get more confident that it's a change. And not merely the phone just passing through a shadow or seeing a reflection or something briefly. So that's the case study that we want to work up. We're going to use it using indirect auto brightness levels to do it. So in summary, the light levels are only available indirectly but you can use that indirect light level for some clever user experiences. One example of a user experience that I've seen that took advantage of this is an application that changed it's color palette if the light levels got too low. So I kind of switched from a daylight colored palette to a nighttime colored palette, and that happens automatically by monitoring the auto brightness levels. That is kind of a clever way to use the auto brightness to detect low levels of light. So the next thing that we're going to do, I'm going to show you this working in code and that will be our case study for this idea of using light level sensors. Thank you. [MUSIC]