Hello, and welcome back to Course 11, the final course that we're doing this part of the CIPM. This one is about continual improvement. Continually improving, measuring, monitoring, improving your privacy program. We can imagine now that with our case study, Metaforce1, you've come in as the senior person for privacy, you've established your team, you've done your assessments, you've documented your privacy program, you've taken care of security, you've trained in a way your staff, you deal with people's rights requests. Things are now rolling in another pace. But of course, you've then got the management who was going to ask you questions. How do we know that we're happy? Assure us, tell us that we're performing in the way we want to. This next section of the course is about exactly that performance measurement, and if you like going back to that plan, do check act model. We've had to understand exactly what's expected of us. Those requirements and expectations, and we've got to deliver or events those expectations. To have done the plan on the do, we're now in the check phase, we're now moving towards the act phase, deciding what we want to improve. Of course, we have to deliver against those requirements and expectations that have fallen to us, not only by the law, but by all of our stakeholders, our customers, our management, the data subjects we serve. Going back to our designing a privacy program, this is intrinsic with a privacy program. We've done our assessments, we've done our policy procedures, we've done our training. We've implemented relays, privacy controls to manage our risks. We're now monitoring the compliance, we're now reporting back to our stakeholders on if we've done the job properly, and we're now reviewing what we need to do to continually improve. You'll see a lot in the exam about monitoring, measuring, and reporting. The first question here is, what exactly do we want to monitor and measure, and that's up to you. You can monitor and measure anything. The amounts of subject access requests you've received, amount of policies and procedures that have been created or implemented or read or amount of staff that had been trained, the amount of security breaches. You could measure pretty much anything you want, amounts of Eurasia requests, amounts of regulatory interactions. Some people purely define the success or failure of the privacy program by, has the regulator investigated me? Pretty bad way of measuring success, I believe, but I believe what we should really think about doing is actually good back a little bit and think about what our goals and objectives were. These measurements should actually change year-on-year because what we're trying to achieve in Year 1 of the privacy program is not the same as what we're trying to achieve in Year 5 of the privacy program. You cannot have too many metrics. You can measure and monitor everything but really take you back to why you're measuring? What you're measuring? What are your objectives, goals, and plans for this year of your privacy or six months of your privacy program? Year 1 should be about carrying out assessments, doing DPIAs starting to document. Year 2, it might be about training. Year 3, it might be about responding to access requests. These measurements can change over time, but the most important thing to understand is how do these measurements and objectives relate to your goals for that year of your privacy program or that period of your privacy program? You can take it at the top level for the entire privacy program or actually take it down a few levels. A processor system, a data flow, exactly. Plenty of examples here, as I said, quality, speed, legal retention periods, volume, types, customer satisfaction, different things we might want to focus on, and here is 7, 8, or 9 where everything is locked down superb. We might then really want to focus on differences between response in different divisions perhaps, who knows? But essentially, the most important thing to determine is what you want to measure. Why you are taking that measurement? How the measurement relates to the objective that you're trying to achieve with your privacy program for the given period? Another set, the measurement are different in Year 1 than they are in Year 2 than they are in Year 10. The exam does have a talk of bits of terminology that you're going to need to know about here. Metric owners, this is going to be not probably brought your privacy team, this is essentially the individuals who are going to be creating a metric for you. Could be you in the privacy team if really what it sometimes request, but equally, it could be the security division, could be the HR division. Who knows? The metric owner is responsible for producing the metric. Give good you the metric called a measurement at the right time in the right place. Even may not even be you who's responsible for coming up with that statistic or that measurement. The exam will still got metric audiences. This is essentially on the converse, who'd the metric is for, who's going to reading when you're looking at that metric. The consumer of that metric, the person who requires the assurance and the exam splits, put it down to primary, secondary, and tertiary audiences. Essentially, the primary audience is who the metric was intended for. The process owner. Privacy team, people who do metric is intended for, but there might be other audiences as well. Management, tertiary audience could be external stakeholders, other people who need to see the metric as well. Essentially primary, secondary, and tertiary, just how far away from the metric you are. Are you the intended audience or a secondary audience or a tertiary audience? The final thing we're going to talk about essentially with this check phase is what you want to measure. You can measure a lot of different things about your privacy program. But to me, the most important thing sometimes is to not review some of these measurements in isolation. I just think you get a much better picture if you collect them together. If you get the management or some information governance board or some information team that sits down at a given period, and transitions between the check and the Act phase by reviewing all of these measurements metrics together. Number of different sources, number of different sources here that we want to measure for our performance management. These sources, they're going to include the measurements that we just suggested. These are one time dips that we're going to take to say how is X occurring X period of time? How many staffs are trained? How many subject access requests we got? Monitors, now these are things that we are different in monitoring and measurement. A monitor is happened repeatedly. We're not just measuring at a point in time, we're constantly monitoring. We can get to the grasp of this. We can understand how things go up and things go down. We can understand exactly how things stops and move forwards. Our objectives and our plans. Have we achieved them? Yeah, have we achieved them? Audits. We'll talk about audits in a moment in a future session of this final part of the course. But yeah, when we're checking ourselves, when we're examining ourselves, what audit findings are they, internal audits, external audits? Feedback really important. You ask stakeholders here, especially the Data subjects. What's their feedback on how our privacy team and our privacy program is just happening. What risks are there? Are there old risk, are there new risks? How are we managing our risks in our risk assessment? What's coming out to the DPIAs? What changes are being made as results of the DPIAs? Are the DPIAs being done? Security breaches, number of different things to think about there. What types of breaches that we have? Do we have any breaches? If not, it's not suspicious in itself. What about breaches versus incidence? What incidence do we have? If you haven't got breaches, then what incidents do we have? What lessons can be learned from the security side of the house? How are we doing against frameworks? We might have a governance framework that we've instituted or a third party vendor framework, or law such as the GDPR or HIPAA we're trying to comply against. Then how was our privacy program comparing to these frameworks. That altogether we can squeeze it and call the check phase. To me, it's a case of taking all these monitors and measures and giving it to some management review. Whoever that management review is, it's up to you. But generally speaking, it's normally a team, you increase the privacy and the security team and the relevant senior stakeholders across the business who can look at all this evidence of what's going on and make changes for the future. That leads us to the Act phase. Out of the check phase, we're heading towards the Act phase. How are we going to do that? Is we're going to correct problems that was good. We're going to try and prevent problems that could be in the future, and we'd going to into take actions that eliminate re-occurrence, prevent active actions that not only correct a problem, a symptom, but also prevent re-occurrence into the future. Corrective stops a problem. Preventive stops a problem that's yet to occur and prevent active or eliminate the root cause. That's as far as we're going to go for this first section. The next section that we're going to do in here is actually on then taking those continual improvement actions. Thank you.