In the previous lecture, we considered a few fundamental legal questions that artificially intelligent programs raise. In this lecture, we will focus on an aspect of AI that is often overlooked, the hardware aspect. By hardware, I mean the physical computer infrastructure on which artificially intelligent software runs. Like the previous lecture, we will try to look briefly at two questions. First, what is the hardware dimension of AI? Second, which legal issues does AI hardware rise? Let us turn to the first question. Any artificially intelligent software requires a shell, a physical piece of computer equipment. The more complexity AI, the more computational power is required to perform a given task. In technical terms, the computational power of a computer is determined in large part by a computer central processing unit or CPU. Phones true, have a processor that determines how quick tasks can be computed and how complex such tasks can be. For example, when I want to run an artificially intelligent text analysis program on my laptop that tries to recreate legal texts based on judgments that I have supplied to it. It can take a few hours for the analysis to complete. However, when I run the same operation on a more powerful stationary computer, which has a more powerful CPU, it takes only around 10 minutes to complete more complicated AI programs involving, for example, the modeling of medical treatments or facial recognition, I cannot run on my personal computers at all. Due to the limitations of ordinary personal computers, many lay users of AI programs, like myself, use programs that are hosted not on private computers but on service that belong to companies. At present, almost all of these AI programs rely on the so-called classical computer structure. Classical computers store and process information in binary units called bits. These bits can have the value of either one or zero. This means that any process and any information within this classical computer structure is ultimately represented either by one or by a zero. However, there is now a different type of computer emerging alongside this classical binary type of computer. This type of computer is called a quantum computer. Quantum computers do not use bits that can be either one or zero, but cubits that store and process information. Cubits can be set to one or zero, like a classical computers, but importantly, they can also be set to one and zero at the same time. This technological difference is the reason that quantum computers are vastly more powerful than classical computers. To illustrate, one of the manufacturers of quantum computers recently reported that their quantum computer performed a calculation in one second that would take a classic computer 10,000 years to perform. In general, it is anticipated that fully functioning quantum computers will be 100 million times more powerful than contemporary desktop computers, and at least 3,500 times more powerful than contemporary super-computers. It is important to appreciate that this stage that with respect to certain problems, the potential speed of quantum computers is so superior to that of classical computers that problems that used to be impossible to solve by classical computers can now be solved by quantum computers. That does not mean however, that quantum computers will replace classical computers across the board. Indeed, quantum computers are not per se, faster or more powerful than classical computers. Both types of computer will co-exist, each within its own domain. But with respect to the specific types of complex computations, quantum computers will certainly dramatically enhance the ability to use artificial intelligence for beneficial purposes. But they also dramatically enhance some of the risks that the utilization of AI and tails. This brings us to the second question. What do all of these technicalities have to do with law? There are again many answers to this question, but for now I will just focus on two particular aspects. The first aspect that I would like to focus on mirrors the first aspect discussed in the previous lecture. Namely the fact that hardware, just like software, is created by people with specific ideas in mind. Again, this is not necessarily problematic, but it is a fact that one needs to be aware of. A comparison of hardware and architecture might help to illustrate this point. When architects construct a building, they can construct it so that it is wheelchair accessible or not. They can also construct it in a manner that allows people to congregate in corridors by creating space for gathering or not, or they can place the door handles very high so that children cannot reach them, and so on. Depending on what the building would be used for, none of these features are necessarily wrong, but each of them is a choice that is taken for a particular reason and computer architecture is very similar. Some infrastructural choices are of course determined by physical necessities, just like with buildings. But many choices are informed by what the architects of a computer, of a processor, want a given machine to do. One assumption, essentially to come into all computational processes, whether quantum or classical, is for example, that the human decision-making processes, which AI mirror, follows certain rules of reason and rationality. This assumption might capture certain aspects of human reasoning, but it might not account for the whole range of human reasoning. Again, this is not necessarily a problem, but it is important to keep in mind from a legal point of view, since there's an inherent risk that such assumptions, coupled with a belief in neutral technology, can come to dominate over alternative modes of thinking. There's of course also the risk that the views and interests of those who are able to construct hardware prevail over the interests of those who cannot. The second aspect of legal importance relates to the ever increasing complexity and sophistication of the hardware required to run the most advanced AI programs. For quantum computers function only in a vacuum and they need to be cooled down to around 272 degrees Celsius. This technical complexity of quantum computers means that only very few companies and countries are actually able to construct and utilize them. This is significant. At present, it is assumed, for example, that quantum computers can overcome any encryption mechanism. This means that quantum computers can break conventional password protection mechanisms. If this is the case, actors with quantum computers, companies, or states have a clear advantage over those who do not have quantum computers, since those without a quantum computer cannot protect the information from those with a quantum computer. Similar issues, though slightly less severe exist also with respect to classical supercomputers that also require a lot of technical skill and electricity to maintain. As a result, the possibility of reaping the remarkable benefits of AI are often limited to those who can access the hardware that is required to run AI programs. As I mentioned just now with respect to the password problem, this situation can lead to significant inequality. In fact, it can amplify existing inequalities since rich countries and large companies will be able to use the full range of AI applications while less well-off factored actors will be left behind. Thus from a legal point of view, one should think about ways to bridge these discrepancies. For example, by mandating that those with access to the most advanced AI technologies should share certain percentages of their computing resources with those actors who would otherwise be locked out. Ultimately, it is important to keep in mind that a consideration of the legal issues raised by digital processes must always consider both the legal issues raised by software and by the hardware. Neither software nor hardware are simply neutral things. They are normative phenomena in the sense that they are shaped by human choices, which can be questioned and debated. Our hope is that this course will enable you to participate in that debate.