p. 13 “The underlying visions of the AI field do not come into being autonomously, but instead have been constructed from a particular set of beliefs and perspectives. The chief designers of the contempoary atlas of AI are a small and homogenous group of people, based in a handful of cities, working in an industry that is currently the wealthiest in the world. Like medieval European mappae mundi, which illustrated religious and classical concepts as much as coordinates, the maps made by the AI industry are political interventions, as opposed to neutral reflections on the world.”
p. 16 Since antiquity, the business of mining has only been profitable because it does not have to account for its true costs: including environmental damage, the illness and death of miners, and the loss to the communities it displaces. In 1555, Georgius Agricola, known as the father of mineralogy, observed that “it is clear to all that there is greater detriment from mining than the value of the metals which the mining produces.”
p. 31 The mining that makes AI is noth literal and metaphorical. The next extractivism of data mining also encompasses and propels the old extractivism of traditional mining. The stack required to power artificial intelligence systems goes well beyond the multilahered technical stack of data modeling hardware, srevers and netwroks.. reaches into capital, labor and Earth’s resources… the cloud is the backbone of the artificial intelligence industry, and its made of rocks and lithium brine and crude oil”
p. 35 According to the computer manufacturer Dell, the complexities of the metals and mineral supply chain pose almost insurmountable challenges to the prodiction of conflict-free electronics components. The elements are laundered through such a vast number of entities along the chain that sourcing their provenance proves impossible 0 or so the end-product manufacturers claim, allowing them a measure of plausible deniability for any exploitative practices that drive their profits.”
p. p. 38 At the end of the 19th century, a particular Southeast Asian tree called the laquium gutta became the center of a cable boom. These trees, found mainly in Malaysia, produce a milky white natural latex called hutta-percha… rapidly became the darling of the engineering world … the solution to the problem of insulating undersea telegraphic cables to withstand harsh and varying conditions on the ocean floor. The twisted strands of copper wire needed four layers of the soft, organic tree sap to protect fhem from water intrusion and carry their electrical currents… The historian John Tully describes how local Malay, Chinese, and Dayak workers were paid little for the dangerous work of felling the trees and slowly collecting the latex. … as media scolar Nicole Starosielski writes, “Military strategists saw cables as the most efficient and secure mode of communication with the colonies – and, by implication, of control over them…. The jungles of Malaysia and Singapore were stripped: by the early 1880s the Palquium gutta had vanished.In a last-ditch effort to save their supply chain, the British passed a ban in 1883 to halt harvesting the latex, but the tree was all but extinct.”
p. 8 I argue that AI is neither artificial nor intelligent. Rather, artificial intelligence is both embodied and material, made from natural resources, fuel, human labour, infrastructure, logistics, histories and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensivetraining with large datasets or pre-defined rules and rewards. In fact, artificial intelligence as we know it depends entirely on a much wider set of political and social structures. And due to the capital required to build AI at scale and the ways of seeing that it optimizes, AI systems are ultimately designed to serve existing dominent interests. In this sense, artificial intelligence is a registry of power.”
p. 42 ” Strubell’s team found that running only a single NLP [natural language processing] model produced more than 660,000 pounds of carvon dioxide emissions, the equivalent of five gas-powered cars over their total lifetime (including manufacturing) or 125 roundtrip flights from New York to Beijing.”
p. 56 Instead of asking whether robots will replace humans, I’m interested in how humans are increasingly treated like robots and what this means for the role of labour.. people are often performing rote tasks to shore up the impresssion that machines can do the work.”
p. 63 the experiences of crowdworkers who perform the repetitive digital tasks that underly AI systems, such as labelling thousands of hoursof training data and reviewing suspicious or harmful content. Workers do the repetitive tasks that backstop claims of AI magic, but the rarely receive credit for making the systems function.”
p. 76 Dominos Pizza has added to its kitchens machine -vision systems that inspect a finished pizza to ensure the staff made it according to prescribed standards. Surveillance apparatuses are justified for producing inputs for algorithnic scheduling systems that further modulate work time, or to glean behavioural signals that may correlate with signs of high or low work performance, or meerely sold to data brokers as a form of insight”
p. 159 Given that facial expressions are culturally variable, using them to train machine learning systems would inevitably mixtogether all sorts of different contexts, signals and expectations.
p. 173 None of these serious questions about the basis for Ekman’s claims have stopped his work from attaining a priileged role in current AI applications. Hundreds of papers cite Ekman’s view of interpretable facial expressions as tought it wer unproblematic fact, despite decades of scientific controversy. Few computer scientists have even acknowledged this literature of uncertainty”.
p, 174 WWht, with so many criticisms, has the approach of “reading emotions” from the face endured? .. we can begin to see how military research funding, policing priorities and profit motives have shaped the field…theories seemed ideal for the emerging field of computer vision because they could be automated at scale… powerful institutional and corporate investments in the validity of Ekman’s theories or metholodogies. Recognizing that emotions are not easily classified, or that they’re not reliably detectable from facial expressions, could undermine an expanding industry… the more complex issues of context, conditioning, relationaity and cultural factors are hard to reconcile with the current disciplinary approaches of computer science or the ambitions of the commercial tech sector. “
p. 206 when AI systems are deployed as part of the welfare state, they are used primarily as a way to surveil, assess, and restrict people’s access to public resources rather than as a way to provide for greater support… Michigan … “a matching algoriyjm be used to implement the state’s ‘fugitive felon’ policy, which sought automatically the disqualify indvidiuals from food assistance based on outstanding felony warrants. Between 2012 and 2015, mthe new system inaccurately matched from than 19,000 Michigan residents and automatically disqualified each of these from food assistance… in essence these systems are punitive, designed on a threat-targetting model.
p. 211 ARtificial intelligence is not an objective, universal, or neutral computational technique that makes determinations without human direction. Its systems are embedded in social, political, cultural and economic worlds, shaped by humans, institutions and imperatives that determine what they do and how they do it. They are designed to discriminate, to amplify hierarchies, and to encode narrow classifications. When applied in social contexts such as policing, the court system, health care and education, they can reproduce, optimize and amplify existing structural inequalities. This is no accident: AI systems are built to see and interven in the world in ways that primarily benefit the states, insutitutions and corporations that they serve.”
p. 226 What happens … if we begin with the commitment to a more just and sustainable world? How can we intervene to address interdependent issues of social, economic and clamte injustice. Where does technology serve that visions. And are there places where Ai should not be used, where it undermines justice? This is the basis of a renewed politics of refusal – opposing the narratives of techological inevitability that says ‘If it can be done, it will be.’ Rather than asking where Ai will be applied, merely because it can, the empasis should be on why it ought tho be applied … we can question the idea that everything should be subject to the logics of statistical prediction and profit accumulation, what Donna Haraway terms “the informatics of domination”.