Surveillance Capitalism PDF

Title Surveillance Capitalism
Course Fondamenti di psicologia della comunicazione
Institution Libera Università di Lingue e Comunicazione IULM
Pages 8
File Size 89.6 KB
File Type PDF
Total Downloads 258
Total Views 513

Summary

SURVEILLANCE CAPITALISMSurveillance Capitalism: the definition A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; A parasitic economic logic in which the production of goods and services is subordinated to a ne...


Description

SURVEILLANCE CAPITALISM Surveillance Capitalism: the definition 1. A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; 2. A parasitic economic logic in which the production of goods and services is subordinated to a new global architecture of behavioral modification; 3. A rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history; 4. The foundational framework of a surveillance economy; 5. As significant a threat to human nature in the 21st century as industrial capitalism was to the natural world in the nineteenth and twentieth; 6. The origin of a new instrumentarian power that asserts dominance over society and presents startling challenges to market democracy; 7. A movement that aims to impose a new collective order based on total certainty; 8. An expropriation of critical human rights that is best understood as a coup from above: an overthrow of the people’s sovereignty The term surveillance capitalism is not an arbitrary term. Why surveillance? Because it must be operations that are engineered as undetectable, indecipherable, cloaked in rhetoric that aims to misdirect, obfuscate and just downright and bamboozled (cheat) all of us, all the time.» (Zuboff in VPRO Documentary, 2019) «Surveillance capitalism knows everything about us, whereas their operations are designed to be unknowable to us. They accumulate vast domains of new knowledge from us, but not for us» (Zuboff, 2019, p. 11) The code in the culture Platforms intervene «Social media platforms don’t just guide, distort, and facilitate social activity, they also delete some of it. They don’t just link users together, they also suspend them. They don’t just circulate our images and posts, they also algorithmically promote some over others. Platforms pick and choose».

«Platforms intervene, and the public culture that emerges from them is, in important ways, the outcome». «The web provides many instances where algorithms select, hierarchize, suggest, and so forth: online sellers make automated product ‘recommendations’, dating sites calculate ‘compatibility’ coefficients between members and arrange them accordingly, news aggregators generate front pages according to measures of ‘noteworthiness’, social networking services filter friends’ status updates based on ‘closeness’ metrics, and microblogging services give prominence to ‘trending’ topics based on sudden spikes in activity. The terms between quotation marks highlight that we are dealing with cultural and thus highly ambiguous tasks being expressed as and delegated to mechanical procedures.» (Rieder, 2017) Algorithmic Culture Google uses electricity, silicon, and plastic, all working in conjunction with an army of human engineers, to rank the importance of people, places, objects, and ideas. Though the means and ends are different, this is akin to what, back in 1869, the English literary critic Matthew Arnold said was the purpose of culture: to determine “the best which has been thought and said.’» (Ted Striphas, 2014) …engineers now speak with unprecedented authority on the subject, suffusing culture with assumptions, agendas, and understandings consistent with their disciplines. The shifting locus of cultural discourse has helped a broad new sense of the word to emerge - one that may be functionally prevalent, we contend, yet vaguely defined. We refer to it as algorithmic culture: provisionally, the use of computational processes to sort, classify, and hierarchize people, places, objects, and ideas, and also the habits of thought, conduct, and expression that arise in relationship to those processes.» (Hallinan and Striphas, 2016) «Facebook engages in much the same work in determining which of your friends, and which of their posts, will appear prominently in your news feed. The same goes for shopping sites and video or music streaming services, when they offer you products based on the ones you (or someone purportedly like you) have already consumed.

What’s important to note, though, is the way in which algorithmic culture then feeds back to produce new habits of thought, conduct, and expression that likely wouldn’t exist in its absence - a culture of algorithms, as it were.’» (Ted Striphas, 2014) Recursivity of algorithms: “when the output of a computational process becomes itself embedded in the input of a new iteration.” (Airoldi and Rokka, 2019, p. 2). This endless and recursive feedback-loop that can have extensive implications both at the social and cultural level. Filter bubbles and echo chambers (Parisier, 2011); Adaptation of taste to automated recommendations. Filter bubbles: «…a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, e.g. location, past click-behavior and search history.» (Wikipedia) «…your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.» (Parisier, 2011, TedTalk) Echo chambers: the result of a filter bubble. «…a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system. By visiting an "echo chamber", people are able to seek out information that reinforces their existing views, potentially as an unconscious exercise of confirmation bias. This may increase social and political polarization and extremism.» (Wikipedia) «…the rise of echo chambers in which individuals are largely exposed to conforming opinions. Indeed, in controlled experiments, subjects tend to choose news articles from outlets aligned with their political opinions.» (Flaxman et al., 2019, p. 299) Machine learning applications and AI technologies filter, order and, ultimately, constitute everyday consumer experiences, as a sort of “technological unconscious” (Beer, 2009). As a result, it is worth asking if what we eat for dinner has become less a matter of our choice and more the computational result of platforms’ “production of prediction” (Mackenzie, 2015). Also, to what extent is our musical taste a mere

consequence of YouTube or Spotify’s automated recommendations? Rather than emancipating and liberating consumers, can we also say algorithms exert a new form of authority that limits consumer agency?» (Airoldi and Rokka, 2019) We shouldn’t jettison numbers altogether. They can be empowering and revealing, they can be used to hold power to account or to challenge embedded prejudice and established misconceptions. Numbers that reveal poverty, homelessness, uneven mortality and unequal pay and conditions are all crucial to informed debate. Yet recent events show that we need to maintain our scepticism and be cautious even with trusted sources. It is possible to retain some use of important statistics without numbers becoming the only way that we know ourselves and each other. Some measures can’t easily be escaped, but they can be downplayed, subverted or challenged. We should aim to rediscover and celebrate immeasurable qualities and argue for why they matter. (Beer, 2017) “…while powerful businesses, financial institutions, and government agencies hide their actions behind nondisclosure agreements, (…) our own lives are increasingly open books. Everything we do online is recorded; the only questions left are to whom the data will be available, and for how long. (…) Surveillance cameras, data brokers, sensor networks, and “supercookies” record how fast we drive, what pills we take, what books we read, what websites we visit. The law, so aggressively protective of secrecy in the world of commerce, is increasingly silent when it comes to the privacy of persons.” (Pasquale, 2015, p. 2) “Knowledge is power. To scrutinize others while avoiding scrutiny oneself is one of the most important forms of power. Firms seek out intimate details of potential customers’ and employees’ lives, but give regulators as little information as they possibly can about their own statistics and procedures. Internet companies collect more and more data on their users but fight regulations that would let those same users exercise some control over the resulting digital dossiers.” (Pasquale, 2015, p. 34)  Antinomy between powerful – but opaque –corporations and institutions and impotent – and monitored - citizens. “How has secrecy become so important to industries ranging from Wall Street to Silicon Valley? What are the social implications of the invisible practices that hide the way people and businesses are labeled and treated? How can the law be used to enact the best possible balance between privacy and openness?”

“What we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs.” (Pasquale, 2015, p. 2) In aviation, the black box is a source of knowledge: a receptacle of crucial data that might help investigators understand catastrophe. In technology, a black box is the absence of knowledge: a catchall term describing an algorithmic system with mechanics its creators can’t - or won’t - explain. Algorithms, in this telling, are unknowable, uncontrollable, and independent of human oversight, even as they promote extremist content, make decisions affecting our health, or act in potential violation of antitrust law”. (Fussett, 2019) Ex. In 2016 Amazon Prime was much less likely to offer same-day service to predominantly black and Latino suburbs in Boston and New York. The algorithm that determines eligible neighborhoods, Amazon explained, was designed to determine the cost of providing Prime, based on an area’s distance from warehouses, number of Prime members, and so on. That explanation was a shield for the human designers’ choice to ignore how race and poverty correlate with housing, and how that inequality is replicated in Amazon’s products. (Fussett, 2019) Reproduction and amplification of social discriminations by AI systems. Opaque categorization of the world by computational tools owned by private companies. “Algorithms interpret potentially millions of data points, and the exact path from input to conclusion can be difficult to make plain. But the effects are clear. This is a very powerful asymmetry: Anyone can notice a change in search results, but it’s extremely difficult to prove what caused it. That gives algorithm designers immense deniability.” (Fussett, 2019) “Because of their opacity, algorithms can privilege or discriminate without their creators designing them to do so, or even being aware of it. Algorithms provide those in power with what’s been termed “strategic ignorance” - essentially, an excuse that arises when it’s convenient for the powerful not to know something.” “Try as companies might to minimize personal accountability, it is humans who build, train, and deploy algorithms. Human biases and subjectivities are encoded every step of the way.” (Fussett, 2019)

Powerful cultural myth that identify algorithms as neutral entities (Natale and Ballatore, 2018) However, algorithms are not impartial or unbiased (Gillespie, 2015) Striking role of “the culture in the code” (Airoldi and Gambetta, 2018): algorithms as gatekeepers and decision-makers; AI systems can embed cultural biases and re-produce social discriminations, hence, the question is: can algorithms automate inequalities? Ex. “Black girls”, “Latina girls”, “Asian girls”. Back in 2009, when Safiya Noble conducted a Google search using these keywords, the first page of results were invariably linked to pornography. Noble refers to this process as the “pornification” of Black women and girls on Google. This observation holds a deep resonance with historical controlling images of the Black woman as a “jezebel” (black women historically portrayed as innately promiscuous and predatory). Search engine results have a way of symbolically dehumanizing already oppressed groups. These processes impact the dispersal of information about events and issues, and the circulation and amplification of dehumanizing and damaging cultural representations. Discriminatory stereotypes are reproduced and potentially amplified by AI systems that embody (even unconsciously) certain values and cultural frames. Ex. The Black hairdresser Kandis promoted her activity on the platform Yelp. However, she was invisible. The algorithmic practices of Yelp, in concert with the retreat of affirmative action by the major university by which she works, reduced and nearly erased her visibility and, therefore, her ability to make a living. The same happens when mainly black neighborhoods are excluded from search results by platforms. Previous discrimination biases are reproduced and there is the reinforcement of oppressed groups’ exclusion from emergent digitally dependent markets First, Noble’s study highlights that our culture is rooted in a system of white supremacy. Historical analyses of employment and representation in media

industries show persistent patterns of racialized tropes from advertising to pornography. Racial discriminations and sexual inequalities emerge as a hidden feature of systems that are used by billions of individuals everyday to make sense of reality. Second, the contemporary formation of neoliberal economics turns everything, including matters that were recently thought to be public goods such as information and education, into commodities. Google is an advertising engine, not an information search engine. It exist to make a profit and it is not going to strengthen democracy and provide neutral information. Google is the latest in a series of information technologies whose promise is democratic but whose actual deployment is commercial (again, good claims and hidden goals). Search engines are not simply neutral tools for mindlessly completing tasks. Without substantial reform and restructuring, they are also a source of immense social power and financial domination, and ultimately, mechanisms of ongoing oppression. According to Rosino, it is not simply stories or even ideas that naturalize forms of oppression but also the symbols, concepts, characteristics, and images pervasively associated with groups in dominant culture. Social representations matter! Noble’s study about search results on Google and other platforms highlights some of the ways in which search engines misrepresent a variety of people, concepts, types of information and knowledge. Noble exposes a culture of racism and sexism in the way discoverability is created online. In her findings search engines appear as a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color; Systems of categorization embody biases in classifications: the ways in which we categorize social reality online is itself biased. AI systems can embed cultural biases and re-produce social discriminations; People that are already marginalized and oppressed, are subjected by digital media platforms to a reproduction and amplification of those mechanism online;

Data discrimination ought to be considered a social problem....


Similar Free PDFs