Algorithms and surveillance society 3500-SCC-asn
Algorithms are everywhere and when they work well we usually don't notice them. They are used, among others to assess creditworthiness, assess job candidates, target ads, or prevent terrorist attacks. They are so common because the allow to simplify complex phenomena and process huge amounts of data, which facilitates decision making. In spite of appearances, these are not neutral and objective tools that only improve the management of complex problems - many normative choices are made in the course of their design, and their use also has numerous negative social and political effects.
The aim of the course is to look at how algorithms are created, how they work, and what are the social and political consequences of their application. The focus will be on revealing the normative assumptions that underlie the selected algorithms. We will also focus on the threats they generate: strengthening inequalities, hampering democratic control of political processes, and interference with privacy.
The classes will also aim at familiarizing students with the key theoretical texts on the issues of supervision and control (including technology-related supervision) and the identification and categorization of individuals, i.e. how the collection of data about individuals is used to "sort them out” into better and worse (better and worse citizens, customers, employees).
During the classes, we will analyze selected cases of using algorithm-based technologies, reflecting on broader issues such as privacy boundaries, dehumanization of decision-making processes, the weakness of democratic mechanisms when faced with the new supervision technologies, etc. Our primary interest will be the scoring systems (credit scoring and prediction models), profiling systems (crime profiling, migrant profiling, profiling of the unemployed) and other automated systems for classifying and assessing individuals (e.g. automated human resource management and employee assessments). We will reflect on the normative assumptions and value judgements about individuals (citizens, customers, employees) that underlie these systems and the social and political consequences of their use. We will use a variety of sources for analysis: scientific literature, watchdog reports, media publications, and documentaries.
Type of course
Mode
Prerequisites (description)
Course coordinators
Assessment criteria
The basis for getting credit is attendance (2 absences are allowed). The grade is based on activity and class work + completion of tasks (40% of the grade), preparation of a presentation about a selected technology (60% of the grade). Presentations will be commented on by invited guests - researchers or practitioners who specialize in particular topics.
Bibliography
Bowker, G. C. & Star, S. L. (1999) Sorting things out: classification and its consequences, MIT Press, Cambridge, Mass.
Bovens, M., Zouridis, S. (2002) From Street-Level to System-Level Bureaucracies: How Information and Communication Technology is Transforming Administrative Discretion and Constitutional Control, Public Administration Review, Vol. 62, No. 2, pp. 174-184.
Citron D.K, Pasqualle F. (2014) The Scored Society: Due Process for Automated Predictions, University of Maryland Francis King Carey School of Law Legal Studies Research Paper No. 2014 – 8.
Collingridge, D., and Reeve, C. (1986). Science Speaks to Power: The Role of Experts in Policy Making. New York: St. Martin’s Press
Foucault, M. (1998) Nadzorować i karać: narodziny więzienia. (Warszawa: Fundacja Aletheia).
Garfinkel, S. (2001) Database Nation: The Death of Privacy in the 21st Century. Cambridge, MA: O’Reilly.
Gilliom, J. (2001) Overseers of the poor: surveillance, resistance, and the limits of privacy, University of Chicago Press, Chicago.
Hacking, I. (1986), Making Up People, in Heller, Sosna, and Wellbery (eds), Reconstructing Individualism: Autonomy, Individuality, and the Self in Western Thought, Stanford: Stanford University Press, pp. 222-236.
Lyon, D. (1994) The Electronic Eye: The Rise of Surveillance Society. Cambridge, MA: Polity Press.
Lyon, D. (ed.) (2006) Theorizing Surveillance: The Panopticon and Beyond. Cullompton, UK: Willan.
Lyon D. (2015) ‘Citizenfour Alert!’ and ‘Snowden Storm’ in Surveillance After Snowden
Niklas, J., Sztandar-Sztanderska, K. & Szymielewicz, K. (2015) Profiling the Unemployed in Poland. Social and Political Implications of Algorithmic Decision Making. Warszawa, Panoptykon Foundation.
O’Neil, Cathy 2017 Broń matematycznej zagłady: jak algorytmy zwiększają nierówności i zagrażają demokracji. Marcin Z Zieliński, tran. Warszawa: Państwowe Wydawnictwo Naukowe PWN
Scheiner B. (2015) Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World, New York: W. W. Norton & Company.
Scott, J. C. (1998) Seeing like a state: how certain schemes to improve the human condition have failed, Yale University Press, New Haven, dobrać fragment.
Szadkowski K. (2015) Uniwersytet jako dobro wspólne. Warszawa: Scholar
Whitaker, R. (1999) The End of Privacy: How Total Surveillance is Becoming a Reality. New York: The New Press
Additional information
Additional information (registration calendar, class conductors, localization and schedules of classes), might be available in the USOSweb system: