If data is the new soil, and the new oil, could one ask if we are constantly experiencing new and complex ontological futures of technology? And is it possible to simultaneously redefine it? In ‘Science and Technology by Other Means: Exploring collectives, spaces and futures’, the 4S/EASST Conference held in Barcelona this year, many such concerns were central to understanding modern digital conditions we currently negotiate and maneuver through. Technoscience imagination has always been crucial to the conceptualization of particular ways of thinking about the future, and data provides an expanding terrain on which it is made operational. This review will discuss my thoughts from some discussions that reflected a part of the larger engagements that the conference enabled; discussions about big data analytics and contemporary institutional practices.
In doing this review and in trying to understand the theme of the 4S/EASST conference this year, my objective primarily is to reflect on data-driven institutional practices in the meaning making, regulation and governance of illegal bodies and of ‘potential risks’; and the implicit notions of illegality embedded into various categorizations of social groups, communities and populations. Some of the discussions relevant to these issues focused on data driven practices in regulating social bodies, realities and phenomena, and perceptions of risks and illegalities embedded in digital interventions. For example in the session ‘Data-driven cities? Digital urbanism and its proxies’ (T027), presentations focused on the meanings and ways of using big data in analyzing urban spaces and politics. In general, they focused on how data was crucial in making calculable and computable analysis in governance. Modern urban spaces are a minefield for statistical analysis of social reality and phenomena which are often understood as manageable risks for institutions. This could be argued as based on idea of producing predictions (Mackenzie, 2015). As a more interesting insight into such aspects, some presentations focused on modern policing practices. In this, the idea was that predictive analysis often understands the idea of crime and responses to it as units of measurement which influence different forms of policing and personnel behavior. This is driven by the models of analytics which help in mapping social behavioral patterns. These were also discussed as practices of securitization, and the embedded biases in which algorithmic calculations become central to this particular governance of such risks (Amoore, 2009; Ziewitz, 2016).
The idea of biases could be investigated as a further analysis in understanding digital infrastructures. The technologies of policing and of biometrics based mapping, for instance, are often based on historical data, and of identifying illegality defined by preexisting human practices. These practices incorporate historical biases, and social perceptions regarding individuals or specific groups and communities, which get embedded into processes of data collection and the programming of algorithms. Since historical biases are often about sections of populations which have been categorized as illegal or as risks, this could potentially create technologies which always specifically target certain groups over other sections of the population. Hence data driven practices of identification and deterrence actually end up creating new forms of discrimination.
This was insightful for my own research interests of critically analyzing the centrality of computable big data in describing social realities. More specifically, the concerns regarding the movement of human bodies through regulated spaces of governance. Some of the presentations of the session ‘Infrastructures, subjects, politics’ (T085) looked specifically at case studies of infrastructures which seek to regulate populations and spaces. The presentations in general focused on these specific practices at the intersections between governance and the production and regulation through digital technologies. Some of the presentations were important in discussing border technologies to monitor refugee and immigrants, biometrics-based authentication systems, and the various uses of smartphones to circumvent state infrastructures of monitoring and surveillance. These discussions while illustrating state surveillance practices also raised questions as to what forms of subversion and spaces of resistance were possible outside this particular domain of state infrastructures. A particular presentation also focused on the implicit nature of private interests in monitoring other sections of the population, such as transgender, through health data infrastructures centered on the notions of gender and sexuality.
All these questions were important for understanding the spaces that we currently occupy and the possible futures that one can envision. In response to such questions, some aspects of the Keynote Plenary 2 by Isabelle Stengers was insightful when she argues that while one does exist in the ‘ruins’ of such contemporary social conditions and processes, or of sharing a common future, it also gives us an opportunity of imagining alternate possibilities. For her, imagination is possibility, and therefore one must take into account the notion of generativity – as ontological, as and of situations which produce the possible. The nature of an event is to produce new moments of possibility and interventions, and therefore indicate a way of thinking about collective spaces and futures.
For my work, the conference allowed important insights about the nature of issues that I currently engage in, specifically about big data, state practices of policing and monitoring immigrants. As a researcher working on the ideas of digital infrastructures and big data analytics in India, I feel it is imperative that there should be a possibility of resistance and agency; machine learning which allows for human cooption and coproduction of technologies. The practices of surveillance, of managing populations as risks and illegalities, given global issues around immigration and refugees, is a present that needs to reimagine its future from current events that seem to suggest otherwise. One possible way could be of thinking about building consensus around policies such as transparency, open data, open government initiatives, and digital rights in connection with biometrics based human machine interactions. The idea of technology as other means is possible only when alternative spaces can be imagined and made possible. The data driven forms of governance and interventions on spaces and human bodies is one form of a future where technology is politics by other means, through a different set of political practices in which issues and specific moments of human-machine interactions and conflict in infrastructures could be anticipated, critically analyzed and technically resolved.
Amoore, L. (2009). Algorithmic War: Everyday Geographies of the War on Terror. Antipode, 41(1), 49–69. JOUR. http://doi.org/10.1111/j.1467-8330.2008.00655.x
Mackenzie, A. (2015). The production of prediction: What does machine learning want? European Journal of Cultural Studies , 18(4–5), 429–445. JOUR. http://doi.org/10.1177/1367549415577384
Ziewitz, M. (2016). Governing Algorithms: Myth, Mess, and Methods . Science, Technology & Human Values , 41(1), 3–16. JOUR. http://doi.org/10.1177/0162243915608948