datafication

By Daniel Johanne…, 17 June, 2021
Publication Type
Language
Year
Pages
97-113
Journal volume and issue
62.1
Record Status
Abstract (in English)

Digital platforms have become central to interaction and participation in contemporary societies. New forms of ‘platformized education’ are rapidly proliferating across education systems, bringing logics of datafication, automation, surveillance, and interoperability into digitally mediated pedagogies. This article presents a conceptual framework and an original analysis of Google Classroom as an infrastructure for pedagogy. Its aim is to establish how Google configures new forms of pedagogic participation according to platform logics, concentrating on the cross-platform interoperability made possible by application programming interfaces (APIs). The analysis focuses on three components of the Google Classroom infrastructure and its configuration of pedagogic dynamics: Google as platform proprietor, setting the ‘rules’ of participation; the API which permits third-party integrations and data interoperability, thereby introducing automation and surveillance into pedagogic practices; and the emergence of new ‘divisions of labour’, as the working practices of school system administrators, teachers and guardians are shaped by the integrated infrastructure, while automated AI processes undertake the ‘reverse pedagogy’ of learning insights from the extraction of digital data. The article concludes with critical legal and practical ramifications of platform operators such as Google participating in education.

DOI
10.1080/17508487.2020.1855597
Event type
Date
Individual Organizers
Address

Budapest
Hungary

Short description

In this workshop, we seek to provide possible answers to the question: what does the replacement of writing by code mean for the future of reading and interpretation? With increasing reliance on algorithms and big data, does interpretation even have a future? What constitutes reading today, and what could hermeneutics look like in a digital age? Hermeneutics traditionally refers to the method and study of textual interpretation. Modern hermeneutics has its origin in textual exegesis, the interpretation of the Old Testament. It revolves around building bridges—between the present and the past, the familiar and the strange. In a time of post-truth, filter bubbles, and alternative facts, such perspectives are worth remembering and reiterating.

In our information age, we can predict to an increasingly precise degree what kinds of messages will resonate with us, and we can simply filter out the rest. In the Humanities and Social Sciences, the shift to datafication transforms our research fields in far-reaching ways, including how we think, how we formulate our research questions, and what answer we find. Was interpretation, then, a historically necessary, but equally contingent mode? In what terms do we need to think about it as we move into a culture of big data, distributed AI, convergence, and globalization? Where does our influence end that that of the black box begin; and where does the analysis of the machine end, and our responsibility begin? After all, data still is, and needs to be, interpreted. The workshop brings together scholars from diverse disciplines in the Humanities and Social Sciences to engage in a cross-disciplinary dialogue on these matters.

(source: event page)

Images
Image
Umbrella shields person at desk with computer from a rain made of numbers.
Record Status