5 Love and Other Data Assets
For years, I’ve led pedagogical development seminars for faculty at numerous institutions focused on topics like “higher education pedagogy,” “digital knowledge,” and “reimagining assessment.” These have included staff, librarians, adjunct faculty, junior faculty, senior faculty, STEM faculty, humanities faculty, social sciences faculty, fine arts faculty, current students, recently graduated students, and more. We’ve discussed practical issues arising from our own teaching, discussed readings, met with special guests, and done microteaching (offering feedback on short lessons taught by participants).
I’ve guided these discussions and chosen many of the readings, but the topics have arisen organically, usually beginning with a group-brainstorm at our first session. At any one of these seminars, what we’ve done was more important than what we set out to do, and we’ve reflected regularly on how we did it and why. The syllabi for these seminars have ended up an eclectic “mixtape” including non-fiction, documentary films, even comics, and ranging from classics (John Dewey, bell hooks, Seymour Papert) to curiosities (Virginia Woolf, Henry David Thoreau) to more contemporary choices (Sara Goldrick-Rab, Ruha Benjamin, Cathy N. Davidson). Sometimes, there has been a clear path from one week’s topics to the next, but more often the juxtapositions have been haphazard, as much about generating friction as about finding common theses.
There is very little time, support, or funding in higher education for this kind of reflective consideration of our work. Teaching is quite often something we do alone with very little direct preparation for the work. Conversations between and among faculty, librarians, instructional designers, and technologists are much rarer than they should be. The faculty/staff divide at many Institutions limits our ability to talk across structural barriers. And contingent, adjunct, or precarious educators are too often left out of these conversations altogether. Each year I facilitated these seminars at University of Mary Washington, for example, it was a struggle to include (and compensate) adjuncts, librarians, and staff. Spending faculty development funds on non-faculty, even non-faculty with teaching assignments, was discouraged by administrators. In fact, every year, funding was eliminated altogether for this project, and I had to advocate (with the support of previous participants) for it to be reinstated. Even as the funding for the seminar was eliminated again in early 2020, the annual bill for the learning management system contract went up. This kind of decision-making sent a message that the administration wanted technical support, not deep inquiry into the complicated work of teaching.
Educational technology is strangely situated at many institutions (usually somewhere vaguely between academics and IT), which further frustrates necessary conversations across the teaching/technology divide. And, quite often, for-profit ed-tech companies take advantage of this situation through predatory marketing tactics — pitching their tools to the most powerful, least knowledgeable folks at an institution. The majority of ed-tech is driven by the bureaucratic traditions of education more than the pedagogical ones.
In “Teaching as Possibility: A Light in Dark Times,” Maxine Greene writes, “It is obvious enough that arguments for the values and possibilities of teaching acts (no matter how enlightened) within the presently existing system cannot be expressed through poetry, even as it is clear that the notion of ‘teaching as possibility’ cannot simply be asserted and left to do persuasive work.” What Greene describes is a conundrum. For her, the space of the imagination, the habitus of poetry, is necessary to the work of education. But how do we reconcile the philosophies of John Dewey with the fact of the learning management system? How does the work of Maria Montessori sit without combusting alongside the increasingly aggressive marketing of remote and algorithmic proctoring tools? How do bell hook’s words about self-actualization in the classroom not wither in a world of key-stroke monitoring and plagiarism-detection software? And how can we honestly approach Virginia Woolf’s A Room of One’s Own with students if we’re complicit in the monetization of their educational data by for-profit companies?
These crises aren’t existential, nor are my examples purely hypothetical. The technological tools we’ve widely adopted for education are increasingly out of step with what we say education is for. There’s a serious problem in education if we assume dishonesty on the part of students while failing to acknowledge that for-profit tech companies like Turnitin or ProctorU might care about their bottom line more than they care about students.
During a 2019 investor conference, the (now former) CEO of Instructure (maker of the Canvas learning management system) bragged about their “second growth initiative focused on analytics, data science, and artificial intelligence,” saying: “We have the most comprehensive database on the educational experience in the globe … No one else has those data assets at their fingertips to be able to develop those algorithms and predictive models.” What concerns me are two specific words that fell so easily off that CEO’s tongue: “data assets.” Teachers and students have long been called “users” and “customers” by educational technology companies, and this has had me uncomfortable enough, but reducing us and our work to “data assets” takes this a step further, exposing the role that for-profit companies and technologies play in the increasing precarity of education and educational labor. It’s not a coincidence that more than 70% of university teachers are working off the tenure-track and nearly 1 in 2 students in the U.S. is food insecure, even as Turnitin claims their product is used by more than 30 million students at 15,000 institutions in 150 countries and the global learning management system market is expected to reach $28.1 billion by 2025.
As these bureaucratic technological systems become more ubiquitous, educators increasingly accept them as inevitable instead of pushing back when institutions invest more and more in machines (and algorithms) and less and less in teachers (and the work of teaching). The biggest issues for me arise when we adopt tools across an entire educational institution, discipline, or curriculum and give teachers and students no choice but to use them. A student’s degree or grade shouldn’t rest on whether they are willing to sacrifice their privacy or give their data to a for-profit corporation. And institutions shouldn’t allow the worst of these tools, like plagiarism detection and remote proctoring solutions, to short-circuit our best pedagogical intentions by creating a culture of distrust in education.
Grades are a big reason institutions adopt surveillance technologies that actively encourage a culture of suspicion in education, technologies like algorithmic retention platforms, remote proctoring tools, plagiarism and AI detection software, etc. And the abuses of these tools is not distributed equally. In “Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education,” Shea Swauger writes, “Algorithmic test proctoring encodes ideal student bodies and behaviors and penalizes deviations from that ideal by marking them as suspicious.”
There’s a common end game for tech companies, especially ones that traffic in human data: create a large base of users, collect their data, monetize that data in ways that help assess its value, then leverage that valuation in an acquisition deal. An educational institution should be skeptical of those companies, not suspicious of its own students.
In “Digital Sanctuary: Protection and Refuge on the Web?,” Amy Collier writes, “We in higher education need to seriously consider how we think about and handle student data, and we need to respectfully and empathetically acknowledge where our practices may cause harm.” This work starts by talking openly with one another, and across institutional divides. Collier’s call is for dialogue, not empty bureaucratic structures. We all need to work together to ask hard questions of our technologies: Does the tool educate students about IP and data privacy before collecting data? Can individual students opt out, no matter the university policy? Is there a single button students can click to remove all their data? If the company does monetize the data it has collected, whether permission was given or not, will the owners of that data be compensated?
Asking these questions is necessary when a tool is institutionally adopted, but smaller versions of these conversations should happen every time we consider pointing to a tool on an institutional Web page or require the use of a tool for a course assignment. “If higher education is to ‘save the Web,’” Chris Gilliard writes in “Pedagogy and the Logic of Platforms,” “we need to let students envision that something else is possible, and we need to enact those practices in classrooms. To do that, we need to understand ‘consent’ to mean more than ‘click here if you agree to these terms.’” Our work as educators is not just to question ubiquitous practices, compulsory data collection, and algorithmic decision-making, but also to model what it looks like to think critically about the whens, whys, and hows of technology.
The learning management system isn’t an accident. It exists for very particular historical, bureaucratic, institutional, and pedagogical reasons. The same is true of remote proctoring software, plagiarism detection services, and algorithmically-driven mobile apps for student retention. These tools are not neutral. In an interview with Tara Robertson (from the book Feminists Among Us), Chris Bourg offers, “Many people don’t proclaim their agendas, but definitely have agendas, even if they are agendas about maintaining the status quo, and never get asked about how they handle people in their organization who don’t agree with their agendas.” Bourg argues for a feminist administrative practice of radical openness and transparency about our own agendas. The onus has to be on the tech companies themselves to educate “users” about data security and data monetization.
Ed-tech companies need to state clearly: “Here’s what we’re collecting, here’s why we’re collecting it, here’s what what we hope to do with it, here’s why it should matter to you.” Far too many companies attempt to shift these responsibilities entirely onto educational institutions, when they should be shared by students, educators, institutions, and the ed-tech companies themselves. We all need to talk honestly about what tools are for, how they might shift culture, and who they could disadvantage. We need to actively resist marketing jargon that would have us believe our “culture of academic integrity begins with Turnitin,” or that ProctorU “validates knowledge.” And by “resist,” I mean we should not adopt tools that conceal their monetization strategy or lie to us about their function.
My work in critical digital pedagogy begins with the presumption that there aren’t easy solutions in education — that students and educators bring complex backgrounds, experiences, and contexts to the work — and that this work demands we “gather together a cacophony of voices.” This means acknowledging fractious (and sometimes abusive) faculty/staff divides, drawing students into the work of building their own educational spaces, and creating much more inclusive conversations between technology companies and all “stakeholders,” the human beings, who populate (and construct) a university, college, or other school.
In the description for my “digital knowledge” pedagogical development seminar, I write, “Putting the word ‘digital’ in front of ‘knowledge’ begs a whole host of questions: How do digital technologies change the ways we produce, disseminate and consume knowledge? Does social media indeed have a democratizing effect, as some have suggested? What inequities has it unearthed or enhanced? When we say ‘digital knowledge,’ how can we see knowledge as acting also upon the digital? What is the role for educators in helping construct new pathways into, out of, and around the digital?” One of the first things I encourage participants to read in these seminars is Teaching to Transgress by bell hooks. It is productive to hold her work, which has been foundational for my own pedagogy, up against investigations into the past, present, and future of educational technologies.
Henceforth, every time I hear the phrase “data assets” to describe students, educators, and their work, I’m going to recite in my brain (or aloud) these words from another book by hooks, Teaching Critical Thinking:
It is the most militant, most radical intervention anyone can make to not only speak of love, but to engage in the practice of love. For love as the foundation of all social movements for self-determination is the only way we create a world that domination and dominator thinking cannot destroy.
And thinking on those words, I’ll be reminded of this from Paulo Freire’s Pedagogy of the Oppressed, “If I do not love the world if I do not love life if I do not love people I cannot enter into dialogue.”
We are not data assets. And the work of dialogue depends on that. It begins when we:
- build a community of care
- ask genuine, open-ended questions
- wait for answers
- let conversation wander
- model what it looks like to be wrong and to acknowledge when we’re wrong
- recognize that the right to speak isn’t distributed equally
- make listening visible
From there, we can find tools and technologies that support the project of education, ones that allow us to own our data, delete it at will, and export it easily — tools that allow “users” (human faculty, staff, librarians, adjuncts, administrators, technologists, and students) to create our own spaces (and conversations) on and about the Web — tools with an architecture of radical openness, love, imagination, hard questions, possibility, and poetry.