In the living room the voice-clock sang, Tick-tock, seven o’clock, time to get up, time to get up, seven o’clock!
~ Ray Bradbury, “There Will Come Soft Rains”
The more our tools are naturalized, invisible, or inscrutable, the less likely we are to interrogate them.
The 2015 film The Experimenter is based on the true story of Stanley Milgram, the Yale University psychologist who became famous for his 1961 social behavior experiments that tested the obedience of volunteers who thought they were administering electrical shocks to strangers. In the film, the character of his wife, Alexandra “Sasha” Milgram, is played by Winona Ryder, and she serves as the on-screen stand-in for the film audience. Our ethical response to what happens in the film is registered on her face. In several scenes, the camera focuses on the face of Winona Ryder watching the experiment unfold—her skin twitching, her body shifting uncomfortably, her eyes wide with both horror and also a certain awe at what humans are capable of.
In his experiment, Milgram asked a “teacher” (the subject of the experiment) to shock a “learner” (an actor) for getting wrong answers on a simple test. An “experimenter” (working with Milgram) would order the teacher to give increasingly powerful shocks, and more often than not, the teacher complied. The study is not without baggage, but the results remain compelling nonetheless. At one point in the film, Winona Ryder as Sasha Milgram asks to experience the shock herself, the same very small shock that the teachers were also given during the setup of the experiment. The scene is played out with a certain menace as the various accoutrements are put into action. Visually, she is overwhelmed by the devices that surround her: the electrodes, the teacher’s microphone, a series of digits that light up to show the learner’s answers, a pen, a clipboard, the gray of the experimenter’s lab coat, a recording device, and the large box of switches through which the teacher delivers the shocks.
Milgram himself describes this particular device as “an impressive shock generator. Its main feature is a horizontal line of thirty switches, ranging from 15 volts to 450 volts, in 15-volt increments. There are also verbal designations which range from SLIGHT SHOCK to DANGER-SEVERE SHOCK.” I sense glee in the language Milgram uses (“impressive”), something theatrical in the excess (“thirty switches”), and a fastidiousness in his attention to detail in reporting all this.
The mechanisms and the machine, but also the clipboard, the lab coat, etc. play clear roles in maintaining and even eliciting compliance. And the subtler and more intricate or inscrutable the mechanism, the more compliance it appears to generate — because the human brain fails to bend adequately around it. The camera works a similar magic on the film viewers as it ominously traces over these objects. Like our on-screen surrogate, Winona Ryder, we too sit still — complicit, both horrified and awed by what we see and our inability to stop it.
The less we understand our tools, the more we are beholden to them. The more we imagine our tools as transparent or invisible, the less able we are to take ownership of them.
At the interview for my current job at the University of Mary Washington, the inimitable Martha Burtis asked me to reflect on the statement: “It’s teaching, not tools.” What assumptions does this oft-bandied-about phrase make? What does it overlook? Like Martha, I find myself increasingly concerned by the idea that our tools are without ideologies — that tools are neutral. Of course, they aren’t. Tools are made by people, and most (or even all) educational technologies have pedagogies hard-coded into them in advance. This is why it is so essential that we consider them carefully and critically—that we empty all our LEGOs onto the table and sift through them before we start building. Some tools are decidedly less innocuous than others. And some tools can never be hacked to good use.
In 2014, the EDUCAUSE Learning Initiative (ELI) report “7 Things You Should Know About the Internet of Things” noted: “The Internet of Things (IoT) describes a state in which vast numbers of objects are interconnected over the Internet and can collect data and transmit and receive information.” I find something ominous about the capital-I and capital-T of the acronym IoT, a kind of officiousness in the way these devices are described as proliferating across our social and physical landscapes.
The ELI report continues, “the IoT has its roots in industrial production, where machine-to-machine communication enabled the manufacture of complex items, but it is now expanding in the commercial realm, where small monitoring devices allow such things as ovens, cars, garage doors, and the human heartbeat to be checked from a computing device.” At the point when our relationship to a device (or a connected series of devices) has become this intimate, this pervasive, the relationship cannot be called free of values, ethics, or ideology.
I’ll be candid. I am quite often an unabashed fan of the Internet of Things. I like that my devices talk to one another, and I enjoy tracking my movement and my heart rate. I even find myself almost unable to resist my curiosity about something like the ridiculous bluetooth-enabled cup that can track how much water I drink. I like controlling my car from my phone and feeling the tickle of an incoming text message on my wrist. But my own personal curiosity and fascination are outweighed by my concern at the degree to which similar devices are being used in education to monitor and police learning.
I am worried by sentences like this one from the ELI report, “E-texts could record how much time is spent in textbook study. All such data could be accessed by the LMS or various other applications for use in analytics for faculty and students.” I am worried by how words like “record,” “accessed,” and “analytics” turn students and faculty into data points. I am worried that students’ own laptop cameras might be used to monitor them while they take tests. I am worried that those cameras will report data about eye movement back to an algorithm that changes the difficulty of questions. I am worried because these things take us further away from what education is actually for. I am worried because these things make education increasingly about obedience, not learning.
Remote proctoring tools can’t ensure that students will not cheat. The LMS can’t ensure that students will learn. Both will, however, ensure that students feel more thoroughly policed. Both will ensure that students (and teachers) are more compliant. In his 1974 book Obedience to Authority: An Experimental View, Milgram described “the tendency of the individual to become so absorbed in the narrow technical aspects of the task that he loses sight of its broader consequences.” Even if I find the experiment itself icky, Milgram offers useful reflections on the bizarre techno theater that made his experiment go.
When Internet-enabled devices have thoroughly saturated our educational institutions, they run the risk of being able to police students’ behavior without any direct input or mediation from teachers. By merely being in the room, the devices will monitor students’ behavior in the same way that the cameras and switches and lab coats did in Milgram’s experiments. How will learning be changed when everything is tracked? How has learning already been changed by the tracking we already do? When our LMS reports how many minutes students have spent accessing a course, what do we do with that information? What will we do with the information when we also know the heart rate of students as they’re accessing (or not accessing) a course?
Winona Ryder was caught on camera and arrested for shoplifting in Saks Fifth Avenue in 2001. How do we respond when a security guard peers through the slats of a dressing room to witness a very rich person, “scissors in hand, clipping sensor tags from store items?” The jury convicted her. She was vilified even as “Free Winona” t-shirts started flying off shelves. An early web meme was born.
I maintain a great deal of excitement about the potential of the Internet of Things. At the same time, I find myself pausing to consider what Milgram called “counteranthropomorphism” — the tendency we have to remove the humanity of people we can’t see. These may be people on the other side of a wall, as in Milgram’s experiment, or people mediated by technology in a virtual classroom.
Winona has very few lines of dialogue in The Experimenter, and yet her performance is a pivotal one because she offers a guide, a moral compass, for the off-screen audience. She is complicit in her passivity and yet rebellious in her willingness to register raw and genuine emotion, something no other character can muster. And as the film unfolds, the shock and awe on her face gives way to compassion. As she looks upon the scene of the experiment, she sees human beings and not the experiment.
We must approach the Internet of Things from a place that doesn’t reduce ourselves, or reduce students, to mere algorithms. We must approach the Internet of Things as a space of learning, not as a way to monitor and regulate. Our best tools in this are ones that encourage compassion more than obedience. The Internet is made of people, not things.