Managing Learning

18 Issues with Data: Personal Identity

Data about us is constantly recorded through our phones and computers. This data is interpreted based on who is recording it and who is looking at it. To take just one example, Google makes its digital version of us, the digital identity, based on what we do on its platforms. It proceeds to label us based on this data and then rearranges what we see on its search engines and apps accordingly. It markets us to companies who might want to market themselves to us.

Activity

Access your “Ad Settings” Profile on Google, Facebook or Instagram. Or if you regularly use another platform, try finding out if they have ad settings and if you can access them. These are part of our digital identity.

Questions for discussion:

  • What does your “digital identity” look like? Does it reflect your demographics and interests? Do you agree with this identity?
  • How do you think Google decided each of these interests? What data could have been taken into account? These interest categories change frequently and are recursive: an ad interest you are associated with can determine what ad interest you’ll be categorized with next. What can this tell us about profiling?
  • Do you agree with scholars such as Cheney-Lippold and Bassett that there is an over-reduction of identity here? Why is this an ethical concern?
  • Ethically, does it matter more if these profiles get your interests “right” or “wrong”?
  • Does your gender and race play a part in how you are labelled? How does that make you feel?

This activity has been adapted from Identity, Advertising, and Algorithmic Targeting: Or How (Not) to Target Your “Ideal User.“,  licenced under CC BY NC 4.0. 1

The labels Google gives us – male, female, young or aged, have nothing to do with our identities, needs or values. Someone can be male if they look at certain websites(say hardware stores) and buy certain items2. Tomorrow, a male can become female if their activity or the activities of a million other humans who contributed to what is male-like-behaviour change. Different companies give us completely different identities based on what interests them.

The same is done to our students when they interact with Personalized learning software and are subjected to learning analytics. Their digital identity, their performance, engagement and satisfaction, as viewed by these systems, is then used to evaluate not only their performance but also the performance of their peers, teachers, schools and the educational system itself3.

Why is this a problem?

  1. These profiles are often composed based on noisy and incorrect data from various sources and can be very misleading4.
  2. These digital identities can change how a student sees themselves and others, how teachers see each student, how the system sees each teacher, how the society sees education and pedagogy, and how everyone reacts to decisions and feedback3.
  3. Yet, these judgements on who someone is are made without their knowledge and consent – by black boxes no one has access to. Often, there is no control over what data is recorded, where and when it is recorded and how decisions are made based on it4,1.  Students and teachers lose their expressive power and human agency.
  4. This data and judgements tend to persist as stored data long after the recorded event took place4.
  5. The stress on metrics, where students, teachers and staff are constantly assessed, compared and ranked, can induce reactions such as  anxiety and competition instead of motivation and growth3.
  6. Aspects of education that can be automatically captured and analysed are given more importance; they push us towards outcomes and practices that are different from what might otherwise be of concern to us.
  7. The organisations that do the “datafication” have the power to define “what ‘counts’ as quality education – a good student or an effective teacher3.”

Here are countermeasures the experts suggest teachers take:

  1. Consider the people, their identity, integrity, and dignity: “Approach people with respect of their intrinsic value and not as a data object or a means-to-an-end”5. People are not just the data; that the label that a software might give students to personalise learning pathways or to split them into groups is not their real identity5.
  2. Be data literate: Learn how to handle data correctly. Learn what different data based systems do, how they do it, what is their recommended usage and how to interpret the information they generate and the decisions they make.
  3. Maintain a critical distance from AIED companies and software. Question their claims, ask proof of their validity and reliability, verify that the system follows ethical guidelines of your institution and country3.
  4. Monitor the effects these systems have on you, your students, their learning and the classroom atmosphere.
  5. Call for open systems that give you control and the power to override automated decisions. Pitch in, clarify or override wherever and whenever you feel the need.

Kant, T., Identity, Advertising, and Algorithmic Targeting: Or How (Not) to Target Your “Ideal User. MIT Case Studies in Social and Ethical Responsibilities of Computing, 2021.

Cheney-Lippold, J., We Are Data: Algorithms and the Making of Our Digital Selves, NYU Press, 2017.

Williamson, B., Bayne, S., Shay, S., The datafication of teaching in Higher Education: critical issues and perspectives, Teaching in Higher Education, 25:4, 351-365, 2020.

4 Kelleher, J.D, Tierney, B, Data Science, MIT Press, London, 2018.

5 Ethical guidelines on the use of artificial intelligence and data in teaching and learning for educators, European Commission, October 2022.

Licence

Icon for the Creative Commons Attribution 4.0 International License

AI for Teachers: an Open Textbook Copyright © 2024 by Colin de la Higuera and Jotsna Iyer is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book