The (Data) Clouds Have Eyes

“Echo, are you spying on me?”

“No, I’m not spying on you. I value your privacy.”

-Me, to my Amazon Echo Dot

 

So, may I be the first to say: holy shitake mushrooms, my fellow colleagues. There’s an unbelievable amount of things to address in terms of this week’s readings, so I’ll break it down into a cohesive narrative to the best of my ability. I had been adamant about facilitating the discussion on data and surveillance because I recently began researching in this field last semester. I had taken a course within the Urban Education program titled Immigration and the Intersections of Education, Law, and Psychology, and for my term paper I wrote an article draft of my own which is currently titled “Digital Challenges of Immigration: How Technology is Working Against Diverse Bodies.” My article is about the rapid technological/database advancements made within the United States post-September 11th 2001, and how these technologies are working against different body types and fostering potentially dangerous digital environments for populations such as immigrant college students. When Jones, Thomson, and Arnold (briefly) mentioned the term “biomarkers” in their piece“Questions of Data Ownership on Campus” I made the connection to Ruha Benjamin’s explanation of biomarkers from Race After Technology.

According to Benjamin (2019), within these large collections of data, names are encoded with racial markers, which means more than just them signaling cultural background, but it also provides a plethora of historical context. What makes this concept stick out to me is the mention of universities’ misuse/storage of #BigData and the startling ideas behind how these collections can be linked/used. Looking at some of the following quotes, one can’t help but get a severely sinister vibe:

“Rarely are such systems outright coercive, but one could imagine developing such systems by, for instance, linking student activity data from a learning management system to financial aid awards. Rather than relying on end-of-semester grades, an institution might condition aid on keeping up on work performed throughout the semester: reading materials accessed, assignments completed, and so forth.” 

“From application to admission through to graduation, students are increasingly losing the ability to find relief from data and information collection. Students are required to reveal sensitive details about their past life and future ambitions, in addition to a host of biographic information, simply to be considered for admission—they are never guaranteed anything in return for these information disclosures.” 

“’College students are perhaps the most desirable category of consumers,’ says Emerson’s Newman. ‘They are the trickiest to reach and the most likely to set trends.’ As a result, he says, their data is some of the most valuable and the most likely to be mined or sold.”

“But the company also claims to see much more than just attendance. By logging the time a student spends in different parts of the campus, Benz said, his team has found a way to identify signs of personal anguish: A student avoiding the cafeteria might suffer from food insecurity or an eating disorder; a student skipping class might be grievously depressed. The data isn’t conclusive, Benz said, but it can ‘shine a light on where people can investigate, so students don’t slip through the cracks.’”

For starters, what the hell? Perhaps we should slowly walk through the problematic features of these quotes because, at first glance, the everyday reader (non-edtech enthusiasts) may night pick up on the subtle (or for some, not so subtle) red flags in each quote.

The first red flag I’m referring to is the blatant use of the term “coercion.” This quote was pulled from Jones et al.’s “Student Perspectives on Privacy and Library Participation in Learning Analytics Initiatives,” which focused on the WiFi-based tracking system where all students had to download an app in order for the university to maintain attendance records. I teach my Communications students that what makes “coercion” different than “convincing” is that coercion persuades the target through methods of threats and fear tactics. Suggesting that the future of such technology could be practices such as linking these applications to learning management systems (LMS) in order to turn financial aid into a conditional reward is horrific, ableist, racist, and classist to say the least.

Dangling financial aid like a carrot in front of students in order to get them to perform to a university’s liking is extremely dystopian. This borders heavily into issues of racism & classism because this reward system may not apply to privileged students who do not need to take advantage of financial aid. Students who are coming from lower-income communities are relying on that aid not only to pay tuitions but to have food to eat and a roof over their heads. When the carpet is ripped out from under them due to, let’s say, a medical condition that inhibits their ability to attend class occasionally, who does that benefit? And as we know, the previously mentioned low-income communities are more often than not predominantly minority groups. This is very reminiscent of the introduction to Virginia Eubanks’s book, ​Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor,​ which she used to tell a story about how her partner required a life-saving $62,000 surgery. Eubanks’s (2018) insurance company had denied them coverage for the surgery, despite Eubanks’s domestic partner being covered through her insurance at her new place of employment. Upon further investigation, Eubanks (2018) came to the realization that her family had been red-flagged and placed under investigation for fraud. Who is to say these actions aren’t racially influenced by markers within these big data clouds?

 

 

 

 

On top of these questionable backend practices, the actual user interface that shows the student attendance/progress mentioned in the same article looks similar to applications like Credit Karma (I’d share a screenshot to show the comparison but I don’t need you guys seeing my credit score #DataPrivacy). What struck me about this was that it resonated with a quote from the same piece about the concept of “cradle-to-grave profiles,” which is essentially when a student is tracked their entire student-careers and beyond that, having these services trail them into their actual professional careers in order to evaluate outcomes.

I apologize but I am fixated on this piece particularly because it addressed so many violations of college student independence in a way that really left me disturbed. These methods of tracking also talk about the ability to locate the exact location of the student, and this would help to keep tabs on their health throughout the semester. As observed in the last quote, by tracking where a student is spending their time on campus, they can guess what they are going through emotionally. Huh?! Making the bold assumption that a student has an eating disorder because they do not spend time in the dining hall is breaking so many social boundaries. If we’re being honest, a lot of universities have crap dining options, I know mine did in undergrad. So if a student remains in their dorm building to cook and study, they may get marked in the database as a hazard or concern of some sort to the university. The more I pry open this article, the more dystopian it feels.

“Because if [my institution] had the intention of using my data to create better programs or better educational tools, then I’m all for it, you know…. But I could also see certain things that are tracked, maybe being a little embarrassing. I initially didn’t go [to the counseling center] for a long time because I was embarrassed, because I knew that the university was going to be able to track that and look at my record and say, “Oh yeah, she’s been going to counseling.” And maybe if they wanted to, they could somehow find out what exactly it was that I was talking to the therapist about.”

We want to assume that these practices aren’t super present in the magical land that is CUNY, but students have experienced similar within our very own Graduate Center. While having a conversation with a peer of mine about the Wellness Center and counseling services provided through our tuition/fees, this person had decided to pursue the counseling services to talk through some stuff. First of all, we only receive SIX (6) sessions a semester as graduate students, like, okay. Second, during the session, my peer was made aware of the fact that the session was being recorded on camera. However, they ensured that there was no audio. Wait, what? Yeah, cameras in the counseling center apparently. I wanted to investigate this for myself but unfortunately have not had the time. We tried to justify it with the idea that psychology doctoral students of specific disciplines needed to meet an hourly requirement, but it just wasn’t enough for us to feel comfortable enough to take advantage of a space where we are expected to be vulnerable.

I’ve been ranting quite a bit at this point, so I’ll start to wrap it up and leave the remainder for our class discussion. With the references I’ve made in this post alone, I want to get across the extreme discomfort I have over the fact that these unethical educational technologies are invading university campuses in a way that threatens the growth/development of every student’s individuality. Higher education is just as much about wellness and personal experiences that influence our decisions just as much as (or at least close to as much as) the ways classrooms shape their futures. I have so much that I want to talk about, so I am looking forward to probing these horrors even deeper with you all tomorrow.

7 thoughts on “The (Data) Clouds Have Eyes

  1. Elena Abou Mrad

    Oh wow! I just read the title from my email notification and I knew I had to read the whole post. First of all, I love the title. And I really appreciated how you articulated your discomfort towards these surveillance practices. I liked that you seem interested in all of the student population, with a particular focus on vulnerable categories like people with disabilities or from disadvantaged economic backgrounds. Can’t wait to discuss more in class tomorrow!

  2. Luke Waltzer (he/him)

    Amazing post, Anthony– thank you! A lot to chew on here. Extreme discomfort is a good starting place in reaction to the overview this week’s reading provides. One question that I want to pose for consideration later is this: if we are to construct and protect teaching and learning spaces as liberatory sites, what does that mean for our orientation to these tools? What does skepticism, resistance, rejection, counterbalancing look like?

  3. Kathleen Begonia

    Thank you for this post, Anthony. This is an extremely interesting topic. It’s eye-opening to me that schools are trying to monitor every step their students take. As a nurse, I believe every individual has a right to privacy, especially because we are taught that ensuring privacy promotes patient dignity. Before any procedure, we make sure the curtains are pulled or cameras are off/turned away. How can you develop trust with someone if there are cameras on. What will the Wellness center do with the footage? Students should have the option to opt out of the recording or refuse to have counseling while a camera is recording or if they need the camera rolling (for some reason), then the USB or file should be kept with the student and not with the counselor.

    These articles reminded me of 23 and Me and the All of Us campaign by the National Institutes of Health.

    23 and Me-
    https://www.wired.com/story/23andme-glaxosmithkline-pharma-deal/
    23 and Me sold their databases filled with the genetic data of everyday citizens to a major drug company (GlaxoSmithKline). GlaxoSmithKline intends to create better drugs based off our genetic data. Many of 23 and Me’s customers did not know the implications of sharing their genetic info because 23 and Me was advertised everywhere as a fun thing to do or fun gift to give someone–you get to find out your biological inheritance! Apparently, all the information about potential sharing of genetic data was within the consent process that customers need to agree to before shipping off their sample to the company, but people don’t care or have the time to read carefully.

    I think we really need to caution each other about who we share our personal data with and really question the values and ethics of companies who want to collect our information. What other companies or researchers are they partnered with? Since 23 and Me shared their databases with Big Pharma, what more if insurance companies got a hold of our info? Will they deny coverage based on anticipated disease processes?

    The NIH (National Institute of Health) is also actively campaigning to recruit a million Americans to share their genetic data and answer surveys about how they live and where they work. Researchers with NIH are planning to advance precision medicine by using genetic data paired with family history and social determinants of health to improve health outcomes.This all sounds great, but trusting the government to collect and store data may not sit well with everyone. Between 1930-1970, the U.S. Public Health Service conducted the Tuskegee Syphilis Study by collecting data on 600 impoverished African American sharecroppers in Alabama and following them for 40 years to see the outcomes of syphilis. Even after penicillin was found as a cure, the sharecroppers were not offered treatment. This violated ethical code and led to the establishment of the Office for Human Research Protections and required creation of institutional review boards.
    All of Us: https://allofus.nih.gov/about/all-us-research-program-overview

    1. Carolyn A. McDonough

      Wowza, this unit of reading and these posts are intense, I can’t wait for class today. I want to make a poster from a tri-fold protesting the very clear violations of privacy, ethics, and the replacement of common sense and “brotherly love” (which the USA’s founding documents rely heavily upon) with BIG BRO.

      Anthony, your indignation and observations are spot on. The Wellness Center at the GC uses cameras?! WHY?!

      Kathleen, thank you for pointing out the Tuskegee Syphilis Study, which is a travesty. Unbelievable! I have alot of trouble with the genetic databases and know people whose family lives have been majorly disrupted by them. In one case, a woman took the test against her family’s will and got a “match” to a sibling but couldn’t understand why until it was fully revealed that her father had children with a woman other than her mother, who then contested her father’s will. And now the siblings that she grew up with are really mad at her. Yikes. I would certainly call “23 and me” and Ancestry.com (which is owned by the Mormon Church) DISRUPTORS in the business, social, and cultural senses indeed. Not a fan!

      To be cont’d very soon…

Comments are closed.