Presentation Document

Hi guys,

For my “presentation” today, I uploaded a draft of my project to our group files page here. I apologize, I meant to get it up sooner, but some things came up. If you have a chance to check it out before we get started, feel free to do so! My presentation will be more of a writer’s workshop format, so no pressure.

See you all very soon!

Intervention week: “lazy consensus” and gardening (vs streaming)

Once again, I apologize for this late post. It’s also going to be a brief write-up.

This week I was particularly moved by two separate ideas & practices in two of our readings: one is “lazy consensus” from “Ethical EdTech” and the other is “the garden/gardening” in Audrey Watters’ “Why ‘A Domain of One’s Own’ Matters (For the Future of Knowledge).” 

To be honest, I was not (and am not) familiar with how “lazy consensus” peacefully takes place in the decision-making voting system as I feel somewhat distant to it as a critical thinker who is more interested in the visibility of dissensus in the public or counter-public. Still what I liked about this is that, as the author Nowviskie states, “Lazy consensus wielded wisely and justly is capable of galvanizing comatose organizations back into motion and even of reversing terrible inertial trends.” ( According to Nowviskie, it sounds like “lazy consensus” as a principle stems from “Newton’s First Law of Motion” as the author addresses that “it describe how we might apply the concept of inertia to decision-making in organizations. This is because lazy consensus is not only a tactic to use under clinical conditions (such as opinion-polling of members of a committee). It may be a social contract — but in all frankness it’s also a practically a natural law.” While I don’t quite like the liberal, progressive tone in this definition or derivation, I find using inertia to prevent an impasse of inertia in decision-making in terms of ethical questions regarding Ed-Tech pragmatically inspiring in ways of allowing the Ed-Tech practitioners to move towards the future. But I hope we can talk about some kinds of cases and problems involved in this kind of decision-making process.

Regarding the garden from Watters’s essay, I loved the poetic implications of the metaphor of gardening (or designing) that contrasts with the streaming that might generate the issues of data exhaustion and waste (hence further isolation of domain users). Though I, as a teacher, was very encouraged by the practices of having one’s own domain on students’ sides for their autonomous knowledge and scholarship, I wasn’t sure how those independent sites run and represented by students can be delivered to the interested audience. Though Watters didn’t articulate the precise tactics of gardening (a network of flowers when we may imagine each student’s domain as some groups of flowers categorized by seasonal, colors, regional, etc.) in building the network of students’ domains in helping them connect to each other’s knowledge and invite external readers to their pages, I thought maybe there has been already a similar practice in pursuing domain’s one’s own. I hope that the class can discuss this further. In addition, in terms of teachers’ roles in navigating students’ gardens, I wondered what we could think more when it comes to grading and otherwise evaluating students’ works and scholarships. Can we just leave students as gardeners on their own terms? The idea of autonomy isn’t a bit too irresponsibly generalized nowadays? I hope that we can talk about this together.

‘Do no harm’: a proposal for an Ethics Committee in Ed Tech

When I was reading the materials for this week, one definition in Pat Reid’s article stood out for me:

“Educational technology is the study and ethical practice of facilitating learning and improving performance by creating, using, and managing appropriate technological processes and resources.”

Ritzhaupt, Albert D., et al. “Development and Validation of the Educational Technologist Competencies Survey (ETCS): Knowledge, Skills, and Abilities.” Journal of Computing in Higher Education, vol. 30, no. 1, Apr. 2018, pp. 3–33. Springer Link, doi:10.1007/s12528-017-9163-z.

The word that really resonated with me was “ethical”. In all the failures of ed tech we’ve read about, the ethical component was the most problematic to me. Ed tech should be used to promote learning, but it many cases it becomes an exploitative practice towards students and faculty. In Austerity Blues, for example, the authors expose how UCLA administrators failed to inform the Faculty Senate about the legal negotiations with THEN-OLN. Another example of failed Ed Tech, the SJSU-Udacity experiment, caused faculty uproar. The SJSU Academic Senate protested the administrator’s power to unilaterally impose new pedagogical practices and to monetize the faculty’s intellectual property without their consent.

I thought about Audrey Watter’s proposition to have a “Hippocratic Oath for Ed Tech”. The author draws a comparison between educators and healthcare professionals, who share the mission “First, do no harm”.

Coming from a background in Comparative Literature, I decided to bring the comparison even further, and two words popped up in my mind: “comitato etico” (“ethics committee”). In my research to discover what an ethics committee exactly was, I was surprised by two facts:

  1. It is only used for medical studies! For some reason, I thought that every public institution needed an ethics committee – how optimistic of me.
  2. My beloved EU has a directive about ethics committees and how they work.

The Ethics Committee

The definition of an Ethics Committee is in Article 2 of DIRECTIVE 2001/20/EC of the European Parliament (emphasis mine):

“(k) ‘ethics committee’: an independent body in a Member State, consisting of healthcare professionals and non-medical members, whose responsibility it is to protect the rights, safety and wellbeing of human subjects involved in a trial and to provide public assurance of that protection, by, among other things, expressing an opinion on the trial protocol, the suitability of the investigators and the adequacy of facilities, and on the methods and documents to be used to inform trial subjects and obtain their informed consent”

Ervine, Cowan. “Directive 2004/39/Ec of the European Parliament and of the Council of 21 April 2004.” Core Statutes on Company Law, by Cowan Ervine, Macmillan Education UK, 2015, pp. 757–59. (Crossref), doi:10.1007/978-1-137-54507-7_21.

What does independent mean?

  • The ethics committee and the research center must not be in a relationship of hierarchical subordination
  • There needs to be the presentce of personnel external to the research center
  • Absence of conflict of interest
  • Committee members must not have an economic interest in pharmaceutical companies.

The goals of the ethics committee

  • Guarantee the feasibility of the project, from an ethical and a scientific standpoint;
  • Protect the rights of the subjects of the research;
  • Ensure an appropriate relationship between the research center and the sponsor of the study.

How can we translate this to Ed Tech?

The Ed Tech Ethics Committee:

  1. Is an independent body
    • no hierarchy among members
    • no conflict of interest: committee members and the institution must not have ties with Ed Tech vendors
  2. Is composed by university personnel (faculty, adjuncts, administrative staff) and a representation of students
  3. Is responsible for ensuring:
    • Respect of faculty and students’ rights (ex. intellectual property, privacy)
    • Pedagogical effectiveness
  4. Needs to provide public assurance
    • Accountability
    • Transparency
  5. Its goals are:
    • Ensuring the economic, pedagogical, and ethical feasibility of a product or service
    • Defend the rights of faculty, staff, and students.
    • Ensure an ethical relationship between the institution and the vendors.

CUNY Graduate Center’s First Ethical Committee for Ed Tech: a Simulation Game

Part of my MA in Italy was dedicated to videogame semiotics and game studies, especially as far as boardgames are concerned. Thanks to Professors Peppino Ortoleva, Ivan Mosca, and Riccardo Fassone, I learned how helpful games can be in understanding complex issues.[1]

Therefore, I thought it might be interesting to simulate how an Ethics Committee might work in the Ed Tech field. Yes, it’s going to be nerdy. And yes, I’ve been watching too much Star Trek these days.

Wesley in the Holodeck

The Scenario

During a global pandemic (lol) the needs of the CUNY GC have changed. And Blackboard has crashed. The GC needs a new platform for distance learning and to ensure the safety of students and personnel. Ed Tech company Whiteboard offers CUNY GC a service similar to Blackboard, but harder, better, faster, stronger. It also offers a mobile app that tracks GC students’ and faculty to monitor the spread of Coronavirus, similar to what Apple and Google are building now.. The Ed Tech team at CUNY GC creates an Ethics Committee to evaluate the proposal.

Players and their Roles:

Star Trek Meeting
  1. The Vendor: Prof. Waltzer. I thought it would be interesting to have him reverse his usual role. The vendor offers a brief presentation of the product and is available to answer questions from the Ethics Committee.
  2. The Ethics Committee: I’m assigning the roles according to the list on our group in the Commons. Also, feel free to reach out in the comments if you wish to change roles or if you can’t make it to class on April 21. The Ethics Committee members ask questions to each other and to the Vendor. All you need to do is prepare some questions or discussion prompts beforehand.
    • Student Representatives: Hyemin, Lucas
    • Faculty: Carolyn, Jelissa
    • Adjuncts: Jason, Kathleen
    • Administrative Staff: Zach, Anthony
    • Librarians: Shani
  3. Guide: me. I’ll be there to coordinate the discussion, help out, and let you know when it’s time to make a decision.

The Goal

This is a cooperative game. People need to discuss to come to a shared solution.

Can’t wait to see you all on Tuesday…and I hope the game works!

Captain Picard saying "engage"

[1] By the way, Ortoleva coined the term homo ludicus and did extensive research on the ubiquity of game and play in contemporary society, in case you’re interested!

Tools and Platforms: By Kranzberg’s Laws

I was interested in doing my blog post for this week because over the last few years at my various jobs, I had access to the curriculum of K-12 schools and how the students, their teachers, and parents react and interact with the educational system today. Living in this age when we find the big, solve-all solutions to our inquiries and problems through tech, one question that often pops up in my head is at what point does technology cease to be helpful and becomes harmful to its creators?

I often think back to my time working as a tutor for the NYPL and one of my students asked me how to spell a word and what is its definition. I pointed out to him the stack of dictionaries that was sitting in the middle of the room collecting dust. He told me he never learned how to use a dictionary, and it was the same with the other students in our tutoring groups. He quickly picked up a laptop and said “Why can’t I just type in what I think it is spelled like, the laptop will tell me everything I need to know.” This interaction is still so clear to me years later, because in the effort of not trying to fall into the rabbit hole of believing that we currently exist in the start of the prequel film of Pixar’s Wall-E, this kid completely blew my mind. The thought that the skills of how to use a dictionary was ingrained in me before I was his age because it was part of the learning skills necessary. And now he just taught me how life skills have changed drastically and concepts can become obsolete in a short period of time.

What really struck me is historian Melvin Kranzberg’s Six Laws of Technology. The first law states, “Technology is neither good nor bad; nor is it neutral.” This was referenced at the beginning of Zheng, Binbin, Mark Warschauer, Chin-Hsi Lin and Chi Chang’s “Learning in One-to-One Laptop Environments: A Meta-Analysis and Research Synthesis,” and I felt that this reference of Kranzberg’s laws could be helpful to synthesize the various arguments flowing through all the readings. You need humans to work, create, and program the technology, but at what point does technology, especially ed-tech, stop being an input for humans’ intellectual, cultural capital, and the biases that can unfortunately come along with it and become the mechanical void of the neither good nor bad nor neutral?

According to Merriam-Webster Dictionary, a tool in the technological sense is something used in performing an operation or necessary in the practice of a profession, and a platform is a vehicle used for a particular purpose to carry a usually specified kind of equipment. The tools and platforms highlighted in this week’s readings highlight the fine balance in educational technologies being positive tool for human learning or impartial for educational needs. Gaggle, according to Caroline Haskins, gloats that it not only provides data structure in a school system by providing tools and a platform for student-teacher work and communication, as well as saves lives from suicide at the cost of constant surveillance and lack of student privacy. While other algorithms such as AI grading programs that, as Lauren Katz explains, are designed and trained by humans to optimistically be as useful as possible to the purpose of assisting students, teachers, and writers . But that comes with the risks of biases steeping into the system because the technology can only work with what it’s given. We also have situations such as Pearson’s move to a Digital-First strategy. As Lindsay McKenzie explained, the plan is to slowly limit the production of print textbooks on a time schedule as the company focuses on its digital platform and course materials. With the implementation of this tool, not only will the digital platform and its course materials will be more frequently updated include newer research developments, technologies, and breakthroughs, it will also to be set at a less expensive renting rate opposed to buying expensive textbooks.

After doing each of the readings for this week, the words “tools” and “platforms” that title this week’s sections was uncanny to me in referring back to the Kranzberg law. In all the readings this week, there is a main situation or problem dealing with technology as an educational component and I was left grappling with questions of what makes this technology neither good nor bad nor neutral. There were examples in the readings that showed how technologies can positively uplift some, negatively impact others, or not make a difference. I believe Audrey Watters asked some of the important questions for us to discuss together when it comes to educational technology: “What do we need out of educational technology? Are we only interested in rousing test scores or learning efficiencies?” These tools and platforms in their own right can make great impact on our learning processes, which should be the main reason why there is a major boom in educational technology that does not seem to be dying down any time soon. But human reasoning needs to be analyzed and at the focal point on how we proceed with educational technology because it does not exist on a moral basis of being good, bad, or neutral.

a mini-rant about link in Watters’ reading

First, thanks to Luke, Kathleen, and Carolyn.
I’m still inclined to to try to do some kind of project paper for this course, and I find Luke’s breakdown of options helpful… but have still not yet managed to tackle. Am working on our course readings first…. and followed Watters’ link to What Faculty Should Know About Adaptive Learning.
I think Feldstein’s article itself is fine, maybe even valuable, but I was really put off by this paragraph:

Here are a few examples of adaptive learning in action:
A student using a physics program answers quiz questions about angular momentum incorrectly, so the program offers supplemental materials and more practice problems on that topic.
A history student answers questions about the War of the Roses correctly the first time, so the program waits an interval of time and then requizzes the student to make sure that she is able to remember the information.
A math student makes a mistake with the specific step of factoring polynomials while attempting to solve a polynomial equation, so the program provides the student with extra hints and supplemental practice problems on that step.
An ESL writing student provides incorrect subject/verb agreement in several places within her essay, so the program provides a lesson on that topic and asks the student to find and correct her mistakes.

This feels to me like a demo for what is wrong with adaptive learning approaches– “customizing” in contrast to authentic “personalizing”. Feldstein plugged in variables of word-pairs in each of his “specific” examples. But they are in fact nearly interchangeable! Take “physics/angular momentum”, “history/War of the Roses”, “math/specific step of factoring polynomials/attempting to solve a polynomial equation” and “ESL writing/ subject/verb agreement/ essay” and move them around within the four examples, and mostly it’ll make no difference. yes, there’s “information” for history, and “practice problems” for physics and math, but much of the language is blandly one-size-fits-all, and those minor tweaks do not change much.
So, I am feeling frustrated by what I perceive as the inescapable pervasiveness of our #algorithmic_dystopia; and feeling frustrated by my cynicism, but preferring it to the alternative of naivete…. and grateful for this space in which to rant.

Revised Final Project Options

Hi All,

Here’s a proposed set of options for completing our work this semester. Feedback and sidebars welcome (thanks to those of you who have reached out about this already). We can plan to discuss in class, refine this list together, and move on from there.

Option One: Continue with projects.

  • Environmental scan and proposed form for the final artifact due April 7
    • Potential forms
      • Grant Proposal (along these lines)
      • Physical prototype of a project with 5 page description
      • 10-15 page reflective narrative of a project you’re doing elsewhere
      • Proposal for a larger research project
      • Other

Final Artifact due May 18. Presentation optional.

Option Two: Term Paper

Write a final paper of 10-15 pages that draws upon readings from at least three weeks this semester which describes the critical educational technology toolkit that you will take forward in your work.  Due May 18.

Option Three: Lexicon Work

Referencing reading from this course, craft a definition for at least 10 terms on the Ed Tech lexicon. Your definitions should then be shared with the class before the end of the semester, and revised and submitted by May 18.

EdTech, Economics and a Pandemic

Wow, it’s just wild how much has happened since we all came together last week! I’m sure the microscopic threat called SARS-CoV-2 has changed the daily lives of most of us and it’s hard to get back to work in our home offices. I was wondering how much space this ubiquitous topic should get in my blog post. Considering the vast amount of universities and schools all around the globe that are now in the process of transitioning to online learning, it is definitely worth examining the current EdTech processes from a critical perspective. However, as I am sure that the Coronavirus will dominate our discussion next week anyways, I will only touch on this topic at the end of my post with some open questions and will, instead, try to give you an insight into our readings.

Even though it feels like it’s been a very long time since we’ve had our last meeting, you may remember that we read texts about the Thoma Bravo’s acquisition of Instructure, the provider of Canvas. We learned about the resistance of educators and shareholders that were worried about the sale of Instructure and its implications for the strategies of the company and its LMS. In this week’s reading, Phil Hill examined how Instructure reacted to this criticism, and he seemed to be surprised about the explicitly profit-driven framing of the company’s response. Instructure is straight-forward about their goals: “the Board of Directors has one priority: maximizing value for our stockholders”. I was wondering if Hill is genuinely surprised that a corporation’s primary objective is the increase of their revenue or if he is rather surprised about the fact that they communicate is such a transparent way. In either way, the company’s focus on monetization and profit-making reveals that even a company like Instructure that was initially thought to be “different”, as it brought new vibes into the LMS market, is eventually pushed by market forces to follow the rules of neoliberalism. But can we conclude from this that EdTech companies care more about money and power, then about things like pedagogy and data privacy?

– Sure, any answer to this question would simplify the complex settings in which EdTech companies operate. However, it is definitely worth scrutinizing the economic interests of corporations involved in the education sector. Ben Williamson’s paper on Silicon Valley start-up schools is a good example of such an effort, as it examines the role of money and political influence in the EdTech world. He demonstrates how Silicon Valley networks are using their financial and technical means to push a “technocratic mode of corporate education reform” by creating their own schools. With executives and engineers from the tech industry as executives and staff, big tech firms try to restructure school institutions based on tech sector market logics. The ultimate goal is to “scale up” these projects and create a more effective, data-driven alternative to traditional educational institutions. Critically, the narrative used by these companies reveals that their “algorithmic imaginaries” of the future of K-12 education do not envision a renewed public education. Instead, public schooling is described as a “dangerously broken system” and private startup schools are considered to be the cure. I think the apparent conflict of interests that arises when these companies promote their schools in a highly lucrative private education market does not need further elaboration.

How desperately tech vendors are waiting to expand their role in education can also be seen in the study by Jennifer Morrison and her colleagues at the Johns Hopkins University. While administrators and executives in public education are mostly satisfied with the procurement of EdTech, providers are “extremely dissatisfied”. Almost 80% of the vendors expressed their dissatisfaction with the “district’s processes for identifying, evaluating, and acquiring needed ed-tech products” and more than 70% are dissatisfied with “the time required to complete procurement processes and bring products to end-users”. The latter directly mirrors Williamson’s observation that “corporate philanthropists (many from successful technology companies) are impatient with public bureaucracies and have focused instead on creating a broad network of private and nonprofit alternatives for developing and running schools.”

It may be no surprise to see that the fast-paced business world conflicts with long-term decision-making procedures in public administration but the consequent division of educational institutions into private high-tech and public low-tech schools is worrisome. Thus, we should ask ourselves: How do we bring together private and public actors without losing control of our public institutions and without failing to take advantage of private products? The study by Morrison et al. suggests that improved EdTech procurement processes may be able to manage this task. However, according to the authors, there are currently various barriers in procurement processes, such as the insufficient integration of the end-user experiences, the lack of comprehensive assessments of the schools’ needs and mostly inadequate evidence of the products’ pedagogical implications.

The Gallup survey, commissioned by the NewSchools Venture Fund, touches on several of these aspects. It examines the opinion of end-users, teachers and students, and aims to provide evidence for the perceived effectiveness of digital learning tools. The study even claims to “provide critical information for educators, leaders, developers and entrepreneurs to maximize the effectiveness of digital learning tools that support teaching and learning today.” Yet, I doubt this report can bring together private and public actors in a well-balanced, meaningful way. Instead, it seems to fall into the category of “non-rigorous evidence”, which Morrison et al. criticize for its distortion of decision processes. This report is funded by philanthropies like the Gates, Dell, and Chan Zuckerberg Foundation and should be used with scrutiny. Even though some findings of the study may indeed be useful for decision-makers, the technocratic framing of this report reveals its support from the Silicon Valley.

One aspect that I found to be particularly suspicious was the use of “effectiveness”. The words “effective” and “effectiveness” are found more than a 100 times in this report and findings like “Most teachers, principals and administrators think digital learning tools are at least as effective as non-digital learning tools” are praised by the authors. Yet, as the authors fail to provide a definition of effectiveness, these statements raise more questions than they offer answers for critical readers. If digital tools were just as effective as non-digital tools, could I switch to digital learning and nothing would change? Do teachers and administrators even perceive effectiveness in the same way? Unfortunately, however, “effectiveness” just remains a buzzword in this study complying with the technocratic narrative of Silicon Valley corporations. Considering this, the report is a perfect illustration of how difficult it is to make well-informed decisions on how to include EdTech in schools in a world that is dominated by powerful companies, which, eventually, will always a have financial interest in selling their products.

It is clear that the EdTech procurement decision-making processes need sufficient time to assess products independently and think through their long-term effects. But what happens if there is just not enough time to consider all of the implications that the use of digital learning and teaching tools might have? I think it is a question that we should ask ourselves in the current crisis.

In my opinion, Morrison et al. offer a useful guideline for EdTech procurement processes:

1. Assessment: What ed-tech product do we need?

2. Discovery: What ed-tech products are available for our needs?

3. Evaluation: Which available products are the best fit?

4. Acquisition: Can we acquire the products that we select in a timely manner?

Is it possible to go through all of these steps sufficiently in the current crisis, now that the transition to online learning needs to happen asap? What happens if administrators and executives skip several steps of this framework? What will be the long-term effect of this rapidly implemented online teaching environment? How should EdTech vendors behave in this situation?

I am already looking forward to discussing these questions and the readings with you on Tuesday!

Warm thoughts and stay healthy!

– Lucas

Systems… from a nurse’s perspective

Before diving into edtech, I want to share a healthcare perspective on systems. In my work as a nursing informatics specialist, I observe and analyze nursing workflow, identify gaps, speak with vendors or our internal IT team for solutions, propose, implement, and maintain solutions with the help of an interdisciplinary team of information systems analysts and clinical folks. Some clinicians are not satisfied with the systems they use to document care because not all systems were built with clinician input.

To provide some background, the American Reinvestment & Recovery Act (ARRA) enacted in 2009 included many measures to modernize our nation’s infrastructure. One particular measure, the “Health Information Technology for Economic and Clinical Health (HITECH) Act” included the concept of electronic health records – meaningful use [EHR-MU], an effort led by Centers for Medicare & Medicaid Services and the Office of the National Coordinator for Health IT (ONC).

Under Meaningful Use, hospitals needed to implement certain electronic services and technology by government established timelines in order to receive monetary incentives from CMS and avoid financial penalties. For example, Meaningful Use stage 1 established base requirements for capturing clinical data, which means hospitals needed to implement an electronic health record.

As a result of the EHR Incentive Programs and the temptation of EHR incentive payments for EHR adoption, many healthcare organizations rushed to implement technology that became obstacles for efficient clinical workflows.  As Broussard mentions, computers really are proxies for the people who made them! Vendors without clinical staff perspectives jumped at this opportunity and hospitals made deals quickly to meet Meaningful Use standards and get incentive pay. A decade later, we have EHRs that are barely usable and we keep layering on APIs (application programming interfaces) to make the systems we have slightly more usable. Depending on the development and implementation, sometimes I feel like it’s a house of cards waiting for the next upgrade or API to break it all down. But, I love being in the middle of it because I have a say when I test systems. I can advocate for my clinicians.

It’s interesting how government incentive payments have the power to drive implementation. Money speaks!

A perspective on Electronic Medical Record systems

and back to EdTech…

Learning management systems, adaptive learning systems, centralized computer systems, mobile applications, and algorithms built into systems that reinforce oppression. This week’s readings covered various systems that produce educational technology.

I appreciate Broussard and Noble’s respective pieces about how people can significantly impact the system experience. Broussard highlights how systems can fail students if not used to their full potential. Students at poor schools are at a severe disadvantage without the textbook resources they need. But, how would the administrators know what they needed if they didn’t keep accurate tabs of the textbooks in the system they already had?  Even the educators’ haphazard method of keeping track of textbooks was a failure of the school system.  It’s kind of insane to me that the projector method of viewing textbook stock was acceptable…I wouldn’t guestimate my patient’s stock of IV fluids or medications and just deny them these resources if I ran out. But, then again I haven’t been an educator in an underprivileged school, so I don’t really know the other factors that come into play.

The centralized computer system did not have accurate records about each district’s textbook quantity and many students did not have a textbook to use, hindering their education and ability to score well on standardized tests. This touches upon the golden triangle of people, processes, and technology, which is a framework often discussed in systems development and process improvement. Without enough people to maintain the system, the process of textbook tracking, and technology in the form of the centralized computer system fail.  There has to be an equal balance among the three concepts otherwise stakeholders of the system suffer. In this case, the students suffer.

I’ve felt this frustration when shopping for an item and driving far to retrieve it, only to have the store associate tell me that the stock is not accurate online and they’re sold out of the product.  So, what’s the point of the system then if the data isn’t accurate?

Noble’s discussion on algorithms of oppression highlight what’s happening today with the coronavirus and xenophobia. I remember seeing images of Chinese food, bat soup with wild animals, and Chinese people while scrolling through Facebook and Google when coronavirus (COVID-19) was extremely new and rampant in Wuhan, China. Today, these images seem to be gone from the feeds and searches, but they still exist in the real world.  Xenophobia and micro-aggressions are trending topics in search engines and real life. If only I could have a dollar for each time I hear “we can’t eat Chinese food….” everywhere I go….

Noble also mentions the significance of the digital divide and how we leave people behind when we implement technology without considering culture, access, or usability. How do I tell my patients who speak Ethiopian to access their discharge instructions on their patient portal when the instructions are only written in 5th grade English and they don’t have Internet or a computer at home? How does an educator tell a student to complete an assignment on a mobile application if they can’t afford a phone?  I liked Liz Kolb’s podcast assignment, but I don’t like how it assumed that every student owned a cell phone. There are students who cannot afford smart phones and may not be able to participate in class activities that suggest cell phone use. Sometimes we also forget about students with special needs who have physical limitations or cognitive delays. How do we incorporate them in learning activities if they do not have a cell phone or cannot use a cell phone?

Cost is always a factor when considering the use of complex edtech tools. As Broussard mentioned, systems are a great responsibility because of 24/7 maintenance. We need a lot of money to pay vendors to upgrade systems, provide help desk support, and account for hardware and data storage.

My new Chief Digital Officer, Claus Jensen, has a phrase he reiterates: “Buy what accelerates, build what differentiates” so we can support our toolmakers and caretakers and keep caring for the people who need help. For some reason, it really stuck with me.

This phrase seems to relate to the private equity deal between Instructure and Thoma Bravo. Thoma Bravo seems to want to “buy what accelerates” since they already own Frontline education, an administrative and HR software solution for educational organizations. Instructure also wants to remain sustainable and “meet the bigger needs of education than just being an LMS” by “exploring strategic alternatives in order to maximize shareholder value.”  Once Instructure’s deal closes (if it does), it will become private and allow it to invest more in its software and potentially make more acquisitions. I’m not sure if Thoma Bravo will decide to merge Frontline and Instructure or what the future holds for them. I’m curious as to what they will build to differentiate. How will their strategy change and do they have the best intentions of the students/faculty? Whenever I see a company take over, I wonder about their intentions.  I looked up Thoma Bravo and I couldn’t tell from their website if an educator sits on the board.

 Instructure’s pending deal rightfully raises worrisome questions about downstream effects. Employees wrote public letters calling for Instructure to make a legally binding public pledge that will protect student data under new ownership after the CEO boasted to investors about having “the most comprehensive database on the educational experience in the globe.”

How are the students and faculty affected? What will happen to their data? Will their data be used for profit without their consent? Will the data be used to create algorithms and predictive models? Will they be charged for modules or specific uses of the system?

It’ll be interesting to see what happens. How will faculty play a role in this? Steven Oxman and William Wong give a thorough overview of adaptive learning systems in their white paper. They cover the three core elements and give examples of these systems in corporate training and different levels of education. I appreciate the overview, but I am left wondering about the role of faculty. I see that the adaptive learning systems are starting to integrate sensor use to integrate the learner’s affect, but something about this doesn’t feel right…it can certainly tell educators how a student is feeling when taking modules or a particular test. This can help the educator make a decision on how to address the learner since other factors in life may affect learning.

What I enjoy most about learning is story telling. Personally, I remember concepts best when faculty find a way to incorporate a personal example. Learning systems can incorporate stories, but there’s a different feel when a professor tells you an emotional story that reinforces the learning concept and you have a live discussion about it. 

I’m curious to know, being of the health profession and as an outsider looking in, do you agree with Phil Hill when he says that the market is much healthier and more focused on educators’ needs than it was a decade ago? Or does it depend where you work (Country wide, internationally, here in NYC)?

If the government decided to revamp the nation’s educational system by providing millions of dollars in funding, what aspects of systems would you incorporate so that it would be fair to all students?

Apologies for the long post. I was a blogger in a past life. Looking forward to our chat Tuesday!

Please remember to stay calm, wash your hands with soap for 20 seconds, don’t touch your face, cover your coughs, drink a LOT of water, and if you feel slightly ill, stay home! I’d also recommend limiting intake of media stories about this virus and navigate straight to the CDC and WHO websites for information.

The (Data) Clouds Have Eyes

“Echo, are you spying on me?”

“No, I’m not spying on you. I value your privacy.”

-Me, to my Amazon Echo Dot


So, may I be the first to say: holy shitake mushrooms, my fellow colleagues. There’s an unbelievable amount of things to address in terms of this week’s readings, so I’ll break it down into a cohesive narrative to the best of my ability. I had been adamant about facilitating the discussion on data and surveillance because I recently began researching in this field last semester. I had taken a course within the Urban Education program titled Immigration and the Intersections of Education, Law, and Psychology, and for my term paper I wrote an article draft of my own which is currently titled “Digital Challenges of Immigration: How Technology is Working Against Diverse Bodies.” My article is about the rapid technological/database advancements made within the United States post-September 11th 2001, and how these technologies are working against different body types and fostering potentially dangerous digital environments for populations such as immigrant college students. When Jones, Thomson, and Arnold (briefly) mentioned the term “biomarkers” in their piece“Questions of Data Ownership on Campus” I made the connection to Ruha Benjamin’s explanation of biomarkers from Race After Technology.

According to Benjamin (2019), within these large collections of data, names are encoded with racial markers, which means more than just them signaling cultural background, but it also provides a plethora of historical context. What makes this concept stick out to me is the mention of universities’ misuse/storage of #BigData and the startling ideas behind how these collections can be linked/used. Looking at some of the following quotes, one can’t help but get a severely sinister vibe:

“Rarely are such systems outright coercive, but one could imagine developing such systems by, for instance, linking student activity data from a learning management system to financial aid awards. Rather than relying on end-of-semester grades, an institution might condition aid on keeping up on work performed throughout the semester: reading materials accessed, assignments completed, and so forth.” 

“From application to admission through to graduation, students are increasingly losing the ability to find relief from data and information collection. Students are required to reveal sensitive details about their past life and future ambitions, in addition to a host of biographic information, simply to be considered for admission—they are never guaranteed anything in return for these information disclosures.” 

“’College students are perhaps the most desirable category of consumers,’ says Emerson’s Newman. ‘They are the trickiest to reach and the most likely to set trends.’ As a result, he says, their data is some of the most valuable and the most likely to be mined or sold.”

“But the company also claims to see much more than just attendance. By logging the time a student spends in different parts of the campus, Benz said, his team has found a way to identify signs of personal anguish: A student avoiding the cafeteria might suffer from food insecurity or an eating disorder; a student skipping class might be grievously depressed. The data isn’t conclusive, Benz said, but it can ‘shine a light on where people can investigate, so students don’t slip through the cracks.’”

For starters, what the hell? Perhaps we should slowly walk through the problematic features of these quotes because, at first glance, the everyday reader (non-edtech enthusiasts) may night pick up on the subtle (or for some, not so subtle) red flags in each quote.

The first red flag I’m referring to is the blatant use of the term “coercion.” This quote was pulled from Jones et al.’s “Student Perspectives on Privacy and Library Participation in Learning Analytics Initiatives,” which focused on the WiFi-based tracking system where all students had to download an app in order for the university to maintain attendance records. I teach my Communications students that what makes “coercion” different than “convincing” is that coercion persuades the target through methods of threats and fear tactics. Suggesting that the future of such technology could be practices such as linking these applications to learning management systems (LMS) in order to turn financial aid into a conditional reward is horrific, ableist, racist, and classist to say the least.

Dangling financial aid like a carrot in front of students in order to get them to perform to a university’s liking is extremely dystopian. This borders heavily into issues of racism & classism because this reward system may not apply to privileged students who do not need to take advantage of financial aid. Students who are coming from lower-income communities are relying on that aid not only to pay tuitions but to have food to eat and a roof over their heads. When the carpet is ripped out from under them due to, let’s say, a medical condition that inhibits their ability to attend class occasionally, who does that benefit? And as we know, the previously mentioned low-income communities are more often than not predominantly minority groups. This is very reminiscent of the introduction to Virginia Eubanks’s book, ​Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor,​ which she used to tell a story about how her partner required a life-saving $62,000 surgery. Eubanks’s (2018) insurance company had denied them coverage for the surgery, despite Eubanks’s domestic partner being covered through her insurance at her new place of employment. Upon further investigation, Eubanks (2018) came to the realization that her family had been red-flagged and placed under investigation for fraud. Who is to say these actions aren’t racially influenced by markers within these big data clouds?





On top of these questionable backend practices, the actual user interface that shows the student attendance/progress mentioned in the same article looks similar to applications like Credit Karma (I’d share a screenshot to show the comparison but I don’t need you guys seeing my credit score #DataPrivacy). What struck me about this was that it resonated with a quote from the same piece about the concept of “cradle-to-grave profiles,” which is essentially when a student is tracked their entire student-careers and beyond that, having these services trail them into their actual professional careers in order to evaluate outcomes.

I apologize but I am fixated on this piece particularly because it addressed so many violations of college student independence in a way that really left me disturbed. These methods of tracking also talk about the ability to locate the exact location of the student, and this would help to keep tabs on their health throughout the semester. As observed in the last quote, by tracking where a student is spending their time on campus, they can guess what they are going through emotionally. Huh?! Making the bold assumption that a student has an eating disorder because they do not spend time in the dining hall is breaking so many social boundaries. If we’re being honest, a lot of universities have crap dining options, I know mine did in undergrad. So if a student remains in their dorm building to cook and study, they may get marked in the database as a hazard or concern of some sort to the university. The more I pry open this article, the more dystopian it feels.

“Because if [my institution] had the intention of using my data to create better programs or better educational tools, then I’m all for it, you know…. But I could also see certain things that are tracked, maybe being a little embarrassing. I initially didn’t go [to the counseling center] for a long time because I was embarrassed, because I knew that the university was going to be able to track that and look at my record and say, “Oh yeah, she’s been going to counseling.” And maybe if they wanted to, they could somehow find out what exactly it was that I was talking to the therapist about.”

We want to assume that these practices aren’t super present in the magical land that is CUNY, but students have experienced similar within our very own Graduate Center. While having a conversation with a peer of mine about the Wellness Center and counseling services provided through our tuition/fees, this person had decided to pursue the counseling services to talk through some stuff. First of all, we only receive SIX (6) sessions a semester as graduate students, like, okay. Second, during the session, my peer was made aware of the fact that the session was being recorded on camera. However, they ensured that there was no audio. Wait, what? Yeah, cameras in the counseling center apparently. I wanted to investigate this for myself but unfortunately have not had the time. We tried to justify it with the idea that psychology doctoral students of specific disciplines needed to meet an hourly requirement, but it just wasn’t enough for us to feel comfortable enough to take advantage of a space where we are expected to be vulnerable.

I’ve been ranting quite a bit at this point, so I’ll start to wrap it up and leave the remainder for our class discussion. With the references I’ve made in this post alone, I want to get across the extreme discomfort I have over the fact that these unethical educational technologies are invading university campuses in a way that threatens the growth/development of every student’s individuality. Higher education is just as much about wellness and personal experiences that influence our decisions just as much as (or at least close to as much as) the ways classrooms shape their futures. I have so much that I want to talk about, so I am looking forward to probing these horrors even deeper with you all tomorrow.

Pedagogy and Educational Technology

Today while walking to the bus I saw a kid, probably middle schooler, carrying a tri-fold cardboard to school. (I inferred this; as I didn’t ask the kid where he was going. But why in the world would a kid be carrying a piece of cardboard at 8 AM otherwise?) 

I caught myself thinking, “Oh great! That looks rigorous and scientific”. But on second thought, I realized I had no idea what was on that board. I have been witness to some student science fairs, (Why are the tri-fold boards only used in science? And rarely in other subjects?) and let me tell you… just because it’s a tri-fold board doesn’t mean it’s going to be something wonderful. Or evidence of thinking. Or learning. Often it might just be evidence of a determined, driven parent. 

I was listening to an interesting podcast about branding with a branding guru named Tosh Hall. He said when working with brands, you have to be a doctor and a boy scout.

He explained you have to be a doctor in taking the hippocratic oath, “Do no Harm!”

And a boy scout in “Leaving things better than you found them” (This is in reference to campsite, ie pick up the trash, make improvements. He used these mottos to explain his philosophy on a brand. Like Cheerios, or Nice and Easy. But I think these same sorts of things need to be considered in teaching.  


What is it? 

The practice of teaching. 

How does pedagogy and its traditions dictate the forms of educational technology?

I’d argue that the forms of ed tech exist, because they appeal to traditional teaching practices and values. 

Any tech practice or tool wouldn’t gain traction if it were not appealing to the practices and values of teachers (and/or “school”).

Can we critically analyze Educational Technology without critically looking at Pedagogy? 

Many of the ideas presented in the readings reflect concerns and wonderings I have had over the years as I have engaged with teaching and learning. The scale of these musings tends to be grand and overarching, such as:

What is learning? 

And what is it good for?

These questions are vital, especially in the context of the 21 century, where a universe of information is available to many instantly. Memorizing the state capitals of every state IS learning, but what is that good for?

What is learning if it isn’t used?

Ideas of authenticity in practice and in knowledge, and the importance of audience were themes in the readings.