Tuesday 16 March 2021

Using apps to monitor students' emotions

Author Adam Garfinkle wrote that In science fiction, the typical story is that machines will become human-like; but in fact more pressing problem now is that, through the thinning out of our interactions, humans are becoming machine-like.   There may be some truth here - once a search might have meant a daring existential journey; a friend might have meant a person, rather than a contact on a network; like was an emotion not a status symbol; and recognition was something you earned, not something that computers do to your face on CCTV.


It was with this backdrop that I explored a few of the offers I received this week to trial software aimed at helping students with wellbeing.  Always curious, I looked into a few apps that were designed to help students monitor their own emotions, to name how they felt and track their feelings over time.  The software responded to the data, offering advice and advising students to seek support if needed.


Now, we as a school have done similar things from time to time, using questionnaires, so at first this seemed like a small step - or indeed an improvement, as ours was pure data collection that was aggregated to give us cohort-level analysis, while these Apps offer direct individualised feedback to the students.  So I was intrigued.  But the more I think about it, the more questions I have. 


One notable issue is that some of the services are totally free; which sounds good. But then you have to ask how these companies make any money; journalist Shelley Buchanan notes one program used in many US school districts has a link to a children’s app that collects personally identifiable information which is sold to third parties; and another app  that shares information with third party users (despite their company privacy statement).  Examples like these are satirised in this cartoon, but then you only have to look at recent scandals around data from Facebook and others to know that if it's free, then you are the product, as they say.

Perhaps that issue is solvable, but there are wider questions here.  Buchanan shows one system that shows a student as 'scoring' 57% 'calm', 29% 'stressed', 14% 'angry' over time.  It is hard not to lament at such a reductive system!  Not all angers are the same, by any stretch of the imagination; and some types (outrage at injustice) are actually positive, when channelled correctly.


And that's the first problem; who decides what feelings are acceptable? Which ones are to be encouraged, nurtured, supported? Which ones are alarm signs? I worry about a system which even hints at a transfer of this role from teachers, parents ot counsellors to software.


The second problem, I think, is that if we are using this software to support wellbeing, then that means the objective is to produce more positive feelings - which will tend to locate any wellbeing issue in the students' emotional competencies rather than in the situations they face. There is a fine balance there - some children's emotional competencies are not well developed, for sure - but the aim should be for a nuanced conversation, not an automated dashboard that lights up in an administrator's office.


So if I have all these questions, what might be some answers?


We have to start with the frame that for this emotional purpose, the right metaphor for school is that of a family, not a machine. We should not start with an information system and then ask how to collect data, and track it, and display it, any more than we would with our own children. Let's start from a position of remembering that students are not sources of data to be scraped (and processed and sold), but young people seeking to understand themselves, their relationships with peers, and their place in the world. Then the whole frame changes to How can we as adults offer the perspectives, wisdom and questions that will help them grow into their best selves?


Put like this it seems obvious to me that what we need to nurture are trusting relationships, with systems that allow for 1:1 conversations; that create room for reflection, and a space where vulnerabilities and concerns can be shared.  We have such systems in place, and are not in a hurry to introduce any digital tracking into it.  We need to keep social-emotional learning as something that happens when people engage with each other, not with software.


I have some grave reservations about the apps I have seen - but ultimately, perhaps this is an empirical question and there are breakthroughs ahead that will genuinely enrich rather than impoverish social and emotional learning. If software emerges that can genuinely help students in the long term for sustained wellbeing that does not 'thin-out' their rich inner lives, then that has to be a good thing.  I’ll be watching carefully to see the data from schools that operate it, and who knows, perhaps it’ll be genuinely revolutionary.  But not yet. Watch this space.


References


4 comments:

  1. Interesting post. I am reminded of an article some time ago citing an Oxford paper that attempted to rank the jobs least likely to be automated. Along with those that are defined as "creative" in the traditional sense, are those where personal and real human connection is perhaps necessary and expected, such as therapists and school teachers.

    Even if perhaps a machine may give "better" or more accurate recommendations than a psychologist would, the human would probably put greater trust and be more willing to share information with another person, the factor which ultimately wins out at the end of the day. And I think there is the risk, as with exams, of wrongly refocusing the target of mental health- it is a genuine improvement that is desired, not maximizing a score at the end of the day.

    As can be seen with the national and international outcries of sorts resulting from the awarding of exam grades in the 2020 cohort of students, a lot of people against the notion of putting their faith in an algorithmic black box of a process that was traditionally had a lot of human input. Humans seem to be generally more forgiving of another person's errors than a machine's.

    ReplyDelete
    Replies
    1. Thanks Anthony. Agree - it's all about relationships. As long as data can take a back seat, it may be useful. But it will never be central.

      Delete
  2. Good post. More than anything else I like your principle that emotions cannot be reduced to scores and that the school system is more a family than a machine.

    ReplyDelete
  3. Excellent post. In 'Machines and the Emotions' Bertrand Russell writes; the
    great objection to emotions, from the point of view of the machine, is their irregularity. As the machine dominates the thoughts of people who consider themselves 'serious', the highest praise they can give to a man is to suggest that he has the qualities of a machine – that he is reliable, punctual, exact, etc. And an 'irregular' life has come to be synonymous with a bad life."

    We may pay a high price if we try to understand our emotions through the application of technology.

    ReplyDelete