"You know the mind is just a hard drive"
Challenging times, Emotion AI, and the music of Cassandra Jenkins
She said, "Oh, dear, I can see you've had a rough few months…
I'll count to three
Take a deep breath
Count with me.”“Hard Drive” - Cassandra Jenkins (2021)
Wow, it’s been a difficult few months, even few weeks. All I can think of is finishing my Master’s. The capstone project keeps me up all night, the anxiety overwhelming me. I’m not going to lie: I’m on the verge of tears every night. But I’m almost there. I’m in the homestretch.
Last week, I presented my capstone project on Emotion AI and the feeling of relief was incredible. I’m not finished with the program yet: I have four more big deliverables, including the final paper for the capstone. But one by one, the work is being completed. And soon enough, I’ll be at Massey Hall, watching Broken Social Scene.
Today, I wanted to share with you two things. The first is a reflection on Cassandra Jenkins’ music. And the other is a draft overview of my Emotion AI capstone project. I’ll share the final paper down the road. But for now, I’d love to hear some of your feedback!
I took a walk by the lake last week. I just needed to clear my mind. I’m burnt out, just empty. I didn’t know how I’d get through the next few weeks. As I was driving down to Lakeshore, I put on Cassandra Jenkins’ An Overview on Phenomenal Nature (2021).
"Baby, go get in the ocean
If you're bruised, you're scraped, you're any kind of broken
The water, it cures everything"“New Bikini” - Cassandra Jenkins (2021)
I was incredibly struck by her words. There’s a healing nature to water. At times it can be terrifying. The waves can overwhelm you. But there are days when the water is relatively calm and still. When you stand in the lake, the water up to your ankles, you can feel the inconsistent but steady beat of the waves crashing. Close your eyes. You can hear the nature around you. You can smell and taste the lake around you. Breathe. Just breathe. Inhale. 1 - 2 - 3 - 4. Exhale.
“I'll count to three and tap your shoulder
We're gonna put your heart back together
So all those little pieces they took from you
They're coming back now.”“Hard Drive” - Cassandra Jenkins (2021)
The power of words, of storytelling, of water can be healing. I’ve felt replenished listening to Jenkins. There’s something terrifying and beautiful about the lake. As I wrap up the final few weeks of my Master’s, it’s time to return to the lake. Breathe. And keep going.
I’ve had a lot of fun researching Emotion AI and thinking about its policy implications for Canada. It may seem like an obscure topic, but the technology is here and it’s increasingly being used. As facial recognition technology (FRT) expands into more areas of our life, emotion AI will also grow. It’s predicted that the industry will be worth over $90 billion by 2024. It’s already being used in employment and hiring, law enforcement, children’s toys, education, mental health, advertising, driving, and even romance. But there are a lot of questions that need to be answered as emotion AI grows.
First of all, emotion artificial intelligence (AI), at its simplest, are technologies that detect or recognize human emotions through machine learning algorithms. I break down the definition to four key factors:
Detecting and recognizing emotions,
By using biometric data,
To predict and infer emotions through machine learning algorithms, and then
Make a decision and act on the results.
Emotion AI relies on a variety of biometric data collected by different sensors, such as facial expressions, iris scans, body movements, physiological responses (heart rate, skin temperature, and breath), and voice and tone.
But studies show there is a lack of scientific accuracy behind underlying foundations of the technology’s science. Dr. Lisa Feldmann Barrett found that facial expressions are not consistently correlated with their expected emotion and that emotion recognition drops to even lower accuracy rates when used "in the wild" outside of ideal lab environments. One example she often gives is that a scowl is often interpreted as anger, but it can also represent concentrating.
Dr. Lauren Rhue’s study of Microsoft API and Face++ found that these APIs consistently rated black faces as having more negative emotions. Using images of NBA players’ faces, Face++ calculated that black players are 2x angrier, 3x more afraid, and 20% less happy than white players. And when controlling for smiling by comparing photos of black player Darren Collison with white player Gordon Hayward, Face++ rated Collison as 20% less happy and 180x angrier.
Even more concerning, in a study of APIs used on children's faces researchers Bryant and Howard found that commercially available emotion AI performed poorly in correctly identifying children's emotions on average, with an average accuracy rate between 43%-67%. And some services performed even more poorly: Affectiva only correctly recognized “fear” 8.88% of the time, while Amazon recognized sad 10% of the time.
In response to these problems, I proposed and analyzed three policy options to address these issues in the Canadian context:
Creating a federal registry and mandating Algorithmic Impact Assessments (AIAs)
Developing new provincial Anti-discrimination and Privacy Legislation based on a recent white paper by the Ontario government
Federal Biometric Data Legislation governing the collection and use of those kinds of data
I recommend that provinces update their privacy legislation for anti-discrimination in emotion AI. This means creating rights for disclosure, contesting decisions, requesting human reviews of decisions made by "automated decision systems" (ADS). Provincial governments need to further clarify civil rights measures on ADS for housing, employment, and financial applications such as loans or insurance. Emotion AI should also protect children by prohibiting the use of these systems for children under the age of 18 and their use in schools. And provinces should also regulate the use of these systems in making health diagnoses.
The Federal government should enact a biometric data legislation law that establishes minimum responsibilities for operator, which prioritizes privacy, “duty of care,” and liability. With the concerns I've outlined earlier, it is important to create a clearer understanding of the responsibilities of firms and placing higher thresholds on these more sensitive data. It is recommended that proposed legislation explicitly include language that specifically addresses technologies or systems that use biometric data, which includes not just Emotion AI but also Facial recognition technology, advertising, and other technologies using biometric data. For the purpose of emotion AI, the legislative language should address systems that "recognize, predict, infer, or analyze an individual’s emotional state.” Applications of systems covered by this language should be banned on their use in high-risk applications that could “significantly affect” and individual - though the definition of “significantly affect” would need a more detailed definition.
Lastly, one option to consider with this context - which could warrant further exploration - is the creation of a regulatory sandbox that allows for innovative experimentation of emotion AI systems in low-risk applications and maybe to see an increase in the accuracy rates for these technologies, as long as there are higher restrictions on operators of these systems.
In line with the projections in Fall Economic Statement 2020, which included funding for the implementation of the previously proposed Consumer Privacy Protection Act, to implement this new biometric data legislation, the Office of the Privacy Commissioner should have their permanent funding increased by $10 million in 2023-2024 fiscal year and gradually increased to $20 million per year by 2026. In Ontario, the Information and Privacy Commissioner of Ontario should receive an $8 million funding increase in 2023-2024 FY over their 2019-2020 budget of $20 million, increasing to $12.5 million by 2026
These recommendations will hopefully prevent or mitigate the potential harms these technologies might have based on wrong decisions and issues around accuracy and bias. But with a broader biometric data legislation, it will also address greater issues around this more sensitive data
These measures might be difficult to implement, and they are admittedly blunt policy solutions, but the cost of not doing anything is too great, as these technologies will increasingly be used in many sectors and create real harms for people. This is not just a future matter - emotion AI is here in Canada and it will only grow in use in the coming years.
Well, wish me luck over the next few weeks. I’m almost done.
Reach out to me if you’ve got any thoughts, as well as any feedback on my capstone as I finish up the final deliverable!
“All those little pieces
One, two, three
We're gonna put 'em back together now
Are you ready?”
Greetings Angelo, randomness brought me here when I looked for Hard Drive by Cassandra Jenkins to post up on another substack page. Look after yourself!