EMOTICASH RESPONSE #2

23/04/2018

The second response to our Emoticash project comes from Meri Carrasco, designer and RCA postgrad student who imagines the potential different scenarios that could arise from the development of emotion recognition technology.

“Are we being too intrusive? Will these technologies be exploited? Will we lose our sense of self?”

This use of technology to map and read emotions seems quite intriguing, from reading these questions my mind polarises 3 different scenarios...

Commercial exploitation
A situation in which your grin activates advertising that targets a particular weakness and consequently attempts you to buy more of a "healing promise product". In this scenario enterprises target individual aspirations in a crafted and refined way, similarly to the way web ads appear as a reaction to your recent web browsing history. Yet a lot more subtle and efficient.

"All technology is intrusive...yet as humans we have enormous capabilities to adapt and register something unusual as part of our new routine in a matter of days."

Institutional/Militarised exploitation 
In our hyper-vigilant society, facial expressions are assumedly being used to determinate who is a threat to society. The uses of these readings carry the potential to be used in court to determine someone's veracity or psychological state. Or more simply to determine the positive or negative societal value of an individual with regards to their average emotional state.

Institutional/Health purposes 
We currently live on the verge of a clog in public healthcare being massively relieved by the use of AI. We are also currently living in an age where anxiety and adverse mental health is on the rise. Despite several organisations offering counselling at a low cost or free, the majority of people who seek emotional and psychological help rely heavily on private counsellors. Facial recognition as an addition to an AI counselling service could prove highly beneficial, as the bot has the ability to recognise patterns not only on speech rhythm, intonation and breath, but also facial expressions that would compose a better profile in order to trigger the questions that the patient is really seeking answers to. All of this coming at a fraction of the cost of a regular therapy session, with hopefully the same or better results as tradtional face to face therapy. NOTE: This exercise should be attached to a non-exploitative confidential agreement between the software developers, researchers and patients. This would be the ideal use of emotion-recognition.

"The individual envelopes themselves in a mask of normality, diluting their personality."

All technology as well as most policies are intrusive. Yet we humans have enormous capabilities to adapt and register something unusual, as part of our new routine in a matter of days. Whichever of these scenarios would happen, the truth is that these will become adopted by us in the longer term. It is simply the way our minds work.

I find that one loses oneself when out of touch with oneself. The first two proposed scenarios would help the weakening of the individual, as in the smart advertising scenario. This technology seeks to understand, target and exploit the reactionary nature of human beings.

The second scenario provides that the individual envelops themselves in a mask of normality and cannoned behaviour, therefore diluting their personality.

The third scenario if used within non-exploitative parameters will help the individual know and understand themselves better. Therefor becoming a tool for healing and self discovery rather than an extra layer for which to curate oneself with, or a new channel with which to cloud people's existence.

DOING THINGS DIFFERENTLY 

Services Unknown is an open-ended Superimpose Studio project that facilitates new ideas, discussion, events and product.

Read The Emoticash Report here.
Read The Emoticash Response #1 here.

Submission of new ideas and responses can be received at hello@superimpose.studio