A.I. and Emotions: what do you expect?

A.I. has changed and improved in many ways.
But what is emotional intelligent technology?
Which are the risks and implications?
We are going to discover it.

First of all, we must understand how human emotions work. Our brain gets information from different sources: our senses receive the stimulus and then, processing the stimulus, produces changes at a non-conscious level in the somatic state (primary emotion). If the emotion is sufficiently intense, cognitive, social, contextual and surrounding-related, evaluations are carried out, which we refer to as experiencing emotions - meaning the value these events and circumstances have for you [1].

What we feel is caused by a series of neurotransmitters and other chemicals produced by the brain, which activate different parts of the body to make us act and think in a certain way. Most emotions are useful in survival situations (fear, anger, disgust, sadness), as demonstrated by Darwin’s theory of evolution. Others, like joy, drive us toward certain objectives, pleasures and so on [2].

 

In recent years, technologies have improved, such as:

 

  • A.I., Artificial Intelligence: is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans.                        A.I. are often unnoticed and operate behind the scenes, like the artificial intelligence behind Google Search, Amazon store selections and sophisticated data analysis programs. [3] 

  • A.N.I., Artificial Narrow Intelligence (also called weak intelligence): machine’s ability to perform a single task extremely well, even better than humans. [3]

  • A.G.I., Artificial General Intelligence: can understand or learn any intellectual task, like humans – note: today there’s no operational A.G.I. [3]

  • A.S.I., Artificial Super-Intelligence: hypothetical agent that possesses intelligence far surpassing that of the brightest and most gifted human minds. [3]

 

Which process does A.I. have?

When A.I. is given a stimulus through its senses - whether cameras, LADAR, microphones or other sensor - it checks its memory – the database of information - and then runs the appropriate program/algorithm that corresponds to what it has sensed and remembered so that the activated program produces a predictable result.

Actions of deep learning A.I. sometimes surprise because it continually learns on its own from its experiences and the environment, constantly adding information to its memory and adjusting patterns of behaviour. Just like a human being, its memory changes, therefore the output of a triggered program changes.

 

It sounds very similar to a human learning and adapting its behaviour, doesn’t it? Because that’s how we learn too! [1]

 

Could A.I. have emotions? [4]

The question whether A.I. machines can have their own emotions is open to debate. A.I. and neuroscience researchers agree that current forms of A.I. cannot have their own emotions. They have no body, no hormones, no memory of their interaction with the world and have not gone through the

process of learning life: they have no emotional memory equivalent to that of man. A.I. may someday be able to work in diverse environments as well as a human being, but in situations like this, in the short term, A.I. is far more likely to augment our labour or do things we simply don’t want to do.

Status quo

 

Humans interact with artificial intelligence systems on a daily basis, without even realising.
A.I. is not science fiction and is already among us: chatbots, facial expression recognition, translators, personal assistants, for example.

Which are the pros and cons?  

There are obvious advantages to our machines being more sensitive to our needs, that means understand better what we are looking for, both economically and emotionally.

“Even today we spend more time with our phone than with our mothers or any other family member, so they just know and see you more than anything else. And the technology is also getting to a place where you can get these facial and vocal reactions, which can be very early cues if you are too stressed for too long. If you can pick those signals up early, you can do something about it.”

 

 

Mihkel Jäätma (CEO of Realeyes, an emotional start-up) [5]

Prof. Andrew McStay, author of “Emotional A.I.: The Rise of Empathic Media”, thinks that emotion A.I. is dangerous when it affects “future life opportunities” and he is especially concerned about its impact on the workplace. He also worries about the “fake” idea that it’s possible to accurately interpret emotions simply by analysing a person’s face – For example: you might scowl when you’re angry, but also when you are concentrating or have a headache -. [6]

“What we are talking about is 360-degree surveillance. Who benefits from that? I certainly don’t think it’s the mass of people within an organisation, ie its workers. We have this suite of technologies but then we also have this suite of financial opportunities. And it’s whether these financial opportunities are a little bit too lucrative for people to be ethically minded. Follow the money, as it were.”

Prof. Andrew McStay [7]

There are also clear advantages to emotion A.I. being used in retail and there are two clear examples:
- in Dec. 2017 Facebook patented a camera that will be put in shopping malls or shops to recognize people by their Facebook profile. The camera will know our search patterns in Facebook and Google built on our like and our interests and they will also see our behavioural signals in the shops, that means what do we like, which department do we visit most etc. Based on that information, the price of products will be customised: the more you search or like something, the higher price will be. [8]
- Realeyes, an emotion A.I start-up, which uses eye-tracking and facial expression to analyse mood, did an experiment with Mothercare, a shop in the UK, which found out that customers who were greeted at the door with a smile tended to spend more.
Cameras track you as you move through shops, they see how you react to different products or what you think of in-store advertising displays. Have you given your consent for this exchange? By entering the store, perhaps you have, tacitly, but it can be harmful. [9]

What do people think?

In November 2015, Prof. Andrew McStay asked 2,000 UK citizens about whether they would be happy to be the subject of any kind of emotion detecting. He found that 50% of UK citizens are ‘not OK’ with emotion detection any form; 33% are ‘OK’ with emotion detection if they are

not personally identifiable; 8% are ‘OK’ with having data about emotions connected with personally identifiable information; 9% do not know. [10]

What do industries think?

Emotion A.I is now attracting the attention, and funding, of the technology giants. For a long time, it was mainly worked on by start-ups, largely because “the Amazons, the Facebooks, the Googles” were put off by “the creepiness factor”. That, however, has changed: Apple bought the

emotion A.I start-up Emotient in 2016, while Facebook is developing its own emotion-based products. In May 2017, Amazon announced that it had improved its smart assistant Alexa by using A.I to detect human emotion.

 

Emotional A.I. raises concerns too and can also be misused!
Do we really want our emotions to be machine-readable? How can we know that this data will be used in a way that will benefit citizens? Would we be happy for our employers to profile us at work, and perhaps make judgments on our stress management and overall competence? What about insurance companies using data accrued on our bodies and emotional state?

 

To sum up, we can answer these three important questions:

 

  • Do artificial intelligence systems need to feel emotions?
    At the moment, it does not seem that they require to feel emotions nor does it seem that having emotions would help the system improve the execution of tasks.
    In the case of systems that interact with humans, it could be of great value for them to be capable of detecting specific, basic human emotions, and react accordingly. Even if this were possible however, it would not mean that technology has emotions! Machines do not have to be empathetic - they only need to appear so.

 

  • Would it be useful for humans if robots had emotions?
    Knowing that machines seem to have emotions is sufficient but, forwards in the time, it is obvious that we’ll wish to establish emotional connections with these machines. Furthermore, at some point we might require these links to be reciprocal, mirroring real inter-personal relationships.

 

  • Is it possible to provide artificial intelligence with feelings and emotions?
    All this places us far from creating an algorithm that is capable of copying how human emotions are produced. The current standard nowadays is to use calibration stimuli - for example, several positive and negative images -, and then to apply automatic learning algorithms, known as machine learning. But computational models are not the human brain, cannot replicate the human brain, and are far from doing so.

 

 

Elisa Mosca

  • Instagram - eelisee­­_

Wall-E: Der Roboter mit Gefühlen. Eine Disney Pixar-Fantasie, die sich schleichend zur Realität entwickelt. Ein verliebter Roboter hat im Jahr 2008 die Herzen der Kinobesucher erobert – heute sind es „emotionalisierte“ Softwares, die unsere Jobs übernehmen

Wenn Maschinen Emotionen lernen

Ein Multitalent als Arbeitskollege – wer erträumt sich das nicht? Doch was, wenn dieser Kollege gar nicht menschlich ist, sondern ein Algorithmus? Was für Vorteile hat Künstliche Intelligenz und gehört ihr die Zukunft?

Künstliche Intelligenz verschafft Kommunikationsunternehmen Vorteile

Was haben diese Personen gemeinsam? Eine einsame ältere Dame in Singapur. Minderjährige, arbeitssuchende Flüchtlinge in Italien. Ein Jugendlicher in Deutschland, der im Supermarkt fair produziertes Fleisch, einkaufen will. Automatisierte Kommunikation kann ihnen dabei helfen, Missstände aufzudecken, Lösungen für individuelle Probleme zu finden und Krisensituationen zu meistern.

Automatisierte Kommunikation in der humanitären Hilfe

  • sharearrow
  • Facebook - Weiß, Kreis,
  • Twitter - Weiß, Kreis,
  • LinkedIn - Weiß, Kreis,

© 2020 Seminar Automated Communications