Zoom plans to read emotions from the faces

Zoom plans to read emotions from the faces of conference participants

As a globally popular program, Zoom now intends to also scan the emotions of conference participants in the future so that their faces can be read accordingly. Civil rights activists are sounding the alarm. There is a thorough warning about this emotion scanner. It remains to be seen to what extent this new function will lead to abuse and violations of the law - according to the critics.

This is to determine, for example, whether students find a lecture interesting at all or are bored, or whether the colleague is still listening properly or not. If a customer thinks the price is too high, this is shown using this scanner via facial recognition. The app called Zoom is a US company. It is mainly used for online video conferencing. In the meantime, however, artificial intelligence is being used in many areas. This also means reading the body language of the conference participants. This body language is analyzed by the artificial intelligence so that the mood can be interpreted accordingly.

There are many opponents

More than 30 organizations are now speaking out against these plans. It is not desired that the moods are read. This means, for example, that false estimations are made because the artificial intelligence does not function reliably. Moreover, this means a violation of privacy and human rights at the same time. This function should not be developed further for these reasons - according to the organizations.

Penalties can be the consequence

It is feared that there will be corresponding punishments if quasi wrong feelings are expressed and shown. Therefore, this measure is not only racist, but also erroneous and misleading. Not everyone has the same facial expression; body language and voice pitch also differ. Certain people with disabilities or ethnicities are automatically discriminated against by such measures. The facial recognition and artificial intelligence do not work for people of Asian appearance or dark-skinned people. Furthermore, the software can be misused. If false feelings are expressed, universities or employees could react accordingly with penalties. The artificial intelligence puts pressure on the sellers. It is nothing new that the data of the software is used for marketing. Zoom already uses a similar tool to optimize online dialogues. Afterwards, for example, it is checked how many filler words were used or whether the person showed too little patience.

A tool for the evaluation of employees

The new software is advertised as being able to evaluate performance accordingly. Some companies are already using artificial intelligence to read emotions from camera images. Employees in call centers are often monitored with such software solutions.