Tech companies using pandemic to push AI surveillance tools: report

0
4

As the pandemic has led people and authorities to focus on fighting the coronavirus, some tech companies are trying to use this situation as a pretext to push ‘unproven’ artificial intelligence (AI) tools into places. work and schools, according to a report in the journal Nature. Amid serious debate about the potential for misuse of these technologies, several emotion reading tools are being marketed for remote monitoring of children and workers to predict their emotions and performance. These tools can capture emotions in real time and help organizations and schools better understand their employees and students, respectively.

For example, one of the tools decodes facial expressions and places them in categories such as happiness, sadness, anger, disgust, surprise, and fear.

ALSO READ  Asus Detachable CM3 Chromebook Launch: Full Details

This program is called 4 Little Trees and was developed in Hong Kong. It claims to assess children’s emotions while they are doing their classwork. Kate Crawford, academic researcher and author of The Atlas of AI, writes in Nature that such technology needs to be regulated for better policy making and public trust.

An example that could be used to build a case against AI is the polygraph test, commonly referred to as the “lie detector test,” which was invented in the 1920s. The US investigative agency FBI and the military Americans used the method for decades until it was finally banned.

ALSO READ  Redmi Note 8 (2021) confirmed to launch with MediaTek Helio G85 SoC

Any use of AI for random surveillance of the general public must be preceded by credible regulatory oversight. “It could also help set standards to counter overbreadth by companies and governments,” Crawford writes.

He also cited a tool developed by psychologist Paul Ekman that standardized six human emotions to make them suitable for computer vision. After the attacks of September 11, 2001, Ekman sold his system to US authorities to identify airline passengers who show fear or stress to investigate their involvement in terrorist acts. The system has been severely criticized for being racist and lacking in credibility.

ALSO READ  Cryptocurrency mining banned in Iran for 4 months due to power cuts

Allowing these technologies without independently auditing their effectiveness would be unfair to applicants, who would be judged unfairly because their facial expressions do not match those of employees; students would be flagged in schools because a machine made them angry. Author Kate Crawford called for legislative protection against unproven uses of these tools.

.