New Study Observes Acceptance of Emotional AI Among Gen Z
A new study out of Ritsumeikan Asia Pacific University in Japan observes the socio-cultural factors that influence the acceptance of AI technology among Generation Z.
Emotional AI, which is artificial intelligence that engages human emotions, is quickly growing and being used in a wide range of applications. With that said, it is fairly unregulated at this point and lacks recognition of cultural differences. Because of this, the team thinks it's crucial to study its acceptance among Gen Z, which is the demographic most vulnerable to emotional AI.
The team consisted of researchers from Japan and Vietnam, and the research was published in Technology in Society.
The new study was part of the project “Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for an Ethical Life.”
Non-Conscious Data Collection (NCDC)
Algorithms are getting increasingly better at sensing human emotions and interacting with them, which is leading to more of them being embedded into mainstream systems. Emotional AI is achieved through a process known as “non-conscious data collection,” or NCDC, which involves the algorithm collecting data on the user’s heart and respiration rate, voice tones, micro-facial expressions, gestures, and more. These enable the system to analyze a user’s mood and personalize the response.
There are many ethical and privacy concerns around emotional AI, which is why it’s important to explore how Gen Z feels about it. Gen Z makes up 36% of the global workforce, and it is likely the most vulnerable to emotional AI.
Professor Peter Mantello was one of the researchers involved with the study.
“NCDC represents a new development in human-machine relations, and are far more invasive compared to previous AI technologies. In light of this, there is an urgent need to better understand their impact and acceptance among the Gen Z members,” Prof. Mantello says.
Gen Z’s Response to Emotional AI
The team’s study surveyed 1,015 Gen Z respondents across 48 countries and 8 regions across the globe. The participants were asked about their attitudes towards NCDC being used by commercial and state actors. Bayesian multilevel analysis was then used to control for variables and observe the effects of each.
The study found that, overall, more than 50% of the respondents were concerned about the use of NCDC, but the attitude varied based on gender, income, education level, and religion.
Professor Nader Ghotbi was also involved with the study.
“We found that being male and having high income were both correlated with having positive attitudes towards accepting NCDC. In addition, business majors were more likely to be tolerant towards NCDC,” Prof. Ghotbi says.
Cultural factors like region and religion also had an impact. Individuals from Southeast Asia, Muslims, and Christians reported more concern over NCDC.
“Our study clearly demonstrates that sociocultural factors deeply impact the acceptance of new technology. This means that theories based on the traditional technology acceptance model by Davis, which does not account for these factors, need to be modified,” says Prof. Mantello.
The team responded to these concerns by suggesting a “mind-sponge” model-based approach that accounts for socio-cultural factors in assessing the acceptance of AI technology. It also recommended a thorough understanding of the potential risks, which could help enable effective governance and ethical design.
“Public outreach initiatives are needed to sensitize the population about the ethical implications of NCDC. These initiatives need to consider the demographic and cultural differences to be successful,” says Dr. Nguyen.
- What Are LLM Hallucinations? Causes, Ethical Concern, & Prevention
- LAION in Open Letter to European Parliament Urge Call to Protect Open-Source AI in Europe
- Finding Real Partnerships: How Utility Companies Are Evaluating Artificial Intelligence Vendors
- Innovative Bio-Inspired Sensor Detects Motion and Predicts Trajectories for Various Applications
- 5 “Best” AI Content Detection Tools (April 2023)