Journal of Behavioral Data Science, 2023, 3 (1), 59–69.
DOI: https://doi.org/10.35566/jbds/v3n1/wyman

API Face Value: Evaluating the Current Status and Potential of Emotion Detection Software in Emotional Deficit Interventions

Austin T. Wyman and Zhiyong Zhang
University of Notre Dame, Notre Dame, USA
awyman@nd.edu
Abstract. Emotion recognition application programming interface (API) is a recent advancement in computing technology that synthesizes computer vision, machine-learning algorithms, deep-learning neural networks, and other information to detect and label human emotions. The strongest iterations of this technology are produced by technology giants with large, cloud infrastructure (i.e., Google, and Microsoft), bolstering high true positive rates. We review the current status of applications of emotion recognition API in psychological research and find that, despite evidence of spatial, age, and race bias effects, API is improving the accessibility of clinical and educational research. Specifically, emotion detection software can assist individuals with emotion-related deficits (e.g., Autism Spectrum Disorder, Attention Deficit-Hyperactivity Disorder, Alexithymia). API has been incorporated in various computer-assisted interventions for Autism, where it has been used to diagnose, train, and monitor emotional responses to one’s environment. We identify AP’s potential to enhance interventions in other emotional dysfunction populations and to address various professional needs. Future work should aim to address the bias limitations of API software and expand its utility in subfields of clinical, educational, neurocognitive, and industrial-organizational psychology.

Keywords: API · Emotion Recognition · Machine Learning · ASD · ADHD · Alexithymia

Emotions, their expression, and understanding are often described as unique characteristics of human life and development; however, with the growing sophistication of computer vision and machine learning, computing technology is rapidly shrinking the disparity between human and artificial intelligence. This evolution is particularly marked by a redefinition of artificial intelligence (Lisetti & Schiano2000). While originally referring to computers’ ability to perform cognitive tasks, artificial intelligence has now expanded to include a variety of subfields, including artificial wisdom (Jeste et al.2020), and emotional intelligence (Erol et al.2020Poria, Majumder, Mihalcea, & Hovy2019). These advancements represent the latest hurdles computing technology must jump to match human intelligence capabilities (Schuller & Schuller2018), which has the potential to enhance psychological research. The expansion of emotion detection software is highly relevant to the improvement of measurement in psychological research and practice.

1 Introduction to Emotion Recognition API

Application programming interface (API) is a broad term that describes any means of communication between two or more computer programs. In particular, emotion recognition APIs allow the synthesis of computer vision, machine learning algorithms, deep learning neural networks, and other components in order to accurately detect and label human emotions (Deshmukh & Jagtap2017). The emotion API performance is further enhanced by cloud-based support, which continuously supplies learning algorithms with severs full of facial and emotional data (Khanal, Barroso, Lopes, Sampaio, & Filipe2018). Naturally, technology giants with the largest cloud infrastructure (e.g., Amazon, Microsoft, Google), are the most equipped to construct accurate emotion recognition programs. While specific expressions that can be detected vary from program to program, most algorithms are minimally equipped to identify the six basic human emotions: disgust, contempt, anger, fear, surprise, and sadness (Deshmukh & Jagtap2017).

The leading iterations of this technology are Microsoft Azure and Google Cloud Vision, which offer distinct advantages over one another in emotion recognition (Khanal et al.2018). Microsoft’s API triumphs in overall accuracy, reporting high true positive (TP) rates for straight-facing and partially-straight facing profiles (Half Left Face TP = 60%; Straight Face TP = 74.9%; Half Right TP = 57.4%). Google’s API, however, can detect a wider range of facial profiles, particularly side-facing, but with reduced accuracy (Full Left Face TP = 7.3%; Half Left TP = 42.9%; Straight TP = 45.2%; Half Right TP = 43.2%; Full Right TP = 10.4%). The lack of non-frontal facial recognition is a significant limitation, but the implementation of new machine learning frameworks is gradually improving detection accuracy (Lin, Ma, Gong, & Wang2022).

It is important to mention that programming limitations are often readily addressed in future software updates, but sampling limitations require more targeted attention. Emotion APIs are typically trained with large samples of facial and corresponding emotion data, but a lack of diverse data often makes it difficult to account for physiological differences in emotion expression among different groups (Hernandez et al.2021). Due to convenience sampling, training samples predominantly consist of white, young adults in America. This produces significant racial and age bias effects, which further confound previous accuracy estimates. For example, Microsoft and Amazon’s APIs are more likely to label Black participants’ neutral faces as angry or contemptuous compared to white participants exhibiting the same emotion (Kyriakou, Kleanthous, Otterbacher, & Papadopoulos2020Rhue2018). Additionally, these programs demonstrate reduced accuracy with middle aged and older adults compared to young adult participants (Kim, Bryant, Srikanth, & Howard2021). Gender bias used to be a serious concern in previous API iterations (Klare, Burge, Klontz, Vorder-Bruegge, & Jain2012), but current research suggests that this disparity was addressed in recent updates (Kim et al.2021). Whether through sampling adjustment or algorithmic improvement (Howard, Zhang, & Horvitz2017), it is likely the emotion recognition API will become more accurate with respect to racial and age bias, but progress in this area requires selective attention to improving representation.

2 Current Applications of Emotion Recognition API

The present review searched electronic databases (i.e., Google Scholar and Psyinfo) using five term categories: “emotion API”, “emotion detection”, “psychology”, “intervention”, and “emotion deficit.” Studies published since 2017 were included in the review if they (a) were published in English, (b) developed a new intervention using emotion recognition API, and (c) targeted individuals with emotion-related deficits. Current publications on emotion recognition API have seldom reached the mainstream of psychology research, as most studies exploring this technology have focused more on the computer science and algorithmic strength of software than its applications in measuring psychological constructs. Nonetheless, key methodologies have emerged in clinical, neurodevelopmental, and educational psychology research (See Table 1).

Table 1: Current Applications of Emotion Recognition API in Psychology

Author (year)

Area

Software

Alharbi and Huang (2020)

Clinical

Microsoft

Bharatharaj, Huang, Mohan, Al-Jumaily, and Krägeloh (2017)

Clinical

Oxford

Grossard et al. (2017)

Clinical

-

Jiang et al. (2019)

Clinical

-

Liu, Wu, Zhao, and Luo (2017)

Clinical

-

Manfredonia et al. (2018)

Clinical

FACET

Chu, Tsai, Liao, and Chen (2017)

Educational

FACEAPI

Chu, Tsai, Liao, Chen, and Chen (2020)

Educational

Face Tracking API 3.2

Borsos, Jakab, Stefanik, Bogdán, and Gyori (2022)

Quantitative

FR8

Flynn et al. (2020)

Quantitative

iMotions

In clinical and neurodevelopmental areas, emotion detection software has assisted with the monitoring, treatment, and education of various individuals with emotion-related deficits. Byrne, Bogue, Egan, and Lonergan (2016) writes that “psychological mindedness,” the process of identifying and describing emotions, is an “explicit mentalizing capacity that is needed to engage effectively in psychotherapy.” Many talk-therapy techniques rely upon a baseline level of emotional intelligence, requiring that individuals are able to understand their own and others’ emotions. However, clients with emotional deficits struggle with emotion recognition, and, therefore, may not benefit from talk-therapy. Thus, emotion-related interventions are an important gateway step to other substantive areas of mental health treatment.

An expansive body of literature has investigated interventions for improving psychological mindedness in neurodevelopmental disorders, particularly Autism Spectrum Disorder (ASD) and Attention Deficit-Hyperactivity Disorder (ADHD). These populations often struggle with reduced empathy (Baron-Cohen & Wheelwright2004Da Fonseca, Seguier, Santos, Poinso, & Deruelle2009Uekermann et al.2010) and emotion self-regulation (Braaten & Rosen2000), producing significant behavioral problems (Milton2012). The prime window to treat emotional deficits is during early childhood, but many individuals are diagnosed later in life. As the brain matures and patterns of dysfunctional social cognition become fixed, it is incredibly difficult to teach fundamental skills like empathy (Baron-Cohen2009). Given the behavioral consequences of ASD and ADHD in adolescents and adults, it is important that current emotional-deficit interventions are expanded to include populations in late-stage treatment.

Emotion recognition API, thus, is valuable because it can be readily incorporated in a variety of intervention settings and stages, from diagnosis to late-stage treatment (Liu et al.2017). Regarding diagnosis, Manfredonia et al. (2018) used facial expression analysis software to measure differences in emotion expression and replicated diagnoses for ASD participants, ranging from 9-years-old to 54-years-old. Similarly, Jiang et al. (2019) synthesized emotion recognition and eye-tracking software to achieve a diagnosis accuracy rate that was competitive with those by professional psychologists. Post diagnosis, emotion API has been used to provide engaging education for ASD participants to build emotion-related skills. For example, Bharatharaj et al. (2017) developed a semi-autonomous robot presented as a toy parrot, which used Oxford API to monitor emotion regulation and practice social interaction with ASD children. Alharbi and Huang (2020) designed computer games that reward ASD children for accurately matching facial expressions in order to train empathy and communication skills. Many other popular games have been adapted using emotion API and computer-assisted instruction (Grossard et al.2017), which improves both the accessibility and entertainment of diagnostic and intervention strategies for children with neurodevelopmental disorders.

The concern of emotion regulation in children also emerges within educational psychology literature, with multiple studies demonstrating that students with better emotion regulation ability perform better in the classroom and have higher levels of academic achievement (Gumora & Arsenio2002Howse, Calkins, Anastopoulos, Keane, & Shelton2003). Naturally, students with emotional deficits, such as ASD, report much lower academic achievement rates than typically developing students (Ashburner, Ziviani, & Rodger2010). In E-Learning environments, emotion recognition API has been used to detect emotion changes in students with ASD during assessments (Chu et al.2017), for targeted intervention strategies. This intervention was followed up by (Chu et al.2020), which designed an emotion API-based intervention that utilized computer adaptive testing to identify and address learning stress in students with ASD. The result of this intervention significantly improved students’ math performance compared with baseline scores.

From a measurement perspective, some studies have raised concern about the reliability of the software’s emotion estimates. Borsos et al. (2022) evaluated the test-retest reliability of emotion API and found small but significant differences in the ratings. Flynn et al. (2020) observed group differences in the accuracy of emotion estimates between children and adults. However, it is important to note that both of these studies used emotion detection software (FR8 and iMotions respectively) that is meagerly discussed in the literature compared to the API produced by tech giants (e.g., Google Cloud and Microsoft Azure). These limitations are likely not representative of the method as a whole because these studies are operating on less-than-standard measurement tools. Nonetheless, inconsistency in emotion API responses are to be expected to some extent, which highlights the imperfect nature of emotion estimates. However, the adaptability of emotion detection software is a critical strength of this measurement approach, and as the software is incrementally improved over time, the accuracy of emotion estimates will also improve.

3 Potential Applications of Emotion Recognition API

Beyond neurodevelopmental disorders, clinical literature expresses a need for interventions to address a wide-range of psychopathology exhibiting emotion-related deficits. Alexithymia and empathy-related concerns are present in many other disorder classifications, particularly personality disorders (De Panfilis, Ossola, Tonna, Catania, & Marchesi2015Thoma, Friedmann, & Suchan2013), and often lead to interpersonal dysfunction (Cook, Brewer, Shah, & Bird2013Vanheule, Desmet, Meganck, & Bogaerts2007), internalizing and externalizing behavior (Aldao et al., 2016). Current personality pathology interventions often rely on self-report instruments, which have various validity concerns (Haeffel & Howard2010). Thus, the increased availability of emotion detection software has the potential to expand the range of options in how emotion-related experiments are designed. Emotion API has demonstrated its effectiveness in predicting Big Five personality traits and risk-taking behavior (Gloor et al.2022), which is a significant facet of pathological personality (Watson & Clark2020). Detection software could be readily incorporated into studies interested in examining operational ways of measuring emotion dysregulation and psychopathological traits.

Regarding interventions, API strategies for neurodevelopmental disorders have not been tested on other psychopathology, but these interventions could generalize well with disorders that exhibit similar transdiagnostic traits. For example, Antisocial Personality Disorder and Narcissistic Personality Disorders often overlap with ASD and ADHD (Matthies & Philipsen2016). Emotion API could be an incredibly valuable tool in the measurement and design of pathological personality interventions beyond the scope of its current self-report methodology, which could benefit researchers and practitioners alike.

Emotion API interventions could also generalize to the broader, industrial-organizational need for better emotional intelligence trainings. Emotional intelligence is frequently measured in industrial-organizational contexts and is associated with multiple occupational outcomes, including job performance, retention, and interpersonal relations (Prentice, Lopes, & Wang2020). Thus, many industries declare a strong vested interest in screening for candidates with high emotional intelligence, or enhancing the emotional intelligence of their current employees. Facial expression is often described as a basic facet of emotional intelligence (Hildebrandt, Sommer, Schacht, & Wilhelm2015), and is often a targeted topic in emotional intelligence training programs. Employers and industrial-organizational researchers could capitalize off the automated and adaptive features of emotion recognition API to quickly improve employees’ emotional intelligence ability. API-based programs in emotion regulation could be inserted as a complement to existing modules on effective nonverbal communication and empathy.

As mentioned previously, emotion regulation is a critical component of students’ success in the classroom (Gumora & Arsenio2002Howse et al.2003), but other aspects of emotional functioning are relevant as well. Despite a common avoidance to express negative emotions, literature shows that negative emotions are a way to elicit support and build stronger relationships (Graham et al., 2008). Students who less openly express their emotions are less likely to receive help when struggling because they are often unable to call attention to signs of distress. That said, similar emotion expression interventions to the ones currently used for ASD could be helpful to acclimate these types of students to the importance of emotional intelligence. Alternatively, emotion API could be integrated into research focusing on instructors. Literature suggests that the emotion regulation ability of instructors also impacts student engagement and success in the classroom (Sutton et al., 2009; Wang & Ye, 2021). Detection software could complement classroom observation studies, generating ecological momentary assessments of instructors and their emotion regulation ability over the course of a lecture, which may be more accurate and reliable than current self-report or interview assessment strategies.

Broadly speaking, the integration of computational research methods would greatly benefit all areas of psychology, and this can especially be seen with emotion recognition API. The software allows researchers to easily collect and assign quantitative values to emotion-related data (Yannakakis, Cowie, & Busso2021), which increases the feasibility of collecting larger data without compromising quality of data. Emotion recognition API triumphs in efficiency over traditional measurement attempts, which are often long, unreliable, and cumbersome. Neurocognitive research could especially benefit from an increased efficiency in data collection, which is a contributing factor to concerns of low statistical power in current research (Button et al.2013). An upgrade in statistical power is highly important and has the potential to increase the frequency and reproducibility of emotion-related research in clinical trials and neurocognitive work.

4 Conclusion

Although the integration of emotion recognition API is very much in its infancy in psychology, several subfields would benefit from an expansion of this highly adaptive area of measurement. Clinical research could enhance current intervention, develop new models of treatment, and establish new methods of measuring emotional functioning domains. Industrial-organizational research could develop new emotional intelligence indexes and training programs. Educational research could identify new ways of identifying and supporting students in the classroom. And neurocognitive research could generate more power and enhance the precision of neural mechanisms behind emotional expression. As this technology becomes more accessible, future studies should investigate API in all of these important disciplines and other, unidentified yet equally important areas. Although there are significant concerns of reliability and bias in the software currently, the incremental improvement of cloud-based programs confidently suggests that API is becoming a more reliable tool. Understanding emotions is a fundamental facet of human life experiences and emotion recognition API will allow psychologists to understand this phenomenon even further.

Acknowledgement

This study is supported by a grant from the Department of Education (R305D210023). However, the contents of the study do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government.

References

   Alharbi, M., & Huang, S. (2020). An augmentative system with facial and emotion recognition for improving social skills of children with autism spectrum disorders. In 2020 IEEE International Systems Conference (SysCon) (pp. 1–6). doi: https://doi.org/10.1109/SysCon47679.2020.9275659

   Ashburner, J., Ziviani, J., & Rodger, S. (2010). Surviving in the mainstream: Capacity of children with autism spectrum disorders to perform academically and regulate their emotions and behavior at school. Research in Autism Spectrum Disorders, 4(1), 18–27. doi: https://doi.org/10.1016/j.rasd.2009.07.002

   Baron-Cohen, S. (2009). Autism: The empathizing-systemizing (E-S) theory. Annals of the New York Academy of Sciences, 1156(1), 68–80. doi: https://doi.org/10.1111/j.1749-6632.2009.04467.x

   Baron-Cohen, S., & Wheelwright, S. (2004). The empathy quotient: An investigation of adults with asperger syndrome or high functioning autism, and normal sex differences. Journal of Autism and Developmental Disorders, 34(2), 163–175. doi: https://doi.org/10.1023/B:JADD.0000022607.19833.00

   Bharatharaj, J., Huang, L., Mohan, R. E., Al-Jumaily, A., & Krägeloh, C.  (2017). Robot-assisted therapy for learning and social interaction of children with autism spectrum disorder. Robotics, 6(1), 4. doi: https://doi.org/10.3390/robotics6010004

   Borsos, Z., Jakab, Z., Stefanik, K., Bogdán, B., & Gyori, M. (2022). Test–retest reliability in automated emotional facial expression analysis: Exploring facereader 8.0 on data from typically developing children and children with autism. Applied Sciences, 12(15), 7759. doi: https://doi.org/10.3390/app12157759

   Braaten, E. B., & Rosen, L. A. (2000). Self-regulation of affect in attention deficit-hyperactivity disorder (adhd) and non-adhd boys: Differences in empathic responding. Journal of Consulting and Clinical Psychology, 68(2), 313–321. doi: https://doi.org/10.1037/0022-006X.68.2.313

   Button, K. S., Ioannidis, J. P., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., & Munafò, M. R. (2013). Power failure: Why small sample size undermines the reliability of neuroscience. Nature Reviews Neuroscience, 14, 365–376. doi: https://doi.org/10.1038/nrn3475

   Byrne, G., Bogue, J., Egan, R., & Lonergan, E. (2016). “identifying and describing emotions”: Measuring the effectiveness of a brief, alexithymia-specific intervention for a sex offender population. Sexual Abuse: A Journal of Research and Treatment, 28(7), 599–619. doi: https://doi.org/10.1177/1079063214528822

   Chu, H.-C., Tsai, W.-H., Liao, M.-J., & Chen, Y.-M. (2017). Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning. Soft Computing, 22, 2973–2999. doi: https://doi.org/10.1007/s00500-017-2549-z

   Chu, H.-C., Tsai, W.-H., Liao, M.-J., Chen, Y.-M., & Chen, J.-Y. (2020). Supporting e-learning with emotion regulation for students with autism spectrum disorder. Educational Technology & Society, 23(4), 124–146. Retrieved from https://www.jstor.org/stable/26981748

   Cook, R., Brewer, R., Shah, P., & Bird, G. (2013). Alexithymia, not autism, predicts poor recognition of emotional facial expressions. Psychological Science, 24(5), 723–732. doi: https://doi.org/10.1177/0956797612463582

   Da Fonseca, D., Seguier, V., Santos, A., Poinso, F., & Deruelle, C. (2009). Emotion understanding in children with adhd. Child Psychiatry and Human Development, 40(1), 111–121. doi: https://doi.org/10.1007/s10578-008-0114-9

   De Panfilis, C., Ossola, P., Tonna, M., Catania, L., & Marchesi, C. (2015). Finding words for feelings: The relationship between personality disorders and alexithymia. Personality and Individual Differences, 74, 285–291. doi: https://doi.org/10.1016/j.paid.2014.10.050

   Deshmukh, R. S., & Jagtap, V. (2017). A survey: Software api and database for emotion recognition. In 2017 international conference on intelligent computing and control systems (iciccs) (pp. 284–289). doi: https://doi.org/10.1109/ICCONS.2017.8250727

   Erol, B. A., Majumdar, A., Benavidez, P., Rad, P., Choo, K.-S., & Jamshidi, M.  (2020). Toward artificial emotional intelligence for cooperative social human-machine interaction. IEEE Transactions on Computational Social Systems, 7(1), 234–246. doi: https://doi.org/10.1109/TCSS.2019.2922593

   Flynn, M., Effraimidis, D., Angelopoulou, A., Kapetanios, E., Williams, D., Hemanth, J., & Towell, T. (2020). Assessing the effectiveness of automated emotion recognition in adults and children for clinical investigation. Frontiers in Human Neuroscience, 14. doi: https://doi.org/10.3389/fnhum.2020.00070

   Gloor, P. A., Colladon, A. F., Altuntas, E., Cetinkaya, C., Kaiser, M. F., Ripperger, L., & Schaefer, T. (2022). Your face mirrors your deepest beliefs—predicting personality and morals through facial emotion recognition. Future Internet, 14(1), 5. doi: https://doi.org/10.3390/fi14010005

   Grossard, C., Grynspan, O., Serret, S., Jouen, A.-L., Bailly, K., & Cohen, D.  (2017). Serious games to teach social interactions and emotions to individuals with autism spectrum disorders (asd). Computers & Education, 113, 195–211. doi: https://doi.org/10.1016/j.compedu.2017.05.002

   Gumora, G., & Arsenio, W. F. (2002). Emotionality, emotion regulation, and school performance in middle school children. Journal of School Psychology, 40(5), 395–413. doi: https://doi.org/10.1016/S0022-4405(02)00108-5

   Haeffel, G. J., & Howard, G. S. (2010). Self-report: Psychology’s four-letter word. The American Journal of Psychology, 123(2), 181–188. doi: https://doi.org/10.2307/40827643

   Hernandez, J., Lovejoy, J., McDuff, D., Suh, J., O’Brien, T., Sethumadhavan, A., … Czerwinski, M. (2021). Guidelines for assessing and minimizing risks of emotion recognition applications. In 2021 9th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1–8). doi: https://doi.org/10.1109/ACII52823.2021.9597452

   Hildebrandt, A., Sommer, W., Schacht, A., & Wilhelm, O. (2015). Perceiving and remembering emotional facial expressions–A basic facet of emotional intelligence. Intelligence, 50, 52–67. doi: https://doi.org/10.1016/j.intell.2015.02.003

   Howard, A., Zhang, C., & Horvitz, E. (2017). Addressing bias in machine learning algorithms: A pilot study on emotion recognition for intelligent systems. In 2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO) (pp. 1–7). doi: https://doi.org/10.1109/ARSO.2017.8025197

   Howse, R. B., Calkins, S. D., Anastopoulos, A. D., Keane, S. P., & Shelton, T. L. (2003). Regulatory contributors to children’s kindergarten achievement. Early Education and Development, 14(1), 101–120. doi: https://doi.org/10.1207/s15566935eed1401_7

   Jeste, D. V., Graham, S. A., Nguyen, T. T., Depp, C. A., Lee, E. E., & Kim, H. (2020). Beyond artificial intelligence: Exploring artificial wisdom. International Psychogeriatrics, 32(8), 993–1001. doi: https://doi.org/10.1017/S1041610220000927

   Jiang, M., Francis, S. M., Srishyla, D., Conelea, C., Zhao, Q., & Jacob, S.  (2019). Classifying individuals with asd through facial emotion recognition and eye-tracking. In 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 6063–6068). doi: https://doi.org/10.1109/EMBC.2019.8857005

   Khanal, S. R., Barroso, J., Lopes, N., Sampaio, J., & Filipe, V. (2018). Performance analysis of microsoft’s and google’s emotion recognition api using pose-invariant faces. In Proceedings of the 8th international conference on software development and technologies for enhancing accessibility and fighting info-exclusion (pp. 172–178). doi: https://doi.org/10.1145/3218585.3224223

   Kim, E., Bryant, D., Srikanth, D., & Howard, A. (2021). Age bias in emotion detection: An analysis of facial emotion recognition performance on young, middle-aged, and older adults. In Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (pp. 638–644). doi: https://doi.org/10.1145/3461702.3462609

   Klare, B. F., Burge, M. J., Klontz, J. C., Vorder-Bruegge, R. W., & Jain, A. K. (2012). Face recognition performance: Role of demographic information. IEEE Transactions on Information Forensics and Security, 7(6), 1789–1801. doi: https://doi.org/10.1109/TIFS.2012.2214212

   Kyriakou, K., Kleanthous, S., Otterbacher, J., & Papadopoulos, G. A.  (2020). Emotion-based stereotypes in image analysis services. In Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization (pp. 252–259). doi: https://doi.org/10.1145/3386392.3399567

   Lin, H., Ma, H., Gong, W., & Wang, C.  (2022). Non-frontal face recognition method with a side-face-correction generative adversarial networks. In 2022 3rd International Conference on Computer Vision, Image and Deep Learning & International Conference on Computer Engineering and Applications (CVIDL & ICCEA) (pp. 563–567). doi: https://doi.org/10.1109/CVIDLICCEA56201.2022.9825237

   Lisetti, C. L., & Schiano, D. J. (2000). Automatic facial expression interpretation: Where human-computer interaction, artificial intelligence and cognitive science intersect. Pragmatics and Cognition, 8(1), 185–235. doi: https://doi.org/10.1075/pc.8.1.09lis

   Liu, X., Wu, Q. J., Zhao, W., & Luo, X. (2017). Technology-facilitated diagnosis and treatment of individuals with autism spectrum disorder: An engineering perspective. Applied Sciences, 7(10), 1051. doi: https://doi.org/10.3390/app7101051

   Manfredonia, J., Bangerter, A., Manyakov, N. V., Ness, S., Lewin, D., Skalkin, A., … others (2018). Automatic recognition of posed facial expression of emotion in individuals with autism spectrum disorder. Journal of Autism and Developmental Disorders, 49, 279–293. doi: https://doi.org/10.1007/s10803-018-3757-9

   Matthies, S., & Philipsen, A. (2016). Comorbidity of personality disorders and adult attention deficit hyperactivity disorders (adhd)—review of recent findings. Current Psychiatry Reports, 18(4), 1–7. doi: https://doi.org/10.1007/s11920-016-0675-4

   Milton, D. E. M.  (2012). On the ontological status of autism: The ‘double empathy problem.’. Disability & Society, 27(6), 883–887. doi: https://doi.org/10.1080/09687599.2012.710008

   Poria, S., Majumder, N., Mihalcea, R., & Hovy, E. (2019). Emotion recognition in conversation: Research challenges, datasets, and recent advances. IEEE Access, 7, 100943–100953. doi: https://doi.org/10.1109/ACCESS.2019.2929050

   Prentice, C., Lopes, S. D., & Wang, X. (2020). Emotional intelligence or artificial intelligence—an employee perspective. Journal of Hospitality Marketing & Management, 29(4), 377–403. doi: https://doi.org/10.1080/19368623.2019.1647124

   Rhue, L. (2018). Racial influence on automated perceptions of emotions. SSRN Electronic Journal. doi: https://doi.org/10.2139/ssrn.3216634

   Schuller, D., & Schuller, B. W. (2018). Computer. Computer, 51(9), 38–46. doi: https://doi.org/10.1109/MC.2018.3620963

   Thoma, P., Friedmann, C., & Suchan, B. (2013). Empathy and social problem solving in alcohol dependence, mood disorders and selected personality disorders. Neuroscience & Biobehavioral Reviews, 37(3), 448–470. doi: https://doi.org/10.1016/j.neubiorev.2013.01.024

   Uekermann, J., Kraemer, M., Abde-Hamid, M., Schimmelmann, B. G., Hebebrand, J., Daum, I., … Kis, B. (2010). Social cognition in attention-deficit hyperactivity disorders (adhd). Neuroscience & Biobehavioral Reviews, 34(5), 734–743. doi: https://doi.org/10.1016/j.neubiorev.2009.10.009

   Vanheule, S., Desmet, M., Meganck, R., & Bogaerts, S. (2007). Alexithymia and interpersonal problems. Journal of Clinical Psychology, 63(1), 109–117. doi: https://doi.org/10.1002/jclp.20324

   Watson, D., & Clark, L. A. (2020). Personality traits as an organizing framework for personality pathology. Personality and Mental Health, 14, 51–75. doi: https://doi.org/10.1002/pmh.1458

   Yannakakis, G. N., Cowie, R., & Busso, C. (2021). The ordinal nature of emotions: An emerging approach. IEEE Transactions on Affective Computing, 12(1), 16–35. doi: https://doi.org/10.1109/TAFFC.2018.2879512