样式: 排序: IF: - GO 导出 标记为已读
-
Bayesian Optimization with Tree Ensembles to Improve Depression Screening on Textual Datasets IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-08-13 Tingting Zhao, ML Tlachac
-
VAD: A Video Affective Dataset with Danmu IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-28 Shangfei Wang, Xin Li, Feiyi Zheng, Jicai Pan, Xuewei Li, Yanan Chang, Zhou'an Zhu, Qiong Li, Jiahe Wang, Yufei Xiao
-
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-21 Bin Liang, Lin Gui, Yulan He, Erik Cambria, Ruifeng Xu
-
Emotion-Aware Multimodal Fusion for Meme Emotion Detection IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-20 Shivam Sharma, Ramaneswaran S, Md. Shad Akhtar, Tanmoy Chakraborty
-
Contrastive Learning based Modality-Invariant Feature Acquisition for Robust Multimodal Emotion Recognition with Missing Modalities IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-18 Rui Liu, Haolin Zuo, Zheng Lian, Bjorn W. Schuller, Haizhou Li
-
A Multi-Stage Visual Perception Approach for Image Emotion Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-08 Jicai Pan, Jinqiao Lu, Shangfei Wang
-
Can Large Language Models Assess Personality from Asynchronous Video Interviews? A Comprehensive Evaluation of Validity, Reliability, Fairness, and Rating Patterns IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-08 Tianyi Zhang, Antonis Koutsoumpis, Janneke K. Oostrom, Djurre Holtrop, Sina Ghassemi, Reinout E. de Vries
-
Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-03-01 Luz Martinez-Lucas, Wei-Cheng Lin, Carlos Busso
-
GDDN: Graph Domain Disentanglement Network for Generalizable EEG Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-29 Bianna Chen, C. L. Philip Chen, Tong Zhang
-
Guest Editorial: Ethics in Affective Computing IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-29 Jonathan Gratch, Gretchen Greene, Rosalind Picard, Lachlan Urquhart, Michel Valstar
Stunning advances in machine learning are heralding a new era in sensing, interpreting, simulating and stimulating human emotion. In the human sciences, research is increasingly highlighting the explanatory power of emotions, feelings, and other affective processes to predict how we think and behave. This is beginning to translate into an explosion of applications that can improve human wellbeing including
-
Dep-FER: Facial Expression Recognition in Depressed Patients Based on Voluntary Facial Expression Mimicry IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-27 Jiayu Ye, Yanhong Yu, Yunshao Zheng, Yang Liu, Qingxiang Wang
-
Vesper: A Compact and Effective Pretrained Model for Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-26 Weidong Chen, Xiaofen Xing, Peihao Chen, Xiangmin Xu
-
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-22 Ritik Vatsal, Shrivatsa Mishra, Rushil Thareja, Mrinmoy Chakrabarty, Ojaswa Sharma, Jainendra Shukla
-
Continuous Emotion Ambiguity Prediction: Modeling with Beta Distributions IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-20 Deboshree Bose, Vidhyasaharan Sethu, Eliathamby Ambikairajah
-
Facial Action Unit Detection and Intensity Estimation from Self-supervised Representation IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-19 Bowen Ma, Rudong An, Wei Zhang, Yu Ding, Zeng Zhao, Rongsheng Zhang, Tangjie Lv, Changjie Fan, Zhipeng Hu
-
Cross-Task Inconsistency Based Active Learning (CTIAL) for Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-16 Yifan Xu, Xue Jiang, Dongrui Wu
-
Bodily Sensation Map vs. Bodily Motion Map: Visualizing and Analyzing Emotional Body Motions IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-14 Myeongul Jung, Youngwug Cho, Jejoong Kim, Hyungsook Kim, Kwanguk Kim
-
Looking into Gait for Perceiving Emotions via Bilateral Posture and Movement Graph Convolutional Networks IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-13 Yingjie Zhai, Guoli Jia, Yu-Kun Lai, Jing Zhang, Jufeng Yang, Dacheng Tao
-
Multi-Modal Hierarchical Empathetic Framework for Social Robots With Affective Body Control IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-12 Yue Gao, Yangqing Fu, Ming Sun, Feng Gao
-
Avatar-Based Feedback in Job Interview Training Impacts Action Identities and Anxiety IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-08 Sarinasadat Hosseini, Jingyu Quan, Xiaoqi Deng, Yoshihiro Miyake, Takayuki Nozawa
-
Novel VR-Based Biofeedback Systems: A Comparison Between Heart Rate Variability- and Electrodermal Activity-Driven Approaches IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-08 Andrea Baldini, Elisabetta Patron, Claudio Gentili, Enzo Pasquale Scilingo, Alberto Greco
-
An Open-source Benchmark of Deep Learning Models for Audio-visual Apparent and Self-reported Personality Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-08 Rongfan Liao, Siyang Song, Hatice Gunes
-
Emotion Recognition in Conversation Based on a Dynamic Complementary Graph Convolutional Network IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-02-01 Zhenyu Yang, Xiaoyang Li, Yuhu Cheng, Tong Zhang, Xuesong Wang
-
Learning With Rater-Expanded Label Space to Improve Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-31 Shreya G. Upadhyay, Woan-Shiuan Chien, Bo-Hao Su, Chi-Chun Lee
-
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-31 Xiaoheng Zhang, Weigang Cui, Bin Hu, Yang Li
-
Modeling the Interplay Between Cohesion Dimensions: a Challenge for Group Affective Emergent States IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-30 Lucien Maman, Nale Lehmann- Willenbrock, Mohamed Chetouani, Laurence Likforman-Sulem, Giovanna Varni
-
Exploring Retrospective Annotation in Long-videos for Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-29 Patricia Bota, Pablo Cesar, Ana Fred, Hugo Placido da Silva
-
CFDA-CSF: A Multi-modal Domain Adaptation Method for Cross-subject Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-23 Magdiel Jiménez-Guarneros, Gibran Fuentes-Pineda
-
Show me How You Use Your Mouse and I Tell You How You Feel? Sensing Affect with the Computer Mouse IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-23 Paul Freihaut, Anja S. Göritz
-
How Virtual Reality Therapy Affects Refugees from Ukraine - acute stress reduction pilot study IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-10 Dorota Kamińska, Grzegorz Zwoliński, Dorota Merecz-Kot
-
Anthropomorphism and Affective Perception: Dimensions, Measurements, and Interdependencies in Aerial Robotics IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-04 Viviane Herdel, Anastasia Kuzminykh, Yisrael Parmet, Jessica R. Cauchard
-
Gusa: Graph-based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-04 Xiaojun Li, C.L. Philip Chen, Bianna Chen, Tong Zhang
-
MASANet: Multi-Aspect Semantic Auxiliary Network for Visual Sentiment Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2024-01-04 Jinglun Cen, Chunmei Qing, Haochun Ou, Xiangmin Xu, Junpeng Tan
-
Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-11-28 Ioannis T. Pavlidis, Theodora Chaspari, Daniel McDuff
In The formative years of Affective Computing [1], from the late 1990s and into the early 2000s, a significant fraction of research attention was focused on the development of methods for unobtrusive physiological measurement . It quickly became obvious that wiring people with electrodes and strapping cumbersome hardware to their bodies was not only restricting the types of experiments that could be
-
Automatic Deceit Detection Through Multimodal Analysis of High-Stake Court-Trials IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-10-05 Berat Biçer, Hamdi Dibeklioğlu
In this article we propose the use of convolutional self-attention for attention-based representation learning, while replacing traditional vectorization methods with a transformer as the backbone of our speech model for transfer learning within our automatic deceit detection framework. This design performs a multimodal data analysis and applies fusion to merge visual, vocal, and speech(textual) channels;
-
Guest Editorial Neurosymbolic AI for Sentiment Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-09-18 Frank Xing, Björn Schuller, Iti Chaturvedi, Erik Cambria, Amir Hussain
Neural network-based methods, especially deep learning, have been a burgeoning area in AI research and have been successful in tackling the expanding data volume as we move into a digital age. Today, the neural network-based methods are not only used for low-level cognitive tasks, such as recognizing objects and spotting keywords, but they have also been deployed in various industrial information systems
-
Confidence-Aware Sentiment Quantification via Sentiment Perturbation Modeling IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-08-04 Xiangyun Tang, Dongliang Liao, Meng Shen, Liehuang Zhu, Shen Huang, Gongfu Li, Hong Man, Jin Xu
Sentiment Quantification aims to detect the overall sentiment polarity of users from a set of reviews corresponding to a target. Existing methods equally treat and aggregate individual reviews’ sentiment to judge the overall sentiment polarity. However, the confidence of each review is not equal in sentiment quantification where sentiment perturbation arising from high- and low-confidence reviews may
-
Text-Based Fine-Grained Emotion Prediction IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-07-24 Gargi Singh, Dhanajit Brahma, Piyush Rai, Ashutosh Modi
Text-based emotion prediction is an important task in the field of affective computing. Most prior work has been restricted to predicting emotions corresponding to a few high-level emotion classes. This paper explores and experiments with various techniques for fine-grained (27 classes) emotion prediction $^\dagger$ . In particular, 1) we present a method to incorporate multiple annotations from different
-
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-07-24 Tanja Schneeberger, Mirella Hladký, Ann-Kristin Thurner, Jana Volkert, Alexander Heimerl, Tobias Baur, Elisabeth André, Patrick Gebhard
Understanding emotions is key to Affective Computing. Emotion recognition focuses on the communicative component of emotions encoded in social signals. This view alone is insufficient for a deeper understanding and computational representation of the internal, subjectively experienced component of emotions. This article presents a cognition-based method called Deep as a starting point for deeper computational
-
Spatio-Temporal Graph Analytics on Secondary Affect Data for Improving Trustworthy Emotional AI IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-07-20 Md Taufeeq Uddin, Lijun Yin, Shaun Canavan
Ethical affective computing (AC) requires maximizing the benefits to users while minimizing its harm to obtain trust from users. This requires responsible development and deployment to ensure fairness, bias mitigation, privacy preservation, and accountability. To obtain this, we require methodologies that can quantify, visualize, analyze, and mine insights from affect data. Hence, in this article,
-
Encoding Syntactic Information into Transformers for Aspect-Based Sentiment Triplet Extraction IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-07-07 Li Yuan, Jin Wang, Liang-Chih Yu, Xuejie Zhang
Aspect-based sentiment triplet extraction (ASTE) aims to extract triplets consisting of aspect terms and their associated opinion terms and sentiment polarities from sentences, a relatively new and challenging subtask of aspect-based sentiment analysis (ABSA). Previous studies have used either pipeline models or unified tagging schema models. These models ignore the syntactic relationships between
-
A Knowledge-Enhanced and Topic-Guided Domain Adaptation Model for Aspect-Based Sentiment Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-07-04 Yushi Zeng, Guohua Wang, Haopeng Ren, Yi Cai, Ho-Fung Leung, Qing Li, Qingbao Huang
Cross-domain aspect-based sentiment analysis has recently attracted significant attention, which can effectively alleviate the problem of lacking large-scale labeled data for supervised learning based methods. Most of current methods mainly focus on extracting domain-shared syntactic features to conduct the domain adaptation. Due to the language and syntax are diverse between domains, these methods
-
Adversarial Domain Generalized Transformer for Cross-Corpus Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-29 Yuan Gao, Longbiao Wang, Jiaxing Liu, Jianwu Dang, Shogo Okada
Speech emotion recognition (SER) promotes the development of intelligent devices, which enable natural and friendly human-computer interactions. However, the recognition performance of existing approaches is significantly reduced on unseen datasets, and the lack of sufficient training data limits the generalizability of deep learning models. In this article, we analyze the impact of the domain generalization
-
Dynamic Alignment and Fusion of Multimodal Physiological Patterns for Stress Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-28 Xiaowei Zhang, Xiangyu Wei, Zhongyi Zhou, Qiqi Zhao, Sipo Zhang, Yikun Yang, Rui Li, Bin Hu
Stress has been identified as one of major causes of health issues. To detect the stress levels with higher accuracy, fusion of multimodal physiological signals is a promising technique. However, there is an asynchrony between physiological signals observed from different perspectives. Exploring the temporal alignment relationship between modalities is helpful to improve the quality of multimodal fusion
-
PR-PL: A Novel Prototypical Representation Based Pairwise Learning Framework for Emotion Recognition Using EEG Signals IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-23 Rushuang Zhou, Zhiguo Zhang, Hong Fu, Li Zhang, Linling Li, Gan Huang, Fali Li, Xin Yang, Yining Dong, Yuan-Ting Zhang, Zhen Liang
Affective brain-computer interface based on electroencephalography (EEG) is an important branch in the field of affective computing. However, the individual differences in EEG emotional data and the noisy labeling problem in the subjective feedback seriously limit the effectiveness and generalizability of existing models. To tackle these two critical issues, we propose a novel transfer learning framework
-
Fine-Grained Interpretability for EEG Emotion Recognition: Concat-Aided Grad-CAM and Systematic Brain Functional Network IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-23 Bingxiu Liu, Jifeng Guo, C. L. Philip Chen, Xia Wu, Tong Zhang
EEG emotion recognition plays a significant role in various mental health services. Deep learning-based methods perform excellently, but still suffer from interpretability. Although methods such as Gradient-weighted Class Activation Mapping(Grad-CAM) can cope with the above problem, their coarse granularity cannot accurately reveal the mechanism to promote emotional intelligence. In this paper, fine-grained
-
RLWOA-SOFL: A New Learning Model-Based Reinforcement Swarm Intelligence and Self-Organizing Deep Fuzzy Rules for fMRI Pain Decoding IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-22 Ahmed M. Anter, Zhiguo Zhang
Pain is highly subjective, so it is always desirable to develop objective pain assessment methods. Brain imaging techniques, such as functional magnetic resonance imaging (fMRI), have the potential to provide a physiological and quantitative pain assessment tool. However, the ultra-high-dimensional fMRI data and the nonlinear relationship between fMRI and pain greatly degrade the efficiency of fMRI-based
-
Belief Mining in Persian Texts Based on Deep Learning and Users' Opinions IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-21 Hossein Alikarami, Amir Massoud Bidgoli, Hamid Haj Seyed Javadi
Belief mining and the study of public opinion provide valuable information. Analyzing the feelings and belief mining of social media data leads to understanding users' opinions and has wide applications in decision making and policymaking. This article applies a new method based on deep learning to solve the problems of belief mining for Persian comments on the Twitter. In this method, first, the data
-
Disagreement Matters: Exploring Internal Diversification for Redundant Attention in Generic Facial Action Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-16 Xiaotian Li, Zheng Zhang, Xiang Zhang, Taoyue Wang, Zhihua Li, Huiyuan Yang, Umur Ciftci, Qiang Ji, Jeffrey Cohn, Lijun Yin
This paper demonstrates the effectiveness of a diversification mechanism for building a more robust multi-attention system in generic facial action analysis. While previous multi-attention (e.g., visual attention and self-attention) research on facial expression recognition (FER) and Action Unit (AU) detection have been thoroughly studied to focus on “external attention diversification”, where attention
-
MGEED: A Multimodal Genuine Emotion and Expression Detection Database IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-15 Yiming Wang, Hui Yu, Weihong Gao, Yifan Xia, Charles Nduka
Multimodal emotion recognition has attracted increasing interest from academia and industry in recent years, since it enables emotion detection using various modalities, such as facial expression images, speech and physiological signals. Although research in this field has grown rapidly, it is still challenging to create a multimodal database containing facial electrical information due to the difficulty
-
Transformer-Augmented Network With Online Label Correction for Facial Expression Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-13 Fuyan Ma, Bin Sun, Shutao Li
Facial expression recognition (FER) in the wild is extremely challenging due to occlusions, variant head poses under unconstrained conditions and incorrect annotations (e.g., label noise). In this article, we aim to improve the performance of in-the-wild FER with Transformers and online label correction. Different from pure CNNs based methods, we propose a Transformer-augmented network (TAN) to dynamically
-
WiFE: WiFi and Vision Based Unobtrusive Emotion Recognition via Gesture and Facial Expression IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-13 Yu Gu, Xiang Zhang, Huan Yan, Jingyang Huang, Zhi Liu, Mianxiong Dong, Fuji Ren
Emotion plays a critical role in making the computer more human-like. As the first and most essential step, emotion recognition emerges recently as a hot but relatively nascent topic, i.e., current research mainly focuses on single modality (e.g., facial expression) while human emotion expressions are multi-modal in nature. To this end, we propose an unobtrusive emotion recognition system leveraging
-
End-to-End Label Uncertainty Modeling in Speech Emotion Recognition Using Bayesian Neural Networks and Label Distribution Learning IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-07 Navin Raj Prabhu, Nale Lehmann-Willenbrock, Timo Gerkmann
To train machine learning algorithms to predict emotional expressions in terms of arousal and valence, annotated datasets are needed. However, as different people perceive others’ emotional expressions differently, their annotations are subjective. To account for this, annotations are typically collected from multiple annotators and averaged to obtain ground-truth labels. However, when exclusively
-
Generating Multiple 4D Expression Transitions by Learning Face Landmark Trajectories IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-05 Naima Otberdout, Claudio Ferrari, Mohamed Daoudi, Stefano Berretti, Alberto Del Bimbo
In this article, we address the problem of 4D facial expressions generation. This is usually addressed by animating a neutral 3D face to reach an expression peak, and then get back to the neutral state. In the real world though, people show more complex expressions, and switch from one expression to another. We thus propose a new model that generates transitions between different expressions, and synthesizes
-
Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-05 Wenbo Zheng, Lan Yan, Fei-Yue Wang
Depression is a critical problem in modern society that affects an estimated 350 million people worldwide, causing feelings of sadness and a lack of interest and pleasure. Emotional disorders are gaining interest and are closely entwined with depression, because one contributes to an understanding of the other. Despite the achievements in the two separate tasks of emotion recognition and depression
-
Multi-Task Momentum Distillation for Multimodal Sentiment Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-06-02 Ronghao Lin, Haifeng Hu
In the field of Multimodal Sentiment Analysis (MSA), the prevailing methods are devoted to developing intricate network architectures to capture the intra- and inter-modal dynamics, which necessitates numerous parameters and poses more difficulties in terms of interpretability in multimodal modeling. Besides, the heterogeneous nature of multiple modalities (text, audio, and vision) introduces significant
-
An (E)Affective Bind: Situated Affectivity and the Prospect of Affect Recognition IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-05-29 Jason Branford
Several prominent criticisms have recently challenged the possibility of algorithmically determining or recognising human affect. This paper ethically evaluates one underexplored avenue for overcoming such deficiencies in categorical affect recognition technologies (ARTs). Specifically, the emerging literature on “situated affectivity” offers valuable guidance on three fronts. First, it conceptually
-
Opacity, Transparency, and the Ethics of Affective Computing IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-05-23 Manohar Kumar, Aisha Aijaz, Omkar Chattar, Jainendra Shukla, Raghava Mutharaju
Human opacity is the intrinsic quality of unknowability of human beings with respect to machines. The descriptive relationship between humans and machines, which captures how much information one can gather about the other, can be explicated using an opacity-transparency relationship. This relationship allows us to describe and normatively evaluate a spectrum of opacity where humans and machines may
-
A Quantum Probability Driven Framework for Joint Multi-Modal Sarcasm, Sentiment and Emotion Analysis IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-05-23 Yaochen Liu, Yazhou Zhang, Dawei Song
Sarcasm, sentiment, and emotion are three typical kinds of spontaneous affective responses of humans to external events and they are tightly intertwined with each other. Such events may be expressed in multiple modalities (e.g., linguistic, visual and acoustic), e.g., multi-modal conversations. Joint analysis of humans’ multi-modal sarcasm, sentiment, and emotion is an important yet challenging topic
-
The Ethics of AI in Games IEEE Trans. Affect. Comput. (IF 9.6) Pub Date : 2023-05-16 David Melhart, Julian Togelius, Benedikte Mikkelsen, Christoffer Holmgård, Georgios N. Yannakakis
Video games are one of the richest and most popular forms of human-computer interaction and, hence, their role is critical for our understanding of human behaviour and affect at a large scale. As artificial intelligence (AI) tools are gradually adopted by the game industry a series of ethical concerns arise. Such concerns, however, have so far not been extensively discussed in a video game context