Full Paper View Go Back

An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages

Fatima Isiaka1 , Zainab Adamu2 , Muhammad A. Adamu3

Section:Research Paper, Product Type: Journal-Paper
Vol.9 , Issue.6 , pp.72-78, Dec-2021


Online published on Dec 31, 2021


Copyright © Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
 

View this paper at   Google Scholar | DPI Digital Library


XML View     PDF Download

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu, “An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages,” International Journal of Scientific Research in Computer Science and Engineering, Vol.9, Issue.6, pp.72-78, 2021.

MLA Style Citation: Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu "An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages." International Journal of Scientific Research in Computer Science and Engineering 9.6 (2021): 72-78.

APA Style Citation: Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu, (2021). An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages. International Journal of Scientific Research in Computer Science and Engineering, 9(6), 72-78.

BibTex Style Citation:
@article{Isiaka_2021,
author = {Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu},
title = {An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages},
journal = {International Journal of Scientific Research in Computer Science and Engineering},
issue_date = {12 2021},
volume = {9},
Issue = {6},
month = {12},
year = {2021},
issn = {2347-2693},
pages = {72-78},
url = {https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=2608},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=2608
TI - An Emoji Based Emotion Recognition System for Fixed and Active Corporate Webpages
T2 - International Journal of Scientific Research in Computer Science and Engineering
AU - Fatima Isiaka, Zainab Adamu, Muhammad A. Adamu
PY - 2021
DA - 2021/12/31
PB - IJCSE, Indore, INDIA
SP - 72-78
IS - 6
VL - 9
SN - 2347-2693
ER -

107 Views    143 Downloads    41 Downloads
  
  

Abstract :
In AI, one of its major contributions in modern technology is the application of emotion recognition tools which is mostly based on facial expression and modification of its inference engine. The facial recognition scheme is mostly built to understand user expression in an online business webpage on a marketing site but has limited abilities to recognise elusive expressions. The basic emotions are expressed when interrelating and socialising with other personals online. At most times studying how to understand user expression is often a most tedious task, especially the subtle expressions. An emotion recognition system can be used to optimise and reduce complexity in understanding users’ subconscious thoughts and reasoning through their pupil changes. This paper demonstrates the use of a PC webcam to read in eye movement data that includes pupil changes as part of distinct user attributes. A custom eye movement algorithm (CAMA) is used to capture users’ activity and record the data which is served as an input model to an inference engine (artificial neural network (ANN)) that helps to predict user emotional response conveyed as emoticons on the webpage. The dynamic webpage is rendered static for the gaze point and emoji coordinated reiteration. The result from the error in performance shows that ANN is most adaptable to user behaviour prediction and can be used for the system’s modification paradigm.

Key-Words / Index Term :
AI, Static Business webpages, artificial neural network, emotion recognition system, Pupil changes, User expression, Eye movement behaviour

References :
[1]. Lu, X., Ai, W., Liu, X., Li, Q., Wang, N., Huang, G., & Mei, Q. (2016, September). Learning from the ubiquitous language: an empirical analysis of emoji usage of smartphone users. In Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing (pp. 770-780).
[2]. Lisetti, C. L., & Nasoz, F. (2002, December). MAUI: a multimodal affective user interface. In Proceedings of the tenth ACM international conference on Multimedia (pp. 161-170).
[3]. Lisetti, C. L., & Schiano, D. J. (2000). Automatic facial expression interpretation: Where human-computer interaction, artificial intelligence and cognitive science intersect. Pragmatics & cognition, 8(1), 185-235.
[4]. Pauletto, S., Balentine, B., Pidcock, C., Jones, K., Bottaci, L., Aretoulaki, M., & Balentine, J. (2013). Exploring expressivity and emotion with artificial voice and speech technologies. Logopedics Phoniatrics Vocology, 38(3), 115-125.
[5]. Isman, F. A., Prasasti, A. L., & Nugrahaeni, R. A. (2021, April). Expression Classification For User Experience Testing Using Convolutional Neural Network. In 2021 International Conference on Artificial Intelligence and Mechatronics Systems (AIMS) (pp. 1-6). IEEE.
[6]. Zhang, L., Gillies, M., Dhaliwal, K., Gower, A., Robertson, D., & Crabtree, B. (2009). E-drama: facilitating online role-play using an AI actor and emotionally expressive characters. International Journal of Artificial Intelligence in Education, 19(1), 5-38.
[7]. Volioti, C., Manitsaris, S., Hemery, E., Hadjidimitriou, S., Charisis, V., Hadjileontiadis, L., & Manitsaris, A. (2018). A natural user interface for gestural expression and emotional elicitation to access the musical intangible cultural heritage. Journal on Computing and Cultural Heritage (JOCCH), 11(2), 1-20.
[8]. Busuioc, M. (2021). Accountable artificial intelligence: Holding algorithms to account. Public Administration Review, 81(5), 825-836.
[9]. Shorten, C., Khoshgoftaar, T. M., & Furht, B. (2021). Deep Learning applications for COVID-19. Journal of big Data, 8(1), 1-54.
[10]. Tunali, I., Gillies, R. J., & Schabath, M. B. (2021). Application of radiomics and artificial intelligence for lung cancer precision medicine. Cold Spring Harbor perspectives in medicine, 11(8), a039537.
[11]. Zednik, C. (2021). Solving the black box problem: a normative framework for explainable artificial intelligence. Philosophy & Technology, 34(2), 265-288.
[12]. Kiritchenko, S., Nejadgholi, I., & Fraser, K. C. (2021). Confronting abusive language online: A survey from the ethical and human rights perspective. Journal of Artificial Intelligence Research, 71, 431-478.
[13]. Nakase, H., Hirano, T., Wagatsuma, K., Ichimiya, T., Yamakawa, T., Yokoyama, Y., & Yamano, H. O. (2021). Artificial intelligence?assisted endoscopy changes the definition of mucosal healing in ulcerative colitis. Digestive Endoscopy, 33(6), 903-911.
[14]. Roberts, H., Cowls, J., Morley, J., Taddeo, M., Wang, V., & Floridi, L. (2021). The Chinese approach to artificial intelligence: an analysis of policy, ethics, and regulation. AI & SOCIETY, 36(1), 59-77.
[15]. Wang, J., & Mou, X. (2021, August). 19.3: Evaluation of Entrance Pupil Location in Measuring VRAR Eyewear Displays: Theoretical and Experimental Analyses in Field of View. In SID Symposium Digest of Technical Papers (Vol. 52, pp. 261-265).
[16]. Bracke, P., Van de Putte, E., & Ryckaert, W. R. (2021). Comment Concerning the Effects of Light Intensity on Melatonin Suppression in the Review “Light Modulation of Human Clocks, Wake, and Sleep” by A. Prayag et al. Clocks & Sleep, 3(1), 181-188.
[17]. Park, J., Loftness, V., Aziz, A., & Wang, T. H. (2021). Strategies to achieve optimum visual quality for maximum occupant satisfaction: Field study findings in office buildings. Building and Environment, 195, 107458.
[18]. Liu, C. H., Hsiao, C. Y., Gu, J. C., Liu, K. Y., Yan, S. F., Chiu, C. H., & Ho, M. C. (2021). HCL Control Strategy for an Adaptive Roadway Lighting Distribution. Applied Sciences, 11(21), 9960.
[19]. Aguirre, G. K. (2019). A model of the entrance pupil of the human eye. Scientific reports, 9(1), 1-10.
[20]. Wang, J., & Mou, X. (2021, August). 19.3: Evaluation of Entrance Pupil Location in Measuring VRAR Eyewear Displays: Theoretical and Experimental Analyses in Field of View. In SID Symposium Digest of Technical Papers (Vol. 52, pp. 261-265).
[21]. Zhang, Q., Song, W., Hu, X., Hu, K., Weng, D., Liu, Y., & Wang, Y. (2021). Design of a near-eye display measurement system using an anthropomorphic vision imaging method. Optics Express, 29(9), 13204-13218.
[22]. Salama, E. S., El-Khoribi, R. A., Shoman, M. E., & Shalaby, M. A. W. (2021). A 3D-convolutional neural network framework with ensemble learning techniques for multi-modal emotion recognition. Egyptian Informatics Journal, 22(2), 167-176.
[23]. Jia, Z., Lin, Y., Wang, J., Feng, Z., Xie, X., & Chen, C. (2021, October). HetEmotionNet: Two-stream heterogeneous graph recurrent neural network for multi-modal emotion recognition. In Proceedings of the 29th ACM International Conference on Multimedia (pp. 1047-1056).
[24]. Tan, C., Ĺ arlija, M., & Kasabov, N. (2021). NeuroSense: Short-term emotion recognition and understanding based on spiking neural network modelling of spatio-temporal EEG patterns. Neurocomputing, 434, 137-148.
[25]. Huang, D., Chen, S., Liu, C., Zheng, L., Tian, Z., & Jiang, D. (2021). Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition. Neurocomputing, 448, 140-151.
[26]. Huang, D., Chen, S., Liu, C., Zheng, L., Tian, Z., & Jiang, D. (2021). Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition. Neurocomputing, 448, 140-151.
[27]. Abdullah, S. M. S. A., Ameen, S. Y. A., Sadeeq, M. A., & Zeebaree, S. (2021). Multimodal emotion recognition using deep learning. Journal of Applied Science and Technology Trends, 2(02), 52-58.
[28]. Vu, M. T., Beurton-Aimar, M., & Marchand, S. (2021). Multitask multi-database emotion recognition. In Proceedings of the IEEE/CVF International Conference on Computer Vision (pp. 3637-3644).
[29]. Yin, Y., Zheng, X., Hu, B., Zhang, Y., & Cui, X. (2021). EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Applied Soft Computing, 100, 106954.
[30]. Li, Q., Gkoumas, D., Sordoni, A., Nie, J. Y., & Melucci, M. (2021, May). Quantum-inspired Neural Network for Conversational Emotion Recognition. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 35, No. 15, pp. 13270-13278).
[31]. Najm, H. (2021). Data Authentication for Web of Things (WoT) by Using Modified Secure Hash Algorithm-3 (SHA-3) and Salsa20 Algorithm. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 12(10), 2541-2551.
[32]. Sun, Y., Zhu, S., Zhao, Y., & Sun, P. (2021). Let Your Camera See for You: A Novel Two-Factor Authentication Method against Real-Time Phishing Attacks. arXiv preprint arXiv:2109.00132

Authorization Required

 

You do not have rights to view the full text article.
Please contact administration for subscription to Journal or individual article.
Mail us at  support@isroset.org or view contact page for more details.

Go to Navigation